Profile cover photo
Profile photo
Robert Brown (Robert Lawson Brown)
Robert's posts

Post has attachment
#ifihadglass I would stream some of my real life into my Second Life(tm), via a shared media object showing RL within SL Perhaps like a keyhole from one World to the Other. Virtual Realism.

A new word may be invented when the language is advanced by such invention. Accordingly, I submit the word "eor" which is to be defined as "joining two phrases where one and only one of the phrases is true". The word "eor" is not to be confused with the acronym EOR, nor with the ancient rune eor, which will be clear from context. The new word will reduce the use of the "and/or" construction, which henceforth will be just an ordinary "or", with a consequent great reduction in the loss of human life over this issue. To deploy this new word into language, please immediately gather up all previously published dictionaries, appropriately revise the content, and republish. If we start now, perhaps this can be accomplished by Saturday.

I have reading recently (and in the distant past) about the "post personal computer" era, said to be forthcoming in the next technlogy cycle, or there abouts. Several noted technology journalists and techology pundits have given credible reasons for a shift from individually owned fixed or laptop computers, to other devices, in particular devices in the mould of iOS™ devices. If you search for references to "post PC", you will get around 200,000 mentions found on the web, for the past year by itself. 

Having evolved my own workflow over forty three years of using computers, from big iron with all of 54KB of workspace and all the storage you need on Hollerith cards, to my nice little iMac with 16 GB memory and 9 TB of storage, I can appreciate crawling out of the sea onto land, and later taking to the air. 

To begin, consider the nature of the classical computer. Such a device is self contained. It has means to accept information, to store information, to process that information, and to reveal information. Most people today will go with a more restrictive description: it has a keyboard, mouse, hard drive or SSD, CPU, graphics card, and display. Microphone, webcam, speakers, Bluetooth and WiFi are also de rigueur. 

Both my iMac and my iPad fit this description. So where is the evolutional demarcation? It is in two major aspects.

First, is the location of the computer. It is not longer tied to your desk (or the basement, in the case of old big iron). Instead, is it with you, in your pocket or in your hand, all of the time. Checking something on the web is a casual decision, you can do it at any urban place, most suburban places, and usually find a place within an hour or so in the rural environment. You can post a short message, read the responses, put numbers into a spreadsheet, dictate a letter, see what is nearby, buy something, record some audio or video, and on and on and on. Even without web accessibility, your musical library is right there, your photo library is right there, your book library is right there. 

Second, is the location of information. You do not have to keep it all on the computer. It can be safely and securely stored in professionally maintained bunkers of data servers, i.e. the cloud. Wherever you have a connection to the web, you have access. If your own computer is destroyed or stolen, you still can get the information you have accumulated.

The above is true for most of you. Perhaps not for most of those who read my postings, but in the greater community, yes.  There are deviations from this cozy scenario.

The most important is while the mobile devices are able to handle many of the typical personal transactions, interactions, and computations, the mobile device would be a pain to use to develop, write and test new computer code and applications on such. Similarly, a mobile device is not a good graphic design workstation, nor a full audio or video production studio. I feel I can generalize this: much of our creativity activity will require computing on a greater scale than mobile devices provide. If the mobile devices have capability at level "X", to create the media and applications that are used on the mobile device, you will required a workstation of capability at N*X, where N is much greater than one. The creative people will always want more than the targeted system.

The next deviation from the mobile device scenario are those who have to maintain large amounts of personal data. This need is likely just temporary. The available cloud services keep increasing the amount of data they will store for you. But I have several terabytes allocated to video, audio, and images. When really good high speed connections are available most everywhere, and when the cloud services can reasonably offer tens of terabytes to individuals, I will happily transfer my data mass to the cloud. 

But all of this is the future, perhaps as much as five or six months away.

I happened to notice a post promoting the wearing of a bit of jewelry, a necklace pendant, that proclaims the wearer as being an atheist. The idea it seems is to make atheism more visible. Christians wear crosses, fish, and other symbols, and other forms of faith have their own proclamation jewelry as well. It makes them socially visible and promotes social acceptability. Should not atheist do so as well?

I think so, but I can not wear such, because I am not an atheist, I am an agnostic. Atheism requires that the proponent to make an absolute statement: "there is no god". Not that there is necessarily no spirit, no backstage to the Universe, but there is no god. People do tend to extend atheism to become equivalent to an absolute materialist stance, but the traditional accepted term is more limited, in my opinion.

Accordingly, as I can not state with certainty there is no god, I can not honestly wear the badge of an atheist. I do not know if or if not there is a god. I can not even come with a good definition of god for which to believe or not believe. 

Modern atheism seems to hold that as physics has shown a god to be unnecessary, the existence is thereby forbidden -- anything not compulsory is forbidden. Strangely, not so long ago, a famous physicist (Gell-Mann) said "Everything not forbidden is compulsory." Of course, he was speaking in particular of the necessary inclusion of all possible interactions for quantum particles. But it is possible to ask if god is not required, yet not forbidden, could there be a god? That is why I feel the agnostic stance is the appropriate one.

However, bear in mind that physics is a overwhelmingly successful purely materialist view, by which I mean nothing happens without observable cause, and all such causes fit in the realm of knowable physical laws. God is not required. Spiritualism, in the strong sense, is not required. No unknowable backstage to the Universe.

If something is always unknowable, perhaps I am wrong to even consider it, for something that is both unknowable and has no observable effect, is almost equivalent to nonexistent, except in a sophomoric hair splitting. If it is knowable, does it not become part of physics, and hence not a god, but will be a phenomena, reducible to physical law? On the other hand, does an observable effect without observable cause prove god?

Bottom line: we need to take the stance that there are things that are temporarily unknown, maybe even unknowable. I can say what I observe is real, but I can not say what I have not observed is not real.

What would be a good pendant for the agnostic? A forked path? An infinite symbol? Or just a question mark?

It appears to be time to retire my old Mac Pro Tower, and advance to a better, newer system. I will not fully retire it -- it runs Mac OS X Lion great, and can certainly maintain its usefulness as a Mathematica(tm) processor. But it just is not managing my video editing tasks under the latest Final Cut Pro X, especially when dealing with ingesting and transcoding AVCHD. So I will add a workstation for video to my suite. I thinking in terms of this specification, which is available more or less off the shelf at any Apple retailer:

iMac 27 inch Intel Core i7


2TB serial ATA Drive

AMD Radeon HD 6970M 2GB GDDR5

Magic Trackpad

Wireless keyboard
With the Thunderbolt capability, I can add external drive clusters to keep my video files. I tend to need around 4 TB for current projects.

The interesting point in this, is that I find the tower configuration is no longer necessary. I foresee more use of external components, e.g. drives, audio interfaces, video interfaces, via optical Thunderbolt, taking over the role formerly provided by PCI buses and cards. So having used towers for roughly a quarter of a century, I believe the one I currently have is the last I will own.

I am of two minds -- actually more than two minds -- concerning the Scott Thompson affair. I am sure everyone in my circles is aware, but just to recap, it is alleged that Scott Thompson, CEO of Yahoo, wrote on his resume that he obtained a computer science degree back in 1979, which in fact he did not obtain, nor even attended classes.

First, I do not undervalue the "things learned in life" versus academic credentials. I say this as a person who earned a doctorate in physics back in 1978. I fully respect that twenty or thirty years of career work may be a person's most valid credential.

However, while I do not, at this stage in my life and my career, practically apply my knowledge of atomic and condensed matter, nor the bits and pieces I have in quantum mechanics, relativity and the standard model, the process of obtaining that doctoral credential was (is?) key to my success. The ability to engage in lifelong study, learning, and application of the new is the real advantage of my degree. Accordingly, I am not shy to show that credential, though many might disparage it as a mere "wall plaque".

Looking at some commentary I find on the web, and especially in online forums, I find three main lines of thought on the issue of people fabricating academic standing. Some take a hard line -- the man lied -- this shows he can not be trusted. A second view seems to be the let it be stance -- sure he lied, but 'everyone' is doing it. Finally, there is the academics is crap anyway view, so why not lie to undo the 'artificial and unfair' advantage 'those people' have in the job market -- let a guy's career prove his worth.

Starting with the first view, I find it very disappointing that a person, especially one of my generation, would lie about obtaining an academic credential. When I was younger and just starting out, this was rare event. Sure, a tiny fraction might try it. It was considered horribly dishonorable. In addition, back then, the personnel department (it was not called Human Resources, it was Personnel) would expect to fetch your transcripts straight from the bursar's office of whatever institution you claimed on your resume. They would call up former employers as well, and there would be none this "we are not legally permitted to comment on former employees". You would actually get an honest answer: "yes, we hated to see him leave us, but he was great" or "Yeah, we caught him stealing ketchup packets from the cafeteria".

So, for lying in the first place, Scott Thompson gets a bad mark, unless the "inadvertent" claim was due to some one else, not him.

The second view -- that it is a commonly place gambit to climb the corporate ladder -- is just sad. It is an attempt to legitimize the money for nothing philosophy -- the gonna make my nut without being the chump that has to actually work for it. It is the same philosophy that says you do not have to be productive, merely clever, sneaky. Ride the backs of those are are too stupid to realize it is all about what you can finagle. Sociopaths rule.

Toward the third view -- that "life work" is more important that schooling, as I have stated to some degree I can accept that. But I do not see a real distinction that makes the eleven years of university work that leads to a doctorate in physics as not being part of "life work". It seems that when the anti-academics claim that academic degrees are meaningless distinctions, they themselves are making a meaningless distinction.

It would be nice if those who could not get in academic programs, or did find traditional academic learning to be beneficial to them, could get recognition of the same caliber for their life work. The problem is one of vetting such work. Maybe some universities could consider a life work vetting program. The candidate could submit a chronology of work, which university professors might examine, confirm, and consider. The candidate might visit the university to offer a oral dissertation on such work. Perhaps a written thesis might be required. The degree granted might be a new category, perhaps "master-at-large", where the "at large" appellation is understood as meaning "not resident at the university".

So here are my multiple minds on Scott Thompson. I believe that likely he does fit into the "master at large" category in computer science. That is unproven, but it seems strongly indicated. Nonetheless, the lie on his resume does reflect on his personal integrity. This must be answered. Finally, there is a feeling that maybe he is just acting as the accepted ethics of our current social dictate, a world where appearances are more important that accomplishment.

Post has attachment

I recently read about flipped classrooms for high school education. In brief, students view online lectures at anytime they wish, anywhere they wish, using smartphones, tablets, desktop computers. When they come to the classroom, instead of a lecture, they work on assignments such as essays or math or analytical problems. The teacher is available to help each student individually with problem spots. In effect, homework is now classwork and vice versa, hence the flipped classroom.

The main objection that might be raised is that students will simply not bother with watching the lectures when left to their own devices. However, in early implementations of flipped classrooms, that does not appear to be the case. It may be that a student finds school lectures more appealing when he or she can treat the lecture like a watch at home movie, pausing, repeating, or breaking into shorter sessions.

It seems to be good for the teachers as well. They can carefully put together one well polished reusable lecture series, saving the bulk of their time for individual student interaction.

As I read of this, it occurred to me that it like the proposed online college level courses, such as the MITx project. We would see a college/university system where the curriculum portion is self paced, self monitored. A practicum portion (expanding the definition of practicum to be all hands on work) would be in the classroom, with lots of individual instruction. Certification and degree granting might be by traditional testing, or by thesis submission and oral examination. It is conceivable that the "resident" time of the student at the college could be drastically reduced, but during that time the availability of the professors for that student will be similarly increased.

A recent study (see Science section of the New York Times for January 2rd, 2012, "In Classic vs. Modern..." by Nicholas Wade) suggests that Stradivarius violins made a third of a millennium ago are not necessarily better, musically, than well made modern violins. A quick check reveals that this was not the first such study to obtain this conclusion -- there have been others. It is known that "reproduction" instruments have come very close to matching the best of the old instruments. Certainly objections can be made to this recent study, and to the ones that preceded it: Was an "average" Stradivarius compared to the best of the modern field? Were the test conditions in performance venues? Were the musicians serving as judges really capable of perceiving the differences?

However, taking the study's conclusion as stated, it raises this question: why do we associate quality with rarity or high cost or with being old school, or being just plain difficult?

This happens in many situations. I remember when CD technology was introduced. Some audiophiles claimed it was unacceptable, because they can hear the 44 Khz sampling "like a buzz saw screaming in their heads." Never mind neuro-physics. Forty years later, now the audiophiles keep treasured CDs as their master sound references. Some photography schools continue to turn up their noses at digital media, insisting that only with film and paper photo chemistry can "real"art be done.

I invite you to add in the comments cases where rarity or difficulty truly make a difference, and cases where it patently does not.
Wait while more posts are being loaded