Profile

Scrapbook photo 1
Robert Brown (Robert Lawson Brown)
Attended University of Virginia (Ph.D., Physics)
Lives in Palo Alto, California
69 followers|8,727 views
AboutPostsPhotosVideos+1's

Stream

Robert Brown

Shared publicly  - 
 
#ifihadglass I would stream some of my real life into my Second Life(tm), via a shared media object showing RL within SL Perhaps like a keyhole from one World to the Other. Virtual Realism.
1
Add a comment...

Robert Brown

Shared publicly  - 
 
I happened to notice a post promoting the wearing of a bit of jewelry, a necklace pendant, that proclaims the wearer as being an atheist. The idea it seems is to make atheism more visible. Christians wear crosses, fish, and other symbols, and other forms of faith have their own proclamation jewelry as well. It makes them socially visible and promotes social acceptability. Should not atheist do so as well?

I think so, but I can not wear such, because I am not an atheist, I am an agnostic. Atheism requires that the proponent to make an absolute statement: "there is no god". Not that there is necessarily no spirit, no backstage to the Universe, but there is no god. People do tend to extend atheism to become equivalent to an absolute materialist stance, but the traditional accepted term is more limited, in my opinion.

Accordingly, as I can not state with certainty there is no god, I can not honestly wear the badge of an atheist. I do not know if or if not there is a god. I can not even come with a good definition of god for which to believe or not believe. 

Modern atheism seems to hold that as physics has shown a god to be unnecessary, the existence is thereby forbidden -- anything not compulsory is forbidden. Strangely, not so long ago, a famous physicist (Gell-Mann) said "Everything not forbidden is compulsory." Of course, he was speaking in particular of the necessary inclusion of all possible interactions for quantum particles. But it is possible to ask if god is not required, yet not forbidden, could there be a god? That is why I feel the agnostic stance is the appropriate one.

However, bear in mind that physics is a overwhelmingly successful purely materialist view, by which I mean nothing happens without observable cause, and all such causes fit in the realm of knowable physical laws. God is not required. Spiritualism, in the strong sense, is not required. No unknowable backstage to the Universe.

If something is always unknowable, perhaps I am wrong to even consider it, for something that is both unknowable and has no observable effect, is almost equivalent to nonexistent, except in a sophomoric hair splitting. If it is knowable, does it not become part of physics, and hence not a god, but will be a phenomena, reducible to physical law? On the other hand, does an observable effect without observable cause prove god?

Bottom line: we need to take the stance that there are things that are temporarily unknown, maybe even unknowable. I can say what I observe is real, but I can not say what I have not observed is not real.

What would be a good pendant for the agnostic? A forked path? An infinite symbol? Or just a question mark?
1
Mark Hall's profile photoRichard Brown's profile photo
2 comments
 
Evidence is sometimes individualized.  I have experienced many instances where something beyond the visual has affected my life and even saved my life.  Too many times to dismiss as luck.  Religion calls this God or Allah or spirits.  No matter what you call it, there is more than meets the eye.  Science exists in a time frame and religion teaches us that God exists in eternity which has no beginning nor end.  The conflicts between religion and science are very much time frame oriented.  First there was light, then water (as science shows that water came from above (as in possibly meteors), then plant and animal life and finally mankind.

The athiests I know do not belive in an afterlife.  I do.  My best friend died believing he was "dust in the wind".

My main problem with evolution and the purely scientific way of looking at things is my sense of self.  Where did that come from and why do I, as an individual with a sense of self exist?
Add a comment...

Robert Brown

Shared publicly  - 
 
I am of two minds -- actually more than two minds -- concerning the Scott Thompson affair. I am sure everyone in my circles is aware, but just to recap, it is alleged that Scott Thompson, CEO of Yahoo, wrote on his resume that he obtained a computer science degree back in 1979, which in fact he did not obtain, nor even attended classes.

First, I do not undervalue the "things learned in life" versus academic credentials. I say this as a person who earned a doctorate in physics back in 1978. I fully respect that twenty or thirty years of career work may be a person's most valid credential.

However, while I do not, at this stage in my life and my career, practically apply my knowledge of atomic and condensed matter, nor the bits and pieces I have in quantum mechanics, relativity and the standard model, the process of obtaining that doctoral credential was (is?) key to my success. The ability to engage in lifelong study, learning, and application of the new is the real advantage of my degree. Accordingly, I am not shy to show that credential, though many might disparage it as a mere "wall plaque".

Looking at some commentary I find on the web, and especially in online forums, I find three main lines of thought on the issue of people fabricating academic standing. Some take a hard line -- the man lied -- this shows he can not be trusted. A second view seems to be the let it be stance -- sure he lied, but 'everyone' is doing it. Finally, there is the academics is crap anyway view, so why not lie to undo the 'artificial and unfair' advantage 'those people' have in the job market -- let a guy's career prove his worth.

Starting with the first view, I find it very disappointing that a person, especially one of my generation, would lie about obtaining an academic credential. When I was younger and just starting out, this was rare event. Sure, a tiny fraction might try it. It was considered horribly dishonorable. In addition, back then, the personnel department (it was not called Human Resources, it was Personnel) would expect to fetch your transcripts straight from the bursar's office of whatever institution you claimed on your resume. They would call up former employers as well, and there would be none this "we are not legally permitted to comment on former employees". You would actually get an honest answer: "yes, we hated to see him leave us, but he was great" or "Yeah, we caught him stealing ketchup packets from the cafeteria".

So, for lying in the first place, Scott Thompson gets a bad mark, unless the "inadvertent" claim was due to some one else, not him.

The second view -- that it is a commonly place gambit to climb the corporate ladder -- is just sad. It is an attempt to legitimize the money for nothing philosophy -- the gonna make my nut without being the chump that has to actually work for it. It is the same philosophy that says you do not have to be productive, merely clever, sneaky. Ride the backs of those are are too stupid to realize it is all about what you can finagle. Sociopaths rule.

Toward the third view -- that "life work" is more important that schooling, as I have stated to some degree I can accept that. But I do not see a real distinction that makes the eleven years of university work that leads to a doctorate in physics as not being part of "life work". It seems that when the anti-academics claim that academic degrees are meaningless distinctions, they themselves are making a meaningless distinction.

It would be nice if those who could not get in academic programs, or did find traditional academic learning to be beneficial to them, could get recognition of the same caliber for their life work. The problem is one of vetting such work. Maybe some universities could consider a life work vetting program. The candidate could submit a chronology of work, which university professors might examine, confirm, and consider. The candidate might visit the university to offer a oral dissertation on such work. Perhaps a written thesis might be required. The degree granted might be a new category, perhaps "master-at-large", where the "at large" appellation is understood as meaning "not resident at the university".

So here are my multiple minds on Scott Thompson. I believe that likely he does fit into the "master at large" category in computer science. That is unproven, but it seems strongly indicated. Nonetheless, the lie on his resume does reflect on his personal integrity. This must be answered. Finally, there is a feeling that maybe he is just acting as the accepted ethics of our current social dictate, a world where appearances are more important that accomplishment.
1
Richard Brown's profile photo
 
Yes, since I have a master's degree job with only a BA, I have a master's (if not doctorate) in job experience that far outweighs those who work for me who have or will soon obtain a doctorate.  They cannot fill my shoes at this time.  Their formal education is too much theory and not enough practicality.  Of course, if I had the MA then I might earn $5000 more year or have been promoted before I was ready. 

Oddly though, I am the only supervisor with a course in supervison.  Somehow they all missed that one in their pursuit of their more theoretical degrees.

If I ran a buisiness, you would start at the bottom regardless of degree (as I would want you to know the business from bottom to top) and hopefully your high education would give you the skills to move up quicker than others, but that is not always true.  This method also allows future suporvisors to really understand what their workers experience on the job.

I have 7 full time employees, 5 of whom have higher degrees than mine.  They all admit that there are skills in my job that they lack.

I have 9 part time employees, 2 of whom have higher degrees than mine.
Add a comment...

Robert Brown

Shared publicly  - 
 
I recently read about flipped classrooms for high school education. In brief, students view online lectures at anytime they wish, anywhere they wish, using smartphones, tablets, desktop computers. When they come to the classroom, instead of a lecture, they work on assignments such as essays or math or analytical problems. The teacher is available to help each student individually with problem spots. In effect, homework is now classwork and vice versa, hence the flipped classroom.

The main objection that might be raised is that students will simply not bother with watching the lectures when left to their own devices. However, in early implementations of flipped classrooms, that does not appear to be the case. It may be that a student finds school lectures more appealing when he or she can treat the lecture like a watch at home movie, pausing, repeating, or breaking into shorter sessions.

It seems to be good for the teachers as well. They can carefully put together one well polished reusable lecture series, saving the bulk of their time for individual student interaction.

As I read of this, it occurred to me that it like the proposed online college level courses, such as the MITx project. We would see a college/university system where the curriculum portion is self paced, self monitored. A practicum portion (expanding the definition of practicum to be all hands on work) would be in the classroom, with lots of individual instruction. Certification and degree granting might be by traditional testing, or by thesis submission and oral examination. It is conceivable that the "resident" time of the student at the college could be drastically reduced, but during that time the availability of the professors for that student will be similarly increased.
1
Richard Brown's profile photo
 
I am a believer in self paced education and no social promotions. Solve the sports problems by putting sports in park and rec. I alos belive that education has to move beyond the education factories designed for 1900 not 2000+.
Add a comment...

Robert Brown

Shared publicly  - 
 
A recent study (see Science section of the New York Times for January 2rd, 2012, "In Classic vs. Modern..." by Nicholas Wade) suggests that Stradivarius violins made a third of a millennium ago are not necessarily better, musically, than well made modern violins. A quick check reveals that this was not the first such study to obtain this conclusion -- there have been others. It is known that "reproduction" instruments have come very close to matching the best of the old instruments. Certainly objections can be made to this recent study, and to the ones that preceded it: Was an "average" Stradivarius compared to the best of the modern field? Were the test conditions in performance venues? Were the musicians serving as judges really capable of perceiving the differences?

However, taking the study's conclusion as stated, it raises this question: why do we associate quality with rarity or high cost or with being old school, or being just plain difficult?

This happens in many situations. I remember when CD technology was introduced. Some audiophiles claimed it was unacceptable, because they can hear the 44 Khz sampling "like a buzz saw screaming in their heads." Never mind neuro-physics. Forty years later, now the audiophiles keep treasured CDs as their master sound references. Some photography schools continue to turn up their noses at digital media, insisting that only with film and paper photo chemistry can "real"art be done.

I invite you to add in the comments cases where rarity or difficulty truly make a difference, and cases where it patently does not.
1
Add a comment...

Robert Brown

Shared publicly  - 
 
David Segal of Demand Progress Org (demandprogress.org) has reported significant progress toward preventing the passage of the SOPA by the USA Congress. http://act.demandprogress.org/sign/sopa_win/
1
Add a comment...
Have him in circles
69 people
Chi Quan's profile photo
Richard Aguilar's profile photo
Erik Driscoll's profile photo
Cortland Klein's profile photo
Margaret Hill's profile photo
William Baldwin's profile photo
George Cleveland's profile photo

Robert Brown

Shared publicly  - 
 
A new word may be invented when the language is advanced by such invention. Accordingly, I submit the word "eor" which is to be defined as "joining two phrases where one and only one of the phrases is true". The word "eor" is not to be confused with the acronym EOR, nor with the ancient rune eor, which will be clear from context. The new word will reduce the use of the "and/or" construction, which henceforth will be just an ordinary "or", with a consequent great reduction in the loss of human life over this issue. To deploy this new word into language, please immediately gather up all previously published dictionaries, appropriately revise the content, and republish. If we start now, perhaps this can be accomplished by Saturday.
1
Robert Brown's profile photoCharles Gousha's profile photoRichard Brown's profile photo
4 comments
 
As for pronunciation, I suggest "ee or'. (Think "E R".)
Add a comment...

Robert Brown

Shared publicly  - 
 
I have reading recently (and in the distant past) about the "post personal computer" era, said to be forthcoming in the next technlogy cycle, or there abouts. Several noted technology journalists and techology pundits have given credible reasons for a shift from individually owned fixed or laptop computers, to other devices, in particular devices in the mould of iOS™ devices. If you search for references to "post PC", you will get around 200,000 mentions found on the web, for the past year by itself. 

Having evolved my own workflow over forty three years of using computers, from big iron with all of 54KB of workspace and all the storage you need on Hollerith cards, to my nice little iMac with 16 GB memory and 9 TB of storage, I can appreciate crawling out of the sea onto land, and later taking to the air. 

To begin, consider the nature of the classical computer. Such a device is self contained. It has means to accept information, to store information, to process that information, and to reveal information. Most people today will go with a more restrictive description: it has a keyboard, mouse, hard drive or SSD, CPU, graphics card, and display. Microphone, webcam, speakers, Bluetooth and WiFi are also de rigueur. 

Both my iMac and my iPad fit this description. So where is the evolutional demarcation? It is in two major aspects.

First, is the location of the computer. It is not longer tied to your desk (or the basement, in the case of old big iron). Instead, is it with you, in your pocket or in your hand, all of the time. Checking something on the web is a casual decision, you can do it at any urban place, most suburban places, and usually find a place within an hour or so in the rural environment. You can post a short message, read the responses, put numbers into a spreadsheet, dictate a letter, see what is nearby, buy something, record some audio or video, and on and on and on. Even without web accessibility, your musical library is right there, your photo library is right there, your book library is right there. 

Second, is the location of information. You do not have to keep it all on the computer. It can be safely and securely stored in professionally maintained bunkers of data servers, i.e. the cloud. Wherever you have a connection to the web, you have access. If your own computer is destroyed or stolen, you still can get the information you have accumulated.

The above is true for most of you. Perhaps not for most of those who read my postings, but in the greater community, yes.  There are deviations from this cozy scenario.

The most important is while the mobile devices are able to handle many of the typical personal transactions, interactions, and computations, the mobile device would be a pain to use to develop, write and test new computer code and applications on such. Similarly, a mobile device is not a good graphic design workstation, nor a full audio or video production studio. I feel I can generalize this: much of our creativity activity will require computing on a greater scale than mobile devices provide. If the mobile devices have capability at level "X", to create the media and applications that are used on the mobile device, you will required a workstation of capability at N*X, where N is much greater than one. The creative people will always want more than the targeted system.

The next deviation from the mobile device scenario are those who have to maintain large amounts of personal data. This need is likely just temporary. The available cloud services keep increasing the amount of data they will store for you. But I have several terabytes allocated to video, audio, and images. When really good high speed connections are available most everywhere, and when the cloud services can reasonably offer tens of terabytes to individuals, I will happily transfer my data mass to the cloud. 

But all of this is the future, perhaps as much as five or six months away.
1
Add a comment...

Robert Brown

Shared publicly  - 
 
It appears to be time to retire my old Mac Pro Tower, and advance to a better, newer system. I will not fully retire it -- it runs Mac OS X Lion great, and can certainly maintain its usefulness as a Mathematica(tm) processor. But it just is not managing my video editing tasks under the latest Final Cut Pro X, especially when dealing with ingesting and transcoding AVCHD. So I will add a workstation for video to my suite. I thinking in terms of this specification, which is available more or less off the shelf at any Apple retailer:


iMac 27 inch Intel Core i7

16 GB RAM

2TB serial ATA Drive

AMD Radeon HD 6970M 2GB GDDR5

Magic Trackpad

Wireless keyboard
With the Thunderbolt capability, I can add external drive clusters to keep my video files. I tend to need around 4 TB for current projects.

The interesting point in this, is that I find the tower configuration is no longer necessary. I foresee more use of external components, e.g. drives, audio interfaces, video interfaces, via optical Thunderbolt, taking over the role formerly provided by PCI buses and cards. So having used towers for roughly a quarter of a century, I believe the one I currently have is the last I will own.
1
Jose Francisco Medeiros's profile photoMargaret Hill's profile photo
2 comments
 
Is it a PPC G5 tower?
Add a comment...

Robert Brown

Shared publicly  - 
 
My hometown of Palo Alto has a 71 year tradition of placing small Christmas trees along two blocks of Fulton Street, one in front of every house. The residents often decorate their homes with elaborate displays, which some say are the best in the Bay Area. The complete story of this tradition may be found in a web site: http://www.christmastreelane.org/index.php

These pictures are ones I took in my scroll down the lane on December 23rd.
1
Add a comment...
People
Have him in circles
69 people
Chi Quan's profile photo
Richard Aguilar's profile photo
Erik Driscoll's profile photo
Cortland Klein's profile photo
Margaret Hill's profile photo
William Baldwin's profile photo
George Cleveland's profile photo
Work
Occupation
scientist, programmer
Employment
  • scientist, programmer, present
Places
Map of the places this user has livedMap of the places this user has livedMap of the places this user has lived
Currently
Palo Alto, California
Previously
Tuscaloosa, Alabama - Glassboro, New Jersey - Newark, Delaware - Charlottesville, Virginia - Chicago, Illinois
Story
Introduction
I am a physicist, (U.Va. Ph.D. 1978). I am half experimentalist, half theorist, half mathematician, and half computer programmer. 

I have worked in atomic studies, condensed matter, magnetic properties, semiconductor optics, plasma etching, semiconductor process development, metrology, and computer programming. 

These days, I mostly work on web 2.0. I prefer to work in Java but I also actively work in PHP. Python has begun to speak to me. In the past, I worked in Fortran, Basic, Forth, Pascal, and even Prolog. 

I am known as Scire Gaea in Second Life.

group.as participant

This is my general Google Apps account, used primarily for the Google+ network, and YouTube. 

I am also online as "robert@rlbrown.org" and "robert@rlbrown-corporate.com", which exist for the appropriate purposes inferred from the domains.
Education
  • University of Virginia (Ph.D., Physics)
  • University of Delaware (B.S., Physics)
Basic Information
Gender
Male
Robert Brown (Robert Lawson Brown)'s +1's are the things they like, agree with, or want to recommend.