Shared publicly  - 
Google shows off "OK Glass" voice control system for its electronic eyewear. Pretty interesting. I wonder if the highly personal aspect of the device from the user's perspective will overcome the alienation others might feel at talking to cyborg people.
Philipp Currlin's profile photoAndreas Proschofsky's profile photoStephen Shankland's profile photoBud Gibson's profile photo
Probably not for a few generations. In Germany at least there are too many people who are afraid of technology and especially anything created by Google.
I think society will adjust rather quickly. We got used to people running around with headphones on their ears after the Walkman was introduced, nowadays everyone stares into their smartphone - all that seemed pretty alien in the beginning too.
People wear glasses all the time, so I think their trying to capitalize on that familiarity. A few things:

No one else in the video seems to be wearing google glass. Is google predicting a low adoption device?

What if you already wear prescription lenses?

What about activities where you need your full field of view? We already have anti texting laws.

Isn't my cell more convenient and less obtrusive for all of this already?
Google is working on integration with prescription glasses. They have dark-glasses versions. The display only takes up a little bit of your field of view so I don't see a problem as currently designed for most situations. Re. texting, the display isn't terribly practical for reading much; Google wants you to operate Glass with your voice and have texts read to you, I believe. I still think 90% of driver distraction problems of electronics is operating the gizmo at all rather than operating it with your hands, so I think Glass is potentially an incremental worsening of the situation. Fortunately, we'll all be spacing out in Google self-driving cars so it won't matter.

I agree people wear glasses a lot, but I don't think Google is trying to capitalize on that per se; it more boils down to where else are you going to put it? I do think it's different with the little camera peering at you, and if people's attention wanders to the display, you'll notice it just like you notice when the person you're talking to checks the mobile phone mid-conversation. None of these are insurmountable, but I think there will be adjustments required. 

Your mobile phone is more convenient, and you can put it away, which I think is a feature, not a bug, at least in many human-interaction situations. You can't take videos and photos using your phone unless you have hands free, though. Driving directions will be wherever you set your phone down, not hovering up above your field of vision. There are a lot of compelling advantages to Project Glass IMO. I wouldn't be surprised if it ends up being more an adjunct to your phone than a standalone device, though.
Yesterday, I was sitting working in a coffee shop for 5 hours plugged into my phone via noise blocking headphones. Using that interface, I could see running voice navigation as well as other location-based voice-delivered services. (perhaps the headphones would be less noise blocking in those scenarios)

So, I can see your point about having glas as an adjunct to a central phone-like device. The question then becomes what is the increment of the HUD plus video capture capabilities in the eye glass component?

Note, I'm just being analytical here. I'm 70% of the way to applying for the #ifihadglass  contest.
Add a comment...