I agree that the solution is better apps integration, but I don't know that putting apps on the camera is the right approach.
I always envisioned the personal cloud being a set of interconnected devices, each one good at some things and not so good at others. Some devices have a nice screen (glasses and/or tablet). At least one device had a solid app library and effective control UX (could be the same device with the nice display, but maybe another device if your best display is in your glasses). At least one device has network connections to the rest of the planet. One device would have good audio and mic and usually be on/near your head. You've got some storage.
And when you wanted it, you could carry a camera that just worked. It had the level of control complexity suitable for your needs. It took amazing images (stills or video). And it talked to the rest of your devices. Images or video was available for review on your displays. Images and video could be shared trivially to the rest of the world via the apps you've installed on your core device, etc.
But trying to control something like android apps through the camera UI? Color me skeptical that that's the solution we'll end up with. I'm also highly skeptical that smartphone cameras, even with better optics and apps, can replace anything but snapshot cameras (good riddance to those, IMHO).
For now just integrate the camera with your phone via an app that allows the external camera to act like it's a device on the phone (with all of the apps integration that entails). The rest will follow.