Yes, +Shauna Myers
, flagging would allow those extra sensitive users to "opt out" of your posts. It's a great idea, but I wonder if it would really work as good as it have to in order for Google to lift the general censorship?Flickr
for example have had this kind of mechanism since for a long time, but it still mostly annoys the hell out of people. Some users flag their pictures as 'moderate' or 'restricted', some do not, and then the usual armada of 'moral' people with nothing better to do all day but searching for 'disturbing' pictures to report have the regular user's pictures or even streams flagged and hence rendered unviewable in many countries (including for some reason Germany, maybe to get rid of people in certain banned uniforms I don't know). The whole thing causes a constant annoyance and drama and extra work for staff employees just to handle the issues. And all that on a platform that is very mature and extremely focused on just one thing
Google+ on the other hand is an all new community that is about, well everything
. There is no chance in hell users will flag their posts as 'moderate' or 'restricted' according to some common standard. There probably is
no good common standard even - some countries, like the U.S. are terrified at the slightest hint of skin, while others, for example the Scandinavian countries, have no problem with nudity but tend to disapprove violence. And by what standard and for whom would you flag your pictures as 'inapproptiate'? What do the average Joe or Jane know about what's ok or not in countries like Yemen, Israel, South Korea or, say Ireland? And why should he or she have to even care about it?
No, I think the best and simplest solution would for Google to implement the standard Safe Search
filter they have for Google Search, set it to 'moderate' by default and then let people post and read whatever they prefer. Least work for everyone, no need to define a common standard, and no need for a special department to handle peoples complaints and rant.