Shared publicly  - 
Identity & Reputation

Do you agree?

The other day, I read a perceptive article, "In Defense of Friction," arguing that "automated trust systems undermine trust by incentivizing cooperation because of the fear of punishment rather than actual trust." That's a profound point. If we rely on computational systems for a trust framework, we actually lose our instincts and capacity for personal trust; even more, we cease to care about it. And there's a big difference between trusting someone and relying on a system that says they're trustworthy.
Kingsley Idehen's profile photoJeff Jockisch's profile photoMeg Tufano's profile photoNick Benik (HackerCEO)'s profile photo
+Jeff Jockisch That's a very good point and important as well. People have to be careful to not lose their ability to generating and understanding trust via interpersonal actions and having an over reliance on outside mechanisms.
I really hate the bulk auto-sharing features for musics and games and the like. But I love automating posts... There are a lot of things going on here...

Seems like automated trust systems COULD make things worse, as could the Filter Bubble effect. But if built well, with transparency and consumer control, they could be dynamite!
Don't do FB, and not much traction to this article.. How is this relevant to Gplus ?
I certainly agree that this is a much needed discussion. As Google (and no doubt soon others) position themselves into "identity services" it is important to decide how society should accept and use such tools. Or, perhaps, whether we should at all.
AJ Kohn
Well, first off, I'm not sure they are mutually exclusive.

For instance, if you were to install something that simply created a 'read' share based on the page you were on, that would create a footprint of your surfing behavior.

But then you'd overlay that will active signals such as commenting on that piece, or Liking, +1, Saving, Emailing etc. Those would provide a higher threshold.

The assumption here is that frictionless sharing will reduce the incentive to share. I'm just not sure that's true, particularly since you, yourself, aren't going to be actively reviewing your feed to see what you shared or not.

It's sort of like a documentary filmmaker. At first you are always noticing the camera, but over time you begin to forget that it's there and continue to behave as you would without it being there.

Obviously a lot of this is about how it is implemented and presented, but in general I don't agree. Frictionless sharing simply provides a backdrop and context for our active sharing and may provide some interesting insight into what areas a person is more connected to (e.g. - passing interest in X but active interest in Y).
+Gary Tivey Fair point about this applying less to G+. But the issue has wider significance, b/c Google will soon be using their own internally-generated trust metrics to influence your feed... how much control do you want over that?
+Bob O'Bob I think they are coming regardless of our desire. But we CAN influence how they work and how much control we are able to exercise over them.
+AJ Kohn Great insight. Love this thought: "frictionless sharing simply provides a backdrop and context for our active sharing and may provide some interesting insight into what areas a person is more connected to"
As for music, I don't ever want anything which pumps my choices out to other people. But I might be happy with something which lets the tiny few who might actually be interested look to see, such as by reading my profile with a "what's playing now" kind of entry. Automation is great ... except when it's awful.
Trust is still a decision that's made by the participant, not the social network. I'd argue that it's not the end of social... but that machine-based authority and popularity is where the problem lies. Lady Gaga may be at the top of the social networks for popularity and celeb authority... but that doesn't mean that she's trusted.

There are lots of popular sites and social profiles that don't actually equate to conversions. In other words, you can be popular, but not necessarily trusted. That judgment is still a personal one.
Interesting revelation +Jeff Jockisch

Did I miss something? What specific proof do you have that Google will be using trust metrics?

Influence, especially by Google is highly over-rated. They keep wanting me to add folks that I have already do you think that's working out?
+Jeff Jockisch - to the question "how much control do you want over that?" my answer is I want a dial that lets me take total control or turn over as much as I choose to. I want the ability to critically examine all the choices which others are offering to make for me, and to turn those on and off when I decide how much I trust them. I want trusting people to be able to say "do it for me" and I want other people to have the choice to do it all for themselves.
That's a large factor in why I have practically abandoned my FB account. They are making filtering decisions without consulting me in any way, assuming my consent when I have never actually been given any choice.
+Gary Tivey Google has already indicated they are going to use your network and the shares and +1's of you and your Circles to influence rankings of results in Google Search. They are already starting some of this by displaying if a result has been shared or +1'ed by someone.
I agree with the researcher's conclusion that "overreliance on security and assurance structures as replacements for interpersonal trust". Of course just because the structures exist doesn't mean that we as individuals have to over-rely on them; but in an online environment, where most people don't have a lot of experience or intuitions about making trust decisions or building trust, a lot of people do by default.

+Bob O'Bob have you also abandoned Google search? they too make filtering decisions without consulting you -- that's what PageRank and all the other "signals" that feed into their algorithm are.

+AJ Kohn in theory I agree that "frictionless sharing" can be the backdrop and context. in practice, with finite time and energy, how much will it actually work that way, and how much will it replace other kinds of sharing and attention? TBD.
Search rankings have little to do with the stream on Gplus. Search results are already impacted by Gplus Pages and members who are posting in publications. Don't see any problems with that, in fact many in my circles are writing the news.
I see this as simply a numbers game to placate the SEO / marketing crowd that is looking for parity with FB.
Easy to see an ongoing struggle between Quantity and Quality on Gplus. Inevitable.
AJ Kohn
You're absolutely right about finite time and energy +Jon Pincus. And it will depend on how frictionless sharing is implemented.

However, in some ways, if I have a finite amount of time and energy to share, a well implemented 'frictionless' sharing feature would potentially provide users with slightly more time to actively share what is truly important.

Perhaps people won't think that way but, if I knew that I could easily go back and find what pages I'd read then maybe I wouldn't bookmark or save as many marginal pieces and spend more time on (and share) the really good stuff.
Today's algorithmic 3rd party trust and reputation systems are implicitly broken. They pursue mirage-like objectivity while being inherently subjective.

When people wake up and start taking control of their personal Identity and associated resources online, today's false starts -- in the Web 2.0 realm -- will become crystal clear.

First stop, Personal Data Spaces (or Data Lockers). There is a magic to being "You!"
The Web of Trust (WOT) is not something that should can be machine automated. It can and should only be machine facilitated via #WebID and #ACLs made possible through #LinkedData. Individuals need to remain in control of their relationships, their trust networks, by curating these resources. This helps others to learn whom to trust.

See this W3C resource for more info on WOT
+Jon Pincus:
I am unhappy that the evolution of search has created a whole new and entirely parasitic industry, but that's pretty irrelevant to this thread. Those choices were made, over time, by the market. "Relying on a system which says they are trustworthy" is now the only practical choice.
AJ Kohn
I wouldn't paint the SEO community with such a broad and stereotypical brush. I'm an SEO and I'm proud of it.

There's little doubt that Google is going to use Google+ as a way to inform search results. It is an identity platform which combined with authorship and engagement analysis will provide something akin to AuthorRank (or TrustRank as it is referred to in Google Patent docs.)

Here's my original take on this topic:

And then a follow-up specifically about Google developing an influence metric:
What about experience? If you experience something that is intelligent, enlightening or beautiful, you plus it, share it, etc. Am I understanding this article to say that an algorithm will be telling me what I can see online? As in a kind of weird filter? Attempting to show me what I've shown I have an interest in?

Unintended consequences of this kind of number model is exactly what I'm getting lately in email because I bought a piece of equipment online for my son who is a sound engineer. I must get ads for all sorts of weird musical things every day.

And Google is constantly sending me ads for online degrees (I am an online professor and HAVE my degrees, thank you.)

In other words, I "get" the SEO, and I hope it will get better for advertising; I do NOT get the filtering by what things I already like. How I learned to like most of the things I like was a chance encounter with something I didn't even know existed. (One stupid example, photography apps for my iPhone.)
+Jason Hurtado Daniels Good point. And because I've been doing this from the beginning, I worked hard at the circle part because I could see that this was going to be the KIND of social network I would enjoy (I love discussion, and have learned how to disagree without being disagreeable (for the most part), the art of a good conversation). I just read +AJ Kohn 's first article and I have always said, from the beginning that I agreed with +Vic Gundotra 's insistence on there being a real person behind the conversation. (What I do not know is how one gets "verified" by Google (the checkmark).) For all the possible damage that could be done politically, I think the ability to relate to people authentically trumps that. (And I specifically agree with the suggestion that if people do not want to be "real," then maybe this isn't the place for them to be.)

As to +AJ Kohn 's experience of getting a pretty nasty anonymous response to something he'd written in the newspaper, same thing happened to me BEFORE the Internet, I was in the Toronto newspaper for having won a business award along with another woman. In those days there were telephone books and on Christmas Eve, I got a call from someone saying he was coming to murder me. I called the police of course. They would do nothing. (This was before Star 69 too.) So, I went to a hotel that Christmas Eve. Just saying that there are a lot of bad actors out there and social life requires that we judge the character of people (the original article).

Finally, just a head's up. On a different thread, someone used his real name, but he was a felon and a convicted terrorist who was convicted of stealing the driver's license software from his state and making ID's for bad guys. Identification is very important. I blocked him before I realized that I should have alerted Google. I asked the other person in the thread to contact Google because he could still see him. Didn't hear back.

Only bringing this to your attention to point out all the paradoxes involved in real names.
AJ Kohn
I couldn't agree more with +Meg Tufano when it comes to serendipity. You just don't know what is going to make a connection for you, what will pique your interest. One day you like alternative music and the next you're listening to Cuban fusion because you ate at a great Cuban restaurant last night.

The biggest fallacy I see is that past behavior predicts future behavior. For some portion it does, but for vast others it doesn't.
I want Google and others to help me filter/control/prioritize my stream. Of course the devil is in the details of implementation.

I hear you, +Gary Tivey and +Meg Tufano and +Jeff Sayre, I too fear that these systems can easily dehumanize us, guide us like lemmings, build a brand and celeb-driven popularity index.. +Kingsley Idehen I agree that algorithmic 3rd party trust and reputation systems COULD be bad.

But with the right combination of transparency, user controls, distributed platforms, I think they might also be beautiful, even liberating. Giving us more time, more control, more value...

How do we get to that world? We have to go out an build it. We have to scream about the importance of transparency and user control and more.
Friction aids self calibration. At the end of the day, self-calibration of one's vulnerabilities (online or offline) is what privacy is all about. This is why the solution ultimately lies in fusing PKI and Trust Logics as demonstrated by WebID. We need our own Personal Data Spaces (cloud or wherever) distinct from the applications and services to which access is granted. BTW -- this is how things used to work before WWW ubiquity within enterprises.

Luckily, the architecture of the WWW is so dexterous that it also enables federated identity that will ultimately be controlled by "You" rather than "Them".
Could someone please define "friction" for me? I'm missing something.
+Meg Tufano 'Friction' here means that you have to agree/act/click/intend to share something or connect to someone, rather than an algo or a bot deciding that for you.
Thanks. (I just learned the term "negative entropy" and I think my brain is starting to reject new words. ;'))
Add a comment...