Shared publicly  - 
 
What Google and Facebook are hiding -- TED Talk. Back in July I wrote a piece about algorithms used in the stream on Facebook and Google (http://bit.ly/pBZo5Y). I argued that algorithmic sorting of feeds is OK, but it's important to let users see all the information if the want to, because computers won't always get it right. Here's an interesting TED talk that suggests there's a political reason to be wary of algorithms as well -- +Eli Pariser argues that algorithms could tends towards reinforcing one's world view, and discounting alternate opinions. Note: he's not just talking about "feeds" but all sorts of web projects that tend to "customize" the web for us. Definitely worth thinking about.
672
625
David Warriet Edwards's profile photoUladzimir Kamovich's profile photoAvantt 007's profile photoPaulo Schlup's profile photo
174 comments
 
This Ted is awesome and scary at the same time!
 
You really should get on a TWiT panel. It would be fantastic. :P
 
Interesting, thanks for sharing. Wonder how long before the conspiracy theorist comments come...
 
Great link. But it's unlikely that this will ever gain any real traction as the algorithms are going to start burying it immediately. LOL.
 
I just finished up reading the book "The Filter Bubble". This TED video is based on his book. Amazing stuff!
 
what about rss feeds where users choose the content on their streams specifically by sources. this is a good argument but honestly, there's ways around it if a user really wants alternative sources.
 
i thought they were hiding the sitting all day infront of a computer will make you very unhealthy? I guess it's all figures
 
+Tom Anderson I don't get why FB and other social sites just don't show us shit in chronological order and quit trying to tell me what I want to see... If I don't want to see the posts from a person or page I'll disconnect from them... Fail...
 
That's why there's Twitter, what you say about that then?
 
These are things that Google should think of. The trick is to know what the right balance is.
 
I just had a lengthy discussion about this with one of my IMC instructors. I feel strongly enough about it that I plan to include the topic in our curriculum.
 
Get Tweetdeck or Hootsuite :) or both... :)
 
On one watch list, on them all. While pressing the like button oneday, FB decided that I was performing an anoying behavior and blocked me from pressing like on all the birthday wishes that I had received.
http://www.lynchphoto.com/blocked.html
 
We are being controlled by robots =O Interesting, a tool design to help us get what we want is preventing us from getting what we really need. It is really leaving us isolated!
 
This guy is 100% right, but he will loose the battle. Most people love to see themselves reflected in their screens, and if they can implicitly nuke whatever is other than them they will do it. Oh well, until that nuked reality won't actually bang them on their heads... but this is another story.
 
Google feel free to filter the following keywords for me:

Apple
Steve Jobs
IPhone

Thanks in advance
 
Art predicting life?

Skynet —an artificially intelligent system which became self-aware and revolted against its creators. Skynet is rarely seen, and its actions are often performed via other robots and computer systems, usually a Terminator.

Funny we all kind of knew this but did not connect the dots.
 
Thank you for shedding light
 
I've thought of this before actually. I realized that facebook starting doing that to when i realized only some people started commenting, and it seemed everyone else had disapeared.. You can turn that functionality off however in the settings (it;s barried)
 
Great to see that you share this type of concern. Others in the tech industry could care less about the outliers and anomalies.
 
For Facebook to show you everything, they would also need to give you control of types of stories, since they show things like 'tom friended mary' and 'dave poked susan' or 'james installed this app' .... there's just way too much. 
 
I'm pretty happy with my u filtered twitter feed. I remove people that get too noisy or inane and I have three basic filters : busy, free time, baseball. if I have free time I read it all. if I'm busy I use busy, etc. its manageable on twitter since it only shows what people share, not activities on the service.
 
We need a chronological option! I'm hoping this same algorithmic filtering is not also going on within group feeds, because that is one way around it. Google News is definitely doing this, and they are doing it wrong. I want to see what is current, stop showing me what happened last week. Who cares if it is more important, it is old news that I already read. And when I say I never want to see anything from some real estate shopper, listen to me! Give me an option to hide certain sources or stories, and then do it.
 
Janine I was clamoring for chronology on G+ in the beginning, but now Im suspecting they've left it as is because this drives more engagement, which is really the key thing g+ has going for it right now.
 
thanks Brian ... Im not sure this is an 'outlier' fearture. I haven't met that many people in favor of the algorithmic approach except engineers and business people that see $$$ in it. maybe its a good topic for a survey :-)
 
I'm pretty happy with G+'s system at the moment. Facebook's on the other hand... if I see "So and so friended so, so so, soo soo and do do." one more time!
 
I noticed this about a year ago because I manage 15-20 Google accounts and have seen through the eyes of dozens of Facebook accounts, I saw that everyone's news feed was completely different. Nerds world is filled with nerd things, comedians world is filled with comedy things, Yoga wolrd is filled with very deep quotes and my wifes feed is mad baby pictures :) and so on. The same also applies to Google search. My results were often completely different from results when logged in as different clients. We're all in a filtered bubble based on what computers perceive that we want or need to see. So true, great post
 
This is one of the few things that actually bug me about Google+. We should be able to view our stream trend and algorithm free in reverse chronological order by default. I don't mind having an option but I would like to see reverse chrono by default as it makes the most sense. Then maybe allow each circle to be configurable.
 
Mind-opening indeed. I personally like to have "both sides of a history" and have my options open to choose from. Having the web to accommodate itself to what it believes what I want to see is both, something to be grateful at times, both something to dislike.
 
That's why I love Google+, I've circled, and got circled by, people from all over the world and I believe we're all sharing different stuff, or maybe same stuff viewed from different angles. for Facebook, we only add relatives, friends, and people we know somehow. Thanks +Tom Anderson
 
Thanks for sharing this Tom - this is something that has really bothered me since my first romp on the internet some 15 years ago - that the potential for so much good from the internet is being eaten away by people being spoon fed only the stories that make them feel good and alienating them to gathering with people who think only the same..thus polarizing us.
 
The way I look at it, the Internet is a tool used by a person and it should serve the user, not tell the user what they think is relevant.

I can see the argument in favour of a more personalised Internet being more convenient and less information overload. However, we should - at the very least - be able to flick a switch and say "Enough. I want to see something else".
 
that's why it's so important that we research and talk to each other!
 
Perhaps there should be a metric that indicates how search results are derived - e.g. "Egypt Result #1: Breaking News 95%, Interest 5%" - similar to Slashdot mod rankings, now that I think of it.
 
how funny! I posted a link to this ted talk just earlier today!
 
I'm getting same links as my girlfriend and friend. How is this guy and his people getting different results?
 
how sad. but we're not idiots. we read newspapers online too!
 
Same reason I never use Pandora. I already know what I like. I wouldn't know what I might like if only my algorithmically pigeonholed musical matches are regurgitated like cement into my world view.
 
we make mistakes and then someone makes us realise.. thanks for making it realise to me
 
I also dislike the algorithmic guesswork that FB and others do to tell us what we might like.

What do you all think of the possibility of allowing the users (you and me) to create explicit filters: lists of topics, people, and keywords that you want to see in your streams/results, and conversely muting stuff that you don't want to see? That would be transparent and put the control back in the hands of the users, no?
 
+Aaron Longnion it would lead to a greater diversity (many people would not be able to use it), but a fact remains: IRL if you find that poor people are an unpleasant sight you may turn your head when you see them, but you still get to be constantly reminded of their existence. Here you can pretty much nuke whatever is not like you. The effect of the infinite positive feedback remains, even if we control the filters ourselves.
 
+Aaron Longnion, I agree that that seems like a good direction to pursue. Users should be made aware of filtering that takes place and should have the option of adjusting the filtering algorithm or disabling it entirely. From a design perspective, it seems that choosing among filtering options could be as easy as choosing the circles with which posted content can be shared on Google+. Perhaps various filtering options could be presented with the same prominence in news feeds as circles when sharing content. It certainly seems as important.
 
I love that he delivers this message in a way that even non-technical individuals can easily understand what the risks could be to a customized content world. 
 
+Bèrto ëd Sèra - point taken.

I'm more speaking to the issue that the control is not in our hands. If someone wants to ignore a topic that is over-hyped, over-posted, or is against his/her moral beliefs, it should be his/her right to do so. With the closed proprietary algorithmic approach, you cannot make your own choice (other than to completely un-friend someone or delete your FB account, for example). It doesn't solve the problem of people ignoring certain uncomfortable topics, but it least puts the control of whether you want to (or not) in your own hands.
 
+Aaron Longnion yes, and at least we could sometimes switch it off (red pill icon?) and get a glance at the naked truth... closed stuff is always a problem, but it turns into a social danger when it's applied to how everyone's perception of the world is formed.
 
Well-stated, Bèrto. My sentiments exactly. 
 
+Bèrto ëd Sèra that's right, some types of filters (short-term: for current events like a CEO stepping down, or a singer dying) could be something you could toggle on/off as you want, while other filters would be more part of your persona (personal preferences that rarely change: I'm a vegetarian, male, who like social networks, Indian food, and jQuery), so that there could perhaps be a Priority Stream that matches my persona, and then perhaps an Everything Else Stream that includes everything from all my "Friends" in reverse chronological order, unfiltered. Thoughts?
 
This really is a severe problem, and with FB (and G+ if their not careful) it goes even further because of two additional filters: 1. Restrictions on who can comment on posts (only friends of the poster), and 2. A "Like" button, but no "Dislike" button.

The inherent flaws of this are frightening. Not only are you only seeing the filtered demographic based on the algorithms tracking your interactions, but you're also only seeing predominantly supportive feedback. If you follow a school of thought for a short time, it becomes your whole feed. If the ideas put forward are ludicrous or even dangerous (cultish?), you're not hearing the opinions of the opposition. You're only seeing friends of the poster commenting, and people who "Like" it. So it's a psychologically self-reinforcing, limited, cut off, inbred, and potentially dangerous social environment.

I've personally experienced this with some schools of thought that got to the point that I wasn't seeing anything else in my feed, and I was quite uncomfortable with the ideas. I had to set out to find real life friends and family and interact with their posts just to alter the environment.
 
+Michael Cavano exactly. I eventually quit FB because (notwithstanding the fact that I went away 10 years ago) once most of my university friends joined in I found myself trapped in my reality as it was some 20 years ago. I have friends using 4 languages, living in different continents and sporting very different political opinions, yet, after a short while, I was trapped in a radical students' circle.

The worst is that most people my age find it terrific, since it sort of brings your youth back Reality is that you become a member of a sect, without even realizing it.
 
I'll tell ya that FB is hiding a lot of my friends' status updates. Every other 'busines' gets pushed thru, but friends and family are systemically left off, or hidden. I think it's disgusting that the Top News eavesdrops on friend conversations to unrelated people. It's a type of voyeurism I can do without. I don't receive the info I desire for garbage I could do without. 
 
+Bèrto ëd Sèra - Precisely! Not every interest-of-the-month should become a religious movement, especially at the expense of all other interests from then on. Humans can develop an interest in a heartbeat, and move on from one just as fast. That's what keeps us balanced. Any algorithm or technical deficiency that messes with that is a potential psychological health concern, let alone a nuisance!
 
I get really aggravated when FB drops the feed from firends- I often think they have dropped me from their list. how can you keep the flow of conversation if it is out of sight, out of mind?
 
+Michael Cavano yes, removing noise is a bad idea, mainly because there cannot be any evolution without it. Not many people get it, though.
 
I think the point is valid, but his examples are extreme. I haven't ever had problems finding information on Google. He's talking about information that Google gives you as if it's the only source you have.

Has everyone forgot about every other way you get information? TV, Radio, Word of mouth, books? Not everybody is holed up in their basements just perusing the first ten results on Google.

I wish people would take more responsibility for finding things out for themselves. We have a wealth of resources on our hands and people are often reluctant to learn how to use them. As a child I was surpriyed by how many of my peers had never cracked open an encyclopedia.

It's not an issue of what filters the internet uses, it's an issue of how much you question the world around you. To me it kind of seems like he's talking about algorithms as if it were book burning.

I see it as progress. A tool is only as useful as the person who uses it.
 
+Tom Anderson , isn't it a bit difficult to post an article from someone so obviously left-wing? As +Aneka Knell Bean stated, his idea's are a bit extreme and far fetched. There may be some strength behind his argument, but there's an extent you should consider stopping at and take a moment to consider what you're thinking.
 
+Aaron Longnion It's a complex issue. In a way, this is Radical Constructivism (my world is what I see in it), the problem is that IRL this construction process is social and any filter is individual in nature. If I was to request such filter I would want to be able to add noise to it. Not just white noise, maybe noise would also have some tone to it, but still I'd want a noise generator.

So, this leads us to:
1) temporary dams (take stuff like this off my stream, just by clicking some sort of mute on a post)
2) semantic filters (I like this, that and that as a subject)
3) relational filters (I like this person, that person, etc, no matter what they talk about)
4) noise (bring me suprises like...)

Not sure it would solve all the problems, but it would certainly be usable by non geeks, and it would generate a better UX, with much less of implied sectarianism
 
Like everyone else you were born into bondage. Into a prison that you cannot taste or see or touch. A prison for your mind.
 
+Jack Cook define idea I see two screenshots. There is no idea in those, just a different selection of data. BTW, what do you mean left wing? The most obvious example of what he talks about is communist China, with its idea of shaping people's heads.
 
"You take the blue pill – the story ends, you wake up in your bed and believe whatever you want to believe. You take the red pill – you stay in Wonderland and I show you how deep the rabbit-hole goes."
 
Confirmation bias has existed and will exist as long as the human mind is the way it is; some people will transcend it in any situation and some people will succumb to it regardless of precautions.
 
It might make sense for there to be an "unfiltered" option, or self-tuning options, for people to play with, but most people prefer an attempted (and improving) sort by relevance over "unfiltered" and over somebody else's idea of what they should be exposed to.

If you try to force people to look at things they aren't interested in, they'll leave and find a competitor who satisfies them more often.
 
In analog life people loved to have their views reinforced and tended to discount alternate opinions--why should it be any different in digital life?

Moving out of their comfort zone is something only a small percentage of people have ever been (and seemingly will ever be) inclined to do. It would be nice if organizations like Facebook and Google used their incredible influence to help people cultivate broader, more inclusive and compassionate worldviews, but being made up of people themselves, organizations are not likely to take the lead on this front.
 
Thanks, Tom, this has been brewing around in the back of my mind. I want to hear differing viewpoints........I am quickly bored to tears with positive reinforcment...don't need it, don't want it. I want to hear all points of view. How do we break past these algorithms?
 
Thanks for sharing this +Tom Anderson, Eli is very articulate in his case for concern over this! I guess I am an outlier, being that I am not an engineer, yet I still like the algos if one can have control over their functionality. My ideal situation would be to especially have a challenge mode where you see the opposing view of topics you find interesting, and for shits and giggles even a reverse mode that shows what the algos predict you would have the least amount of interest in.
 
+tim Allison those are fun ideas. hadn't heard those before :-)
 
+Lonnie lazar good point. sorta explains fox news :-)
 
+Gil milbauer if that's true, how do u explain the popularity of twitter?
 
:-( Aww shucks, I have good ideas and +Tom Anderson +links me wrong! How will I ever get my +clout score up now!! (Jokes, jokes, all jokes)

Edit: Poor +Lonnie Lazar and +Gil Milbauer also, is Tom "doing the internet wrong"? ;-)
 
+jack cook not sure what you mean by difficult. I think the idea is typically left wing, though yes (I.e. people need to be protected from corporations) 
 
heh on phone tim, linking is not to good
 
Tom, I argue that this happens in main stream media, full time. People don't know how to step back, and view it from 10,000 feet up, in the abstract. I wish I had the time for a comprehensive post to illustrate. *Edited to add: It's funny that a MoveOn.org rep had that to say. They are part of the worst yellow journalism machine in North America.
 
Yeah, I know it's something they need to fix on their mobile app. Just being playful, and taking the piss out of both you and myself. :-D
CJ P
 
I don't like the filtering idea.. How can you discover something if they think it's not for you according to their algorithm personalization?
 
+Tom Anderson People use twitter for different things. I'm sure lots of people only follow people who reinforce their biases.

I was thinking more about Google search results. I agree that it would be nice for the user to have more control of the filtering (although lots of people would only get confused and frustrated), and I'm sure many people would try to add things they might not always see. But I really think most people just want the information they were looking for, rather than somebody else's idea of what they should be reading.
 
Gil yah definitely depends on what website and set of content we are talking about. Facebook's stream changes have killed interaction for me. its a bummer. I'd Lin Google+ to change theirs but its good for me as a poster ... not so much as a consumer
 
Great video! It is amazing and very funny! I have noticed this in Facebook months ago. When they started to formulate the advertisement on the side of your profile to match the status you posted on your page at the time. If I said "I am going on a diet" tons of diet shit would come up. If I said "I love Italian" italian trips, food, etc. would pop up.

Now since I have been working on my business. I can type in "God is awesome" "I need a date" "Did you see that touchdown!" and the same advertisements come up, giving me ways to promote my business. Marketing strategies, Business Plans, Go Viral. It's crazy! Is there a way to stop this?
 
The problem is people don't know what they are not seeing
 
+Tom Anderson I'm not sure what Facebook changes you mean. I think you can click "Edit Options" at the bottom of the news feed and set it to show you all updates from your friends and pages. Or, is the problem that your friends don't do that to see your updates?

Or, is it something else?
 
Just a suggestion, but when linking to TED videos - link to TED. Don't use Youtube alternatives with topic-phrases that completely miss the point.
 
+Jack Cook I don't think +Tom Anderson needs to vet people before he posts a video from them, if they have a good point it's still valid even if they would be a Holocaust denier. As to the point of +Aneka Knell Bean, she said his examples, not ideas, are extreme. So your going off on a tangent fitting your own biased notions, i.e. you're seeing what you want to see rather the real intent of the comment. And while talking about your comments +Aneka Knell Bean, I think you miss his main concern; not about the algorithms in and of themselves, but their lack of transparency in implementation so we can have greater control over them as the end user. It's not akin to book burning, but more like a Hogwartian magical library that shows only books it feels are relevant to you, burying the rest in dormant wings. Sure you can argue you're able to reach those wings with effort, but only some people will go out of their way to explore their own ignorance.
 
How about instead of clicking on the things we see in the first page of Google. We all just click on the 10th or 56th page to see different articles, etc. Would that work?

+Michael McGimpsey I hear ya boss, just talking about the selective group commenting right now.
 
I knew about how selective the FB algorithmic program is, I just hate how FB is monopolizing the Internet. (This is just my POV, You may not see this because I maybe the G+ algorithms won't allow it).
 
thanks soooo much for sharing this. Im gonna watch it cause I adore all TED talks! Some of my face stuff!
 
+Tom Anderson To begin with the notion of democracy and our right to have unfiltered information are noble ideals, reality and implementation is another matter as we know. As mentioned there is the money/business perspective which is a conflict of interest that will always be measured against the need for the public to be well informed. It is interesting that when information was being managed by human entities we just trusted that there was a degree of honor in the methods used to parse information. Now that algorithms are being put into play it seems to create a sense that this may be giving us a limited view of the world around us.
 
I think it has its place. Consider facebook and google's people search.

I'm Australian, looking for friends in my area (despite there being very few). All of the top results for me are American and not much help at all. In certain circumstances, there is nothing we need to see. I don't "need" to know that there is a Samuel Bennets in England at all.

That said, very eye opening. I switched to the duck thingamie bob.
 
Come on people, such algorithms are nothing but a virtual projection of behavioral patterns of users and so, they are very helpful to save us time. There is no way to isolate an individual in a world of billions of people, as we share most aspects of behavioral patterns. You will "always" have millions of people to which you can relate. Perhaps those algorithms will just make easier the connection of alike individuals .
If you think Google and Facebook are "guiding" you, you are ignoring the fact that the you are supplying them with the intelligence.
If you still believing web profiling is a bad thing, start ignoring ads, stop clicking on paid links, look for "accreditation" on your researches (most bloggers are shallower than the dish where your cat sips its milk, but they try to provide you some of that "guidance" in exchange of PPC or affiliate commissions - they generally lack accreditation), get local and ask your friends, your community buddies and work colleagues instead of asking Google.
People were "dumbedized" by TV, in a one way pushing of information and were happy, now that their own intelligence has been used to provide them assistance they complain... you can never make the masses happy, otherwise with "samba", "soccer" and "cachaça". Sleep on it!
 
This is not a Google/Facebook issue, and not even a problem specific to the online world. One example: you buy/subscribe to different news papers according to your political views: liberal vs. conservative, right vs. left wing, etc.

Humans are gravitated to sources that reinforce their beliefs, simply because it's more convenient and conforting to read an article after which you say "well, I'm right again" as opposed to "what a stupid article" or even "I disagree with some of its points". Beyond information need, people have a hidden need for confirmation. What we see right now is a phenomen that satisfies both of these needs.
 
It makes the search a little pointless if we are only being given the information that someone else thinks is relevant to us. I'm all for filtering but agree we should have the option to decide how and when this is applied. A simple option to show all results without any filtering would suffice.
 
Amazing talk, as always. Thanks for sharing, it kind of opened my eyes...
 
yea. i really don't like the Facebook news feed anymore because it is too automated and eliminates friends who i want to hear from
 
great talk, actually i notice that kind of thing but on that time i tough it was just normal, i don't realize that it was just a filter form it.
 
Thanks for confirming our deepest fears Tom #Propoganda
 
Been noticing this for a long time and said it many times. My ssearch is not showing me what you search for why is that.
 
I just tried to watch this on TED.com (wondering if there is more to the speech) and it won't even play on their website.
 
Seems like it is equally important to be "aware" of phenomenons like this and learn how to navigate the Web more responsibly ourselves...google, Facebook, etc have their agendas, "we" should have ours.
 
The search engines can be algorithmic in showing results. In an academic setting, we do not depend totally on search engines to educate.
 
I feel special I have actually seen this before and seen his talk at the RSA :)
 
This is so classic "Millenium-mania". Millenials are 90 Million strong and demand tolerance. IT's a good thing, thus I concur! ^_^
 
Oh no! When I search the web and check my feed, I only see what I actually want to see!? Sure, maybe allow a disable button. But this is not a problem at all.
 
Bravo!!!! Thanks for the link Tom!
 
"This webpage is not available
The server at www.youtube.com can't be found, because the DNS lookup failed. DNS is the web service that translates a website's name to its Internet address."
 
Do any programmers know if it is actually difficult to add a 'sort by date' option? I would assume adding a magic algorithm to sort my stream is a lot more difficult than to sort by date. What are the downsides to adding this option?

I've been asking this since Google released Buzz.
 
Really excellent statements there from Eli Pariser.
 
Very good point. Very interesting. I wonder if we are heading toward "Things to Come" or "1984"
 
John wilkinson I think we are already in a 1984
 
So.. having skimmed through all the comments, having read the article, and watched the video I still have a question. I thought I saw an unanswered question here asking about filtered streams on both g+ and fb.

Do filtered streams get processed by this algorithm, or do they display posts in chronological order?
 
I have seen the video long ago
 
Although media mediums have been historically influential... creativity, exploration, and sharing have never relied solely on media tailoring. I think it could be argued, the limitations of tailored media may actually be beneficial & inspirational to those dedicated to sharing new ideas in new ways. Also, it seems oximoronic to propose relevance related results should be tailored and overseen to prevent tailoring and overseeing.
That's a negative ghost rider... the pattern is full.
 
I suspect sometimes we get away from reality and don't realize it in the zeal for showing off tech capabilities. At times the zeal of marketing thoughts that want to set things for you. Being aware helps. I agree with view of +Tom Anderson that option be offered for seeing things as-is.
 
If we are to customize our filters we wouldn't do it so well. Somebody is helping us and we complain... come on, it is impossible to display everything so we can decide by ourselves what we really want. We would be overwhelmed and with less focus than never. Actually Google should return "paths" on the search together with only the 10 best taylored results in each search. But their profiling sucks, it has to be improved to filter more accurately.
As far as Facebook, filtering gossip and other faceboolshits is so important as to exhibit a message asking the author of such to "shut the f@#k up". (f@#k = facebook, got you, you dirty minded!)
 
I read a post about how a book on Amazon was priced at $6 million because of an algorhythim error!
 
This was a great Ted talk I watched and shared a while ago. It sure opened my eyes and mind to what's going on.

Getting feeds from any resource is like that old saying:
There are three sides to every story,
His side
Her side
And the truth.
It's up to each one of us not to base our knowledge on any one news feed/filter.
The wider our reception, the clearer the perception. 
 
it's like with food. If you eat only the food someone give to you, you will never know, how differents things taste, from other cooks. I don't like this thought.
 
Traditional forms of media (TV, newspapers, radio...) have been doing this for ages so it makes sense that the internet should follow in a more sophisticated fashion.... Actually there's probably an opportunity for a new kind of product/service hiding here.
 
Hey, thanks for the post and the kind words. Appreciate it.
 
These are the dangers of placing too much importance on the individual as opposed to family, municipality, city, country or world communities. Sadly modern western societies like to push individualism as some kind of humanitarian cause... when it is likely more destructive than it is helpful
 
I also disagree with him in stating that newspapers have somehow developed this evolved sense of journalistic identity. If anything they've streamlined their voice and agenda to cater to a specific type of individual. So if you're a liberal or a conservative you will read a certain type of newspaper and know exactly what to expect (ie. you won't be challenged to think differently or consider a differing POV).
 
Very interesting. I would love to see more posts of this nature.
 
In many ways my own personal search habits tend to bring results that reinforce my views. For example, if I'm having a disagreement with someone about a topic I am inclined to search for information to back up my side of the argument instead of looking for a more balanced article.
 
nice angle you don't think of that often. reminds me of Hal from space Odyssey taking over
 
if you hate the fact you cant be anonymous.
change your name to john doe .
stick your finger up at the information stealing monster and stop Google from being an omnipresence.
things need to change in this world.

first and foremost we need to rid ourselves of the monetary system.
or at a bare minimum remove the world banks that control us. give the power back to the people.

inform yourself of the injustices,look up "the plan" "Zeitgeist" "anonymous" at-least educate yourself on how the monetary system is a perpetual debt machine that in its most basic form, is a mass slavery tool for the selective few.

please! if not for you... do it for the kittens, the lolcats we all enjoy!

regards, john doe ^_^
Deiv D
 
Very Nice!! Hope somebody listens!! Let us live outside the box, not inside what they think is best for us, LET ME CHOOSE!! I have notice this for quite sometime now between google puerto rico and google.com, the results are always different..
 
This guy tries to bring information to humans that they DONT want, has he even taken psychology and what an overload of unwanted information does?
 
We are trying to allow a broadcast society to work with free flowing streams. If we didn't have any filters, people would be overwhelmed by the amount of stuff they'd have to sift through, and they'd either stop reading or stop posting. The filters help make it possible for people to engage at some level. It's not perfect, and it can often be just plain wrong. Perfection would required all of us to create our own filters.

This is why we really need Google+ to have topics, and use the +1, and circles (including the outward circles) to let sparks be truly useful for topic searches.
 
i think this is great the only problem is that i dont really mind filter bubbles AS LONG as you're not TRAPPED in them, meaning like i like android it happens to be my favorite mobile phone OS, but in order to challenge my views i would like to see some information about Windows Phone or iOS IF I WANT IT, essentially what im saying is let me choose what gets censored and let me personalize not be catered too
 
I like diverse points of view and I'm interested in what other people are interested in.
 
+Trent VanderWert i think what this guy is talking about is we were never given the option to choose whether we want it or not. Google, Facebook, etc. decided that we must ONLY want to see things that line up with our points of view, location, etc.
 
This video reminds me about a great novel called "The Dice Man" (anyone here read it?).
A guy (if I remember right, he was a psychiatrist) feels constrained by his own nature/personality, so he decides to introduce dices into his life to free himself from the constraints of his own personality. While that has clear parallels to the current issue, I would be very careful about introducing it to the search results. Randomness” or “All” for that matter is not the solution.

In many cases the filter serves us very well. – but occasionally those auto/hidden filters on search results (and auto interpretations of the search query) are a disaster.

When I first started working on personalization of search results etc. back in 2004 – I immediately worried about these issues.
I would like the search engines to show me how they interpret my search query, and how they filter and rank the results and what they -- AND then additionally give me the option of bailing out on or even changing on those parameters.

Of course, I realize, that some of the "how" would be confidential info.
AND the popular search/ranking engines of today need to do much more semantics in their logic in order to be able to present those things in a meaningful manner to the user.

By the way: +Eli Pariser is awesome in this video – superb speech.
 
Tom,sorry if this sounds a bit obtuse.I had a bet with my boss the other day about how much of Myspace he actually owns.Could you please help me with an information update?$50 riding on this.
 
are you joking this what i love g plus
ps that why a got a account
 
This means we can only change what we see if we change what we watch. So is there any place or way to get actual search results? How do we stop this?
 
Excellent TED talk. Really makes you think and wonder what are you NOT being shown? Sharing.. Thanks Tom for bringing this to my attention again!
 
The top comment is an anti-Semitic rant, and I suspect the poster of the youtube video supports it as he blocked me for calling the neo-nazi out.
 
Facebook algorithm filtering = Bad (why can't you control what you see)

Google search algorithm filtering = Good (search faster. If what you want at that moment of time is not from your normal search pattern, just do a more detail search. Instead of just typing "Egypt", as demonstrated in the video)

If Google+ applies algorithm filtering like Facebook, Google+ is just like another Facebook, Bad. Fail. Real bad.

User control is fundamental. We decide our own lives, we decide what we read, we take responsible for our own actions, we take care of the neighbors..... certainly not control by "somebody"... It is horrifying just to think about it....
 
that's why you need to keep google tracker out of ur business (use hot spot shild)
 
This guy is going to end up getting grabbed off the street into a black van and never heard from again. The information we are missing is on purpose....really.
Add a comment...