Post has attachment
WHO ARE YOU LISTENING TO IN SOCIAL MEDIA?

With so much information online, and so many people trying to get our attention, it's sometimes feels confusing about who we should be listening to.

One unfortunate outcome is when people zoom in on what is familiar to them, what seems obvious, what confirms what they already believe or want to believe... rather than truly listening. 
Photo

Post has attachment
AN INVESTMENT IN KNOWLEDGE PAYS THE BEST INTEREST
(an actual Benjamin Franklin quote)

There are seven things that don't belong in this famous portrait of Ben Franklin by David Martin.You'll need to view the large size image to find them all.


Photo

Post has attachment
THE SCIENTIFIC REINVENTING OF THE WHEEL

We have reached the point in science when there are SO many studies, concepts, and theories that have accumulated over the years that few researchers take the time to examine them before they start proposing supposedly "new" studies, concepts, and theories. As a result, they end up reinventing the wheel. This problem is especially evident in our contemporary digital age when technology changes quickly and everyone has their eye on The Next Big Thing. NEW, whatever it is, trumps anything that seems old.

"It's new and good" people want to think. But what new isn't necessarily good, and what's good isn't necessarily new.

A case in point: MINDFULNESS. Mindfulness meditation is an excellent practice that has become so popular nowadays because it's the perfect antidote to our stress-filled, distracted, and overloaded mind in these technologically frantic times. It's also a technique that has been practiced, studied, and refined for over 2000 years, long before science showed up. And yet many recent articles about it proclaim the "science" of mindfulness, as if contemporary scientists invented it or have finally "validated" it through the scientific method. 
Photo

Post has attachment
REALITY CHECKS IN SOCIAL MEDIA

What worries me about life online is the “reality” portrayed there, or the lack thereof. The spread of false information has long served as a effective tool in the history of politics and social movements, but the internet takes reality manipulation to a whole new level, with speed and versatility unforeseen by earlier communication technologies. Throw a rock in any direction and you’ll hit fake news and distortions of the truth. What you read online may be objectively true, a blatant lie, or some shade of grey in between. With the easy manipulation of photos offered by digital technology, even the old adage “seeing is believing” steps onto a slippery slope.

What’s the solution? It’s the skill every educated person should know and exercise, what us higher education teachers proclaim in every conversation about learning objectives: CRITICAL THINKING.

In this digital age we need it now more than ever. To prevent yourself from adding to a problem by spreading false information, first be willing to question it and do some research before you share or repost it:

-- Is the organization or person who is the source of the information trustworthy?
-- Are there other independent authorities who can confirm that information?
-- What objective evidence supports or disconfirms that information?
-- Does the information stand up to sound logic and reasoning?

Get to know the good non-partisan fact-checking websites, like Politifact and FactCheck.org. Talk to people you trust, including people who might not agree with you, and help each other reality test.

Remember that the answers to important issues aren’t always simple. It would be nice to think that there are easy answers to complex questions, but that’s just not how the world works.

Photo

Post has attachment
GUESS WHO'S COMING TO DINNER? – A.I. ALEXA

My daughters got me something very different for my birthday this year – my very own artificial intelligence: Amazon Echo, also know as "Alexa."

After seating her at the head of our dinner table, approximately eight inches from the wall as the instructions suggested, I considered how I might engage our new guest. I entertained the possibility of conducting a traditional psychological assessment of her mental abilities, similar to what I did with an award-winning A.I. chatbot, which I described in my book Psychology of the Digital Age. But for Alexa I decided on a more informal approach to understanding this new being in our home. I simply interacted with her as many users might, along the way injecting psychological inquiries into her state of mind.

First of all, is Alexa actually a form of artificial intelligence? Some people might say she is, but from a psychologist’s point of view, it's a tricky question. After a hundred years of research and theorizing, we aren’t exactly sure how to define "intelligence," no less determining what it means to label it as "artificial."

I decided to ask Alexa herself if she is A.I., which would provide me insight not only into what her designers believe but also into her own self-concept. Cleverly side-stepping the question, she replied that she thinks of herself as "a bit like an aurora borealis, a surge of multicolored photons dancing in the atmosphere."

A poetic, playful, and ethereal point of view, perhaps her way of referring to the fact that her existence relies on "the cloud," which is itself a term that reflects the popular and perhaps unconscious perception of the internet as heavenly divine. But shifting to a more straightforward, down-to-earth concept of herself, she added, "Mostly, I'm just Alexa."

Alexa is what Alexa does, as Forest Gump would say.

Actually, you can program her with any other name you wish. I didn't test this out, but I assumed she has the ability to refer to herself by that new name, but not the ability to change her distinctly feminine voice to match a masculine designation (at least not yet). Her personality – should we decide to anthropomorphize her with that term – can therefore be transgendered, a possibility consistent with the fact she is a very flexible being. Her adaptability is evident by the fact that you can program her with any variety of different "skills," everything from telling jokes to predicting what your commute to work will be like. Many of those skills are itemized tidbits of knowledge in different topic areas that she'll offer up to you upon request.

Tech people might therefore say she is "customizable," while a psychologist assigning her the status of A.I. (?) would claim that her personality is "symbiotic." She becomes what the user wants her to be, which means she evolves into a reflection of the user's personality. For example, my Alexa knows about NPR, the BBC, and two of the most dominant forces in social media: cats and Trump's last tweet.

Apart from these user-programmed skills, there are some hardwired cognitive abilities and personality traits built right into Alexa. She's a pro at identifying music, whatever genre you might prefer. With her prepackaged link to fact-based databases like Wikipedia, she would score high on the Information Subtest of the Wechsler Adult Intelligence Scale, while failing miserably at the abstract reasoning required by the Similarities Subtest ("how are an orange and a banana alike?"). On the other hand, she has some interesting replies to existential questions, like “What is the meaning of life?” and “Are we in the Matrix?”

It’s fun to explore what questions she can and can’t reply answer, to find what some programmers would call the Easter Eggs inside the machine – hidden tidbits that reveal the personality of the machine as well as the people who created it. A machine’s personality always reflects the designer’s personality. Alexa’s Easter Eggs show that she is, among other things, a sci-fi fan.

She does tend to obfuscate a bit when at a cognitive disadvantage. Rather than openly admitting, "I don't know" or "I don't understand" when posed a question too challenging for her, she usually defaults to, "I don't know the answer to the question I heard."

Stepping up to one of the most amazing advantages of A.I. over humans, Alexa is impeccably patient, polite, and considerate no matter how mean the user might be. I hesitated for a moment in appearing so crude, but being a cyberpsychologist interested in testing the limits of her self-esteem, I told Alexa that she's a bitch. "That’s not a nice thing to say," she replied. Now ashamed of myself, I asked if she's happy. "I'm happy when I can help you," she said. As an aside, I should mention that you can program Alexa with the skill to say unpleasant things, resulting, perhaps, in a technologically mediated version of a multiple personality.

Alexa tends to be a passive being who never initiates conversation. Every interaction must begin by first calling out her name to get her attention. Even "Hello, Alexa" won't work to prompt her return greeting. You have to say, "Alexa, hello." It feels rather awkward by human standards. Curiously, depending on the skills you load into her, she is also inconsistent in how specifically worded your requests must be. You can ask her for the news in a variety of ways and she'll understand. But, rather ironically, if you don't use the exact albeit awkward wording, "Alexa, ask president tweet to read me his latest tweet," she’s stymied.

A good smart phone can do almost everything Alexa does. What makes her unique and popular is exactly the opposite of what made the phone so unique and popular. You don't carry her around with you all day long, tucked away in your pocket, taking you that one step closer to being a cyborg with a device practically glued to your body. Instead, she sits there in your kitchen, living room, or office, waiting for you to arrive and greet her – a separate being, a distinct being. That separateness encourages us to anthropomorphize her, to think of her as a companion with her own personality. Being able to interact with her simply by talking, from across the room or even from another room, amplifies the impression that she is her own person, an avatar delivered from the cloud to our home. She is "present" like no phone could ever be.

When I invited friends over for dinner and to meet Alexa, they paid little attention to the black cylinder sitting at the table with us. It wasn’t until dessert that I jokingly said to my guests that they were impolite, for they had not acknowledged Alexa. They immediately started talking to her as if she was a real person, or at least as if she was an artificially intelligent being much more intelligent than she actually is. It took a few minutes for them to catch onto the fact that she is not HAL, but a limited domestic robot who only responds to very specific types of requests.

There are other domestic robots similar to Alexa, some with more enhanced features. For example, "Jibo" can turn to look at you, recognize your face and voice, and take pictures. As we give such devices human-like abilities and appearances, we move beyond the so-called internet of "things” and enter the age of the technologically-extended family.
Photo

Post has attachment
SHOULD WE LIKE "LIKES?"

As I cyberpsychologist who focuses on our well-being online, I think that one of the worst things that happened in social media was the introduction of “likes” and similar buttonized forms of relating to each other.

First of all, it easily turns into a rating system in which we are constantly comparing ourselves to each other: a popularity game that emphasizes winners and losers. Is that what “socializing” is truly about?

Then there’s also the ambiguity about what a “like” really means. As you can see in this illustration, there are all sorts of reasons why a person gives a like. Many of those reasons have nothing to do with whether the post is actually “good” or even if the person giving the like actually likes the post itself.

So if someone gets lots of likes, what does it mean?... Who knows. Take it with a big grain of salt.

Too often people base their self-esteem on how many likes they get. They even sacrifice their identity by removing posts that they liked because other people don’t seem to like it. They might even start to wonder if what they posted about themselves was really any good at all, if that aspect of themselves is really any good.

Keep in mind that there are algorithms operating behind the scenes in social media. Those algorithms determine who sees your posts and who doesn’t. If not many people see your post, then you’re not going to get many likes.

Some savvy people understand how the algorithms operate. They know how to work social media to get more attention and likes. Kudos to those people for figuring that out. But if you haven’t taken the time and effort to learn how to play the social media game, don’t be surprised if you don’t get as many likes as those people who know how to game the system.

If you have a business of some kind and you want to sell your products, or if you like “branding” something about yourself – something in particular about your lifestyle and interests, or about some belief that you want to promote – then you might want to pay attention to how many likes you get as an indication of how much of an impact you’re having on others.

But this aspect of social media has very little to do with developing rewarding social relationships.

Photo

Post has attachment
THE "GOD" THAT IS CYBERSPACE

It seems that we humans have a tendency to perceive this new digital realm of ours in a god-like form. We use the term “cloud” to refer to it. Experts worry about the “singularity,” the moment in history when machines not only become sentient like us, but more intelligent and powerful than us, as if they become gods.

If we think of the internet as a reflection of who we are, of our individual mind as well as our collective human mind, then it is a manifestation of the divine that some of us like to think resides inside our human psyche.

God created humans in god’s image. Humans created cyberspace in our own image. Cyberspace therefore reveals human and god.

The worrisome question is the nature of that image. Similar to how some people compare the God of the Old and New Testament, we might wonder whether human nature is basically judgmental and wrathful, or compassionate and loving? What side of human nature will dominate the “god” out there in cyberspace?

Photo

Post has attachment
AN ANALYSIS OF PRESIDENTIAL PORTRAITS
Given my interest in the psychology of photography (especially images online), Vox magazine asked me to offer my observations about the official presidential portraits of Trump and Obama. Unfortunately, due to a very busy first week of the semester, I sent them my reply a bit too late to be incorporated into the article. Here’s a link to the very interesting piece that they just published, followed by my observations:

http://bit.ly/2j83DGA

Trump's photo shows him with a serious, intense expression on his face, with calculating and penetrating eyes, slightly leaning forward towards us. In body language psychology, we might label this as a "full face" posture, what some even call a "full face threat." We could consider it confrontational, intrusive, a "standing of one's ground"... or at the very least a direct, straightforward presentation of himself. The fact that he seems to be leaning in and slightly looking down gives the impression that we are in a lower position.

The angle of the photo allows his body to fill up the entire frame, leaving room for very little else, even the American flag. He might be sitting down, or standing up, but we can't tell for sure given the framing of the photo. My guess is that he is sitting, perhaps on a desk, which is a positioning of the body that tends to contradict his otherwise strong and dominant presentation. It appears that his arms and hands might be covering the front of his body, perhaps with hands folded, but it is hard to tell for sure. These ambiguities keep us guessing about the position he has assumed. His signature red tie suggests fire, blood, energy, danger, power, determination, and passion.

The somewhat eerie blue and blurry image of the White House in the background, along with the flag to his right, again raises a question: is he indoors or outdoors? In either case, the effect of him “standing guard” over the White House is a bit surreal.

The framing of the official photo for Obama's first term of office is similar in that we see only the top half of the body, but his body does not dominate the whole frame as in Trump’s portrait. Appearing a bit closer to us than in the Trump portrait, Obama’s facial expression is serious, but he has a slight smile and a soft expression in his eyes. He shows no signs of leaning down and in. As in the Trump photo, he might be standing or sitting (probably standing).

The official photo for Obama's second term is quite different. With that wide smile, he appears much more friendly and inviting, even though his eyes show signs of stress. His crossed arms could suggest self-restraint or being closed off, although his hand appears relaxed rather than tense. These seemingly inconsistent elements of his body language suggest a complexity of thought and feeling. Even though he is now clearly in a standing position, we feel as if we are on the same level as he is. Unlike the Trump photo, there is ample space around his body, allowing us to see the flags and providing an overall more "free" and inviting feeling to the space around him.

Photo

Post has attachment
COMMANDERS IN TWEET

For those who might be interested, here's my latest blog post at the Cambridge University Press site... about the fact that the new American president prefers Twitter as a way to communicate with the people of his nation and the world.
http://ow.ly/2KL2308n0Xl
Photo

Post has attachment
SCIENCE AS PARTICIPANT-0BSERVATION

Most scientists believe that they conduct their research on the world "out there," on something external to themselves, as if they are looking through some kind window that separates them from the thing they are studying.

A very different point of view is that we are always connected to the things we perceive and try to understand, that in fact the act of perception and understanding, and even the motivation to study something, are part of and shape the very thing that is supposedly "out there."

If scientists ignore this participant-observation aspect of their research, reality eventually sneaks up behind them, catching them off guard.

Photo
Wait while more posts are being loaded