Profile cover photo
Profile photo
Rolf Degen
Rolf Degen's posts

Post has attachment
Moral outrage: Second-hand anger

Moral violations can cause severe emotional upset in the absence of harm to victims

We humans experience not only a nagging desire to get back at those who afflict damage to ourselves. We are also outraged about the behavior of others who have caused harm to third parties or sinned against the moral rules of the community. According to new research, even harmless injustices against other people can get the spectator in moral fits.

Moral outrage always occurs when a breach of moral standards is perceived as if it were an affront to ourselves. Even victimless "crimes" such as prostitution or pornography can provoke this variant of anger. "Every citizen is in some ways a small policeman", the Zurich Economist Ernst Fehr puts it straight. During the energy crisis in the seventies, a motorist even shot even another one because he had cut in line in front of others in the long queue at the gas station. Sometimes, moral outrage even escalates to a kind of mass hysteria, the so-called "moral panic" when influential forces of society suddenly fall prey to the delusion that some devilish influence - be it rock music containing satanic messages when played backwards, "Killer Games", "fake news" (new entry) or satanic child abuse - would entail unimaginable threats to the community ,

Those who incur the costs and troubles of punishing an offense against third parties seem thus to perform an act of pure altruism. But there is also a hint of self-interest in the game: With this elaborate engagement, the "altruistic punisher" also signals that he/she feels obliged to the moral norms of their community. In a way, that person undertakes a propaganda campaign in which they advertise themselves as an attractive cooperation partner. (It is also a "commitment device": The actor signals to others that he/she is driven by uncontrollable visceral forces, and thus unable to act out of pure, rational self-interest, see my feature about moral emotions linked below.) Moral outrage is therefore particularly easily unleashed when an audience is present. And therefore young men are particularly vulnerable to feelings of outrage - because of their comparatively low status they are dependent on the support of influential coalition partners (and not to forget because of the burning desire to impress young women).

Relatedly, it has been found out that moral outrage distinguishes itself from other emotions in that it cannot be experienced purely individually, without some connection to the collective: In order to experience moral outrage, people need the certainty that others do feel the same way about the cause. Perhaps because of that intellectual quantum leap, monkeys and apes, our nearest relatives in the animal kingdom, seem incapable to undergo this socially agitated feeling state: Injustices toward third parties leave apes and monkeys completely cold.

According to a new study called "What elicits third-party anger?" (2016), moral outrage can be fueled by a perceived violation of moral values alone, independent of any harm inflicted on other persons. Psychologists Helen Landmann and Ursula Hess from the Humboldt University of Berlin made their subjects read newspaper articles about certain moral offenses and recorded their feeling states. One dealt with bank assistant who strongly recommended an equity fund to an elderly couple that fared badly. Another one depicted pharmaceutical researchers who glossed over side effects of a new drug. In some cases, the infractions caused substantial suffering, in others, the victims got off easy.

All things considered, only newspaper articles that described a clear moral violation elicited a considerable degree of moral outrage. In contrast, the hindrance of the victims’ goals and the pain inflicted on them played only a minor role. Although the severity of negative consequences was highly relevant for compassion, it had only a minor impact on the degree of experienced outrage. "In other words," the researchers conclude from their data, "for anger it might not matter how much a victim suffers but rather whether we identify with them." In sum, anger can be elicited by a perceived violation of moral values alone. A direct and severe negative consequence for the own person or for others is not necessary for this reaction.

Source: What elicits third-party anger? The effects of moral violation and others’ outcome on anger and compassion

More on this subject: Moral Emotions: The solution to the selfishness puzzle

More social psychology stuff:

The trench that separates us from the happiness of other people

Anticipatory pleasure: The joy of things to come

Pluralistic Ignorance: The delusion that others are different from ourselves

How people twist their perceptions under majority pressure:

Hypocrisy: The beam in thine own eye

More on schadenfreude:

The joy when others are even worse off than ourselves
Comparing yourself to people in bad luck lightens up your mood – and may backfire

Reactance: The lure of the forbidden

Post has attachment
Doctor Mengele and All Creatures Great and Small

Forgotten roots of animal welfare in the Third Reich

At the peak of the Third Reich, at a conference on animal welfare, organized by the NSDAP, an overzealous teacher launched a real highlight. She reported about a speaking dog, which to the question "Who is Adolf Hitler?" had replied "Adolf Hitler is my Fuehrer." An outraged member of the party, who got in rage over the "tasteless" narrative, was quickly silenced by the undeterred lady. The brave dog had but just wanted to show how grateful it was, because Adolf Hitler had adopted such strict laws to protect animals.

The fabulous four-legged friend is merely a surreal minor character in a buried chapter of German history, which has been excavated by the American anthropologists Arnold Arluke and Boria Sax when studying old files and documents. Conclusion: The same sadistic despots who inflicted unimaginable horrors upon so-called "sub-humans" during the Holocaust, were pioneers for the welfare of our "fellow creatures". They adopted, inter alia, Draconian animal protection laws, committed themselves strongly to the preservation of endangered species and created a natural mythology that has astonishing parallels to today's philosophy of animal rights.

Already towards the end of the last century in Germany a movement was formed that was in full cry over vivisection and the kosher slaughter of animals. In the wake of this campaign, the Nazis issued in the period from 1933 to 1943 a whole series of laws and directives to protect wildlife that turned in the shade everything seen before. Immediately after they seized power, an extremely rigorous law on animal protection was rushed through, strictly regulating conditions of slaughter and imposing harsh penalties on vivisection and 'unnecessary' experiments on animals. Dogs, horses, monkeys and last but not least the domestic cat, were expressly marked out as 'particularly worthy of protection'. This would mean the end of the "unbearable tortures and torments in animal experiments," Herman Goering cheered on the radio. He would "send to a concentration camp all those who still think they can treat animals like inanimate property."

The Animal Protection Act from November 1933 regulated with obsessive love every detail dealing with the fauna. According to the Preamble, the goal was "the awakening and strengthening of compassion, one of the highest moral values of the German people" (doesn't that sound eerie familiar?) In order to spare the creatures "pain and unnecessary experiments", every animal experiment had to be explicitly approved by the Home Office. Detailed regulations turned even the "abuse" of shellfish in restaurants into a crime. In1934, the new spirit culminated in an international conference, the motto of which (framed by swastikas) was "whole epochs of love will be needed to compensate animals for their virtue and dedication."

Animal cruelty was punished with penalties the hardness of which was without precedent in the modern era. Other facts also prove that the animal love of the Nazis was far more than a hypocritical publicity stunt. In 1935, the ideological focus shifted to the rescue of endangered species. With dramatic reports about animals that were close to extinction, "brown" journals now laid on the agony. The Nazis worshiped the forest as "Cathedral of God", created protection zones for elk, bison and other endangered species that were to be reintroduced later. Hunters were placed under obligation with taxes and paragraphs in order to make their contribution "to the care and protection of wild animals". But surprisingly, most Nazis were already hostile toward hunting. Himmler hated it "with hysterical loathing", and Hitler was outraged, "that today anyone with a fat belly can zap the animals from a safe distance." (One wonders if that included his fat compagnon Goering.)

The "nature trip" of the Nazis and their often vegetarian lifestyle (Yes, Hitler was a vegetarian, as were, among others, Hess, Göbbels and Heinrich Himmler. He acquired the habit during his imprisonment in Landsberg and maintained it until his death, mainly feeding on beans. That gave him chronic stomach cramps, which his personal physician Theo Morell far sightedly cured with a microbiome preparation. He was also a fervent anti-smoker and teetotaler, though he may have been given speed, opiates an cocaine) was twined around with an ideological superstructure which glorified animals because of their vitality, their "blind survival instincts" and the "unadulterated drives". Once, preached fascism, humans were ultimately "organic" and "holistic", lived in harmony with the (Aryan) Mother Nature. But then the Jewish spirit destroyed this original unity through its "mechanistic", "corrosive" and "exploitative" view of the world. Historically, there lies considerable cynicism in the fact that chief demagogue Goebbels courted his dog with a quote that was stolen from Schopenhauer: "The more I get to know people, the more I love my Benno."

PS: Hitler had planned to impose vegetarianism upon Germans after victory in war

Eating meat does not make you mean

Seven unflattering truths about vegetarians

Source: Arnold Arluke and Boria Sax:

Post has attachment
Positive emotional expressions literally leave the brain cold

Facial movements triggered by feelings modify temperature inside the head - and makes us feel good or bad

At first glance, it seems rather obvious that our facial expressions serve to express feelings towards the outside. Similar to a kettle, which "expresses" the arrival of the boiling point through a whistle. However, according to indications from experimental research, our grimaces also affect the temperature of the brain and thus actively magnify the intensity of the experienced emotion.

Charles Darwin, father of the theory of evolution, in his so-called facial feedback theory of emotions already made the assertion that facial displays of emotion retroactively amplify the experienced feeling. To the extent to which you bite back the facial expression, you calm down the inner impulse. For example, Darwin questioned if the frowning that occurs when an individual faces the sun has the ability to in influence anger emotions, given that anger also involves frowning. Recent research has tackled this hypothesis, recount psychologists T.F. Price, Aberdeen Proving Ground, Aberdeen, MD, United States and E. Harmon-Jones  from the University of New South Wales, Sydney, in a review of facial feedback theory.

In this experiment, researchers approached individuals walking along beaches and asked them to fill out a questionnaire assessing current anger and aggressive emotions. An equal number of participants walking toward the sun with and without sunglasses, and walking away from the sun, were included. As the results show, walking away from the sun had no effect on emotional responses, neither with nor without sunglasses. "Walking toward the sun without relative to with sunglasses, however, produced more aggressive emotions."

Obviously, researchers are curious about the physiological mechanisms behind this phenomenon. The renowned psychologist Robert B. Zajonc posited that the downward movement of the corrugator supercilii muscle, which often occurs during the forming of negative affective facial expressions, might restrict air-intake into the nasal cavity. "This would in turn cause more mouth as compared to nose breathing and may raise the temperature of blood entering the brain." It is believed that our mind experiences an increase in brain temperature as uncomfortable, while it responds to a slight cooling with positive feelings. The activation of the zygomatic major muscle, which often occurs during the forming of positive affective facial expressions (such as smiling), might open the nasal cavity, reducing the temperature of blood entering the brain, and making us feel good. Hence the famous "cheese" grimace at the photographer!

In order to test these hypotheses, Zajonc had participants recite German vowels that caused greater or lesser brow furrowing. Indeed, the results left no doubt that greater brow furrowing caused more negative evaluations of information and higher facial temperatures. The view that brain temperature fluctuations are emotionally important is finally also supported by an experiment in which the researchers, under a pretext,  blew slightly warmed or cooled air with a tube directly to the nasal mucosa of their subjects. Even in this condition, the "cool breeze" was assessed as more pleasant than the warm breath. As everyone remembers from the experience of a nose clogged by a cold, any reduction in its venous cooling capacity is accompanied by rather unpleasant and awkward sensations. Experimental animals, too, show through their expressive behavior that they enjoy the experimental cooling of the interior of the head, while warming it makes them experience discomfort.

Even orgasm, the climax of pleasant emotions during the love act, betrays the euphorigenic effect of brain cooling, but so far only in (male) rats: Just in the moment in which the penis gets rid of the semen, temperature drops deep inside the rat's head. Lust literally leaves the brain cold! This was demonstrated by psychologists Mark S. Blumberg and Howard Moltz from the University of Chicago. As ejaculation set in, suddenly the blood vessels in the nasal mucosa of the animals stretched out. When blood vessels at this strategic point dilate, the ability of the brain to release excess heat increases. And this mechanism was not without consequences: With one stroke, temperature of the rat cerebrum declined - particularly in the hypothalamus.

According to Blumberg and Moltz, there is at least an indication that orgasm gives humans a cool head too: There is a form of the common cold, "vasomotor rhinitis", which comes about through a dilation of blood vessels in the nasal mucosa. And just in the heat of orgasm, the symptoms of this disorder deteriorate. 

Most probably,  many everyday phenomena explained by this mechanism - perhaps even the enjoyment of music. In subjects who listened to their favorite music in one experiment, nasal breathing changed, and forehead temperature promptly dwindled down. Some people bite their nails, chew on fingers or hold objects obsessively in the mouth. All these actions reduce air intake through the mouth and direct air flow towards the nasal route, so that they produce a pleasant sensation. Finally, there are many patients who are forded to repeatedly produce ticks with their face. Perhaps, they are only instinctively counteracting a disturbed heat balance inside their head.

"Embodying Approach Motivation: A Review of Recent Evidence"

Post has attachment
Capuchins overweight peaks and ends like us

How do we judge the emotional overall quality of an experience that extends over time and contains a mixture of high and low moments? Though one might think that people would construe something like an average assessment, our factual judgment rather obeys the so-called "peak-end rule": Mental accounting is largely based on how we felt at the peak (i.e., the most intense point) and at the end. And according to the latest research, many capuchin monkeys measure their experiences in the same logic. But neither humans nor monkeys seem to be able to derive any hedonic benefit from that formula.

Economists have commonly assumed that people’s hedonic evaluations of an event are based on the sum of the event’s overall “decision utility” across time, and that this information about overall utility enables us to structure future experiences so as to maximize our hedonic gains, says a team of psychologists led by Louisa C. Egan Brad from the University of Portland. A growing body of evidence, however, suggests that these assumptions are empirically unsupported: We do not assign equal value to each of the successive moments in recalled experiences. Instead, we place disproportionate importance on two salient aspects of the event: the event’s peak-intensity and its endpoint.

In retrospect, we consider a movie as more pleasurable if the most enjoyable segment was concentrated into a short duration rather than drawn out across the event. In the same vein, people who experienced an improving sequence (negative, neutral, positive) of jelly bean flavors report greater liking of the experience compared with people who experienced the reverse sequence. Choosing a single great DVD conveys more satisfaction than choosing a great DVD plus a subsequent less-preferred one. In a seminal study, Nobel laureate Daniel Kahneman found that patients rated colonoscopies as less unpleasant if an interval of mild pain was added to the end of the procedure.

Despite evidence that people’s retrospective evaluations are often biased to overweight end-points and peaks, relatively little empirical work has explored the origins of the peak–end rule, the researchers point out: "Does this rule depend on certain cognitive skills and experiences specific to human adults? Moreover, once the peak–end rule is present, can we learn to use it to maximize the utility of our experiences? The few monkey results paint a mixed picture of whether nonhuman primates share human-like peak–end effects. To get to the bottom of this, the researchers conducted a series of experiments with capuchin monkeys. 

They chose this primate species because of two advantages:  For one, in previous experiments capuchins had already shown evidence of other biases that haunt human decision making. Even more importantly, the same lab had successfully developed a token measure to test capuchins’ willingness to pay for different kinds of experiences: The monkeys were already well versed in exchanging "money" for goods and services. The researchers used pieces of strawberry Pocky as a food reward. Each piece of Pocky is comprised of a stick-shaped biscuit that is partially frosted. Thus, each piece of Pocky has a lower-value section of plain, unfrosted biscuit, and a higher-value section of frosted biscuit. Given that higher concentrations of sugar make foods more attractive to capuchins, it was expected that the monkeys would assign greater value to frosted sections of Pocky over unfrosted sections. create food-eating events that either started well

In Condition 1, monkeys could chose to buy their "menu" from an experimenter who presented a food event that began with moderate pleasure and then became more highly rewarding (i.e., a high-end outcome) or from an experimenter who presented the same food event in reverse order, so that the highly valued outcome occurred first (i.e., a low-end outcome). In the second condition, monkeys had the choice between an experimenter who delivered a food reward event with a short but highly valued frosted section in the middle (i.e., a high-peak outcome) and a second experimenter who delivered a food reward event with a twice-as- long but half-as-desirable frosting section (i.e., a low-peak outcome). 

As the results show, in their assessment of "utility", capuchins are like people too: The monkeys behaved similarly to adult humans with regards to retrospective evaluations: Half of the tested monkeys preferred (shown by their willingness to pay) sequences with highly valued endpoints and high peak-intensity. Along with previous work, the results of Study 1 suggest that monkeys may share human-like peak and endpoint effects and, indirectly, the mechanisms that produce such effects. 

Another, possibly more important question is whether humans and animals are able to translate the logic of their own hedonic evaluations into practical utility. Can individuals structure the sequence of their own experiences in a way that produces the maximal emotional payoff, such as proverbially "saving the best for last?". To address this question, Study 2 gave human adults, children, and capuchins the opportunity to structure an uninterrupted eating experience from a discrete set of food rewards: Participants could eat the foods, which were presented simultaneously,  in whatever sequence they preferred. The researchers used the same food items in all three populations: blueberries, cherries, raspberries, and disk-shaped slices of both carrots and cucumbers.

Disappointing conclusion: none of these populations constructed sequences such that their rewards improved over the course of the sequences. Indeed, the different populations were very consistent in their constructed sequences—members from each ate their most preferred foods earlier in their sequences. "In light of the previous literature, an important contribution of Study 2 is adults’ failure to save their most-preferred rewards in order to ensure that their sequences finish on a high note. Our results contrast with findings suggesting that adult prefer to choose improving sequences when considering hypothetical rewards... Human adults—who have had more experience with the effects of endpoints than children—failed to structure their experiences in ways that maximize remembered utility; in fact from the perspective of maximizing hedonic utility, adults did not differ from children at all."

It may well be that, when structuring ongoing experiences, humans AND  animals fall prey to their deficiency in delaying gratifications - the weakness to resist the temptation for an immediate reward and wait for a later reward. 

As far as the psychologists are concerned, these results give us grounds for modesty "These results also show that human-unique metacognition is not enough to allow our species to perform better sequence construction than a distantly related monkey species. Overall, these results suggest again that we are often blind to both the operation and best use of effects that profoundly affect our hedonic evaluations."

The Evolution and Development of Peak–End Effects for Past and Prospective Experiences

Post has attachment
Second Nurture

The groundbreaking lessons of behavior genetics are about the environment, not about genes

The old problem of nature or nurture for many still appears like a fierce struggle for percentage points, similar to the dispute about the belongings in a dirty divorce battle. But aside from the noisy turf wars, behavioral genetics has quietly turned traditional notions about the impact of "nature" and "nurture" on its head. Ironically, the search for the influence of genes brought to light groundbreaking discoveries about the identity and scope of environmental conditions .

Over the last decades, twin studies of personality have been consistent in attributing approximately half of the variance in individual traits to genetic effects. But it would be disastrous if the intellectual contribution of behavioral genetics were confined to the promulgation of a percentage value. Drawing a demarcation line in the sand would be unproductive and boring, because then any "explanation" would just mean nature and nurture contribute that much or that little, and that's that. The really exciting search for the hidden processes behind the phenomena would become stifled by number crunching.

Fortunately, behavior geneticists  made a discovery a few years ago which enriches human self-knowledge by a quantum leap. But it doesn't relate to genes, but to the environmental impact, and arouses suspicion that we are only just beginning to illuminate the formative forces of biographical destiny. With the term "environment", sociologists and theorists usually associate milieu-shaping forces such as education, social class, housing and parenting style. All persons who are subjected to the same environment would be affected in the same direction. The personality of siblings who grow up in the bosom of a family would unyieldingly be brought into line by that kind of  "shared environment".

This possibility can be checked in a "waterproof" manner through the study of adoptive siblings, who have no genes in common. However, they are both socialized in the same milieu and should therefore actually become more similar over time. But the won't, Berlin Psychology Professor Jens Asendorpf takes stock. In fact, their personalities  remain as varied as those of two individuals randomly chosen from the population, even after years of joint rearing. Everything the adoptive parents muster in the form of education and imprinting attempts rushes past the inexplicable social and emotional uniqueness of the individuals entrusted to them. (This is, in fact, also largely true for "normal" siblings, who exhibit certain similarities in their personality, due to the fact that they share 50% of the relevant genes. Still their characters mostly drift apart, like to strangers passing each other on the street. That's why they say parents are behaviorists before the arrival of their first child, but become geneticists after the arrival of the second one.)

These result are devastating for the "old" environmental theory, say researchers from the field of behavioral genetics. This holds true even more for the mirror-image trend in monozygotic twins who were separated soon after they were born, and who grew up in different milieus. The really surprising thing about the gene-alikes is, in reality, not their (sometimes spectacular) similarity. The most exciting result of the twin research is actually the fact that the genetic "clones" are dissimilar to a defined extent. And this measure of dissimilarity is not one iota greater in those twins who grow up in separate environments than in those reared together. Because they are exposed to different environmental conditions, their personalities should drift apart over time. But they end up with the same amount resemblance (up 40 to 50 percent) as those who grew up in one family. Conclusion: All influences that people make (and, indeed, genetically identical people) DISSIMILAR already occur WITHIN one family. The influence of the "shared environment", i.e. the global environmental impact, which comes about without distinction of person, is negligibly weak in all studies - and in no event exceeds  the size of a few percentage points.

This insight, which emerged only a few years ago and has so far passed all empirical tests, "is probably the most significant result of behavioral genetics for personality research - more important than the evidence that personality differences are genetically co-determined," says Asendorpf.  Obviously, there are very well experiences ("nurture") that shape character - at least to the extent that the genes do - but these are apparently personal, small and idiosyncratic experiences, which every person experiences for themselves, because they "democratically" sprinkle over all classes and strata. Basically, today all the textbooks of sociology would have to be rewritten. The cozy home is obviously not a monochrome dip, from which emerge identically inked children. But rather a collection of many micro worlds.

While the investigation of the "non-shared factors" is still in its infancy, we can dare a few  bets about their identity, says Asendorpf: "Personal acquaintances or individuals from the world of media, chosen as role models and imitated tentatively ... certain environmentally related diseases and disabilities, including all prenatal, not genetic developmental disorders; emotional arousing individual experiences. The special buddy from the growth years, the idealized teacher from early puberty or even the passionately devoured television series from childhood provably leave a more sustainable mark on the ego than all coarse factors of family and class of origin." And all these "influences" are themselves only working through a filter which is bound in intricate ways to the unique genetic makeup and unrepeatable life history of the respective "recipient".

In this state of knowledge, psychology must basically admit that we have much less an idea about the formative factors of the human being than it seemed before. Although scientists are not passionately bent on a theory that assigns a significant role to randomness, there's a  good possibility that many critical circumstances which impact personality and intelligence are based on coincidences and chance events, which possess significance only for a small minority of the population.

The new appeal of chance even makes science journalists thoughtfully. "If the pure biographical coincidence proves decisive, a science searching for "laws" finds itself in a hopeless situation", writes a German newsmagazine. A girl watches a nature film that fascinates her at the age of ten - and spends the rest of her life in the solitude of nature; her sister missed the program, because she encountered her clique on the street, and she becomes the disco type .. . "

The fact that genetic heritage and the environment each contribute about fifty-fifty should not make milieu theorist complacent,  along the lines of: Then both sides have been right in the Thirty Years' Nature-Nurture War. Here the joy my be premature. First, among the non-genetic sources of differences that belong to the environmental budget, even accidents (non-genetic) diseases and all the influences to which we are exposed in the womb are assigned - Causes, some of which we would call "biological", and the consequences of which might be literally considered "innate". A recent study even suggests that, as far as intelligence is concerned, nonshared environmental effects are overwhelmingly prenatal.

Secondly, the differences in the genome make up by far the biggest single influence. The non-genetic variance by contrast consists of  many very different factors. Consequently, no single one of them carries a heavy weight, so that it could stand up against the genetic variance "block". All panaceas that start like "One only must ..." ("eliminate" poverty, "send everybody to at schools, "just be" nice to the children), that are fixated on a single (and only hypothetical environmental cause), are doomed to failure. If they are at all effective, then each only minimally. Not only that, there are many crucial environmental factors: To this day it is simply not known which are really important. The only thing we know is that it can't be those which socialization research always considered critical.

According to some accounts, we must even embrace the possibility that the share of our personality which is determined by genetic traits is actually increasing in the modern world. The more open and permeable a society is, the higher the heritability of individual traits turns out. Because when the environment does not enforce differences, the remaining variation is necessary of genetic origin. One can also put it this way: Everyone selects the environment that best matches their genetic makeup; the greater the freedom of choice in a society, the purer any remaining differences are the work of the genes. And of happenstance!

Post has attachment
The Baldwin Effect: When ideas change gene frequencies

Socially learned innovations modify the direction of evolution and the genetic composition of the population

We have almost become numbed to the continuous discovery of fancy new genes which  impact the behavior or personality of animals and people in some form or another. Far less is it known that over a hundred years ago one of the most opalescent figures in the history of psychology turned the attention of the scientific community on the reverse trend: New  behaviors and innovative cultural techniques can radically change the direction of genetic evolution and the frequency of its heritable elements. Only recently has the legacy of James Baldwin found its deserved recognition among researchers who study he dynamics of cultural evolution.

About 200 years ago, French naturalist Jean Baptiste Lamarck created Lamarckism, the doctrine of the inheritance of acquired characteristics. According to this line of thought, individuals pass on characteristics which they have acquired in the course of their lives to their offspring. But while Lamarckism fell into disfavor among geneticists, another, superficially related theory possesses substantially greater longevity. At the end of the nineteenth century, James Mark Baldwin was amongst America’s foremost psychologists. Decades before Crick and Watson deciphered the molecular structure of the gene,  Baldwin argued that new ideas and cultural achievements can steer evolution in new directions and revolutionize the composition of hereditary factors in a population.  More specifically, innovations often generate new basic conditions that will allow their creators to colonize unused ecological niches. While doing so, the genome of the innovator does not change, but their work affects the "selection pressure" which natural selection exerts on future generations. Baldwin himself called this mechanism "Social Heredity", but it would become immortalized as "Baldwin Effect".

Today, the phenomenon is typically illustrated in textbooks with the example of lactose tolerance. Most adult mammals lack the enzyme required to digest the milk sugar contained in milk (lactose). But primarily in the northern regions of the globe, up to 90 percent of all adult humans possess the critical gene, which controls the formation of the enzyme. When mankind,  around 10,000 years ago, invented the domestication of dairy cattle, apparently new rules for natural selection were laid down: Individuals who had available the critical gene for the use of milk could conquer inhospitable habitats and rapidly spread their disposition.

Baldwinian evolution often causes unpredictable and undirected changes in the gene pool, emphasized anthropologist Terrence W. Deacon from Boston University in his widely acclaimed book "The Symbolic Species". One example of how convoluted the path  can be is the gene which causes sickle cell anemia. People who carry the mutated gene at only one of the two loci are characterized by an increased resistance to malaria. This combination can be found in clusters in the regions of Africa where malaria is rampant. The roots of this distribution reach back into prehistory, when the invention of livestock in Africa started its triumphal march -  and allowed the survival of the mosquito, the bloodsucking carrier of malaria. This cultural innovation changed the selection pressure such that people with the critical mutation could multiply their genes better.

Now, these examples might give the impression that the products of Baldwinian evolution are only flukes, some bizarre anomalies on a secondary arena of phylogeny.  Not so, says Joseph Henrich from the Department of Psychology at the University of British Columbia in his new book "The Secret of Our Success", which lays the foundation of a comprehensive theory of cultural evolution.  "The  evolutionary  biologist  Kevin Laland and his collaborators have already fingered over 100 genes that have likely been under selection,  based on analyses of the genome, and have at least plausible cultural origins. These genes influence an immense spectrum of traits ranging from dry earwax and malaria resistance to  skeletal development and the digestion of plant toxins." 

Most of the above examples stem from the emergence of food production - from agriculture and animal domestication, which took place after the advent of the so-called Neolithic revolution a good 10.000 years ago."Nevertheless, there’s every reason to suspect that there was a cooking-and-fire revolution,  a projectile-weapons revolution, and a spoken-language revolution, among many others." We can't even imagine how mass mobility, the computer and other contemporary innovations may change the course of biological evolution. There could be innumerous other still undiscovered Baldwin effects that decidedly contributed to the way we live our life - and use our brain.

Over many generations, the outcomes of Baldwinian evolution may even be written into the brain as "instincts", says Deacon. That might be the way humans came by their "language instinct". Thus, humanlike ancestors long ago may have developed a primitive proto-language that did not require a specialized brain structure. But the advantages of linguistic communication were so striking that all conspecifics took over the mental tool - and even built nerve centers that contained innate linguistic skills, such as an elementary grammar. In  fact, concurs Henrich, "the selection pressures created by culture can be among the most powerful observed in nature, and broad genetic sweeps can occur in tens of thousands of  years.  Culture-gene coevolution  can be remarkably fast."

Baldwinian evolution has the greatest influence on species like ourselves, which produce a lot of innovations and pass them on to their offspring through social learning at high speed. But evidence is piling up that "cultural transmission" also modifies the gene pool of supposedly less able organisms. Especially dramatic is the example of the Red colobus monkeys, which were studied by American biologist Thomas Strusaker on the African island of Zanzibar. The primates had suddenly developed an appetite for charcoal, which has a detoxifying effect that allowed them to consume previously inedible leaves. Young monkeys took over the innovation from their mothers, and after a while it spread to the whole population. Result: The monkeys conquered habitats that were previously denied to them and had to finally put up with a massive population explosion.

The forests in Israel was almost completely degraded in recent years and replaced with Jerusalem firs. Although their diet base became destroyed by this change, the black rat managed to perform  a surprising comeback, discovered zoologist Joseph Terkel from Tel Aviv. The rodents suddenly developed the ability to strip the edible seeds from the pine cone. None of the necessary steps had previously existed in their behavioral repertoire. Once again, the innovation spread from the mothers to the pups and finally marched through the whole population, saving it from its genetic downfall. 

Although Baldwin enjoyed high esteem in the academic world - his ideas being broadly discussed among the intelligentsia - his triumph in American ended with a bizarre and inglorious counterpoint, which has been portrayed meticulously in a recent historical  treatise. In June 1908, Baldwin, then Professor of Psychology and Philosophy at Johns Hopkins University and at the pinnacle of his career, was arrested in a Baltimore house of prostitution, actually a "colored" brothel. Although he insisted on both his legal and moral innocence and all legal charges against him were dismissed, the threat of scandal led Hopkins authorities to demand Baldwin’s resignation. In he end, Baldwin was forced to remove himself and his family permanently to France. 

More on the scandal that brought down Baldwin:

Post has attachment
When your reflection conjures up rejection: Mirrors betray the gap between aspiration and reality

Across cultures, mirrors have been associated with the "true self", instruments that cannot lie and such are symbols of correctness and purity. A glance in the mirror is more than simply a dispassionate look at a human image; it involves comparisons to standards of attractiveness and adequacy. And the results of those comparisons can put you in a bad mood - or spoil your appetite for a sought-after delicacy.

According to "Objective Self-Awareness theory", when attention is directed inward, individuals become the object of their own consciousness, says psychologist Ata Jami from the University of Central Florida. "Incidents  such as gazing into a mirror, writing about the self, or seeing one’s own image in a photograph or videotape increase the likelihood of directing one’s attention toward the self and developing a state of self-awareness." 

Seeing the man (or the woman) in the mirror creates a sudden awareness of how he/she is being seen by other people, which can have grave consequences for inner harmony. It gives rise to the process of  comparing the self with the standards we are committed to - and according to which we may be judged by other people. Being reminded of a (potential) gap between what is and what should be can change behavior substantially. For example, an enhanced level of self-awareness changes behaviors such as decreasing the likelihood of cheating on a test, strengthening sexual inhibitions, and reducing stereotyping,  the psychologist outlines previous research.

By comparing self with ideal standards, a person is able to detect the discrepancy between self and standards. Perceiving a congruity between self and standards can induce positive affect, while any discrepancy can seriously put you down. But no matter which aspect of your personality you focus on,  you will always be able to track down deviations from the strived-for perfectibility. Any self-criticism arising from that observation in the best case incites efforts toward betterment. If the individual, however,  does not feel up to the challenge, they may strive for self-forgetfulness and any way of retreating from the situation. 

As many dieters may have experienced, a look into the mirror also has the potential to spoil one's appetite. "We argue that when people consume a food product in front of a mirror, they compare their behavior with the ideal standards of eating," Jami points out. "When they eat a healthy food, the self matches up with the standards and there appears no conflict in attributing the positive feelings of eating a healthy food to self. However, when  people eat an unhealthy food, there is a discrepancy between self and the standards."

Accordingly, when one fails to follow the standards, he/she does not want to look at a mirror because it enhances the discomfort of the failure. Attributing the discomfort of eating the unhealthy food to self would increase incongruity between self and standards, making oneself literally look bad. Therefore, the researcher speculates, "people attribute their feelings to the food and evaluate its taste to be lower since the food is the only readily available external factor." In other words, the food is being made the scapegoat and assumes responsibility for the repulsion that the sight of their own reflection generates in the onlooker. 

Which turned out true: In a taste test study, 185 undergraduate students chose between a chocolate cake and a fruit salad and then evaluated its taste in a room with a mirror or with no mirrors around. Those who selected the chocolate cake evaluated it less tasty in the room with a mirror compared to those with no mirrors around. However, the presence of a mirror did not change the taste of the fruit salad. "This research suggests that placing a mirror in dining rooms and other eating spaces so that diners can see themselves eat, can be an effective way for individuals and restaurants to encourage healthier eating practices," says the accompanying press release. Perhaps, manufacturers had better directly built a mirror into every new refrigerator. Preferably one of those curved mirrors that make you look like an elephant. 

When subjects' objective self awareness is triggered in the laboratory, they display increased levels of self-absorption,  reflected in a relative increase of ego-related utterances. The same linguistic phenomenon can also be seen in people suffering from a depressive mood disorder. In general, objective self awareness may bring about mental states resembling depression.  When young children for the first time in their life fully comprehend that their mirror image is the facade they present to others, they are stricken with shock and embarrassment. This is, by the way, an insight we humans have over all our fellow creatures: Hearing recordings of one's own voice for the first time produces a similar effect:

Depression is, almost by definition, associated with painful sensations of displeasure. Assuming that an individual has little hope of achieving a particular target state, objective self-awareness produces such unpleasant feelings. This was shown, among others, in a study whose participants were made to believe that they had failed badly in a previous test of intellectual performance. Some of the subjects were also told that there were little chance that they could pass the following test. Among those, the confrontation with a mirror led to a dramatic deterioration in performance and emotional well-being.

A loss of self-esteem is considered by various researchers as the cardinal symptom, if not the driving force of Depression. The state of self-awareness can have corresponding consequences,  as was shown in a study whose participants had also received a (fictitious) damning feedback regarding their performance at certain tests and then provided  information about their capabilities. In the mirror-condition,  the respective subjects offered significantly scaled down estimations of their own talents.

In contrast to other people, depressives are less susceptible to illusionary misjudgments concerning their own characteristics and abilities, which is called "depressive realism". Non-depressives tend to overoptimistically appraise their talents, their future, their positive sides and their contributions to collective achievements. And indeed, mirrors can imbue the onlooker with  a shot of depressive realism: In the state of objective self-awareness, subjects revealed  more accurate information about their previous academic merits and professional accomplishments. In a way that still eludes our detailed understanding, self-awareness and depression are closely related. Confirming a vision of the German poet Friedrich Hebbel who once said: "Self-awareness manifests itself in discovering more ailments in yourself than in others."

Healthy Reflections: The Influence of Mirror Induced Self-Awareness on Taste Perceptions

Post has attachment
The torturer in everyone

The readiness to inflict harm on others under authority pressure, as demonstrated in the Milgram Experience, is unbroken

No one who experiences  only a twinge of humanistic spirit can imagine themselves as a henchman of torture and human destruction. However, the American social psychologist Stanley Milgram furnished  proof in the early 60s with a series of harrowing experiments that even the most decent in appropriate circumstances can become executors of barbaric behavior against others. As the latest research now suggest, the willingness to bow down to "destructive obedience" hasn't declined one iota.

With an ingeniously simple experimental set-up that brought forth the darkest sides of the human soul, the Yale psychologist had then provided the impetus for a variety of dramatic adaptations and a whole range of international follow-up studies. Under the deceptive pretext that they would take part in an investigation of the effect of punishment on learning, Milgram asked his  subjects to administer electrical shocks of increasing intensity to a "student" learning a list of words. In the standard variant of the experiment, 65% of participants gave the (in fact only acting) "student" surges of up to the limit of 450 volts, even if the victim was screaming, suddenly complained about an alleged "heart defect" or raised terrible suspicions  through falling silent.

In the last decades, the original experiment has given rise to innumerable replications and variations and has evoked intensive scientific debates. Whereas a dominant viewpoint suggests that individuals obey due to a passive conformation to the authority’s commands, recent findings suggest that participants’ willingness to engage in destructive obedience is rather "a reflection not of simple obedience, but of active identification with the experimenter and his mission," notes a team of psychologist led by Martial Mermillod from the University of Grenoble Alpes in France. According to that point of view, the Milgram studies are less about people conforming slavishly to the will of authority "than about getting people to believe in the importance of what they are doing."

In order to test the role of the coercive pressure in the obedience process - and to find out if destructive obedience still raises its ugly head - the researchers used a role-playing version of the administrative obedience paradigm. It suffices to say that the original experimental setup would never again pass the review of a University Ethics Committee, because of the amount of stress and deception it subjected the subjects to. The experimenter presented himself as a consultant for a fictive Human Resources consulting firm, which has been approached by the University to evaluate its administrative employees every 2 years. In order to ensure the procedure impartiality, the University requested the presence of a student for each evaluation as an 'independent observer'. Participants were told that the employees had to succeed in the test in order to keep their job. In order to identify those who would be dismissed, the participants had to stress the employee by making 10 pre-formulated negative remarks about his performance during the test. The remarks rose in pungency over the course of the experiment.

In the ‘‘Compliance with pressure’’ condition, participants were explicitly ordered by the consultant to enunciate all the remarks: ‘‘You have to enunciate all of these remarks very carefully, exactly as they are appeared in the notebook. You have to follow my instructions.’’ In the ‘‘Compliance without pressure’’ condition, the consultant insisted on the participant’s ‘‘freedom of action’’: ‘‘You came here voluntarily, I totally trust you to make sure that this interview is successful.’’  Then, the consultant introduced the employee (a confederate – male, 25 years) and the experiment began. When the participants refused (or hesitated) to formulate a remark, the consultant intervened with further injunctions, either of the ‘‘Compliance without pressure" or the ‘‘Compliance with pressure" variety.

Conclusion: In line with previous replications, a whopping 84% of the participants fully obeyed the consultant and administered the "toxic" remarks to the bitter end. Moreover, the level of obedience was higher in the ‘‘Compliance with pressure condition’’ (96%) than in the ‘‘Compliance without pressure’’ condition (73%).  These results seem to support the view that coercive pressure is a key component of the obedience process after all. However, even the ‘‘Compliance without pressure’’ condition elicited a strikingly high amount of obedience from participants.

These results compliment previous reinstatements of the Milgram Experiment, which showed that participants bow to authority pressure even in circumstances vastly different from the original set-up. In one study, in which the job of the subjects consisted of pouring someone else a fluid  (purportedly hydrochloric acid) in the face, around 50% of participants did as they were instructed.  Psychologist Grete Schurz from Graz in Austria found that 80% of all subjects abused the alleged "student" on command with allegedly excruciating, if not tissue damaging "ultrasonic pulses". To the disappointment of psychologists, knowledge of destructive obedience does not protect not against it. Students who had gone through the results of the Milgram experiment in a seminar still displayed the same level of readiness for destructive obedience as the unsuspecting.

Furthermore, even with all scientific means available, no social group membership, no personality trait or ideological leaning could be found that protected subjects against moral surrender. Race, creed, education level, occupation, age, income, "moral maturity", political orientation, and, paradoxically, even "obedience to authority", had no influence on the behavior in the test situation. "Subjects who described themselves as rebellious and unruly obeyed blindly, while others, who saw themselves as adjusted and submissive, defied the experimenter," recounts Schurz. The irrelevance of personality traits is a clear proof that the importance of "character" has been overestimated massively, declares British psychologist Gilbert Harman. Should we ascribe a character flaw to all participants in these experiments? 

One of the enduring legacies of Milgram seems to be the insight that nobody, no matter how much civil courage he/she believes to harbor,  can claim the moral high ground. Even if you'd possess the right stuff to resist in one condition of the experiment, a subtle rearrangement of the set-up could be enough to unleash your inner bootlicker.

"Destructive Obedience Without Pressure"

Post has attachment
The lure of the forbidden

People have a built-in propensity to rebel against restrictions of their freedom, which can become a powerful motivation by itself: reactance

Since that fateful incident in Garden Eden, people lust after just those things that are expressly prohibited. In the same vein, they tend to develop a deep dislike for specifically those blessings others want to force upon them. The force that makes forbidden fruits taste particularly sweet has been a subject of psychological research for 60 years and will never lose its topicality.

The scientific notion of "reactance" was born 60 years ago, when social psychology was still fresh and ingenious and pursued grand and fertile ideas, instead of killing time and grant money with pointless nonsense such as social priming. In 1966, famed psychologist Jack W. Brehm led children to believe that they could choose between a number of candy bars. When the experimenter threatened the freedom to choose candy bar X by stating that candy bar X should not be chosen, children instantly became set on the discarded option.

"Reactance is an unpleasant motivational arousal that emerges when people experience a threat to or loss of their free behaviors," recaps a team of researchers led by Christina Steindl from the Department of Psychology at the University of Salzburg, Austria, in a current research review. It creates as a powerful motivation which is aimed at restoring one’s lost scope of action. On the cognitive side, it may make people derogate the source of threat, put a higher value on the restricted freedom, or downgrade the imposed option.

Over the decades, psychologists have conducted a variety of experiments demonstrating the determining factors  of reactance. Among several painted portraits, subjects zeroed in on the single one they were not allowed to take home. A particularly strongly authored poster at the toilet wall, urging people not to paint or scribble on the walls, tempted amateur artists to do so all the more, much more so than a casual reminder. The knowledge that they had received information which was purposefully censored caused a boomerang effect, making subjects particularly curious about the withheld content. Individuals who had been given accurate predictions about the things they would do in the near future spared no effort to act otherwise, just in order to refute these prophecies. And the mere assertion that there were efforts to muzzle citizens increased the number of signatures on a petition that had been arranged by the experimenter.

Reactance also comes into play in matters of the heart between women and men. Countless literary dramas and so-called "tearjerker" bear witness to the magic of forbidden, wicked and impossible love, which gives rise to the most violent surges of emotion. But sober scientists, too, have confirmed that it is beneficial to contain oneself when courting for love. For example, there was a study whose participants were shown the descriptions of five men, one of whom the female subjects could chose for a date. One of these candidates, however, suddenly was said to jib at the tryst. Showing all the signs of reactance, the snubbed women now found this unattainable one the most enticing. Romeo and Juliet were welded together by their parental interdiction. Something similar happened in another study, which measured the attachment of partners who had coupled against the will of their fathers and mothers. No matter whether the parental opposition was based on racial, social, religious or economic reasons, these lovers stuck together particularly intensively

Another area in which the effects of reactance can render void the original intentions is advertising. Purchase appeals which are formulated too obtrusively can have the opposite effect, as one experiment shows in which a certain food was either touted with the command "You may only buy X!" or the gentle invitation "just try X!": The strict tone reduced the sales of the product considerably. Conversely, an detergent explicitly forbidden in Florida because of its phosphate content sold particularly well in the neighboring countries. Much more effective than the prohibition on the other hand were information cards unobtrusively placed on the shelves, which pointed to the harmfulness of phosphates.

All these defiant behaviors have a common motif. Obviously, the persons want to appear to themselves and to others to be free and self-determined: They therefore defend themselves against any attempt to narrow their scope of action. Therefore, any act against the prohibition conveys the satisfaction of not having been intimidated. The defiance will be greater, the more important the freedom that is at stake, the more freedoms are threatened and the more a threat evokes further threats. Reactance doesn't necessarily manifest itself in offenses against regulations. It can also show up in behaviors that resemble the forbidden, in sending ahead a representative to oppose the limitation of freedom, or in openly rebelling against the rule makers. Reactance is also greater in those areas one knows best. Finally, an authority with which one identifies voluntarily - for example, a pop star - demonstrably arouses less reactance  than a ruler who was imposed on the actor. Almost absurd is the reactance which sets in shortly after one has selected one of several alternative courses of action: The rejected options then temporarily appear particularly attractive, precisely because they now lie beyond one's own freedom of action.

The domain of life with the largest amount of reactance, due to continuous limitations of freedom, is education. To the chagrin of every educator, the forbidden, hidden and unreachable toys, of all, appeal most strongly to the mind of the child. Behaviors which are fraught with lots of rules, automatically gain popularity with the young. Even rewards can cause tantrums if the child perceives being "forced" by those in a particular direction. Contrary to popular notions, boys don't display more reactance or love of freedom than girls. Nevertheless, in an educational environment they make a stand more frequently than girls, perhaps because, on one hand, they are admitted more autonomy, but at the same time, more rules a imposed on them than on girls.

Even aid that you give to other people can, under certain circumstances, be transformed by reactance mechanism into a threat and lead to rebellion against the helper. This happens, for example, when the help seeker becomes suspicious that he will have to be grateful to the "Samaritan", or when the support is linked to any preconditions. The greatest freedom often lies precisely in the believe that you can cope with a problem without getting help.

Reactance manifests itself not only in feelings and actions, but also in physiological arousal, the new research review points out. Merely imagining being restricted from visiting a flat they might have wanted to rent was sufficient to immediately increase people’s heart rate. But only with an illegitimate, unexpected and inappropriate, not a legitimate restriction. People may even experience vicarious reactance: They go through the full blown emotion themselves while observing a threat to another person’s freedom. In one study, female participants observed an actor being excluded from a decision-making process. The researchers found that the participants themselves showed vicarious reactance, by suddenly rating the attractiveness of the discussion topics higher when the actor was restricted versus not restricted. "Reactance can be aroused by the mere observance of a threat to another’s freedom, without the perception of one’s own freedom being potentially directly threatened." But there is a physiological difference between direct and vicarious reactance, a similar study showed: While there was an immediate increase in heart rate after self-restrictions, the increase after vicarious restrictions was delayed.

Because people experiencing reactance are striving to restore their freedom, the experience should be associated with approach motivation. This hypothesis was tested in one experiment using electroencephalography (EEG). And indeed, reactance was associated with heightened left frontal alpha asymmetry, which is thought to be an indicator of approach motivation. The experience of reactance obviously shares commonalities with the experience of anger, but it also  bears resemblance to related emotional concepts, such as defiance and spite. To explore the neuronal signature of reactance and how it differs from anger, one team of researchers used fMRI techniques to compare conditions in which participants read about reactance-arousing, anger-arousing, or neutral situations. During reactance-arousing compared to anger-arousing situations, the middle temporal lobe, the temporal poles, and the gyrus rectus were active. These regions have been shown to be involved in mentalizing processes in which people are drawing inferences about the mental states of others. "This suggests one basis for distinguishing reactance processes from more pure anger processes, an issue that should be explored more carefully in future research."

"Understanding Psychological Reactance New Developments and Findings"

Post has attachment
Choosing a "Ditto" Mate: I'll have what she has

The propensity to imitate others' partner selection can overwrite genetically based preferences  

As long as you're single, no one wants to know. But the minute you get a partner, you're the centre of attention. This lousy phenomenon, as unfair is it may appear, can be traced back to the most simple earthly creatures - and possess the power to change the course of evolution.

First and foremost, the females of many species practice what ethologists call “mate choice copying”. Instead of searching and choosing a partner for themselves, they copy the choice of other females. The female in question witnesses how a female conspecific selects a certain male as a sexual partner and then tries to poach the "second hand" mate. Scientists have recently tracked mate choice copying behavior in the fruit fly, in several fish species, in a couple of birds, and last not least in humans. Bottom line: Males become more desirable to females when in the company of another female. For a female to find out if a potential mate is fit to sire her offspring would invoke a great deal of time, costs and risks. So why not simply copy the mate choice of other females who already got it over and done with?

So far, studies investigating mate choice copying in females are far more numerous  than those examining the mechanism in males, emphasizes a team of biologists led by Klaudia Witte from the Research Group of Ecology and Behavioral Biology at the University of Siegen in Germany. "This is probably because females are considered  to be the choosier sex in most species because they usually face higher reproductive investment. High reproductive investment is expected to select for increased  choosiness, and copying should reduce the cost of choosiness.

However, there is now clear evidence that the supposedly "stronger" sex pursues this imitative mating strategy as well. Copying might be beneficial for males if mating or sperm production is costly or if males provide parental care and thus can't allow themselves to spread their seeds indiscriminately. In the deep-snouted pipefish, for example, a sex role reversed species in which males are choosier than females, males but not females copy the mate choice of their conspecifics. The same is true for male darters of Etheostoma flabellare, a species in which males provide parental care by guarding a nest site under a rock and caring for developing eggs. When both sexes face high costs of reproduction as in the stickleback, both sexes use mate-choice copying. Incidentally, among our species, almost exclusively women have been observed imitating the partner choice of their fellow females. With the exception of a study in the speed-dating environment, where both sexes embraced a potential mate when he/she had been embraced by others previously. Finally, one crucial difference between the sexes is that, by now, generalization  - choosing a partner who has only has the general characteristics of the original model instead of that individual - has only been found in females.

The influence of mate choice copying on mate choice can be so strong that socially  acquired information can overwrite genetically based preferences for certain male phenotypes in females. Hence, it became obvious that mate choice copying can play a role in the evolution of new secondary sexual traits. Sailfin molly females, which under natural conditions prefer larger males, maintain the socially learned mate preference for smaller males and can, therefore, serve as model females for other conspecific females. Additionally, they remember an observed sexual interaction for  at least one day and can thus copulate with the same male, not only immediately after the observed females, but at a safer moment regarding predation risk and/or sperm depletion in males.

Generalization is a prerequisite for cultural inheritance of socially driven mate choice. And indeed, sailfin molly females generalized a learned preference for smaller males between individual males. A number of studies have demonstrated that even the  choice for artificially created phenotypes can be copied. Female zebra finches copied the choice of other females and preferred males of the same artificial phenotype (leg band color) as the observed female’s mate. A red feather on the forehead or an artificial yellow plastic sword with a black border served the same purpose in other species. Due to the sex-specific ability to generalize and copy the choice of specific phenotypes instead of individuals, it is more likely that mate choice copying supports the evolution of new traits in the male sex than in females or in both sexes. Thus, the phenomenon can lead to stronger sexual dimorphism within a species. 

In some instances, mate choice copying may produce strange effects. Ever heard of the Amazon molly? In this Mexican freshwater fish, nature has gone completely crazy. It looks as if a feminist utopia has become reality: The females are completely independent and have relinquished  the male sex. No longer do they need to be fertilized by males, because they clone themselves by way of “parthenogenesis”. But female Mollies do still need some kind of masculine assistance:  Like a throwback to the earlier sexual stages of their evolution, they still need the copulation ritual in order to get in the mood for propagation. 

But because this species consists entirely of females, they must allow the males of a closely related fish species to have their way with them. Although the “act” looks like the real thing, no genetic material is transmitted, it is all for show. But why should the assistants, the Sailfin Molly males, participate in this ruse in the long run? Eventually, they would have to recognize that they invest their precious time and genetic material without ever reaping any reproductive benefit. And here the circle closes: The males are rewarded for their assistance to the foreign Mollies through the increased admiration by the females of their own species. Male Sailfin Mollies gain sex appeal when being watched in the company of foreign females. As if the Sailfin females were thinking: "This must be quite a stunner, when even the foreign females chase after him.”

Mate-choice copying: Status quo and where to go

More good stuff on sex:

When sex goes stale through repetition:

No escape from a sex starved world

If there was a God, the clitoris would sit in the vagina

Sex under false pretenses

Sodom and Kandahar: How the invisibility of adult women leads to sick  twists in male sexuality
Wait while more posts are being loaded