Fire Hydrant of Freedom

Politics, Religion, Science, Culture and Humanities => Science, Culture, & Humanities => Topic started by: Crafty_Dog on November 01, 2006, 05:41:09 AM

Title: Evolutionary biology/psychology
Post by: Crafty_Dog on November 01, 2006, 05:41:09 AM
An Evolutionary Theory of Right and Wrong
 
By NICHOLAS WADE
NY Times
Published: October 31, 2006
Who doesn?t know the difference between right and wrong? Yet that essential knowledge, generally assumed to come from parental teaching or religious or legal instruction, could turn out to have a quite different origin.



Primatologists like Frans de Waal have long argued that the roots of human morality are evident in social animals like apes and monkeys. The animals? feelings of empathy and expectations of reciprocity are essential behaviors for mammalian group living and can be regarded as a counterpart of human morality.

Marc D. Hauser, a Harvard biologist, has built on this idea to propose that people are born with a moral grammar wired into their neural circuits by evolution. In a new book, ?Moral Minds? (HarperCollins 2006), he argues that the grammar generates instant moral judgments which, in part because of the quick decisions that must be made in life-or-death situations, are inaccessible to the conscious mind.

People are generally unaware of this process because the mind is adept at coming up with plausible rationalizations for why it arrived at a decision generated subconsciously.

Dr. Hauser presents his argument as a hypothesis to be proved, not as an established fact. But it is an idea that he roots in solid ground, including his own and others? work with primates and in empirical results derived by moral philosophers.

The proposal, if true, would have far-reaching consequences. It implies that parents and teachers are not teaching children the rules of correct behavior from scratch but are, at best, giving shape to an innate behavior. And it suggests that religions are not the source of moral codes but, rather, social enforcers of instinctive moral behavior.

Both atheists and people belonging to a wide range of faiths make the same moral judgments, Dr. Hauser writes, implying ?that the system that unconsciously generates moral judgments is immune to religious doctrine.? Dr. Hauser argues that the moral grammar operates in much the same way as the universal grammar proposed by the linguist Noam Chomsky as the innate neural machinery for language. The universal grammar is a system of rules for generating syntax and vocabulary but does not specify any particular language. That is supplied by the culture in which a child grows up.

The moral grammar too, in Dr. Hauser?s view, is a system for generating moral behavior and not a list of specific rules. It constrains human behavior so tightly that many rules are in fact the same or very similar in every society ? do as you would be done by; care for children and the weak; don?t kill; avoid adultery and incest; don?t cheat, steal or lie.

But it also allows for variations, since cultures can assign different weights to the elements of the grammar?s calculations. Thus one society may ban abortion, another may see infanticide as a moral duty in certain circumstances. Or as Kipling observed, ?The wildest dreams of Kew are the facts of Katmandu, and the crimes of Clapham chaste in Martaban.?

Matters of right and wrong have long been the province of moral philosophers and ethicists. Dr. Hauser?s proposal is an attempt to claim the subject for science, in particular for evolutionary biology. The moral grammar evolved, he believes, because restraints on behavior are required for social living and have been favored by natural selection because of their survival value.

Much of the present evidence for the moral grammar is indirect. Some of it comes from psychological tests of children, showing that they have an innate sense of fairness that starts to unfold at age 4. Some comes from ingenious dilemmas devised to show a subconscious moral judgment generator at work. These are known by the moral philosophers who developed them as ?trolley problems.?

Suppose you are standing by a railroad track. Ahead, in a deep cutting from which no escape is possible, five people are walking on the track. You hear a train approaching. Beside you is a lever with which you can switch the train to a sidetrack. One person is walking on the sidetrack. Is it O.K. to pull the lever and save the five people, though one will die?

Most people say it is.

Assume now you are on a bridge overlooking the track. Ahead, five people on the track are at risk. You can save them by throwing down a heavy object into the path of the approaching train. One is available beside you, in the form of a fat man. Is it O.K. to push him to save the five?

Most people say no, although lives saved and lost are the same as in the first problem.

Why does the moral grammar generate such different judgments in apparently similar situations? It makes a distinction, Dr. Hauser writes, between a foreseen harm (the train killing the person on the track) and an intended harm (throwing the person in front of the train), despite the fact that the consequences are the same in either case. It also rates killing an animal as more acceptable than killing a person.
---------

Many people cannot articulate the foreseen/intended distinction, Dr. Hauser says, a sign that it is being made at inaccessible levels of the mind. This inability challenges the general belief that moral behavior is learned. For if people cannot articulate the foreseen/intended distinction, how can they teach it?

Dr. Hauser began his research career in animal communication, working with vervet monkeys in Kenya and with birds. He is the author of a standard textbook on the subject, ?The Evolution of Communication.? He began to take an interest in the human animal in 1992 after psychologists devised experiments that allowed one to infer what babies are thinking. He found he could repeat many of these experiments in cotton-top tamarins, allowing the cognitive capacities of infants to be set in an evolutionary framework.

His proposal of a moral grammar emerges from a collaboration with Dr. Chomsky, who had taken an interest in Dr. Hauser?s ideas about animal communication. In 2002 they wrote, with Dr. Tecumseh Fitch, an unusual article arguing that the faculty of language must have developed as an adaptation of some neural system possessed by animals, perhaps one used in navigation. From this interaction Dr. Hauser developed the idea that moral behavior, like language behavior, is acquired with the help of an innate set of rules that unfolds early in a child?s development.

Social animals, he believes, possess the rudiments of a moral system in that they can recognize cheating or deviations from expected behavior. But they generally lack the psychological mechanisms on which the pervasive reciprocity of human society is based, like the ability to remember bad behavior, quantify its costs, recall prior interactions with an individual and punish offenders. ?Lions cooperate on the hunt, but there is no punishment for laggards,? Dr. Hauser said.

The moral grammar now universal among people presumably evolved to its final shape during the hunter-gatherer phase of the human past, before the dispersal from the ancestral homeland in northeast Africa some 50,000 years ago. This may be why events before our eyes carry far greater moral weight than happenings far away, Dr. Hauser believes, since in those days one never had to care about people remote from one?s environment.

Dr. Hauser believes that the moral grammar may have evolved through the evolutionary mechanism known as group selection. A group bound by altruism toward its members and rigorous discouragement of cheaters would be more likely to prevail over a less cohesive society, so genes for moral grammar would become more common.

Many evolutionary biologists frown on the idea of group selection, noting that genes cannot become more frequent unless they benefit the individual who carries them, and a person who contributes altruistically to people not related to him will reduce his own fitness and leave fewer offspring.

But though group selection has not been proved to occur in animals, Dr. Hauser believes that it may have operated in people because of their greater social conformity and willingness to punish or ostracize those who disobey moral codes.

?That permits strong group cohesion you don?t see in other animals, which may make for group selection,? he said.

His proposal for an innate moral grammar, if people pay attention to it, could ruffle many feathers. His fellow biologists may raise eyebrows at proposing such a big idea when much of the supporting evidence has yet to be acquired. Moral philosophers may not welcome a biologist?s bid to annex their turf, despite Dr. Hauser?s expressed desire to collaborate with them.

Nevertheless, researchers? idea of a good hypothesis is one that generates interesting and testable predictions. By this criterion, the proposal of an innate moral grammar seems unlikely to disappoint.


Title: Looking at Flipper
Post by: Crafty_Dog on November 03, 2006, 01:18:38 PM
Looking at Flipper, Seeing Ourselves
By FRANS de WAAL
Published: October 9, 2006
Atlanta

NO one blinks when a celebrity is called "vacuous" or a politician a
"moron" - but when headlines screamed that dolphins are "dimwits" and
"flippin' idiots," I was truly shocked. Is this a way to talk about an
animal so revered that there are several Web domain names that include
"smart dolphin"?

This is not to say that one should believe everything about them. For
example, their supposed "smile" is fake (they lack the facial musculature
for expressions), and all we seem to have learned from chatting "dolphinese"
with them is that lone male dolphins are keenly interested in female
researchers.

Nevertheless, it's going too far to say that dolphins are dimwits. Yet this
is the claim of Paul Manger, a South African scientist who says that
dolphins' relatively large brains are due simply to preponderance of fatty
glial cells. These glia produce heat, which allows the brain's neurons to do
their job in the cold ocean.

Based on this observation, Professor Manger couldn't resist speculating that
the intelligence of dolphins and other cetaceans (like whales and porpoises)
is vastly overrated. He offered gems of insight, such as that dolphins are
too stupid to jump over a slight barrier (as when they are trapped in a tuna
net), whereas most other animals will. Even a goldfish will jump out of its
bowl, he noted.

If we skip the technicalities - such as that glial cells are not simply
insulation, that they add connectivity to the brain, and that humans, too,
have many more glial cells than neurons - the question remains why the
prospect of animal intelligence sets off such controversy. Could it be that
the huge size of the dolphin brain, which exceeds ours by 15 percent or
more, threatens the human ego? Are we to ignore the billions and billions of
neurons that dolphins do possess?

The goldfish remark reminded me of a common strategy of those who play down
animal intelligence. They love to "demonstrate" remarkable cognitive feats
in small-brained species: if a rat or pigeon can do it, it can't be that
special. Thus, some pigeons have been trained to use "symbolic
 communication" by pecking a key marked "thank you!" that delivered food to
another pigeon. And they have also been conditioned to peck at their own
bodies in front of a mirror, supporting the claim that they are
 "self-aware."

Clearly, pigeons are trainable. But is this truly comparable to the actions
of Presley, a dolphin at the New York Aquarium, who, without any rewards,
reacted to being marked with paint by taking off at high speed to a distant
part of his tank where a mirror was mounted? There he spun round and round,
the way we do in a dressing room, appearing to check himself out.

What is so upsetting to some people about the closeness between animal and
human intelligence, or between animal and human emotions, for that matter?
Just saying that animals can learn from each other, and hence have
rudimentary cultures, or that they can be jealous or empathic is taken by
some as a personal affront. Accusations of anthropomorphism will fly, and we'll
be urged to be parsimonious in our explanations. The message is that animals
are no humans.

That much is obvious. But it is equally true that humans are animals. Is it
so outlandish, from an evolutionary standpoint, to assume that if a
large-brained mammal acts similarly to us under similar circumstances, the
psychology behind its behavior is probably similar, too? This is true
parsimony in the scientific sense, the idea that the simplest explanation is
often the best. Those who resist this framework are in "anthropodenial" -
they cling to unproven differences.

Since Aristotle, humans have known that dolphins are incredibly social. Each
individual produces its own unique whistle sound by which the others
recognize him or her. They enjoy lifelong bonds and reconcile after fights
by means of "petting." The males form power-seeking coalitions, not unlike
the politics of chimpanzees and humans. Dolphins also support sick
companions near the surface, where they can breathe. They may encircle a
school of herring, driving the fish together in a compact ball and releasing
bubbles to keep them in place, after which they pick their food like fruit
from a tree.

In captivity, dolphins are known to imitate the gait and gestures of people
walking by, and to outsmart their keepers. One female dolphin that was
rewarded with a fish for every piece of debris she managed to collect from
her tank managed to con her trainers into a bounty of snacks. They
discovered she had been hiding large items like newspapers underwater, only
to rip small pieces from them, bringing these to her trainer one by one.

There are tons of such observations, which is why most of us believe in
dolphin intelligence - glia or no glia. It also explains why the slaughter
of dolphins, as still occurs every year in Japan, arouses such strong
emotions and controversy.

Still, I must admit that the whole dolphin affair has also offered me some
fresh insights. From now on, if I find my goldfish thrashing on the floor, I
will congratulate him before dropping him back into his bowl.

Frans de Waal, a professor of psychology at Emory University, is the author
of "Our Inner Ape."
Title: Neanderthal Genome to be Mapped
Post by: Body-by-Guinness on November 09, 2006, 12:24:14 PM
Once this is sequenced it'll be interesting to compare and contrast it to the human genome:


Scientists Create Neanderthal Genome

Wednesday, 8th November 2006, 19:06
Scientists are reconstructing the genome of Neanderthals - the close relations of modern man.

The ambitious project involves isolating genetic fragments from fossils of the prehistoric beings who originally inhabited Europe to map their complete DNA.

The Neanderthal people were believed to have died out about 35,000 years ago - at a time when modern humans were advancing across the continent.

Lead researcher Dr Svante Paabo, an evolutionary geneticist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, said: "This would be the first time we have sequenced the entire genome of an extinct organism."

But the prospect of using the genome to produce a living Neanderthal has been ruled out.

A popular caricature portrays Neanderthals as beetle-browed brutes - but this is far from the truth, reports New Scientist.

"Neanderthals were sophisticated stone-tool makers and made razor-sharp knives out of flint," said Dr Richard Klein, an anthropologist at Stanford University, California.

"They made fires when and where they wanted and seem to have made a living by hunting large mammals such as bison and deer."

Neanderthals also buried their dead, which, fortunately for researchers, increases the odds of the bones being preserved.

"By sequencing their entire genome we can begin to learn more about their biology," said Dr Eddy Rubin, a geneticist at the Lawrence Berkeley National Laboratory in Walnut Creek, California.

The genetic questions could also solve the biggest mystery of all - why did Neanderthals die out while modern humans went on to conquer the globe?

Dr Paabo and colleagues pioneered the genetic study of Neanderthals by extracting and decoding fragments of
mitochondrial DNA (mtDNA) from the bones of the original specimen, discovered in 1856 in the Neander Valley in Germany.

The mtDNA Dr Paabo sequenced suggested humans split from Neanderthals about 500,000 years ago - which fits neatly with the fossil record. It also suggested Neanderthals did not interbreed with our ancestors.

Dr Paabo's team have selected two Neanderthal specimens to work on based on the fact both have "clean" DNA that is
relatively uncontaminated.

One is a 38,000-year-old fossil from Vindija, Croatia. The other is the original specimen, which, despite being
extensively handled, has unusually clean DNA in its right upper arm bone.

During its lifetime the individual lost the use of its left arm after breaking it and had to rely on the right arm - causing the bones to grow thicker and denser than usual.

After death this shielded the DNA from contamination. The researchers are also hunting for new specimens that can be sampled before other people get their hands on them.

They have so far sequenced about a million base pairs of nuclear DNA from the Croatian fossil and hope to publish a draft of the whole genome in two years.

"It is definitely possible to sequence the entire genome from such well-preserved specimens," said Dr Eske Willerslev, an expert in ancient DNA at the University of Copenhagen, Denmark.

"Perhaps the biggest difficulty will be verifying the sequences obtained are genuinely from the Neanderthal genome and not a contaminant - as so much of it will be identical to the human genome."

The genome is sure to fuel the particularly intense controversy that has surrounded a
much-vaunted aspect of human uniqueness - language.

"There's been a debate going for more than 30 years about the speech capabilities of Neanderthals," says Dr Philip
Lieberman, a cognitive scientist at Brown University in Providence, Rhode Island.

"It's clear from the fossil record and comparisons with modern humans that Neanderthals could speak."

But the prospect of the genome providing the blueprint for resurrecting a living "Jurassic-Park-style" Neanderthal is unlikely.

Dr Paabo said: "We would be able to create a physical Neanderthal genome but we will not be able to recreate a Neanderthal - even if we wanted to."
Title: Nutrition & Genetic Inheritance
Post by: Body-by-Guinness on November 13, 2006, 09:10:06 PM
2:00 13 November 2006
NewScientist.com news service
Roxanne Khamsi


A mother?s diet can change the behaviour of a specific gene for at least two subsequent generations, a new study demonstrates for the first time.

Feeding mice an enriched diet during pregnancy silenced a gene for light fur in their pups. And even though these pups ate a standard, un-enriched diet, the gene remained less active in their subsequent offspring.

The findings could help explain the curious results from recent studies of human populations ? including one showing that the grandchildren of well-fed Swedes had a greater risk of diabetes.

The new mouse experiment lends support to the idea that we inherit not only our genes from our parents, but also a set of instructions that tell the genes when to become active. These instructions appear to be passed on through ?epigenetic? changes to DNA ? genes can be activated or silenced according to the chemical groups that are added onto them.

Gene silencer

David Martin at the Children?s Hospital Oakland Research Institute in California, US, and colleagues used a special strain of genetically identical mice with an overactive version of a gene that influences fur colour. Mice with the AVY version of this gene generally have golden fur.

Half of the mice were given a diet enriched with nutrients such as vitamin B12 and zinc. These nutrients are known to increase the availability of the ?methyl? chemical groups that are responsible for silencing genes. The rest of the mice received a standard diet.

The pups of mice on the standard diet generally had golden fur. But a high proportion of those born to mice on the enriched diet had dark brown fur.

Martin believes that the nutrient-rich maternal diet caused silencing of the pups? AVY genes while they developed in the womb.

Passed down

Intriguingly, even though all of the pups in this generation received a standard diet, those that had exposure to a high-nutrient diet while in the womb, later gave birth to dark-coated offspring. Their control counterparts, by comparison, produced offspring with golden fur.

This shows that environmental factors ? such as an enriched diet ? can affect the activity of the AVY gene for at least two generations, the researchers say.

?The results make it clear that a nutritional status can affect not only that individual, but that individual?s children as well,? says study member Kenneth Beckman.

Skin colour

Beckman notes that the AVY gene is linked to weight and diabetes risk. He adds that there is some evidence that a related gene in humans might affect skin colour ? but it is unknown if it also affects weight.

Even though humans may have a similar gene, they should not make dietary changes based on the results of the mouse experiment, researchers stress. ?It would be irresponsible to make any prescriptions about human behaviour based on these findings,? says Martin.

An earlier Swedish study which used historical data of harvests in Sweden, found that a youngster had a quadrupled risk of diabetes if their grandfather had good access to food during his own boyhood (see Grandad's diet affects descendants' health).

Journal reference: Proceedings of the National Academy of Science (DOI: 10.1073/pnas.0607090103)

Related Articles
Famine increases the risk of schizophrenia
http://www.newscientist.com/article.ns?id=dn7780
02 August 2005
Life sentence
http://www.newscientist.com/article.ns?id=mg18424715.800
30 October 2004
Hidden inheritance
http://www.newscientist.com/article.ns?id=mg16021625.100
28 November 1998
Title: Can't Think what you Can't Say, Part I
Post by: Body-by-Guinness on November 17, 2006, 01:24:02 AM
So broad in scope it seems misfiled; Dalrymple is becoming one of my favorite essayists.

The Gift of Language -

Theodore Dalrymple

No, Dr. Pinker, it?s not just from nature.

Now that I?ve retired early from medical practice in a slum hospital and the prison next door, my former colleagues sometimes ask me, not without a trace of anxiety, whether I think that I made the right choice or whether I miss my previous life. They are good friends and fine men, but it is only human nature not to wish unalloyed happiness to one who has chosen a path that diverges, even slightly, from one?s own.

Fortunately, I do miss some aspects of my work: if I didn?t, it would mean that I had not enjoyed what I did for many years and had wasted a large stretch of my life. I miss, for instance, the sudden illumination into the worldview of my patients that their replies to simple questions sometimes gave me. I still do a certain amount of medico-legal work, preparing psychiatric reports on those accused of crimes, and recently a case reminded me of how sharply a few words can bring into relief an entire attitude toward life and shed light on an entire mental hinterland.

A young woman was charged with assault, under the influence of alcohol and marijuana, on a very old lady about five times her age. Describing her childhood, the young accused mentioned that her mother had once been in trouble with the police.

?What for?? I asked.

?She was on the Social [Security] and working at the same time.?

?What happened?? I asked.

?She had to give up working.? The air of self-evidence with which she said this revealed a whole world of presuppositions. For her, and those around her, work was the last resort; economic dependence on state handouts was the natural condition of man.

I delighted in what my patients said. One of them always laced his statements with proverbs, which he invariably mangled. ?Sometimes, doctor,? he said to me one day, ?I feel like the little boy with his finger in the dike, crying wolf.? And I enjoyed the expressive argot of prison. The prison officers, too, had their own language. They called a loquacious prisoner ?verbal? if they believed him to be mad, and ?mouthy? if they believed him to be merely bad and willfully misbehaving.

Brief exchanges could so entertain me that on occasion they transformed duty into pleasure. Once I was called to the prison in the early hours to examine a man who had just tried to hang himself. He was sitting in a room with a prison officer. It was about three in the morning, the very worst time to be roused from sleep.

?The things you have to do for Umanity, sir,? said the prison officer to me.

The prisoner, looking bemused, said to him, ?You what??

?U-manity,? said the prison officer, turning to the prisoner. ?You?re Uman, aren?t you??

It was like living in a glorious comic passage in Dickens.

For the most part, though, I was struck not by the verbal felicity and invention of my patients and those around them but by their inability to express themselves with anything like facility: and this after 11 years of compulsory education, or (more accurately) attendance at school.

With a very limited vocabulary, it is impossible to make, or at least to express, important distinctions and to examine any question with conceptual care. My patients often had no words to describe what they were feeling, except in the crudest possible way, with expostulations, exclamations, and physical displays of emotion. Often, by guesswork and my experience of other patients, I could put things into words for them, words that they grasped at eagerly. Everything was on the tip of their tongue, rarely or never reaching the stage of expression out loud. They struggled even to describe in a consecutive and logical fashion what had happened to them, at least without a great deal of prompting. Complex narrative and most abstractions were closed to them.

In their dealings with authority, they were at a huge disadvantage?a disaster, since so many of them depended upon various public bureaucracies for so many of their needs, from their housing and health care to their income and the education of their children. I would find myself dealing on their behalf with those bureaucracies, which were often simultaneously bullying and incompetent; and what officialdom had claimed for months or even years to be impossible suddenly, on my intervention, became possible within a week. Of course, it was not my mastery of language alone that produced this result; rather, my mastery of language signaled my capacity to make serious trouble for the bureaucrats if they did not do as I asked. I do not think it is a coincidence that the offices of all those bureaucracies were increasingly installing security barriers against the physical attacks on the staff by enraged but inarticulate dependents.

All this, it seems to me, directly contradicts our era?s ruling orthodoxy about language. According to that orthodoxy, every child, save the severely brain-damaged and those with very rare genetic defects, learns his or her native language with perfect facility, adequate to his needs. He does so because the faculty of language is part of human nature, inscribed in man?s physical being, as it were, and almost independent of environment. To be sure, today?s language theorists concede that if a child grows up completely isolated from other human beings until the age of about six, he will never learn language adequately; but this very fact, they argue, implies that the capacity for language is ?hardwired? in the human brain, to be activated only at a certain stage in each individual?s development, which in turn proves that language is an inherent biological characteristic of mankind rather than a merely cultural artifact. Moreover, language itself is always rule-governed; and the rules that govern it are universally the same, when stripped of certain minor incidentals and contingencies that superficially appear important but in reality are not.

It follows that no language or dialect is superior to any other and that modes of verbal communication cannot be ranked according to complexity, expressiveness, or any other virtue. Thus, attempts to foist alleged grammatical ?correctness? on native speakers of an ?incorrect? dialect are nothing but the unacknowledged and oppressive exercise of social control?the means by which the elites deprive whole social classes and peoples of self-esteem and keep them in permanent subordination. If they are convinced that they can?t speak their own language properly, how can they possibly feel other than unworthy, humiliated, and disenfranchised? Hence the refusal to teach formal grammar is both in accord with a correct understanding of the nature of language and is politically generous, inasmuch as it confers equal status on all forms of speech and therefore upon all speakers.

The locus classicus of this way of thinking, at least for laymen such as myself, is Steven Pinker?s book The Language Instinct. A bestseller when first published in 1994, it is now in its 25th printing in the British paperback version alone, and its wide circulation suggests a broad influence on the opinions of the intelligent public. Pinker is a professor of psychology at Harvard University, and that institution?s great prestige cloaks him, too, in the eyes of many. If Professor Pinker were not right on so important a subject, which is one to which he has devoted much study and brilliant intelligence, would he have tenure at Harvard?

Pinker nails his colors to the mast at once. His book, he says, ?will not chide you about proper usage . . .? because, after all, ?[l]anguage is a complex, specialized skill, which . . . is qualitatively the same in every individual. . . . Language is no more a cultural invention than is upright posture,? and men are as naturally equal in their ability to express themselves as in their ability to stand on two legs. ?Once you begin to look at language . . . as a biological adaptation to communicate information,? Pinker continues, ?it is no longer as tempting to see language as an insidious shaper of thought.? Every individual has an equal linguistic capacity to formulate the most complex and refined thoughts. We all have, so to speak, the same tools for thinking. ?When it comes to linguistic form,? Pinker says, quoting the anthropologist, Edward Sapir, ?Plato walks with the Macedonian swineherd, Confucius with the head-hunting savage of Assam.? To put it another way, ?linguistic genius is involved every time a child learns his or her mother tongue.?

The old-fashioned and elitist idea that there is a ?correct? and ?incorrect? form of language no doubt explains the fact that ?[l]inguists repeatedly run up against the myth that working-class people . . . speak a simpler and a coarser language. This is a pernicious illusion. . . . Trifling differences between the dialect of the mainstream and the dialect of other groups . . . are dignified as badges of ?proper grammar.? ? These are, in fact, the ?hobgoblins of the schoolmarm,? and ipso facto contemptible. In fact, standard English is one of those languages that ?is a dialect with an army and a navy.? The schoolmarms he so slightingly dismisses are in fact but the linguistic arm of a colonial power?the middle class?oppressing what would otherwise be a much freer and happier populace. ?Since prescriptive rules are so psychologically unnatural that only those with access to the right schooling can abide by them, they serve as shibboleths, differentiating the elite from the rabble.?

Children will learn their native language adequately whatever anyone does, and the attempt to teach them language is fraught with psychological perils. For example, to ?correct? the way a child speaks is potentially to give him what used to be called an inferiority complex. Moreover, when schools undertake such correction, they risk dividing the child from his parents and social milieu, for he will speak in one way and live in another, creating hostility and possibly rejection all around him. But happily, since every child is a linguistic genius, there is no need to do any such thing. Every child will have the linguistic equipment he needs, merely by virtue of growing older.

I need hardly point out that Pinker doesn?t really believe anything of what he writes, at least if example is stronger evidence of belief than precept. Though artfully sown here and there with a demotic expression to prove that he is himself of the people, his own book is written, not surprisingly, in the kind of English that would please schoolmarms. I doubt very much whether it would have reached its 25th printing had he chosen to write it in the dialect of rural Louisiana, for example, or of the slums of Newcastle-upon-Tyne. Even had he chosen to do so, he might have found the writing rather difficult. I should like to see him try to translate a sentence from his book that I have taken at random, ?The point that the argument misses is that although natural selection involves incremental steps that enhance functioning, the enhancements do not have to be an existing module,? into the language of the Glasgow or Detroit slums.

In fact, Pinker has no difficulty in ascribing greater or lesser expressive virtues to languages and dialects. In attacking the idea that there are primitive languages, he quotes the linguist Joan Bresnan, who describes English as ?a West Germanic language spoken in England and its former colonies? (no prizes for guessing the emotional connotations of this way of so describing it). Bresnan wrote an article comparing the use of the dative in English and Kivunjo, a language spoken on the slopes of Mount Kilimanjaro. Its use is much more complex in the latter language than in the former, making far more distinctions. Pinker comments: ?Among the clever gadgets I have glimpsed in the grammars of so-called primitive groups, the complex Cherokee pronoun system seems especially handy. It distinguishes among ?you and I,? ?another person and I,? ?several other people and I,? and ?you, one or more other persons, and I,? which English crudely collapses into the all-purpose pronoun we.? In other words, crudity and subtlety are concepts that apply between languages. And if so, there can be no real reason why they cannot apply within a language?why one man?s usage should not be better, more expressive, subtler, than another?s.

Similarly, Pinker attacks the idea that the English of the ghetto, Black English Vernacular, is in any way inferior to standard English. It is rule- governed like (almost) all other language. Moreover, ?If the psychologists had listened to spontaneous conversations, they would have rediscovered the commonplace fact that American black culture is highly verbal; the subculture of street youths in particular is famous in the annals of anthropology for the value placed on linguistic virtuosity.? But in appearing to endorse the idea of linguistic virtuosity, he is, whether he likes it or not, endorsing the idea of linguistic lack of virtuosity. And it surely requires very little reflection to come to the conclusion that Shakespeare had more linguistic virtuosity than, say, the average contemporary football player. Oddly enough, Pinker ends his encomium on Black English Vernacular with a schoolmarm?s pursed lips: ?The highest percentage of ungrammatical sentences [are to be] found in the proceedings of learned academic conferences.?

Title: Can't Think what you Can't Say, Part II
Post by: Body-by-Guinness on November 17, 2006, 01:24:42 AM
Over and over again, Pinker stresses that children do not learn language by imitation; rather, they learn it because they are biologically predestined to do so. ?Let us do away,? he writes, with what one imagines to be a rhetorical sweep of his hand, ?with the folklore that parents teach their children language.? It comes as rather a surprise, then, to read the book?s dedication: ?For Harry and Roslyn Pinker, who gave me language.?

Surely he cannot mean by this that they gave him language in the same sense as they gave him hemoglobin?that is to say, that they were merely the sine qua non of his biological existence as Steven Pinker. If so, why choose language of all the gifts that they gave him? Presumably, he means that they gave him the opportunity to learn standard English, even if they did not speak it themselves.

It is utterly implausible to suggest that imitation of parents (or other social contacts) has nothing whatever to do with the acquisition of language. I hesitate to mention so obvious a consideration, but Chinese parents tend to have Chinese-speaking children, and Portuguese parents Portuguese-speaking ones. I find it difficult to believe that this is entirely a coincidence and that imitation has nothing to do with it. Moreover, it is a sociological truism that children tend to speak not merely the language but the dialect of their parents.

Of course, they can escape it if they choose or need to do so: my mother, a native German-speaker, arrived in England aged 18 and learned to speak standard English without a trace of a German accent (which linguists say is a rare accomplishment) and without ever making a grammatical mistake. She didn?t imitate her parents, perhaps, but she imitated someone. After her recent death, I found her notebooks from 1939, in which she painstakingly practiced English, the errors growing fewer until there were none. I don?t think she would have been favorably impressed by Professor Pinker?s disdainful grammatical latitudinarianism?the latitudinarianism that, in British schools and universities, now extends not only to grammar but to spelling, as a friend of mine discovered recently.

A teacher in a state school gave his daughter a list of spellings to learn as homework, and my friend noticed that three out of ten of them were wrong. He went to the principal to complain, but she looked at the list and asked, ?So what? You can tell what the words are supposed to mean.? The test for her was not whether the spellings were correct but whether they were understandable. So much for the hobgoblins of contemporary schoolmarms.

The contrast between a felt and lived reality?in this case, Pinker?s need to speak and write standard English because of its superior ability to express complex ideas?and the denial of it, perhaps in order to assert something original and striking, is characteristic of an intellectual climate in which the destruction of moral and social distinctions is proof of the very best intentions.

Pinker?s grammatical latitudinarianism, when educationists like the principal of my friend?s daughter?s school take it seriously, has the practical effect of encouraging those born in the lower reaches of society to remain there, to enclose them in the mental world of their particular milieu. Of course, this is perfectly all right if you also believe that all stations in life are equally good and desirable and that there is nothing to be said for articulate reflection upon human existence. In other words, grammatical latitudinarianism is the natural ideological ally of moral and cultural relativism.

It so happens that I observed the importance of mastering standard, schoolmarmly grammatical speech in my own family. My father, born two years after his older brother, had the opportunity, denied his older brother for reasons of poverty, to continue his education. Accordingly, my father learned to speak and write standard English, and I never heard him utter a single word that betrayed his origins. He could discourse philosophically without difficulty; I sometimes wished he had been a little less fluent.

My uncle, by contrast, remained trapped in the language of the slums. He was a highly intelligent man and what is more a very good one: he was one of those rare men, much less common than their opposite, from whom goodness radiated almost as a physical quality. No one ever met him without sensing his goodness of heart, his generosity of spirit.

But he was deeply inarticulate. His thoughts were too complex for the words and the syntax available to him. All through my childhood and beyond, I saw him struggle, like a man wrestling with an invisible boa constrictor, to express his far from foolish thoughts?thoughts of a complexity that my father expressed effortlessly. The frustration was evident on his face, though he never blamed anyone else for it. When, in Pinker?s book, I read the transcript of an interview by the neuropsychologist Howard Gardner with a man who suffered from expressive dysphasia after a stroke?that is to say, an inability to articulate thoughts in language?I was, with great sadness, reminded of my uncle. Gardner asked the man about his job before he had a stroke.

?I?m a sig . . . no . . . man . . . uh, well, . . . again.? These words were emitted slowly, and with great effort. . . . ?Let me help you,? I interjected. ?You were a signal . . .? ?A sig-nal man . . . right,? [he] completed my phrase triumphantly. ?Were you in the Coast Guard?? ?No, er, yes, yes . . . ship . . . Massachu . . . chusetts . . . Coast-guard . . . years.? It seemed to me that it was a cruel fate for such a man as my uncle not to have been taught the standard English that came to come so naturally to my father. As Montaigne tells us, there is no torture greater than that of a man who is unable to express what is in his soul.

Beginning in the 1950s, Basil Bernstein, a London University researcher, demonstrated the difference between the speech of middle- and working-class children, controlling for whatever it is that IQ measures. Working-class speech, tethered closely to the here and now, lacked the very aspects of standard English needed to express abstract or general ideas and to place personal experience in temporal or any other perspective. Thus, unless Pinker?s despised schoolmarms were to take the working-class children in hand and deliberately teach them another speech code, they were doomed to remain where they were, at the bottom of a society that was itself much the poorer for not taking full advantage of their abilities, and that indeed would pay a steep penalty for not doing so. An intelligent man who can make no constructive use of his intelligence is likely to make a destructive, and self-destructive, use of it.

If anyone doubts that inarticulacy can be a problem, I recommend reading a report by the Joseph Rowntree Trust about British girls who get themselves pregnant in their teens (and sometimes their early teens) as an answer to their existential problems. The report is not in the least concerned with the linguistic deficiencies of these girls, but they are evident in the transcript in every reply to every question. Without exception, the girls had had a very painful experience of life and therefore much to express from hearts that must have been bursting. I give only one example, but it is representative. A girl, aged 17, explains why it is wonderful to have a baby:

Maybe it?s just?yeah, because maybe just?might be (um) it just feels great when?when like, you?ve got a child who just? you know?following you around, telling you they love you and I think that?s?it?s quite selfish, but that?s one of the reasons why I became a mum because I wanted someone who?ll?you know?love ?em to bits ?cos it?s not just your child who?s the centre of your world, and that feels great as well, so I think?it?s brilliant. It is fantastic because?you know?they?re?the child?s dependent on you and you know that (um)? that you?if you?you know?you?ve gotta do everything for the child and it just feels great to be depended on. As I know from the experience of my patients, there is no reason to expect her powers of expression to increase spontaneously with age. Any complex abstractions that enter her mind will remain inchoate, almost a nuisance, like a fly buzzing in a bottle that it cannot escape. Her experience is opaque even to herself, a mere jumble from which it will be difficult or impossible to learn because, for linguistic reasons, she cannot put it into any kind of perspective or coherent order.

I am not of the ungenerous and empirically mistaken party that writes off such people as inherently incapable of anything better or as already having achieved so much that it is unnecessary to demand anything else of them, on the grounds that they naturally have more in common with Shakespeare than with speechless animal creation. Nor, of course, would I want everyone to speak all the time in Johnsonian or Gibbonian periods. Not only would it be intolerably tedious, but much linguistic wealth would vanish. But everyone ought to have the opportunity to transcend the limitations of his linguistic environment, if it is a restricted one?which means that he ought to meet a few schoolmarms in his childhood. Everyone, save the handicapped, learns to run without being taught; but no child runs 100 yards in nine seconds, or even 15 seconds, without training. It is fatuous to expect that the most complex of human faculties, language, requires no special training to develop it to its highest possible power.

http://www.city-journal.org/html/16_4_urbanities-language.html
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on December 10, 2006, 11:39:38 AM
http://www.nytimes.com/2006/12/10/us/10dna.html?th&emc=th

SOUTH NAKNEK, Alaska — The National Geographic Society’s multimillion-dollar research project to collect DNA from indigenous groups around the world in the hopes of reconstructing humanity’s ancient migrations has come to a standstill on its home turf in North America.


A review board stopped DNA research in South Naknek, Alaska.

Billed as the “moon shot of anthropology,” the Genographic Project intends to collect 100,000 indigenous DNA samples. But for four months, the project has been on hold here as it scrambles to address questions raised by a group that oversees research involving Alaska natives.

At issue is whether scientists who need DNA from aboriginal populations to fashion a window on the past are underselling the risks to present-day donors. Geographic origin stories told by DNA can clash with long-held beliefs, threatening a world view some indigenous leaders see as vital to preserving their culture.

They argue that genetic ancestry information could also jeopardize land rights and other benefits that are based on the notion that their people have lived in a place since the beginning of time.

“What if it turns out you’re really Siberian and then, oops, your health care is gone?” said Dr. David Barrett, a co-chairman of the Alaska Area Institutional Review Board, which is sponsored by the Indian Health Service, a federal agency. “Did anyone explain that to them?”

Such situations have not come up, and officials with the Genographic Project discount them as unlikely. Spencer Wells, the population geneticist who directs the project, says it is paternalistic to imply that indigenous groups need to be kept from the knowledge that genetics might offer.

“I don’t think humans at their core are ostriches,” Dr. Wells said. “Everyone has an interest in where they came from, and indigenous people have more of an interest in their ancestry because it is so important to them.”

But indigenous leaders point to centuries of broken promises to explain why they believe their fears are not far-fetched. Scientific evidence that American Indians or other aboriginal groups came from elsewhere, they say, could undermine their moral basis for sovereignty and chip away at their collective legal claims.

“It’s a benefit to science, probably,” said Dr. Mic LaRoque, the Alaska board’s other co-chairman and a member of the Turtle Mountain Chippewa Tribe of North Dakota. “But I’m not convinced it’s a benefit to the tribes.”


The pursuit of indigenous DNA is driven by a desire to shed light on questions for which the archeological evidence is scant. How did descendants of the hunter-gatherers who first left humanity’s birthplace in east Africa some 65,000 years ago come to inhabit every corner of the Earth? What routes did they take? Who got where, and when?

As early humans split off in different directions, distinct mutations accumulated in the DNA of each population. Like bread crumbs, these genetic markers, passed on intact for millennia, can reveal the trail of the original pioneers. All non-Africans share a mutation that arose in the ancestors of the first people to leave the continent, for instance. But the descendants of those who headed north and lingered in the Middle East carry a different marker from those who went southeast toward Asia.

Most of the world’s six billion people, however, are too far removed from wherever their ancestors originally put down roots to be useful to population geneticists. The Genographic Project is focusing on DNA from people still living in their ancestral homelands because they provide the crucial geographic link between genetic markers found today and routes traveled long ago.

In its first 18 months, the project’s scientists have had considerable success, persuading more than 18,000 people in off-the-grid places like the east African island of Pemba and the Tibesti Mountains of Chad to donate their DNA. When the North American team arrived in southwestern Alaska, they found volunteers offering cheek swabs and family histories for all sorts of reasons.

The council members of the Native Village of Georgetown, for instance, thought the project could bolster a sense of cultural pride.


Page 2 of 3)


Glenn Fredericks, president of the Georgetown tribe, was eager for proof of an ancient unity between his people and American Indians elsewhere that might create greater political power. “They practice the same stuff, the lower-48 natives, as we do,” Mr. Fredericks said. “Did we exchange people? It would be good to know.”


Others said the test would finally force an acknowledgment that they were here first, undermining those who see the government as having “given” them their land.
Still others were interested in the mechanics of migration: “Were the lands all combined? Did they get here by boat?” For many nonindigenous Americans who feel disconnected from their roots, the project has also struck a chord: nearly 150,000 have scraped cells from their cheek and sent them to the society with $100 to learn what scientists know so far about how and where their individual forebears lived beyond the mists of prehistory.

By giving the broader public a way to participate, though it is likely to generate little scientific payoff, the project has created an unusual set of stakeholders with a personal interest in its success. More details, the project explains in the ancestral sketches it gives individuals, will come only with more indigenous DNA.

“I think you have to be sensitive to these cultures,” said Jesse R. Sweeney, 32, a bankruptcy lawyer in Detroit who hopes the millennia-size gaps in his own ancestors’ story will eventually be filled in. “But hopefully they will change their mind and contribute to the research.”

Mr. Sweeney’s DNA places his maternal ancestors in the Middle East about 50,000 year ago. After that, they may have gone north. Or maybe south: “This is where the genetic clues get murky and your DNA trail goes cold,” read the conclusion to his test results on the project’s Web site. “By working together with indigenous peoples around the globe, we are learning more about these ancient migrations.”

The first large effort to collect indigenous DNA since federal financing was withdrawn from a similar proposal amid indigenous opposition in the mid-1990s, the Genographic Project has drawn quiet applause from many geneticists for resurrecting scientific ambitions that have grown more pressing. As indigenous groups intermarry and disperse at an ever-accelerating pace, many scientists believe the chance to capture human history is fast disappearing.

“Everyone else had given up,” said Mark Stoneking, a professor at the Max Planck Institute for Evolutionary Anthropology. “If they get even a fraction of what they are trying for, it will be very useful.”

Unlike the earlier Human Genome Diversity Project, condemned by some groups as “biocolonialism” because scientists may have profited from genetic data that could have been used to develop drugs, the Genographic Project promises to patent nothing and to avoid collecting medical information. The project has designated half the proceeds from the sale of kits to the public for programs designed to preserve traditional cultures and language.

In May, project officials held a stormy meeting in New York with the indigenous rights group Cultural Survival while protestors carried signs reading “National Geographic Sucks Indigenous Blood.” Shortly after, the United Nations Permanent Forum on Indigenous Issues recommended suspending the project.

On the ground, every region has its challenges. To make scientific progress, the project’s geneticists are finding they must first navigate an unfamiliar tangle of political, religious and personal misgivings.

Pierre Zalloua, the project director in the Middle East, faces suspicion that he is an emissary of an opposing camp trying to prove their lineages are not important. Himla Soodyall, the project’s South African director, finds herself trying to explain to people who worship their ancestors what more her research could add. In Australia, some aboriginal groups have refused to cooperate.

But among the 10 geneticists the society has given the task of collecting 10,000 samples each by the spring of 2010, Theodore G. Schurr, the project’s North American director, is in last place. Fewer than 100 vials of DNA occupy a small plastic box in his laboratory’s large freezer at the University of Pennsylvania, where he is an assistant professor of anthropology. And at the request of the Alaska review board, he has sent back the 50 or so samples that he collected in Alaska to be stored in a specimen bank under its care until he can satisfy their concerns.

American Indians, Dr. Schurr says, hold the answer to one of the more notable gaps in the prehistoric migration map. Although most scientists accept that the first Americans came across the Bering Strait land bridge that connected Siberia and Alaska some 20,000 years ago, there is no proof of precisely where those travelers came from, and the route they took south once they arrived.

=========

Page 3 of 3)


Comparing the DNA of large numbers of American Indians might reveal whether their ancestors were from a single founding population, and when they reached the Americas. And knowing the routes and timing of migrations within the Americas would provide a foundation for studying how people came to be so different so quickly.

Human History With Genetics
But almost every federally recognized tribe in North America has declined or ignored Dr. Schurr’s invitation to take part. “What the scientists are trying to prove is that we’re the same as the Pilgrims except we came over several thousand years before,” said Maurice Foxx, chairman of the Massachusetts Commission on Indian Affairs and a member of the Mashpee Wampanoag. “Why should we give them that openly?”

Some American Indians trace their suspicions to the experience of the Havasupai Tribe, whose members gave DNA for a diabetes study that University of Arizona researchers later used to link the tribe’s ancestors to Asia. To tribe members raised to believe the Grand Canyon is humanity’s birthplace, the suggestion that their own DNA says otherwise was deeply disturbing.

When Dr. Schurr was finally invited to a handful of villages in Alaska, he eagerly accepted. But by the time he reached South Naknek, a tiny native village on the Alaska Peninsula, to report his analysis of the DNA he had taken on an earlier mission, the Alaska review board had complained to his university supervisors.

The consent form all volunteers must sign, the Alaska board said, should contain greater detail about the risks, including the fact that the DNA would be stored in a database linked to tribal information.

Dr. Schurr’s latest attempt at a revised form is to be reviewed this month by the board in Alaska and the by University of Pennsylvania board supervising the project.

In the meantime, his early results have surprised some of the Alaskans who gave him their DNA. In South Naknek, Lorianne Rawson, 42, found out her DNA contradicted what she had always believed. She was not descended from the Aleuts, her test results suggested, but from their one-time enemies, the Yup’ik Eskimos.

The link to the Yup’iks, Ms. Rawson said, only made her more curious. “We want them to do more research,” she added, offering Dr. Schurr more relatives to be tested.

But she will have to wait.
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on December 11, 2006, 05:11:13 AM
Today's NY Times

Study Detects Recent Instance of Human Evolution
             
 
By NICHOLAS WADE
Published: December 10, 2006
A surprisingly recent instance of human evolution has been detected among the peoples of East Africa. It is the ability to digest milk in adulthood, conferred by genetic changes that occurred as recently as 3,000 years ago, a team of geneticists has found.


Convergent Adaptation of Human Lactase Persistence in Africa and Europe
 (Nature Genetics) The finding is a striking example of a cultural practice — the raising of dairy cattle — feeding back into the human genome. It also seems to be one of the first instances of convergent human evolution to be documented at the genetic level. Convergent evolution refers to two or more populations acquiring the same trait independently.

Throughout most of human history, the ability to digest lactose, the principal sugar of milk, has been switched off after weaning because there is no further need for the lactase enzyme that breaks the sugar apart. But when cattle were first domesticated 9,000 years ago and people later started to consume their milk as well as their meat, natural selection would have favored anyone with a mutation that kept the lactase gene switched on.

Such a mutation is known to have arisen among an early cattle-raising people, the Funnel Beaker culture, which flourished some 5,000 to 6,000 years ago in north-central Europe. People with a persistently active lactase gene have no problem digesting milk and are said to be lactose tolerant.

Almost all Dutch people and 99 percent of Swedes are lactose-tolerant, but the mutation becomes progressively less common in Europeans who live at increasing distance from the ancient Funnel Beaker region.

Geneticists wondered if the lactose tolerance mutation in Europeans, first identified in 2002, had arisen among pastoral peoples elsewhere. But it seemed to be largely absent from Africa, even though pastoral peoples there generally have some degree of tolerance.

A research team led by Sarah Tishkoff of the University of Maryland has now resolved much of the puzzle. After testing for lactose tolerance and genetic makeup among 43 ethnic groups of East Africa, she and her colleagues have found three new mutations, all independent of each other and of the European mutation, which keep the lactase gene permanently switched on.

The principal mutation, found among Nilo-Saharan-speaking ethnic groups of Kenya and Tanzania, arose 2,700 to 6,800 years ago, according to genetic estimates, Dr. Tishkoff’s group is to report in the journal Nature Genetics on Monday. This fits well with archaeological evidence suggesting that pastoral peoples from the north reached northern Kenya about 4,500 years ago and southern Kenya and Tanzania 3,300 years ago.

Two other mutations were found, among the Beja people of northeastern Sudan and tribes of the same language family, Afro-Asiatic, in northern Kenya.

Genetic evidence shows that the mutations conferred an enormous selective advantage on their owners, enabling them to leave almost 10 times as many descendants as people without them. The mutations have created “one of the strongest genetic signatures of natural selection yet reported in humans,” the researchers write.

The survival advantage was so powerful perhaps because those with the mutations not only gained extra energy from lactose but also, in drought conditions, would have benefited from the water in milk. People who were lactose-intolerant could have risked losing water from diarrhea, Dr. Tishkoff said.

Diane Gifford-Gonzalez, an archaeologist at the University of California, Santa Cruz, said the new findings were “very exciting” because they “showed the speed with which a genetic mutation can be favored under conditions of strong natural selection, demonstrating the possible rate of evolutionary change in humans.”

The genetic data fitted in well, she said, with archaeological and linguistic evidence about the spread of pastoralism in Africa. The first clear evidence of cattle in Africa is from a site 8,000 years old in northwestern Sudan. Cattle there were domesticated independently from two other domestications, in the Near East and the Indus valley of India.

Both Nilo-Saharan speakers in Sudan and their Cushitic-speaking neighbors in the Red Sea hills probably domesticated cattle at the same time, since each has an independent vocabulary for cattle items, said Dr. Christopher Ehret, an expert on African languages and history at the University of California, Los Angeles. Descendants of each group moved southward and would have met again in Kenya, Dr. Ehret said.

Dr. Tishkoff detected lactose tolerance among both Cushitic speakers and Nilo-Saharan groups in Kenya. Cushitic is a branch of Afro-Asiatic, the language family that includes Arabic, Hebrew and ancient Egyptian.

Dr. Jonathan Pritchard, a statistical geneticist at the University of Chicago and the co-author of the new article, said that there were many signals of natural selection in the human genome, but that it was usually hard to know what was being selected for. In this case Dr. Tishkoff had clearly defined the driving force, he said.

The mutations Dr. Tishkoff detected are not in the lactase gene itself but a nearby region of the DNA that controls the activation of the gene. The finding that different ethnic groups in East Africa have different mutations is one instance of their varied evolutionary history and their exposure to many different selective pressures, Dr. Tishkoff said.

“There is a lot of genetic variation between groups in Africa, reflecting the different environments in which they live, from deserts to tropics, and their exposure to very different selective forces,” she said.

People in different regions of the world have evolved independently since dispersing from the ancestral human population in northeast Africa 50,000 years ago, a process that has led to the emergence of different races. But much of this differentiation at the level of DNA may have led to the same physical result.

As Dr. Tishkoff has found in the case of lactose tolerance, evolution may use the different mutations available to it in each population to reach the same goal when each is subjected to the same selective pressure. “I think it’s reasonable to assume this will be a more general paradigm,” Dr. Pritchard said.

Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on December 26, 2006, 02:50:26 PM
Devious Butterflies, Full-Throated Frogs and Other Liars
 
Joe McDonald/Corbis
The green frog has been known to deceive eavesdroppers with its croak.




 

By CARL ZIMMER
Published: December 26, 2006
If you happen across a pond full of croaking green frogs, listen carefully. Some of them may be lying.

Dishonesty has been documented in crustaceans and primates alike.

A croak is how male green frogs tell other frogs how big they are. The bigger the male, the deeper the croak. The sound of a big male is enough to scare off other males from challenging him for his territory.

While most croaks are honest, some are not. Some small males lower their voices to make themselves sound bigger. Their big-bodied croaks intimidate frogs that would beat them in a fair fight.

Green frogs are only one deceptive species among many. Dishonesty has been documented in creatures ranging from birds to crustaceans to primates, including, of course, Homo sapiens. “When you think of human communication, it’s rife with deception,” said Stephen Nowicki, a biologist at Duke University and the co-author of the 2005 book “The Evolution of Animal Communication.” “You just need to read a Shakespeare play or two to see that.”

As Dr. Nowicki chronicled in his book, biologists have long puzzled over deception. Dishonesty should undermine trust between animals. Why, for example, do green frogs keep believing that a big croak means a big male? New research is offering some answers: Natural selection can favor a mix of truth and lies, particularly when an animal has a big audience. From one listener to the next, honesty may not be the best policy.

“I think it could explain a lot of mysteries in the evolution of communication in animals, including humans,” said Stephen P. Ellner, a mathematical biologist at Cornell University.

Tales of animal deception reach back at least as far as Aesop’s fables. In the late 19th century, the naturalist George Romanes made a semi-scientific study of deceptive animals. In his 1883 book, “Mental Evolution in Animals,” Romanes wrote about how one of his correspondents had sent him “several examples of the display of hypocrisy of a King Charles spaniel.”

By the mid-1900s, scientists had documented deception in cases where one species fooled another. Some nonpoisonous butterflies, for example, evolved the same wing patterns that poisonous species used to warn off birds. Within a species, however, honesty usually prevailed. Animals gave each other alarm calls to warn of predators; males signaled their prowess in fighting; babies let their parents know they were hungry. Honesty benefited both the sender and the receiver.

“The point of signaling was to get information across,” Dr. Nowicki said. “Deception was almost not an issue.”

There was just one hole in this happy arrangement: it presented a great opportunity for liars. Shrikes, for example, regularly use alarm calls to warn one another of predators. But sometimes the birds will use false alarm calls to scare other shrikes away from food.

Imagine that a shrike fools other shrikes with a false alarm. It eats more, and therefore may hatch more babies. Meanwhile, the gullible, less-nourished shrikes hatch fewer babies. If false alarms become common, natural selection should favor shrikes that are not fooled by them.

When scientists created mathematical models of this theory, they found that dishonesty could undermine many vital kinds of communication. The challenge, then, was to find out how honesty countered the advantage of deception. “The liars ought to be able to take advantage of the system, so that you’d have selection on the listeners to ignore the signals,” said Jonathan Rowell, a postdoctoral researcher at the University of Tennessee.

Amotz Zahavi, a biologist at Tel Aviv University, proposed a way for honesty to prevail. His idea was that honesty won out only because lying carried a relatively large cost. His theory eventually led to elaborate mathematical models and experiments that confirmed it.

Roosters attract hens, for example, with their large red combs. Hens benefit from choosing mates in good condition, because their chicks will tend to be in good condition as well. The bigger and brighter a comb, the better condition the rooster is in.

Theoretically, a weak rooster could fool hens by growing a deceptively large comb. But it costs a weak rooster more than it does a strong one to build a big comb. This tradeoff leads to honest signals from weak and strong roosters alike.

“The mystery of why there is honesty was suddenly solved,” Dr. Ellner said. “All the big problems fell away.”

But if they had explained why deception did not win out, why did it continue to thrive? “We couldn’t explain all the dishonesty,” Dr. Ellner said.

Dr. H. Kern Reeve, an evolutionary biologist at Cornell, said that “deception is popping up with a surprising frequency.”

Even crustaceans can lie. Male stomatopods dig burrows, to which they try to attract females. Some males choose to try to evict other stomatopods from their burrows and take them over. These conflicts are dangerous because stomatopods can deliver crushing blows with their claw-like appendages. But the stomatopods rarely come to blows. Instead, males raise themselves up and extend their appendages, like a boxer raising his gloves. The sight of big appendages causes smaller stomatopods to back down.



========================



Page 2 of 2)



Yet even the biggest, meanest stomatopod has his moments of weakness. Like all crustaceans, they must molt. A freshly-molted stomatopod has a soft, tender exoskeleton. Even in this vulnerable state, however, males will still raise up their claws in a bold crustacean bluff.

Dr. Rowell recently created a more complicated model of animal signals that may explain why deception is so common. Previous models examined only a single animal sending a signal to a single receiver. But real signals are rarely so private. “They’re not happening in a one-on-one situation,” Dr. Rowell said. “They’re really happening in public.”

A signaler may have different relationships with different listeners. In some cases, honest signals are best. But eavesdroppers may be able to use honest signals for their own advantage.

To capture this extra layer of complexity, Dr. Rowell built a mathematical model with two receivers instead of one. The signaling animal could choose to be honest or dishonest. The receivers could respond to the signal as an honest one or a dishonest one.

Working with Dr. Ellner and Dr. Reeve, Dr. Rowell discovered that honesty and deception could reach a stable coexistence in the model. The signalers could sometimes be dishonest, and yet the receivers continued to believe the signals despite the deception.

Dr. Rowell and his colleagues published the details of their model in the December issue of The American Naturalist.

“It’s really important,” Dr. Nowicki said of the study. “They’re coming up with new angles that could explain how you could have more deception and keep it stable.”

Dr. Rowell argues that real-world cases of deception, like bluffing, support the model. When a male green frog or stomatopod bluffs, other males have to decide whether to heed the signal or to ignore it and attack. Attacking is risky, because it is possible that the signaler is not bluffing.

“The challenger isn’t willing to take that gamble,” Dr. Rowell said.

The model also showed how deception could be used against eavesdroppers. Green frogs — along with many other frogs and toads — attract females with a distinctive mating call. Dr. Ellner’s rough translation of their call: “I’m looking for female frogs, and if you come on my lily pad, I’ll show you a good time.”

In most cases, male frogs follow up on their mating calls by courting the females they attract. But sometimes they attack instead. This deceptive reaction may be a way for the males to cope with other males that eavesdrop on them. Such eavesdroppers, instead of holding onto their own territory, sneak around and try to intercept females attracted to the mating calls of other males.

If males are always honest in their mating calls, they may lose out to sneaky males. But if they attack, they can ambush the sneaky males and drive them away. Natural selection thus favors deception, despite the fact that the frogs sometimes attack potential mates. The females, meanwhile, are better off trusting the mating calls than ignoring them.

Dr. Reeve cautioned that the model was only the first step in understanding how networks of listeners can drive the evolution of deception. “Right now it needs to be tested in detail, experimentally,” he said.

Different species may be prone to different levels of deception. Solitary animals may evolve to be more honest than animals that spend long lives in big societies. If that is true, then humans may be exquisitely primed to deceive.

“We’re in a network of individuals watching us,” Dr. Reeve said. “If you provide a signal to one individual, it’s being eavesdropped on by lots of other people.”

Dr. Rowell is exploring cases of human deception with his model. In one case, he examines how terrorist organizations communicate to their sleeper cells.

“Your two listeners are the government and terrorist sleeper cells,” Dr. Rowell explained. “The sleeper cells don’t have a direct communication with whoever your terrorist signaler is.

“They might give something out over the Web, and the government picks it up. You find that you can very easily get a level of dishonesty from the terrorist signaler to get the government to waste resources on phantom attacks. You can see this evolution going on between sleeper cells and the government.”
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on February 09, 2007, 06:31:11 AM
Today's NY Times:

Our ancestors have arrived at the American Museum of Natural History. They are very old, and we are only beginning to recognize them and ourselves in them. They remind us of our origins long ago and how we have emerged as modern humans in the fullness of time.

 
The museum’s new permanent exhibition on human origins, which opens tomorrow, merges notable achievements in paleontology and genetics, sciences that have made their own robust evolutionary strides in recent years. Each introduces evidence supporting the other in establishing a genealogy extending back to protohuman species that arose in Africa from earlier primates some six to seven million years ago.

These two scientific threads run through the exhibition like the strands of the DNA double helix.

Ellen V. Futter, the museum’s president, said the “mutually reinforcing evidence” was organized in the exhibition to address three fundamental questions: Where did we come from? Who are we? And what lies ahead for us?

Turn right at the entrance of the new installation, the Anne and Bernard Spitzer Hall of Human Origins, and you see paleontology’s side of the story. More than 200 casts of prehuman and human fossils and artifacts illustrate stages in physical and behavioral evolution. Four life-size tableaus depict scenes in the lives of human predecessors, the realism stamped by the presence of pesky flies on their shoulders.

Some of the most striking displays are reconstructions from fossil and other evidence of what these ancestors probably looked like. Museum scientists and technicians have recreated the faces and bodies of the famous Lucy skeleton and Neanderthals — even the controversial Hobbits, the tiny specimens of what may be a previously unknown extinct species found recently in Indonesia.

The reconstruction of Turkana Boy is especially evocative. Based on one of the most complete ancestral skeletons ever excavated, the fleshed-out Homo ergaster, a species that lived in Africa 1.9 to 1.4 million years ago, is almost six feet tall, with a body form remarkably like that of modern humans.

“The fossils on which the reconstructions are based are witnesses to a dynamic history,” said Ian Tattersall, a paleoanthropologist at the museum and co-curator of the exhibition. “Now we have a much larger story to tell, with the addition of what we are learning from molecular biology.”

Bear left in the hall, and there is the sign “DNA Tells Us About Human Origins.” Below are three tubes containing particles of DNA in a milky white solution. The samples are not particularly impressive, until you think that this is the stuff of encoded information shaping an entire organism and the material that has transformed the study of genetics, or genomics, and revealed the place of humans in the rest of life.

One of the vials holds human DNA, and another a chimpanzee’s. The analysis of their genetic material has confirmed what comparative anatomy predicted, showing that human DNA is 98.8 percent identical to that of chimps and bonobos, our closest living relatives. And our DNA is, on average, 96 percent identical to our most distant primate kin, some of which are mounted on the wall.

The third vial contains a DNA sample from a 40,000-year-old Neanderthal, the extinct close cousin of Homo sapiens. The discovery of a Neanderthal skull in 1856 led to the recognition that different kinds of humans once lived on Earth. This rare DNA specimen, on display in this country for the first time, was donated by the Max Planck Institute in Leipzig, Germany, the first laboratory to succeed in extracting the genetic material from Neanderthal bones.

Standing nearby are the skeletons of a chimpanzee, a Neanderthal and a modern human, and stations with interactive electronic displays are ready, at the touch of a screen, to explain the differences and similarities between the bones, brains and DNA of the three species.

Other computer animations offer insights into how scientists decode the hereditary information, how it is transmitted through generations, and how mutations of mitochondrial DNA, the traits inherited through the mother’s lineage, reveal relationships through time and migrations. A video of a “tree of life” changes before your eyes, like a kaleidoscope, showing the branching interrelationships among 479 species.

===========

Page 2 of 2)



Rob DeSalle, the exhibition’s other curator and a molecular biologist at the museum, said genomics is leading to the discovery of “the history between other species and humans and the relationships of humans to each other.”


The genetics side of the exhibition is not as visually compelling as the fossils and reconstructed life in other sections. Plan to invest more time with the interactive displays and videos, which convey the truly new contributions to understanding the science of human evolution and the complexity and connectivity of life.

The Hall of Human Origins occupies the galleries of its predecessor, the Hall of Human Biology and Evolution, which had its opening 12 years ago, before many of the advances in genomics and a number of major fossil discoveries. That exhibition closed in September 2005 to make way for its more up-to-date replacement, supported by a gift from the Spitzers, the parents of Gov. Eliot Spitzer of New York.

Some of the cast of fossil characters may be familiar to regular museum visitors, but they have been revitalized in new settings. For example, the Australopithecus couple that left tracks walking 3.5 million years ago across a plain at Laetoli, Tanzania, appear here. The surprise is that they are so small, no more than three feet tall. Yet the discovery of their footprints was the first clear evidence that prehumans were walking upright well before they made tools.

In the habitat displays, two Homo ergasters butcher a carcass and fight off a vulture and a jackal trying to steal the meat, and a Homo erectus, Peking Man, crouches and is about to be pounced on by a hyena. The curators said these were reminders that early human ancestors were prey rather than predator for much of their history.

Toward the back of the gallery, the cultural aspects of evolution are illustrated. An exact reproduction of the painted animals from the cave art at Lascaux in France stretches across the wall. Other displays include a replica of a 75,000-year-old piece of ochre decorated with geometric patterns, a recent discovery in South Africa and one of the earliest examples of symbolic thinking and creativity in modern humans. In this context the exhibition reviews the elements that make humans different from other life: tool use, language, music and writing, as well as art and other forms of creative expression.

Off in a side room, the Spitzer Hall has an educational laboratory with microscopes and laptops ready for visitors, guided by instructors, to try their hands at examining fossils and learning how to decode DNA. The lab is designed with young people and student groups in mind, but anyone is free to experience something of what it is like to delve into the human past. Elsewhere a multimedia bulletin board offers news of the latest developments in research into the human past.

One issue cannot be entirely sidestepped in any public presentation of human evolution: that many people in this country doubt and vocally oppose the very concept. In a corner of the hall, several scientists are shown in video interviews professing the compatibility of their evolution research with their religious beliefs.

Standing nearby at the end of a tour of the exhibition, Michael J. Novacek, a paleontologist and the museum’s senior vice president, said that a previous show on Darwin had been a reassuring test case. The exhibition was popular, he said, and provoked “very little negative response.”

Dr. Novacek said the new hall was “an emphatic statement about the theory of evolution and its power to tell us our origins and history.”

“We emphasize that a scientific theory is an argument that is very carefully tested against scientific evidence,” he continued, “and this one has withstood much scrutiny.”

The modern human capacity for symbolic and creative expression has brought forth different narratives to explain where we came from, drawn from myth, religion and pre-Darwin science. The exhibition’s parallel lines of fossil and molecular evidence have the cumulative effect of solidifying the foundation for the more recent scientific narrative of human evolution.

There are still many gaps in knowledge, and unsolved mysteries. But seeing ourselves in the train of preceding species, we also recognize the degree of our separation from other animals, even our earliest ancestors. Only modern Homo sapiens in our time could present with such newfound authority the epic narrated through the museum’s Hall of Human Origins.

 
The Anne and Bernard Spitzer Hall of Human Origins will open tomorrow at the American Museum of Natural History, Central Park West and 79th Street. Museum hours: daily, 10 a.m. to 5:45 p.m. (to 8:45 p.m. on Fridays). Suggested museum admission: $14; $10.50 for students and 60+; $8 for children 2 to 12; free for members. (212) 769-5100 or (212) 769-5200; amnh.org.

Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on March 05, 2007, 08:17:45 AM

UCLA Study on Friendship among Women

By Gale Berkowitz

A landmark UCLA study suggests friendships between women are special. See the following article: Taylor, S. E., Klein, L.C., Lewis, B. P., Gruenewald, T. L., Gurung, R.A.R., & Updegraff, J. A. (2000). "Female Responses to Stress: Tend and Befriend, Not Fight or Flight", Psychological Review, 107(3), 41-429.

They [friendships between women] shape who we are and who we are yet to be. They soothe our tumultuous inner world, fill the emotional gaps in our marriage, and help us remember who we really are. By the way, they may do even more.

Scientists now suspect that hanging out with our friends can actually counteract the kind of stomach-quivering stress most of us experience on a daily basis. A landmark UCLA study suggests that women respond to stress with a cascade of brain chemicals that cause us to make and maintain friendships with other women.

It's a stunning find that has turned five decades of stress research - most of it on men - upside down. "Until this study was published, scientists generally believed that when people experience stress, they trigger a hormonal cascade that revs the body to either stand and fight or flee as fast as possible," explains Laura Cousino Klein, Ph.D., now an Assistant Professor of Bio-behavioural Health at Penn State University and one of the study's authors. "It's an ancient survival mechanism left over from the time we were chased across the planet by sabre-toothed tigers.

Now the researchers suspect that women have a larger behavioural repertoire than just "fight or flight." "In fact," says Dr. Klein, "it seems that when the hormone oxytocin is released as part of the stress responses in a woman, it buffers the "fight or flight" response and encourages her to tend children and gather with other women instead. When she actually engages in this tending or befriending, studies suggest our bodies release more oxytocin, which further counters stress and produces a calming effect. This calming response does not occur in men,” says Dr. Klein, "because testosterone - which men produce in high levels when they're under stress, seems to reduce the effects of oxytocin. Estrogen,” she adds, "seems to enhance it."

The discovery that women respond to stress differently than men was made in a classic "aha!" moment shared by two women scientists who were talking one day in a lab at UCLA. "There was this joke that when the women who worked in the lab were stressed, they came in, cleaned the lab, had coffee, and bonded,” says Dr. Klein. "When the men were stressed, they holed up somewhere on their own. I commented one day to fellow researcher Shelley Taylor that nearly 90% of the stress research is on males. I showed her the data from my lab, and the two of us knew instantly that we were onto something."

The women cleared their schedules and started meeting with one scientist after another from various research specialties. Very quickly, Drs. Klein and Taylor discovered that by not including women in stress research, scientists had made a huge mistake: The fact that women respond to stress differently than men has significant implications for our health.

It may take some time for new studies to reveal all the ways that oxytocin encourages us to care for children and hang out with other women, but the "tend and befriend" notion developed by Drs. Klein and Taylor may explain why women consistently outlive men. Study after study has found that social ties reduce our risk of disease by lowering blood pressure, heart rate, and cholesterol. "There's no doubt," says Dr. Klein, "that friends are helping us live." In one study, for example, researchers found that people who had no friends increased their risk of death over a 6-month period. In another study, those who had the most friends over a 9-year period cut their risk of death by more than 60%. Friends are also helping us live better. The famed Nurses' Health Study from Harvard Medical School found that the more friends women had, the less likely they were to develop physical impairments as they aged, and the more likely they were to be leading a joyful life.

In fact, the results were so significant, the re searchers concluded, that not having close friends or confidantes was as detrimental to your health as smoking or carrying extra weight! Moreover, that is not all! When the researchers looked at how well the women functioned after the death of their spouse, they found that even in the face of this biggest stressor of all, those women who had a close friend confidante were more likely to survive the experience without any new physical impairments or permanent loss of vitality. Those without friends were not always so fortunate.

Yet if friends counter the stress that seems to swallow up so much of our life these days, if they keep us healthy and even add years to our life, why is it so hard to find time to be with them? That is a question that also troubles researcher Ruthellen Josselson, Ph.D., co-author of "Best Friends: The Pleasures and Perils of Girls' and Women's Friendships (Three Rivers Press, 1998).

"Every time we get overly busy with work and family, the first thing we do is let go of friendships with other women," explains Dr. Josselson." We push them right to the back burner. That is really a mistake because women are such a source of strength to each other. We nurture one another. In addition, we need to have unpressured space in which we can do the special kind of talk that women do when they are with other women. It's a very healing experience."
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on March 06, 2007, 02:01:26 AM
A friend brought this website to my attention and I have just begun surfing it a bit and find it to have some distinctive takes on things various matters.  Check it out.

http://neuropolitics.org/
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on March 08, 2007, 09:57:34 AM
I just realized that this subject has a thread on the "Poltics & Religion" forum too at

http://dogbrothers.com/phpBB2/index.php?topic=523.0 which I will bring over here when I get my wife to remind me how to do that.  :oops:

Please continue to post here, but know that there are many interesting posts there too.

Title: What's so Funny?
Post by: Crafty_Dog on March 13, 2007, 06:46:12 AM
Today's NY Times:

So there are these two muffins baking in an oven. One of them yells, “Wow, it’s hot in here!”
And the other muffin replies: “Holy cow! A talking muffin!”

Did that alleged joke make you laugh? I would guess (and hope) not. But under different circumstances, you would be chuckling softly, maybe giggling, possibly guffawing. I know that’s hard to believe, but trust me. The results are just in on a laboratory test of the muffin joke.

Laughter, a topic that stymied philosophers for 2,000 years, is finally yielding to science. Researchers have scanned brains and tickled babies, chimpanzees and rats. They’ve traced the evolution of laughter back to what looks like the primal joke — or, to be precise, the first stand-up routine to kill with an audience of primates.

It wasn’t any funnier than the muffin joke, but that’s not surprising, at least not to the researchers. They’ve discovered something that eluded Plato, Aristotle, Hobbes, Kant, Schopenhauer, Freud and the many theorists who have tried to explain laughter based on the mistaken premise that they’re explaining humor.

Occasionally we’re surprised into laughing at something funny, but most laughter has little to do with humor. It’s an instinctual survival tool for social animals, not an intellectual response to wit. It’s not about getting the joke. It’s about getting along.

When Robert R. Provine tried applying his training in neuroscience to laughter 20 years ago, he naïvely began by dragging people into his laboratory at the University of Maryland, Baltimore County, to watch episodes of “Saturday Night Live” and a George Carlin routine. They didn’t laugh much. It was what a stand-up comic would call a bad room.

So he went out into natural habitats — city sidewalks, suburban malls — and carefully observed thousands of “laugh episodes.” He found that 80 percent to 90 percent of them came after straight lines like “I know” or “I’ll see you guys later.” The witticisms that induced laughter rarely rose above the level of “You smell like you had a good workout.”

“Most prelaugh dialogue,” Professor Provine concluded in “Laughter,” his 2000 book, “is like that of an interminable television situation comedy scripted by an extremely ungifted writer.”

He found that most speakers, particularly women, did more laughing than their listeners, using the laughs as punctuation for their sentences. It’s a largely involuntary process. People can consciously suppress laughs, but few can make themselves laugh convincingly.

“Laughter is an honest social signal because it’s hard to fake,” Professor Provine says. “We’re dealing with something powerful, ancient and crude. It’s a kind of behavioral fossil showing the roots that all human beings, maybe all mammals, have in common.”

The human ha-ha evolved from the rhythmic sound — pant-pant — made by primates like chimpanzees when they tickle and chase one other while playing. Jaak Panksepp, a neuroscientist and psychologist at Washington State University, discovered that rats emit an ultrasonic chirp (inaudible to humans without special equipment) when they’re tickled, and they like the sensation so much they keep coming back for more tickling.

He and Professor Provine figure that the first primate joke — that is, the first action to produce a laugh without physical contact — was the feigned tickle, the same kind of coo-chi-coo move parents make when they thrust their wiggling fingers at a baby. Professor Panksepp thinks the brain has ancient wiring to produce laughter so that young animals learn to play with one another. The laughter stimulates euphoria circuits in the brain and also reassures the other animals that they’re playing, not fighting.

“Primal laughter evolved as a signaling device to highlight readiness for friendly interaction,” Professor Panksepp says. “Sophisticated social animals such as mammals need an emotionally positive mechanism to help create social brains and to weave organisms effectively into the social fabric.”

Humans are laughing by the age of four months and then progress from tickling to the Three Stooges to more sophisticated triggers for laughter (or, in some inexplicable cases, to Jim Carrey movies). Laughter can be used cruelly to reinforce a group’s solidarity and pride by mocking deviants and insulting outsiders, but mainly it’s a subtle social lubricant. It’s a way to make friends and also make clear who belongs where in the status hierarchy.

=============
Page 2 of 2)



Which brings us back to the muffin joke. It was inflicted by social psychologists at Florida State University on undergraduate women last year, during interviews for what was ostensibly a study of their spending habits. Some of the women were told the interviewer would be awarding a substantial cash prize to a few of the participants, like a boss deciding which underling deserved a bonus.

The women put in the underling position were a lot more likely to laugh at the muffin joke (and others almost as lame) than were women in the control group. But it wasn’t just because these underlings were trying to manipulate the boss, as was demonstrated in a follow-up experiment.

This time each of the women watched the muffin joke being told on videotape by a person who was ostensibly going to be working with her on a task. There was supposed to be a cash reward afterward to be allocated by a designated boss. In some cases the woman watching was designated the boss; in other cases she was the underling or a co-worker of the person on the videotape.

When the woman watching was the boss, she didn’t laugh much at the muffin joke. But when she was the underling or a co-worker, she laughed much more, even though the joke-teller wasn’t in the room to see her. When you’re low in the status hierarchy, you need all the allies you can find, so apparently you’re primed to chuckle at anything even if it doesn’t do you any immediate good.

“Laughter seems to be an automatic response to your situation rather than a conscious strategy,” says Tyler F. Stillman, who did the experiments along with Roy Baumeister and Nathan DeWall. “When I tell the muffin joke to my undergraduate classes, they laugh out loud.”

Mr. Stillman says he got so used to the laughs that he wasn’t quite prepared for the response at a conference in January, although he realizes he should have expected it.

“It was a small conference attended by some of the most senior researchers in the field,” he recalls. “When they heard me, a lowly graduate student, tell the muffin joke, there was a really uncomfortable silence. You could hear crickets.”


Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on March 20, 2007, 06:14:20 AM
Scientist Finds the Beginnings of Morality in Primate Behavior
 Illustration by Edel Rodriguez based on source material from Frans de Waal
Social OrderChimpanzees have a sense of social structure and rules of behavior, most of which involve the hierarchy of a group, in which some animals rank higher than others. Social living demands a number of qualities that may be precursors of morality. More 

By NICHOLAS WADE
Published: March 20, 2007

Some animals are surprisingly sensitive to the plight of others. Chimpanzees, who cannot swim, have drowned in zoo moats trying to save others. Given the chance to get food by pulling a chain that would also deliver an electric shock to a companion, rhesus monkeys will starve themselves for several days.

The Beginnings of Morality? Biologists argue that these and other social behaviors are the precursors of human morality. They further believe that if morality grew out of behavioral rules shaped by evolution, it is for biologists, not philosophers or theologians, to say what these rules are.

Moral philosophers do not take very seriously the biologists’ bid to annex their subject, but they find much of interest in what the biologists say and have started an academic conversation with them.

The original call to battle was sounded by the biologist Edward O. Wilson more than 30 years ago, when he suggested in his 1975 book “Sociobiology” that “the time has come for ethics to be removed temporarily from the hands of the philosophers and biologicized.” He may have jumped the gun about the time having come, but in the intervening decades biologists have made considerable progress.

Last year Marc Hauser, an evolutionary biologist at Harvard, proposed in his book “Moral Minds” that the brain has a genetically shaped mechanism for acquiring moral rules, a universal moral grammar similar to the neural machinery for learning language. In another recent book, “Primates and Philosophers,” the primatologist Frans de Waal defends against philosopher critics his view that the roots of morality can be seen in the social behavior of monkeys and apes.

Dr. de Waal, who is director of the Living Links Center at Emory University, argues that all social animals have had to constrain or alter their behavior in various ways for group living to be worthwhile. These constraints, evident in monkeys and even more so in chimpanzees, are part of human inheritance, too, and in his view form the set of behaviors from which human morality has been shaped.

Many philosophers find it hard to think of animals as moral beings, and indeed Dr. de Waal does not contend that even chimpanzees possess morality. But he argues that human morality would be impossible without certain emotional building blocks that are clearly at work in chimp and monkey societies.

Dr. de Waal’s views are based on years of observing nonhuman primates, starting with work on aggression in the 1960s. He noticed then that after fights between two combatants, other chimpanzees would console the loser. But he was waylaid in battles with psychologists over imputing emotional states to animals, and it took him 20 years to come back to the subject.

He found that consolation was universal among the great apes but generally absent from monkeys — among macaques, mothers will not even reassure an injured infant. To console another, Dr. de Waal argues, requires empathy and a level of self-awareness that only apes and humans seem to possess. And consideration of empathy quickly led him to explore the conditions for morality.

Though human morality may end in notions of rights and justice and fine ethical distinctions, it begins, Dr. de Waal says, in concern for others and the understanding of social rules as to how they should be treated. At this lower level, primatologists have shown, there is what they consider to be a sizable overlap between the behavior of people and other social primates.

Social living requires empathy, which is especially evident in chimpanzees, as well as ways of bringing internal hostilities to an end. Every species of ape and monkey has its own protocol for reconciliation after fights, Dr. de Waal has found. If two males fail to make up, female chimpanzees will often bring the rivals together, as if sensing that discord makes their community worse off and more vulnerable to attack by neighbors. Or they will head off a fight by taking stones out of the males’ hands.

Dr. de Waal believes that these actions are undertaken for the greater good of the community, as distinct from person-to-person relationships, and are a significant precursor of morality in human societies.
-=----------


(Page 2 of 3)



Macaques and chimpanzees have a sense of social order and rules of expected behavior, mostly to do with the hierarchical natures of their societies, in which each member knows its own place. Young rhesus monkeys learn quickly how to behave, and occasionally get a finger or toe bitten off as punishment. Other primates also have a sense of reciprocity and fairness. They remember who did them favors and who did them wrong. Chimps are more likely to share food with those who have groomed them. Capuchin monkeys show their displeasure if given a smaller reward than a partner receives for performing the same task, like a piece of cucumber instead of a grape.

The Beginnings of Morality? These four kinds of behavior — empathy, the ability to learn and follow social rules, reciprocity and peacemaking — are the basis of sociality.

Dr. de Waal sees human morality as having grown out of primate sociality, but with two extra levels of sophistication. People enforce their society’s moral codes much more rigorously with rewards, punishments and reputation building. They also apply a degree of judgment and reason, for which there are no parallels in animals.

Religion can be seen as another special ingredient of human societies, though one that emerged thousands of years after morality, in Dr. de Waal’s view. There are clear precursors of morality in nonhuman primates, but no precursors of religion. So it seems reasonable to assume that as humans evolved away from chimps, morality emerged first, followed by religion. “I look at religions as recent additions,” he said. “Their function may have to do with social life, and enforcement of rules and giving a narrative to them, which is what religions really do.”

As Dr. de Waal sees it, human morality may be severely limited by having evolved as a way of banding together against adversaries, with moral restraints being observed only toward the in group, not toward outsiders. “The profound irony is that our noblest achievement — morality — has evolutionary ties to our basest behavior — warfare,” he writes. “The sense of community required by the former was provided by the latter.”

Dr. de Waal has faced down many critics in evolutionary biology and psychology in developing his views. The evolutionary biologist George Williams dismissed morality as merely an accidental byproduct of evolution, and psychologists objected to attributing any emotional state to animals. Dr. de Waal convinced his colleagues over many years that the ban on inferring emotional states was an unreasonable restriction, given the expected evolutionary continuity between humans and other primates.

His latest audience is moral philosophers, many of whom are interested in his work and that of other biologists. “In departments of philosophy, an increasing number of people are influenced by what they have to say,” said Gilbert Harman, a Princeton University philosopher.

Dr. Philip Kitcher, a philosopher at Columbia University, likes Dr. de Waal’s empirical approach. “I have no doubt there are patterns of behavior we share with our primate relatives that are relevant to our ethical decisions,” he said. “Philosophers have always been beguiled by the dream of a system of ethics which is complete and finished, like mathematics. I don’t think it’s like that at all.”

But human ethics are considerably more complicated than the sympathy Dr. de Waal has described in chimps. “Sympathy is the raw material out of which a more complicated set of ethics may get fashioned,” he said. “In the actual world, we are confronted with different people who might be targets of our sympathy. And the business of ethics is deciding who to help and why and when.”

Many philosophers believe that conscious reasoning plays a large part in governing human ethical behavior and are therefore unwilling to let everything proceed from emotions, like sympathy, which may be evident in chimpanzees. The impartial element of morality comes from a capacity to reason, writes Peter Singer, a moral philosopher at Princeton, in “Primates and Philosophers.” He says, “Reason is like an escalator — once we step on it, we cannot get off until we have gone where it takes us.”

That was the view of Immanuel Kant, Dr. Singer noted, who believed morality must be based on reason, whereas the Scottish philosopher David Hume, followed by Dr. de Waal, argued that moral judgments proceed from the emotions.

But biologists like Dr. de Waal believe reason is generally brought to bear only after a moral decision has been reached. They argue that morality evolved at a time when people lived in small foraging societies and often had to make instant life-or-death decisions, with no time for conscious evaluation of moral choices. The reasoning came afterward as a post hoc justification. “Human behavior derives above all from fast, automated, emotional judgments, and only secondarily from slower conscious processes,” Dr. de Waal writes.

========

Page 3 of 3)

However much we may celebrate rationality, emotions are our compass, probably because they have been shaped by evolution, in Dr. de Waal’s view. For example, he says: “People object to moral solutions that involve hands-on harm to one another. This may be because hands-on violence has been subject to natural selection whereas utilitarian deliberations have not.”

The Beginnings of Morality? Philosophers have another reason biologists cannot, in their view, reach to the heart of morality, and that is that biological analyses cannot cross the gap between “is” and “ought,” between the description of some behavior and the issue of why it is right or wrong. “You can identify some value we hold, and tell an evolutionary story about why we hold it, but there is always that radically different question of whether we ought to hold it,” said Sharon Street, a moral philosopher at New York University. “That’s not to discount the importance of what biologists are doing, but it does show why centuries of moral philosophy are incredibly relevant, too.”

Biologists are allowed an even smaller piece of the action by Jesse Prinz, a philosopher at the University of North Carolina. He believes morality developed after human evolution was finished and that moral sentiments are shaped by culture, not genetics. “It would be a fallacy to assume a single true morality could be identified by what we do instinctively, rather than by what we ought to do,” he said. “One of the principles that might guide a single true morality might be recognition of equal dignity for all human beings, and that seems to be unprecedented in the animal world.”

Dr. de Waal does not accept the philosophers’ view that biologists cannot step from “is” to “ought.” “I’m not sure how realistic the distinction is,” he said. “Animals do have ‘oughts.’ If a juvenile is in a fight, the mother must get up and defend her. Or in food sharing, animals do put pressure on each other, which is the first kind of ‘ought’ situation.”

Dr. de Waal’s definition of morality is more down to earth than Dr. Prinz’s. Morality, he writes, is “a sense of right and wrong that is born out of groupwide systems of conflict management based on shared values.” The building blocks of morality are not nice or good behaviors but rather mental and social capacities for constructing societies “in which shared values constrain individual behavior through a system of approval and disapproval.” By this definition chimpanzees in his view do possess some of the behavioral capacities built in our moral systems.

“Morality is as firmly grounded in neurobiology as anything else we do or are,” Dr. de Waal wrote in his 1996 book “Good Natured.” Biologists ignored this possibility for many years, believing that because natural selection was cruel and pitiless it could only produce people with the same qualities. But this is a fallacy, in Dr. de Waal’s view. Natural selection favors organisms that survive and reproduce, by whatever means. And it has provided people, he writes in “Primates and Philosophers,” with “a compass for life’s choices that takes the interests of the entire community into account, which is the essence of human morality.”


Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on April 20, 2007, 08:42:25 AM
This URL about the pyschology of risk-taking  http://risktaking.co.uk/intro.htm  was just posted in the Parkour thread on the Marital Arts forum, but it seems to me like lit belongs here as well.
Title: Re: Evolutionary biology/psychology
Post by: sgtmac_46 on April 21, 2007, 06:24:22 PM
Along the line of evolutionary biology/psychology, but from a purely philosophical vantage point, has anyone read Robert Pirsigs follow up to 'Zen and the Art of Motorcycle Maintenance' entitled 'Lila: An Enquiry into Morals'?  It deals with the concept of how universe developed morality, and what morality actually is.

In short, Pirsig developed a metaphysics of morality, where by the universe has an inate desire to move from a state of lower quality to a state of higher quality.

Pirsig developed a hirearchy of quality

inorganic quality
organic quality
social quality
intellectual quality
dynamic quality

inorganic to organic (why the universe develops life from lifelessness)
organic to social (organisms evolve from singular entities in to group entities)
social to intellectual (organisms develop reason and rationality)

In short, Pirsig stated that each level of quality is in conflict with the lower level and the higher level.

For example, organic quality are all the adaptations that the individual organism has adapted to continue to survive at that level.  The organic level of quality is always in conflict with inorganic quality (death...returning to static inorganic quality) on the one end, and social quality (the group) on the other.

It is the conflict of these qualities, which are adaptive when viewed from below and maladaptive when viewed from above (such as extreme aggression is viewed as a detriment by social quality.....but is adaptive when keeping the organism alive).

Pirsig, from a wide perspective I think, was taking in the direction that the sciences were viewing up close....how each of these phenomenon, which appear disconnected when viewed through the microscope of science, fit together when viewed from a big picture perspective.



Taking Pirsigs metaphysics of moral quality as template, a whole host of moral questions can be answered.  Stealing, for example, has biological quality....it allows an individual biological entity to gain a resources advantage.  From a purely biological perspective there is nothing immoral about stealing.

From a social level of quality, which is a higher level of quality than biological, stealing throws the social order in to chaos, as a society cannot exist without bounderies to individual behavior.  Once a society is seen by the individuals to be unable to maintain social order on biological quality, the individual biological entities see no benefit to continuing to abide by social order, which results in a deteroration of the social order.

The next evolutionary rachette step of morality is always more moral than the lower one.  Biological order is a lower level of quality from Social order.....Social order is a lower set of quality from Intellectual.

Pirsig used the example of Fascism versus Communism.  Fascism represented social order, the subservience of the individual to the state.  Communism, however, represented an idea, an intellectual idea.  As such, Communism was a higher level of quality than Fascism, and hence more moral.  Likewise western liberal democracy, being an intellectual idea manifested as government, is a higher level of quality that Fascism.
Title: Evo-Devo part one
Post by: Crafty_Dog on June 27, 2007, 06:26:08 AM
From a Few Genes, Life's Myriad Shapes
By CAROL KAESUK YOON
Published: June 26, 2007
NY Times

Since its humble beginnings as a single cell, life has evolved into a
spectacular array of shapes and sizes, from tiny fleas to towering
Tyrannosaurus rex, from slow-soaring vultures to fast-swimming swordfish,
and from modest ferns to alluring orchids. But just how such diversity of
form could arise out of evolution's mess of random genetic mutations - how a
functional wing could sprout where none had grown before, or how flowers
could blossom in what had been a flowerless world - has remained one of the
most fascinating and intractable questions in evolutionary biology.

Now finally, after more than a century of puzzling, scientists are finding
answers coming fast and furious and from a surprising quarter, the field
known as evo-devo. Just coming into its own as a science, evo-devo is the
combined study of evolution and development, the process by which a nubbin
of a fertilized egg transforms into a full-fledged adult. And what these
scientists are finding is that development, a process that has for more than
half a century been largely ignored in the study of evolution, appears to
have been one of the major forces shaping the history of life on earth.

For starters, evo-devo researchers are finding that the evolution of complex
new forms, rather than requiring many new mutations or many new genes as had
long been thought, can instead be accomplished by a much simpler process
requiring no more than tweaks to already existing genes and developmental
plans. Stranger still, researchers are finding that the genes that can be
tweaked to create new shapes and body parts are surprisingly few. The same
DNA sequences are turning out to be the spark inciting one evolutionary
flowering after another. "Do these discoveries blow people's minds? Yes,"
said Dr. Sean B. Carroll, biologist at the Howard Hughes Medical Institute
at the University of Wisconsin, Madison. "The first response is 'Huh?' and
the second response is 'Far out.' "

"This is the illumination of the utterly dark," Dr. Carroll added.

The development of an organism - how one end gets designated as the head or
the tail, how feet are enticed to grow at the end of a leg rather than at
the wrist - is controlled by a hierarchy of genes, with master genes at the
top controlling a next tier of genes, controlling a next and so on. But the
real interest for evolutionary biologists is that these hierarchies not only
favor the evolution of certain forms but also disallow the growth of others,
determining what can and cannot arise not only in the course of the growth
of an embryo, but also over the history of life itself.

"It's been said that classical evolutionary theory looks at survival of the
fittest," said Dr. Scott F. Gilbert, a developmental biologist at Swarthmore
College. By looking at what sorts of organisms are most likely or impossible
to develop, he explained, "evo-devo looks at the arrival of the fittest."

Charles Darwin saw it first. He pointed out well over a century ago that
developing forms of life would be central to the study of evolution. Little
came of it initially, for a variety of reasons. Not least of these was the
discovery that perturbing the process of development often resulted in a
freak show starring horrors like bipedal goats and insects with legs growing
out of their mouths, monstrosities that seemed to shed little light on the
wonders of evolution.

But the advent of molecular biology reinvigorated the study of development
in the 1980s, and evo-devo quickly got scientists' attention when early
breakthroughs revealed that the same master genes were laying out
fundamental body plans and parts across the animal kingdom. For example,
researchers discovered that genes in the Pax6 family could switch on the
development of eyes in animals as different as flies and people. More recent
work has begun looking beyond the body's basic building blocks to reveal how
changes in development have resulted in some of the world's most celebrated
of evolutionary events.

In one of the most exciting of the new studies, a team of scientists led by
Dr. Cliff Tabin, a developmental biologist at Harvard Medical School,
investigated a classic example of evolution by natural selection, the
evolution of Darwin's finches on the Galápagos Islands.

Like the other organisms that made it to the remote archipelago off the
coast of Ecuador, Darwin's finches have flourished in their isolation,
evolving into many and varied species. But, while the finches bear his name
and while Darwin was indeed inspired to thoughts of evolution by animals on
these islands, the finches left him flummoxed. Darwin did not realize for
quite some time that these birds were all finches or even that they were
related to one another.

=============
(Page 2 of 5)

He should be forgiven, however. For while the species are descendants of an
original pioneering finch, they no longer bear its characteristic short,
slender beak, which is excellent for hulling tiny seeds. In fact, the
finches no longer look very finchlike at all. Adapting to the strange new
foods of the islands, some have evolved taller, broader, more powerful
nut-cracking beaks; the most impressive of the big-beaked finches is
Geospiza magnirostris. Other finches have evolved longer bills that are
ideal for drilling holes into cactus fruits to get at the seeds; Geospiza
conirostris is one species with a particularly elongated beak.

But how could such bills evolve from a simple finch beak? Scientists had
assumed that the dramatic alterations in beak shape, height, width and
strength would require the accumulation of many chance mutations in many
different genes. But evo-devo has revealed that getting a fancy new beak can
be simpler than anyone had imagined.

Genes are stretches of DNA that can be switched on so that they will produce
molecules known as proteins. Proteins can then do a number of jobs in the
cell or outside it, working to make parts of organisms, switching other
genes on and so on. When genes are switched on to produce proteins, they can
do so at a low level in a limited area or they can crank out lots of protein
in many cells.

What Dr. Tabin and colleagues found, when looking at the range of beak
shapes and sizes across different finch species, was that the thicker and
taller and more robust a beak, the more strongly it expressed a gene known
as BMP4 early in development. The BMP4 gene (its abbreviation stands for
bone morphogenetic protein, No. 4) produces the BMP4 protein, which can
signal cells to begin producing bone. But BMP4 is multitalented and can also
act to direct early development, laying out a variety of architectural plans
including signaling which part of the embryo is to be the backside and which
the belly side. To verify that the BMP4 gene itself could indeed trigger the
growth of grander, bigger, nut-crushing beaks, researchers artificially
cranked up the production of BMP4 in the developing beaks of chicken
embryos. The chicks began growing wider, taller, more robust beaks similar
to those of a nut-cracking finch.

In the finches with long, probing beaks, researchers found at work a
different gene, known as calmodulin. As with BMP4, the more that calmodulin
was expressed, the longer the beak became. When scientists artificially
increased calmodulin in chicken embryos, the chicks began growing extended
beaks, just like a cactus driller.

So, with just these two genes, not tens or hundreds, the scientists found
the potential to recreate beaks, massive or stubby or elongated.

"So now one wants to go in a number of directions," Dr. Tabin said. "What
happens in a stork? What happens in a hummingbird? A parrot?" For the
evolution of beaks, the main tool with which a bird handles its food and
makes its living, is central not only to Darwin's finches, but to birds as a
whole.

BMP4's reach does not stop at the birds, however.

In lakes in Africa, the fish known as cichlids have evolved so rapidly into
such a huge diversity of species that they have become one of the best known
evolutionary radiations. The cichlids have evolved in different shapes and
sizes, and with a variety of jaw types specialized for eating certain kinds
of food. Robust, thick jaws are excellent at crushing snails, while longer
jaws work well for sucking up algae. As with the beaks of finches, a range
of styles developed.

Now in a new study, Dr. R. Craig Albertson, an evolutionary biologist at
Syracuse University, and Dr. Thomas D. Kocher, a geneticist at the
University of New Hampshire, have shown that more robust-jawed cichlids
express more BMP4 during development than those with more delicate jaws. To
test whether BMP4 was indeed responsible for the difference, these
scientists artificially increased the expression of BMP4 in the zebrafish,
the lab rat of the fish world. And, reprising the beak experiments,
researchers found that increased production of BMP4 in the jaws of embryonic
zebrafish led to the development of more robust chewing and chomping parts.

======================

Page 3 of 5)

And if being a major player in the evolution of African cichlids and Darwin's
finches - two of the most famous evolutionary radiations of species - were
not enough for BMP4, Dr. Peter R. Grant, an evolutionary biologist at
Princeton University, predicted that the gene would probably be found to
play an important role in the evolution of still other animals. He noted
that jaw changes were a crucial element in the evolution of lizards, rabbits
and mice, among others, making them prime candidates for evolution via BMP4.

"This is just the beginning," Dr. Grant said. "These are exciting times for
us all."

Used to lay out body plans, build beaks and alter fish jaws, BMP4
illustrates perfectly one of the major recurring themes of evo-devo. New
forms can arise via new uses of existing genes, in particular the control
genes or what are sometimes called toolkit genes that oversee development.
It is a discovery that can explain much that has previously been mysterious,
like the observation that without much obvious change to the genome over
all, one can get fairly radical changes in form.

"There aren't new genes arising every time a new species arises," said Dr.
Brian K. Hall, a developmental biologist at Dalhousie University in Nova
Scotia. "Basically you take existing genes and processes and modify them,
and that's why humans and chimps can be 99 percent similar at the genome
level."

Evo-devo has also begun to shine a light on a phenomenon with which
evolutionary biologists have long been familiar, the way in which different
species will come up with sometimes jaw-droppingly similar solutions when
confronted with the same challenges.

Among the placental mammals of the Americas and the marsupials of Australia,
for example, have evolved the same sorts of animals independently: beasts
that burrowed, loping critters that grazed, creatures that had long snouts
for eating ants, and versions of wolf.

In the same way, the cichlids have evolved pairs of matching species,
arising independently in separate lakes in Africa. In Lake Malawi, for
example, there is a long and flat-headed species with a deep underbite that
looks remarkably like an unrelated species that lives a similar lifestyle in
Lake Tanganyika. There is another cichlid with a bulging brow and frowning
lips in Lake Malawi with, again, an unrelated but otherwise extremely
similar-looking cichlid in Lake Tanganyika. The same jaws, heads, and ways
of living can be seen to evolve again and again.

The findings of evo-devo suggest that such parallels might in fact be
expected. For cichlids are hardly coming up with new genetic solutions to
eating tough snails as they each crank up the BMP4 or tinker with other
toolkit genes. Instead, whether in Lake Malawi or Lake Tanganyika, they may
be using the same genes to develop the same forms that provide the same
solutions to the same ecological challenges. Why not, when even the beaked
birds flying overhead are using the very same genes?

Evo-devo has even begun to give biologists new insight into one of the most
beautiful examples of recurring forms: the evolution of mimicry.

It has long been a source of amazement how some species seem so able to
evolve near-perfect mimicry of another. Poisonous species often evolve
bright warning colors, which have been reproduced by nonpoisonous species or
by other, similarly poisonous species, hoping to fend off curious predators.

Now in a new study of Heliconius butterflies, Dr. Mathieu Joron, an
evolutionary biologist at the University of Edinburgh, and colleagues, found
evidence that the mimics may be using some of the same genes to produce
their copycat warning colors and patterns.

The researchers studied several species of tropical Heliconius butterflies,
all of which are nasty-tasting to birds and which mimic one another's color
patterns. Dr. Joron and colleagues found that some of the main elements of
the patterns - a yellow band in Heliconius melpomene and Heliconius erato
and a complex tiger-stripe pattern in Heliconius numata - are controlled by
a single region of DNA, a tightly linked set of genes known as a supergene.

Dr. Joron said he and colleagues were still mapping the details of color
pattern control within the supergene. But if this turned out to function, as
researchers suspected, like a toolkit gene turning the patterns on and off,
it could explain both the prevalence of mimicry in Heliconius and the
apparent ease with which these species have been shown to repeatedly evolve
such superbly matching patterns.

One of evo-devo's greatest strengths is its cross-disciplinary nature,
bridging not only evolutionary and developmental studies but gaps as broad
as those between fossil-hunting paleontologists and molecular biologists.
One researcher whose approach epitomizes the power of such synthesis is Dr.
Neil Shubin, an evolutionary biologist at the University of Chicago and the
Field Museum.
Title: Evo-Devo part two
Post by: Crafty_Dog on June 27, 2007, 07:03:58 AM
Page 4 of 5)

Last year, Dr. Shubin and colleagues reported the discovery of a fossil fish
on Ellesmere Island in northern Canada. They had found Tiktaalik, as they
named the fish, after searching for six years. They persisted for so long
because they were certain that they had found the right age and kind of rock
where a fossil of a fish trying to make the transition to life on land was
likely to be found. And Tiktaalik appeared to be just such a fish, but it
also had a few surprises for the researchers.

"Tiktaalik is special," Dr. Shubin said. "It has a flat head with eyes on
top. It has gills and lungs. It's an animal that's exploring the interface
between water and land."

But Tiktaalik was a truly stunning discovery because this water-loving fish
bore wrists, an attribute thought to have been an innovation confined
strictly to animals that had already made the transition to land.

"This was telling us that a piece of the toolkit, to make arms, legs, hand
and feet, could very well be present in fish limbs," Dr. Shubin said. In
other words, the genetic tools or toolkit genes for making limbs to walk on
land might well have been present long before fish made that critical leap.
But as fascinating as Tiktaalik was, it was also rock hard and provided no
DNA that might shed light on the presence or absence of any particular gene.

So Dr. Shubin did what more and more evo-devo researchers are learning to
do: take off one hat (paleontologist) and don another (molecular biologist).
Dr. Shubin oversees one of what he says is a small but growing number of
laboratories where old-fashioned rock-pounding takes place alongside
high-tech molecular DNA studies.

He and colleagues began a study of the living but ancient fish known as the
paddlefish. What they found, reported last month in the journal Nature, was
that these thoroughly fishy fish were turning on control genes known as Hox
genes, in a manner characteristic of the four-limbed, land-loving beasts
known as tetrapods.

Tetrapods include cows, people, birds, rodents and so on. In other words,
the potential for making fingers, hands and feet, crucial innovations used
in emerging from the water to a life of walking and crawling on land,
appears to have been present in fish, long before they began flip-flopping
their way out of the muck. "The genetic tools to build fingers and toes were
in place for a long time," Dr. Shubin wrote in an e-mail message. "Lacking
were the environmental conditions where these structures would be useful."
He added, "Fingers arose when the right environments arose."

And here is another of the main themes to emerge from evo-devo. Major events
in evolution like the transition from life in the water to life on land are
not necessarily set off by the arising of the genetic mutations that will
build the required body parts, or even the appearance of the body parts
themselves, as had long been assumed. Instead, it is theorized that the
right ecological situation, the right habitat in which such bold, new forms
will prove to be particularly advantageous, may be what is required to set
these major transitions in motion.

So far, most of the evo-devo work has been on animals, but researchers have
begun to ask whether the same themes are being played out in plants.

Of particular interest to botanists is what Darwin described as an
"abominable mystery": the origin of flowering plants. A critical event in
the evolution of plants, it happened, by paleontological standards, rather
suddenly.

So what genes were involved in the origin of flowers? Botanists know that
during development, the genes known as MADS box genes lay out the
architecture of the blossom. They do so by turning on other genes, thereby
determining what will develop where - petals here, reproductive parts there
and so on, in much the same manner that Hox genes determine the general
layout of parts in animals. Hox genes have had an important role in the
evolution of animal form. But have MADS box genes had as central a role in
the evolution of plants?

================
Page 5 of 5)

So far, said Dr. Vivian F. Irish, a developmental biologist at Yale
University, the answer appears to be yes. There is a variety of
circumstantial evidence, the most interesting of which is the fact that the
MADS box genes exploded in number right around the time that flowering
plants first appeared.

"It's really analogous to what's going on in Hox genes," said Dr. Irish,
though she noted that details of the role of the MADS box genes remained to
be worked out. "It's very cool that evolution has used a similar strategy in
two very different kingdoms."

Amid the enthusiast hubbub, cautionary notes have been sounded. Dr. Jerry
Coyne, an evolutionary biologist at the University of Chicago, said that as
dramatic as the changes in form caused by mutations in toolkit genes can be,
it was premature to credit these genes with being the primary drivers of the
evolution of novel forms and diversity. He said that too few studies had
been done so far to support such broad claims, and that it could turn out
that other, more mundane workaday genes, of the sort that were being studied
long before evo-devo appeared on the scene, would play equally or even more
important roles.

"I urge caution," Dr. Coyne said. "We just don't know."

All of which goes to show that like all emerging fields, evo-devo's
significance and the uniqueness of its contributions will continue to be
reassessed. It will remain to be seen just how separate or incorporated into
the rest of evolutionary thinking its findings will end up being.
Paradoxically, it was during just such a flurry of intellectual synthesis
and research activity, the watershed known as the New or Modern Synthesis in
which modern evolutionary biology was born in the last century, that
developmental thinking was almost entirely ejected from the science of
evolution.

But perhaps today synthesizers can do better, broadening their focus without
constricting their view of evolution as they try to take in all of the great
pageant that is the history of life.

"We're still a very young field," Dr. Gilbert said. "But I think this is a
new evolutionary synthesis, an emerging evolutionary synthesis. I think we're
seeing it."
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on June 29, 2007, 11:54:59 AM
By RANDOLPH E. SCHMID, AP Science Writer
Mon Jun 25, 5:00 PM ET

WASHINGTON - Researchers studying Neanderthal DNA say it should be possible to construct a complete genome of the ancient hominid despite the degradation of the DNA over time.  There is also hope for reconstructing the genome of the mammoth and cave bear, according to a research team led by Svante Paabo of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. Their findings are published in this week's online edition of Proceedings of the National Academy of Sciences.

Debate has raged for years about whether there is any relationship between Neanderthals and modern humans. Some researchers believe that Neanderthals were simply replaced by early modern humans, while others argue the two groups may have interbred.

Sequencing the genome of Neanderthals, who lived in Europe until about 30,000 years ago, could shed some light on that question.  In studies of Neanderthals, cave bear and mammoth, a majority of the DNA recovered was that of microorganisms that colonized the tissues after death, the researchers said.  But they were able to identify some DNA from the original animal, and Paabo and his colleagues were able to determine how it broke down over time. They also developed procedures to prevent contamination by the DNA of humans working with the material.

"We are confident that it will be technically feasible to achieve a reliable Neanderthal genome sequence," Paabo and his researchers reported.

They said problem of damaged areas in some DNA could be overcome by using a sufficient amount of Neanderthal DNA from different individuals, so the whole genome can be determined.

"The contamination and degradation of DNA has been a serious issue for the last 10 years," observed Erik Trinkaus, a professor at Washington University in St. Louis. "This is a serious attempt to deal with that issue and that's welcome.  I'm not sure they have completely solved the problem, but they've made a big step in that direction," said Trinkaus, who was not involved in the research.

Anthropologist Richard Potts of the Smithsonian's National Museum of Natural History, called the work "a very significant technical study of DNA decay."

The researchers "have tried to answer important questions about the potential to sequence ancient DNA," said Potts, who was not part of the research.

Milford Wolpoff, a University of Michigan Anthropologist, said creating a complete Neanderthal genome is a great goal.

But it is "sample intensive," he said, and he isn't sure enough DNA is available to complete the work. Curators don't like to see their specimens ground up, he said.

The research was funded by the Max Planck Society and the National Institutes of Health.

http://news.yahoo.com:80/s/ap/20070625/ap_on_sc/neanderthal_dna
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on July 09, 2007, 06:11:42 AM

The International Hopology site http://www.hoplology.com/about.htm#three is well worth taking a look.

Its "Three Axioms" break down the aggressive instinct differently from Konrad Lorenz.

To refresh memories, KL wrote of three categories: Territory, Hierarchy, Reproduction.  To these three in the case of humans I have added Hunting e.g. a criminal stealing money in effect is taking food and his behaviors will be those of a hunter.

In contrast, as seen below, Hopology apparently has two categories.  My first intuitive response is that their approach also seems to have merit.

TAC,
CD
==============

Three Axioms of Hoplology

1. The foundation of human combative behavior is rooted in our evolution. To gain a realistic understanding of human combative behavior, it is necessary to have a basic grasp of its evolutionary background.

2. The two basic forms of human combative behavior are predatory and affective. Predatory combative behavior is that combative/aggressive behavior rooted in our evolution as a hunting mammal. Affective combative behavior is that aggressive/combative behavior rooted in our evolution as a group-social animal.

3. The evolution of human combative behavior and performance is integral with the use of weapons. That is, behavior and performance is intrinsically linked to and reflects the use of weapons.
Title: In games, an insight into the Rules of Evolution: NYT
Post by: Crafty_Dog on July 31, 2007, 01:35:47 PM
In Games, an Insight Into the Rules of Evolution

By CARL ZIMMER
Published: July 31, 2007
When Martin Nowak was in high school, his parents thought he would be a nice boy and become a doctor. But when he left for the University of Vienna, he abandoned medicine for something called biochemistry. As far as his parents could tell, it had something to do with yeast and fermenting. They became a little worried. When their son entered graduate school, they became even more worried. He announced that he was now studying games.

In the end, Dr. Nowak turned out all right. He is now the director of the Program for Evolutionary Dynamics at Harvard. The games were actually versatile mathematical models that Dr. Nowak could use to make important discoveries in fields as varied as economics and cancer biology.

“Martin has a passion for taking informal ideas that people like me find theoretically important and framing them as mathematical models,” said Steven Pinker, a Harvard linguist who is collaborating with Dr. Nowak to study the evolution of language. “He allows our intuitions about what leads to what to be put to a test.”

On the surface, Dr. Nowak’s many projects may seem randomly scattered across the sciences. But there is an underlying theme to his work. He wants to understand one of the most puzzling yet fundamental features of life: cooperation.

When biologists speak of cooperation, they speak more broadly than the rest of us. Cooperation is what happens when someone or something gets a benefit because someone or something else pays a cost. The benefit can take many forms, like money or reproductive success. A friend takes off work to pick you up from the hospital. A sterile worker bee tends to eggs in a hive. Even the cells in the human body cooperate. Rather than reproducing as fast as it can, each cell respects the needs of the body, helping to form the heart, the lungs or other vital organs. Even the genes in a genome cooperate, to bring an organism to life.

In recent papers, Dr. Nowak has argued that cooperation is one of the three basic principles of evolution. The other two are mutation and selection. On their own, mutation and selection can transform a species, giving rise to new traits like limbs and eyes. But cooperation is essential for life to evolve to a new level of organization. Single-celled protozoa had to cooperate to give rise to the first multicellular animals. Humans had to cooperate for complex societies to emerge.

“We see this principle everywhere in evolution where interesting things are happening,” Dr. Nowak said.

While cooperation may be central to evolution, however, it poses questions that are not easy to answer. How can competing individuals start to cooperate for the greater good? And how do they continue to cooperate in the face of exploitation? To answer these questions, Dr. Nowak plays games.

His games are the intellectual descendants of a puzzle known as the Prisoner’s Dilemma. Imagine two prisoners are separately offered the same deal: if one of them testifies and the other doesn’t talk, the talker will go free and the holdout will go to jail for 10 years. If both refuse to talk, the prosecutor will only be able to put them in jail for six months. If each prisoner rats out the other, they will both get five-year sentences. Not knowing what the other prisoner will do, how should each one act?

The way the Prisoner’s Dilemma pits cooperation against defection distills an important feature of evolution. In any encounter between two members of the same species, each one may cooperate or defect. Certain species of bacteria, for example, spray out enzymes that break down food, which all the bacteria can then suck up. It costs energy to make these enzymes. If one of the microbes stops cooperating and does not make the enzymes, it can still enjoy the meal. It can gain a potential reproductive edge over bacteria that cooperate.

The Prisoner’s Dilemma may be abstract, but that’s why Dr. Nowak likes it. It helps him understand fundamental rules of evolution, just as Isaac Newton discovered that objects in motion tend to stay in motion.

“If you were obsessed with friction, you would have never discovered this law,” Dr. Nowak said. “In the same sense, I try to get rid of what is inessential to find the essential. Truth is simple.”

Dr. Nowak found his first clues to the origin of cooperation in graduate school, collaborating with his Ph.D. adviser, Karl Sigmund. They built a version of the Prisoner’s Dilemma that captured more of the essence of how organisms behave and evolve.

In their game, an entire population of players enters a round-robin competition. The players are paired up randomly, and each one chooses whether to cooperate or defect. To make a choice, they can recall their past experiences with other individual players. Some players might use a strategy in which they had a 90-percent chance of cooperating with a player with whom they have cooperated in the past.
===========

(Page 2 of 2)



The players get rewarded based on their choices. The most successful players get to reproduce. Each new player had a small chance of randomly mutating its strategy. If that strategy turned out to be more successful, it could dominate the population, wiping out its ancestors.

Dr. Nowak and Dr. Sigmund observed this tournament through millions of rounds. Often the winners used a strategy that Dr. Nowak called, “win-stay, lose-shift.” If they did well in the previous round, they did the same thing again. If they did not do so well, they shifted. Under some conditions, this strategy caused cooperation to become common among the players, despite the short-term payoff of defecting.

In order to study this new version of the Prisoner’s Dilemma, Dr. Nowak had to develop new mathematical tools. It turned out that these tools also proved useful for studying cancer. Cancer and the Prisoner’s Dilemma may seem like apples and oranges, but Dr. Nowak sees an intimate connection between the two. “Cancer is a breakdown of cooperation,” he said.

Mutations sometimes arise in cells that cause them to replicate quickly, ignoring signals to stop. Some of their descendants acquire new mutations, allowing them to become even more successful as cancer cells. They evolve, in other words, into more successful defectors. “Cancer is an evolution you don’t want,” Dr. Nowak said.

To study cancer, however, Dr. Nowak had to give his models some structure. In the Prisoner’s Dilemma, the players usually just bump into each other randomly. In the human body, on the other hand, cells only interact with cells in their neighborhood.

A striking example of these neighborhoods can be found in the intestines, where the lining is organized into millions of tiny pockets. A single stem cell at the bottom of a pocket divides, and its daughter cells are pushed up the pocket walls. The cells that reach the top get stripped away.

Dr. Nowak adapted a branch of mathematics known as graph theory, which makes it possible to study networks, to analyze how cancer arises in these local neighborhoods. “Our tissue is actually organized to delay the onset of cancer,” he said.

Pockets of intestinal cells, for example, can only hold a few cell generations. That lowers the chances that any one will turn cancerous. All the cells in each pocket are descended from a single stem cell, so that there’s no competition between lineages to take over the pocket.

As Dr. Nowak developed this neighborhood model, he realized it would help him study human cooperation. “The reality is that I’m much more likely to interact with my friends, and they’re much more likely to interact with their friends,” Dr. Nowak said. “So it’s more like a network.”

Dr. Nowak and his colleagues found that when they put players into a network, the Prisoner’s Dilemma played out differently. Tight clusters of cooperators emerge, and defectors elsewhere in the network are not able to undermine their altruism. “Even if outside our network there are cheaters, we still help each other a lot,” Dr. Nowak said. That is not to say that cooperation always emerges. Dr. Nowak identified the conditions when it can arise with a simple equation: B/C>K. That is, cooperation will emerge if the benefit-to-cost (B/C) ratio of cooperation is greater than the average number of neighbors (K).

“It’s the simplest possible thing you could have expected, and it’s completely amazing,” he said.

Another boost for cooperation comes from reputations. When we decide whether to cooperate, we don’t just rely on our past experiences with that particular person. People can gain reputations that precede them. Dr. Nowak and his colleagues pioneered a version of the Prisoner’s Dilemma in which players acquire reputations. They found that if reputations spread quickly enough, they could increase the chances of cooperation taking hold. Players were less likely to be fooled by defectors and more likely to benefit from cooperation.

In experiments conducted by other scientists with people and animals, Dr. Nowak’s mathematical models seem to fit. Reputation has a powerful effect on how people play games. People who gain a reputation for not cooperating tend to be shunned or punished by other players. Cooperative players get rewarded.

“You help because you know it gives you a reputation of a helpful person, who will be helped,” Dr. Nowak said. “You also look at others and help them according to whether they have helped.”

The subject of human cooperation is important not just to mathematical biologists like Dr. Nowak, but to many people involved in the current debate over religion and science. Some claim that it is unlikely that evolution could have produced humans’ sense of morality, the altruism of heroes and saints. “Selfless altruism presents a major challenge for the evolutionist,” Dr. Francis S. Collins, the director of the National Human Genome Research Institute, wrote in his 2006 book, “The Language of God.”

Dr. Nowak believes evolutionary biologists should study average behavior rather than a few extreme cases of altruism. “Saintly behavior is unfortunately not the norm,” Dr. Nowak said. “The current theory can certainly explain a population where some people act extremely altruistically.” That does not make Dr. Nowak an atheist, however. “Evolution describes the fundamental laws of nature according to which God chose to unfold life,” he declared in March in a lecture titled “Evolution and Christianity” at the Harvard Divinity School. Dr. Nowak is collaborating with theologians there on a project called “The Evolution and Theology of Cooperation,” to help theologians address evolutionary biology in their own work.

Dr. Nowak sometimes finds his scientific colleagues astonished when he defends religion. But he believes the astonishment comes from a misunderstanding of the roles of science and religion. “Like mathematics, many theological statements do not need scientific confirmation. Once you have the proof of Fermat’s Last Theorem, it’s not like we have to wait for the scientists to tell us if it’s right. This is it.”

Title: A theory of affluence
Post by: Crafty_Dog on August 07, 2007, 07:02:37 AM
In Dusty Archives, a Theory of Affluence
By NICHOLAS WADE
Published: August 7, 2007

For thousands of years, most people on earth lived in abject poverty, first as hunters and gatherers, then as peasants or laborers. But with the Industrial Revolution, some societies traded this ancient poverty for amazing affluence.

Breaking Out of a Malthusian Trap Historians and economists have long struggled to understand how this transition occurred and why it took place only in some countries. A scholar who has spent the last 20 years scanning medieval English archives has now emerged with startling answers for both questions.

Gregory Clark, an economic historian at the University of California, Davis, believes that the Industrial Revolution — the surge in economic growth that occurred first in England around 1800 — occurred because of a change in the nature of the human population. The change was one in which people gradually developed the strange new behaviors required to make a modern economy work. The middle-class values of nonviolence, literacy, long working hours and a willingness to save emerged only recently in human history, Dr. Clark argues.

Because they grew more common in the centuries before 1800, whether by cultural transmission or evolutionary adaptation, the English population at last became productive enough to escape from poverty, followed quickly by other countries with the same long agrarian past.

Dr. Clark’s ideas have been circulating in articles and manuscripts for several years and are to be published as a book next month, “A Farewell to Alms” (Princeton University Press). Economic historians have high praise for his thesis, though many disagree with parts of it.

“This is a great book and deserves attention,” said Philip Hoffman, a historian at the California Institute of Technology. He described it as “delightfully provocative” and a “real challenge” to the prevailing school of thought that it is institutions that shape economic history.

Samuel Bowles, an economist who studies cultural evolution at the Santa Fe Institute, said Dr. Clark’s work was “great historical sociology and, unlike the sociology of the past, is informed by modern economic theory.”

The basis of Dr. Clark’s work is his recovery of data from which he can reconstruct many features of the English economy from 1200 to 1800. From this data, he shows, far more clearly than has been possible before, that the economy was locked in a Malthusian trap _ — each time new technology increased the efficiency of production a little, the population grew, the extra mouths ate up the surplus, and average income fell back to its former level.

This income was pitifully low in terms of the amount of wheat it could buy. By 1790, the average person’s consumption in England was still just 2,322 calories a day, with the poor eating a mere 1,508. Living hunter-gatherer societies enjoy diets of 2,300 calories or more.

“Primitive man ate well compared with one of the richest societies in the world in 1800,” Dr. Clark observes.

The tendency of population to grow faster than the food supply, keeping most people at the edge of starvation, was described by Thomas Malthus in a 1798 book, “An Essay on the Principle of Population.” This Malthusian trap, Dr. Clark’s data show, governed the English economy from 1200 until the Industrial Revolution and has in his view probably constrained humankind throughout its existence. The only respite was during disasters like the Black Death, when population plummeted, and for several generations the survivors had more to eat.

Malthus’s book is well known because it gave Darwin the idea of natural selection. Reading of the struggle for existence that Malthus predicted, Darwin wrote in his autobiography, “It at once struck me that under these circumstances favourable variations would tend to be preserved, and unfavourable ones to be destroyed. ... Here then I had at last got a theory by which to work.”

Given that the English economy operated under Malthusian constraints, might it not have responded in some way to the forces of natural selection that Darwin had divined would flourish in such conditions? Dr. Clark started to wonder whether natural selection had indeed changed the nature of the population in some way and, if so, whether this might be the missing explanation for the Industrial Revolution.
==========

The Industrial Revolution, the first escape from the Malthusian trap, occurred when the efficiency of production at last accelerated, growing fast enough to outpace population growth and allow average incomes to rise. Many explanations have been offered for this spurt in efficiency, some economic and some political, but none is fully satisfactory, historians say.

Breaking Out of a Malthusian Trap Dr. Clark’s first thought was that the population might have evolved greater resistance to disease. The idea came from Jared Diamond’s book “Guns, Germs and Steel,” which argues that Europeans were able to conquer other nations in part because of their greater immunity to disease.

In support of the disease-resistance idea, cities like London were so filthy and disease ridden that a third of their populations died off every generation, and the losses were restored by immigrants from the countryside. That suggested to Dr. Clark that the surviving population of England might be the descendants of peasants.

A way to test the idea, he realized, was through analysis of ancient wills, which might reveal a connection between wealth and the number of progeny. The wills did that, , but in quite the opposite direction to what he had expected.

Generation after generation, the rich had more surviving children than the poor, his research showed. That meant there must have been constant downward social mobility as the poor failed to reproduce themselves and the progeny of the rich took over their occupations. “The modern population of the English is largely descended from the economic upper classes of the Middle Ages,” he concluded.

As the progeny of the rich pervaded all levels of society, Dr. Clark considered, the behaviors that made for wealth could have spread with them. He has documented that several aspects of what might now be called middle-class values changed significantly from the days of hunter gatherer societies to 1800. Work hours increased, literacy and numeracy rose, and the level of interpersonal violence dropped.

Another significant change in behavior, Dr. Clark argues, was an increase in people’s preference for saving over instant consumption, which he sees reflected in the steady decline in interest rates from 1200 to 1800.

“Thrift, prudence, negotiation and hard work were becoming values for communities that previously had been spendthrift, impulsive, violent and leisure loving,” Dr. Clark writes.

Around 1790, a steady upward trend in production efficiency first emerges in the English economy. It was this significant acceleration in the rate of productivity growth that at last made possible England’s escape from the Malthusian trap and the emergence of the Industrial Revolution.

In the rest of Europe and East Asia, populations had also long been shaped by the Malthusian trap of their stable agrarian economies. Their workforces easily absorbed the new production technologies that appeared first in England.

It is puzzling that the Industrial Revolution did not occur first in the much larger populations of China or Japan. Dr. Clark has found data showing that their richer classes, the Samurai in Japan and the Qing dynasty in China, were surprisingly unfertile and so would have failed to generate the downward social mobility that spread production-oriented values in England.

After the Industrial Revolution, the gap in living standards between the richest and the poorest countries started to accelerate, from a wealth disparity of about 4 to 1 in 1800 to more than 50 to 1 today. Just as there is no agreed explanation for the Industrial Revolution, economists cannot account well for the divergence between rich and poor nations or they would have better remedies to offer.

Many commentators point to a failure of political and social institutions as the reason that poor countries remain poor. But the proposed medicine of institutional reform “has failed repeatedly to cure the patient,” Dr. Clark writes. He likens the “cult centers” of the World Bank and International Monetary Fund to prescientific physicians who prescribed bloodletting for ailments they did not understand.

If the Industrial Revolution was caused by changes in people’s behavior, then populations that have not had time to adapt to the Malthusian constraints of agrarian economies will not be able to achieve the same production efficiencies, his thesis implies.

=================

Page 3 of 3)


Dr. Clark says the middle-class values needed for productivity could have been transmitted either culturally or genetically. But in some passages, he seems to lean toward evolution as the explanation. “Through the long agrarian passage leading up to the Industrial Revolution, man was becoming biologically more adapted to the modern economic world,” he writes. And, “The triumph of capitalism in the modern world thus may lie as much in our genes as in ideology or rationality.”

Breaking Out of a Malthusian Trap What was being inherited, in his view, was not greater intelligence — being a hunter in a foraging society requires considerably greater skill than the repetitive actions of an agricultural laborer. Rather, it was “a repertoire of skills and dispositions that were very different from those of the pre-agrarian world.”

Reaction to Dr. Clark’s thesis from other economic historians seems largely favorable, although few agree with all of it, and many are skeptical of the most novel part, his suggestion that evolutionary change is a factor to be considered in history.

Historians used to accept changes in people’s behavior as an explanation for economic events, like Max Weber’s thesis linking the rise of capitalism with Protestantism. But most have now swung to the economists’ view that all people are alike and will respond in the same way to the same incentives. Hence they seek to explain events like the Industrial Revolution in terms of changes in institutions, not people.

Dr. Clark’s view is that institutions and incentives have been much the same all along and explain very little, which is why there is so little agreement on the causes of the Industrial Revolution. In saying the answer lies in people’s behavior, he is asking his fellow economic historians to revert to a type of explanation they had mostly abandoned and in addition is evoking an idea that historians seldom consider as an explanatory variable, that of evolution.

Most historians have assumed that evolutionary change is too gradual to have affected human populations in the historical period. But geneticists, with information from the human genome now at their disposal, have begun to detect ever more recent instances of human evolutionary change like the spread of lactose tolerance in cattle-raising people of northern Europe just 5,000 years ago. A study in the current American Journal of Human Genetics finds evidence of natural selection at work in the population of Puerto Rico since 1513. So historians are likely to be more enthusiastic about the medieval economic data and elaborate time series that Dr. Clark has reconstructed than about his suggestion that people adapted to the Malthusian constraints of an agrarian society.

“He deserves kudos for assembling all this data,” said Dr. Hoffman, the Caltech historian, “but I don’t agree with his underlying argument.”

The decline in English interest rates, for example, could have been caused by the state’s providing better domestic security and enforcing property rights, Dr. Hoffman said, not by a change in people’s willingness to save, as Dr. Clark asserts.

The natural-selection part of Dr. Clark’s argument “is significantly weaker, and maybe just not necessary, if you can trace the changes in the institutions,” said Kenneth L. Pomeranz, a historian at the University of California, Irvine. In a recent book, “The Great Divergence,” Dr. Pomeranz argues that tapping new sources of energy like coal and bringing new land into cultivation, as in the North American colonies, were the productivity advances that pushed the old agrarian economies out of their Malthusian constraints.

Robert P. Brenner, a historian at the University of California, Los Angeles, said although there was no satisfactory explanation at present for why economic growth took off in Europe around 1800, he believed that institutional explanations would provide the answer and that Dr. Clark’s idea of genes for capitalist behavior was “quite a speculative leap.”

Dr. Bowles, the Santa Fe economist, said he was “not averse to the idea” that genetic transmission of capitalist values is important, but that the evidence for it was not yet there. “It’s just that we don’t have any idea what it is, and everything we look at ends up being awfully small,” he said. Tests of most social behaviors show they are very weakly heritable.

He also took issue with Dr. Clark’s suggestion that the unwillingness to postpone consumption, called time preference by economists, had changed in people over the centuries. “If I were as poor as the people who take out payday loans, I might also have a high time preference,” he said.

Dr. Clark said he set out to write his book 12 years ago on discovering that his undergraduates knew nothing about the history of Europe. His colleagues have been surprised by its conclusions but also interested in them, he said.

“The actual data underlying this stuff is hard to dispute,” Dr. Clark said. “When people see the logic, they say ‘I don’t necessarily believe it, but it’s hard to dismiss.’ ”

NY TIMES
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on August 28, 2007, 05:56:23 AM
A good discussion of human evolutionary biology/pychology of aggression:

http://www.gresham.ac.uk/event.asp?PageId=108&EventId=291
Title: Migrating Genomes?
Post by: Body-by-Guinness on August 31, 2007, 12:17:03 PM
August 27, 2007

One Species' Genome Discovered Inside Another's

Bacterial to Animal Gene Transfers Now Shown to be Widespread, with Implications for Evolution and Control of Diseases and Pests
Scientists at the University of Rochester and the J. Craig Venter Institute have discovered a copy of the genome of a bacterial parasite residing inside the genome of its host species.

The research, reported in today's Science, also shows that lateral gene transfer—the movement of genes between unrelated species—may happen much more frequently between bacteria and multicellular organisms than scientists previously believed, posing dramatic implications for evolution.

Such large-scale heritable gene transfers may allow species to acquire new genes and functions extremely quickly, says Jack Werren, a principal investigator of the study. If such genes provide new abilities in species that cause or transmit disease, they could provide new targets for fighting these diseases.

The results also have serious repercussions for genome-sequencing projects. Bacterial DNA is routinely discarded when scientists are assembling invertebrate genomes, yet these genes may very well be part of the organism's genome, and might even be responsible for functioning traits.

"This study establishes the widespread occurrence and high frequency of a process that we would have dismissed as science fiction until just a few years ago," says W. Ford Doolittle, Canada Research Chair in Comparative Microbial Genomics at Dalhousie University, who is not connected to the study. "This is stunning evidence for increased frequency of gene transfer."

"It didn't seem possible at first," says Werren, professor of biology at the University of Rochester and a world-leading authority on the parasite, called wolbachia. "This parasite has implanted itself inside the cells of 70 percent of the world's invertebrates, coevolving with them. And now, we've found at least one species where the parasite's entire or nearly entire genome has been absorbed and integrated into the host's. The host's genes actually hold the coding information for a completely separate species."

Wolbachia may be the most prolific parasite in the world—a "pandemic," as Werren calls it. The bacterium invades a member of a species, most often an insect, and eventually makes its way into the host's eggs or sperm. Once there, the wolbachia is ensured passage to the next generation of its host, and any genetic exchanges between it and the host also are much more likely to be passed on.

Since wolbachia typically live within the reproductive organs of their hosts, Werren reasoned that gene exchanges between the two would frequently pass on to subsequent generations. Based on this and an earlier discovery of a wolbachia gene in a beetle by the Fukatsu team at the University of Tokyo, Japan, the researchers in Werren's lab and collaborators at J. Craig Venter Institute (JCVI) decided to systematically screen invertebrates. Julie Dunning-Hotopp at JCVI found evidence that some of the wolbachia genes seemed to be fused to the genes of the fruitfly, Drosophila ananassae, as if they were part of the same genome.

Michael Clark, a research associate at Rochester then brought a colony of ananassae into Werren's lab to look into the mystery. To isolate the fly's genome from the parasite's, Clark fed the flies a simple antibiotic, killing the Wolbachia. To confirm the ananassae flies were indeed cured of the wolbachia, Clark tested a few samples of DNA for the presence of several wolbachia genes.

To his dismay, he found them.

"For several months, I thought I was just failing," says Clark. "I kept administering antibiotics, but every single wolbachia gene I tested for was still there. I started thinking maybe the strain had grown antibiotic resistance. After months of this I finally went back and looked at the tissue again, and there was no wolbachia there at all."

Clark had cured the fly of the parasite, but a copy of the parasite's genome was still present in the fly's genome. Clark was able to see that wolbachia genes were present on the second chromosome of the insect.

Clark confirmed that the wolbachia genes are inherited like "normal" insect genes in the chromosomes, and Dunning-Hotopp showed that some of the genes are "transcribed" in uninfected flies, meaning that copies of the gene sequence are made in cells that could be used to make wolbachia proteins.

Werren doesn't believe that the wolbachia "intentionally" insert their genes into the hosts. Rather, it is a consequence of cells routinely repairing their damaged DNA. As cells go about their regular business, they can accidentally absorb bits of DNA into their nuclei, often sewing those foreign genes into their own DNA. But integrating an entire genome was definitely an unexpected find.

"The question is, are these foreign genes providing new functions for the host? This is something we need to figure out."
Werren and Clark are now looking further into the huge insert found in the fruitfly, and whether it is providing a benefit. "The chance that a chunk of DNA of this magnitude is totally neutral, I think, is pretty small, so the implication is that it has imparted of some selective advantage to the host," says Werren. "The question is, are these foreign genes providing new functions for the host? This is something we need to figure out."

Evolutionary biologists will certainly take note of this discovery, but scientists conducting genome-sequencing projects around the world also may have to readjust their thinking.

Before this study, geneticists knew of examples where genes from a parasite had crossed into the host, but such an event was considered a rare anomaly except in very simple organisms. Bacterial DNA is very conspicuous in its structure, so if scientists sequencing a nematode genome, for example, come across bacterial DNA, they would likely discard it, reasonably assuming that it was merely contamination—perhaps a bit of bacteria in the gut of the animal, or on its skin.

But those genes may not be contamination. They may very well be in the host's own genome. This is exactly what happened with the original sequencing of the genome of the anannassae fruitfly—the huge wolbachia insert was discarded from the final assembly, despite the fact that it is part of the fly's genome.

In the early days of the Human Genome Project, some studies appeared to show bacterial DNA residing in our own genome, but those were shown indeed to be caused by contamination. Wolbachia is not known to infect any vertebrates such as humans.

"Such transfers have happened before in the distant past" notes Werren. "In our very own cells and those of nearly all plants and animals are mitochondria, special structures responsible for generating most of our cells' supply of chemical energy. These were once bacteria that lived inside cells, much like wolbachia does today. Mitochondria still retain their own, albeit tiny, DNA, and most of the genes moved into the nucleus in the very distant past. Like wolbachia, they have passively exchanged DNA with their host cells. It's possible wolbachia may follow in the path of mitochondria, eventually becoming a necessary and useful part of a cell.

"In a way, wolbachia could be the next mitochondria," says Werren. "A hundred million years from now, everyone may have a wolbachia organelle."

"Well, not us," he laughs. "We'll be long gone, but wolbachia will still be around."

This research was funded by the National Science Foundation's Frontiers in Integrative Biological Research program, which supports large, integrative projects addressing major questions in biology.
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on August 31, 2007, 04:36:36 PM
That is fascinating!
Title: Baboons
Post by: Crafty_Dog on October 09, 2007, 05:33:28 AM
NY Times
By NICHOLAS WADE
Published: October 9, 2007
Royal is a cantankerous old male baboon whose troop of some 80 members lives in the Moremi Game Reserve in Botswana. A perplexing event is about to disturb his day.

From the bushes to his right, he hears a staccato whoop, the distinctive call that female baboons always make after mating. He recognizes the voice as that of Jackalberry, the current consort of Cassius, a male who outranks Royal in the strict hierarchy of male baboons. No hope of sex today.

But then, surprisingly, he hears Cassius’s signature greeting grunt to his left. His puzzlement is plain on the video made of his reaction. You can almost see the wheels turn slowly in his head:

“Jackalberry here, but Cassius over there. Hmm, Jackalberry must be hooking up with some one else. But that means Cassius has left her unguarded. Say what — this is my big chance!”

The video shows him loping off in the direction of Jackalberry’s whoop. But all that he will find is the loudspeaker from which researchers have played Jackalberry’s recorded call.

The purpose of the experiment is not to ruin Royal’s day but to understand what goes on in a baboon’s mind, in this case how carefully the animals keep track of transient relationships.

Dorothy Cheney and Robert Seyfarth, a husband-and-wife team of biologists at the University of Pennsylvania, have spent 14 years observing the Moremi baboons. Through ingenious playback experiments performed by themselves and colleagues, the researchers say they have worked out many aspects of what baboons use their minds for, along with their limitations.

Reading a baboon’s mind affords an excellent grasp of the dynamics of baboon society. But more than that, it bears on the evolution of the human mind and the nature of human existence. As Darwin jotted down in a notebook of 1838, “He who understands baboon would do more towards metaphysics than Locke.”

Dr. Cheney and Dr. Seyfarth are well known for a 1990 book on vervet monkeys, “How Monkeys See the World,” in which they showed how much about the animals’ mental processes could be deduced from careful experiments.

When a baby vervet’s call is played to three females, for instance, the mother looks to the source of the sound. The two others look to the mother, evidence that vervets know whose baby is whose.

An experiment like this — recording the sounds, waiting until the animals are in the right place and performing numerous controls — can take months to complete, but the results are widely admired by other biologists. “Any work of Dorothy and Robert’s is going to be as good as you get in the field,” said Robert M. Sapolsky, a Stanford biologist and an author who has studied baboons in the wild for many years.

“There is no one else in the area of animal behavior who does such incredibly interesting experiments in the field,” said Marc Hauser, a biologist at Harvard who was their first student.

Dr. Cheney and Dr. Seyfarth have summed up their new cycle of research in a book titled, after Darwin’s comment, “Baboon Metaphysics.” Their conclusion, based on many painstaking experiments, is that baboons’ minds are specialized for social interaction, for understanding the structure of their complex society and for navigating their way within it.

The shaper of a baboon’s mind is natural selection. Those with the best social skills leave the most offspring.

“Monkey society is governed by the same two general rules that governed the behavior of women in so many 19th-century novels,” Dr. Cheney and Dr. Seyfarth write. “Stay loyal to your relatives (though perhaps at a distance, if they are an impediment), but also try to ingratiate yourself with the members of high-ranking families.”

Baboon society revolves around mother-daughter lines of descent. Eight or nine matrilines are in a troop, each with a rank order. This hierarchy can remain stable for generations.

By contrast, the male hierarchy, which consists mostly of baboons born in other troops, is always changing as males fight among themselves and with new arrivals.

Rank among female baboons is hereditary, with a daughter assuming her mother’s rank.

News of that fact gave great satisfaction to a member of the British royal family, Princess Michael of Kent. She visited Dr. Cheney and Dr. Seyfarth in Botswana, remarking to them, they report: “I always knew that when people who aren’t like us claim that hereditary rank is not part of human nature, they must be wrong. Now you’ve given me evolutionary proof!”

=====

Baboons live with danger on every side. Many fall prey to lions, leopards, pythons and the crocodiles that in the wet season stalk the fords where baboons cross from one island to another. Baboon watchers are subject to the same hazards. Dr. Cheney and Dr. Seyfarth say their rules are not to work alone or to wade into water deeper than knee high. They often find themselves sitting in a tree with baboons waiting out a lion below. But going into New York is more petrifying, they contend, than dodging Botswana’s predators.

The baboons will bark to warn of lions and leopards, but pay no attention to some other species dangerous to humans like buffalo and elephant. On two occasions, baboons have attacked animals, a leopard and a honey badger, that threatened their human companions. “We haven’t lost any post-docs,” Dr. Seyfarth said.

For female baboons, another constant worry besides predation is infanticide. Their babies are put in peril at each of the frequent upheavals in the male hierarchy. The reason is that new alpha males enjoy brief reigns, seven to eight months on average, and find at first that the droits de seigneur they had anticipated are distinctly unpromising. Most of the females are not sexually receptive because they are pregnant or nurturing unweaned children.

An unpleasant fact of baboon life is that the alpha male can make mothers re-enter their reproductive cycles, and boost his prospects of fatherhood, by killing their infants. The mothers can secure some protection for their babies by forming close bonds with other females and with male friends, particularly those who were alpha when their children were conceived and who may be the father. Still, more than half of all deaths among baby baboons are from infanticide.

So important are these social skills that it is females with the best social networks, not those most senior in the hierarchy, who leave the most offspring.

Although the baboon and human lines of descent split apart some 30 million years ago, the species have much in common. Both are primates whose ancestors came down from the trees and learned to survive on the ground in large social groups. The baboon mind may therefore shed considerable light on the early stages of the evolution of the human mind.

In some of their playback experiments, Dr. Cheney and Dr. Seyfarth have tested baboons’ knowledge of where everyone stands in the hierarchy. In a typical interaction, a dominant baboon gives a threat grunt, and its inferior screams. From their library of recorded baboon sounds, the researchers can fabricate a sequence in which an inferior baboon’s threat grunt is followed by a superior’s scream.

Baboons pay little attention when a normal interaction is played to them but show surprise when they hear the fabricated sequence implying their social world has been turned upside down.

This simple reaction says a lot about what is going in the baboon’s mind. That the animal can construe “A dominates B,” and distinguish it from “B dominates A,” means it must be able to break a stream of sounds down into separate elements, recognize the meaning of each, and combine the meanings into a sentence-like thought.

“That’s what we do when we parse a sentence,” Dr. Seyfarth said. Human language seems unique because no other species is capable of anything like speech. But when it comes to perceiving and deconstructing sounds, as opposed to making them, baboons’ ability seems much more language-like.

Assuming that early humans inherited the same ability from their joint ancestor with baboons, then when humans first started to combine sounds in the beginning of spoken language, “their listeners were all ready to perceive them,” Dr. Seyfarth said.

Baboons may be good at perceiving and thinking in a combinative way, but their vocal output consists of single sounds that are never combined, like greeting grunts, the females’ sexual whoop and the males’ competitive “wahoo!” cry. Why did language, expressed in combinations of sounds, evolve in humans but not in baboons?

A possible key to the puzzle lies in what animal psychologists call theory of mind, the ability to infer what another animal does or does not know. Baboons seem to have a very feeble theory of mind. When they cross from one island to another, ever fearful of crocodiles, the adults will often go first, leaving the juveniles fretting at the water’s edge. However much the young baboons call, their mothers never come back to help, as if unable to divine their children’s predicament.

But people have a very strong ability to recognize the mental states of others, and this could have prompted a desire to communicate that drove the evolution of language. “If I know you don’t know something, I am highly motivated to communicate it,” Dr. Seyfarth said.

It is far from clear why humans acquired a strong theory of mind faculty and baboons did not. Another difference between the two species is brain size. Some biologists have suggested that the demands of social living were the evolutionary pressure that enhanced the size of the brain. But the largest brains occur in chimpanzees and humans, who live in smaller groups than baboons.

But both chimps and humans use tools. Possibly social life drove the evolution of the primate brain to a certain point, and the stimulus of tool use then took over. Use of tools would have spurred communication, as the owner of a tool explained to others how to use it. But that requires a theory of mind, and Dr. Cheney and Dr. Seyfarth are skeptical of claims that chimpanzees have a theory of mind, in part because the experiments supporting that position have been conducted on captive chimps. “It’s bewildering to us that none of the people who study ape cognition have been motivated to study wild chimpanzees,” Dr. Cheney said.

“Baboons provide you with an example of what sort of social and cognitive complexity is possible in the absence of language and a theory of mind,” she said. “The selective forces that gave rise to our large brains and our full-blown theory of mind remain mysterious, at least to us.”
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on November 20, 2007, 04:08:57 AM
http://www.msnbc.msn.com/id/21882948/?gt1=10547

Despite flash, males are simple creatures

Females evolve slower, but it's because they're more complex


By Jeanna Bryner
updated 11:06 a.m. ET, Mon., Nov. 19, 2007

The secret to why male organisms evolve faster than their female counterparts comes down to this: Males are simple creatures.

In nearly all species, males seem to ramp up glitzier garbs, more graceful dance moves and more melodic warbles in a never-ending vie to woo the best mates. Called sexual selection, the result is typically a showy male and a plain-Jane female. Evolution speeds along in the males compared to females.

The idea that males evolve more quickly than females has been around since 19th century biologist Charles Darwin observed the majesty of a peacock’s tail feather in comparison with those of the drab peahen.

How and why males exist in evolutionary overdrive despite carrying essentially the same genes as females has long puzzled scientists.
New research on fruit flies, detailed online last week in the journal Proceedings of the National Academy of Sciences, finds males have fewer genetic obstacles to prevent them from responding quickly to selection pressures in their environments.

"It’s because males are simpler," said lead author Marta Wayne, a zoologist at the University of Florida in Gainesville. "The mode of inheritance in males involves simpler genetic architecture that does not include as many interactions between genes as could be involved in female inheritance."

The finding could also shed light on why diseases show up differently in men and women.

Complicated chromosomes
Wayne and her colleagues examined more than 8,500 genes shared by both sexes of the fruit fly Drosophila melanogaster. Of those genes, about 7,600 have different expressions (alleles) that do different jobs in males and females.

The flies were identical genetically, except for their sex chromosomes.
In flies and humans, thousands of genes made up of DNA are packaged into tiny units called chromosomes. Each parent contributes one set of 23 chromosomes to offspring, resulting in little ones with 23 father-given chromosomes and 23 mother-chromosomes — 46 total. One pair of these is called the sex chromosome. In this case, the females have two X chromosomes (XX) and males, XY.

Many genes are found on the X chromosome, whereas few are associated with the Y chromosome. For female fruit flies, the X-chromosome genes can come in two flavors called alleles that not only interact with each other but also with other genes.

For instance, if one allele is dominant over the other, that allele would get "expressed" while the recessive allele would stay hidden. Though under cover, the recessive allele kind of hitches a ride on the X chromosome and can be passed on to future generations.

That's not the case with males.
"We find direct evidence that the expression of the genes on the X has this covering behavior in females whereas in males they're out in the open," said study team member Lauren McIntyre, also of UF.

Males only have one X chromosome, so what you see is what you get. If that particular gene gives the male a boost in terms of sexual selection, say a gene responsible for fluffier feathers, the gene would be selected for in the game of natural selection over successive generations. But if the gene is no good for males, it would get selected against over time.

"Having one X means your genes are more open to selection in males," UF researcher Marina Telonis-Scott said in a telephone interview. "So in a female if you have a recessive allele that confers a sickness, it can be concealed within the two X's but if you've only got one, such as the male, you're more open to selection."

And the reason males are genetic simpletons, it turns out, is sex. The researchers suggest this uncomplicated (compared with females) genetic pathway allows males to respond at the drop of a hat to the pressures of sexual selection. That way they can win females, produce more offspring and start the cycle over again.

While not as prominent a trend, they also found a similar pattern in so-called autosomal genes, which are those found on any chromosome save the sex chromosomes. Many of the fruit-fly autosomal genes, however, did work in concert with genes located on the X chromosome.

Human implications
The "elephant lurking in these results," of course, is how they would apply to men and women.

The researchers caution the results don't directly translate to humans. "The X function is thought to be quite different in flies than humans," McIntyre told LiveScience. In humans, one of the X chromosomes gets inactivated in females, though research is finding this inactivation isn't always absolute.

However, the results could help explain differences in symptoms and responses to diseases in men and women, the authors say. Sexual selection does occur in humans, they note. In addition, fruit flies and humans share an evolutionary history, the authors point out, which is the reason why we share more than 65 percent of our genes with the tiny insects.

"If we see a mechanism in flies it may also be true in everything that shares that evolutionary history," McIntyre said.
On a basic level, the genetic machinery works in a similar manner in flies and us.

"There's a health aspect in figuring out differences in gene expression between the sexes," Wayne said. "To make a male or a female, even in a fly, it's all about turning things on — either in different places or different amounts or at different times — because we all basically have the same starting set of genes."
__________________
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on January 22, 2008, 07:16:22 AM
NY Times

As the candidates have shown us in the succulent telenovela that is the 2008 presidential race, there are many ways to parry for political power. You can go tough and steely in an orange hunter’s jacket, or touchy-feely with a Kleenex packet. You can ally yourself with an alpha male like Chuck Norris, befriend an alpha female like Oprah Winfrey or split the difference and campaign with your mother. You can seek the measured endorsement of the town elders or the restless energy of the young, showily handle strange infants or furtively slam your opponents.

Just as there are myriad strategies open to the human political animal with White House ambitions, so there are a number of nonhuman animals that behave like textbook politicians. Researchers who study highly gregarious and relatively brainy species like rhesus monkeys, baboons, dolphins, sperm whales, elephants and wolves have lately uncovered evidence that the creatures engage in extraordinarily sophisticated forms of politicking, often across large and far-flung social networks.

Male dolphins, for example, organize themselves into at least three nested tiers of friends and accomplices, said Richard C. Connor of the University of Massachusetts at Dartmouth, rather like the way human societies are constructed of small kin groups allied into larger tribes allied into still larger nation-states. The dolphins maintain their alliances through elaborately synchronized twists, leaps and spins like Blue Angel pilots blazing their acrobatic fraternity on high.

Among elephants, it is the females who are the born politicians, cultivating robust and lifelong social ties with at least 100 other elephants, a task made easier by their power to communicate infrasonically across miles of savanna floor. Wolves, it seems, leaven their otherwise strongly hierarchical society with occasional displays of populist umbrage, and if a pack leader proves a too-snappish tyrant, subordinate wolves will collude to overthrow the top cur.

Wherever animals must pool their talents and numbers into cohesive social groups, scientists said, the better to protect against predators, defend or enlarge choice real estate or acquire mates, the stage will be set for the appearance of political skills — the ability to please and placate, manipulate and intimidate, trade favors and scratch backs or, better yet, pluck those backs free of botflies and ticks.

Over time, the demands of a social animal’s social life may come to swamp all other selective pressures in the environment, possibly serving as the dominant spur for the evolution of ever-bigger vote-tracking brains. And though we humans may vaguely disapprove of our political impulses and harbor “Fountainhead” fantasies of pulling free in full glory from the nattering tribe, in fact for us and other highly social species there is no turning back. A lone wolf is a weak wolf, a failure, with no chance it will thrive.

Dario Maestripieri, a primatologist at the University of Chicago, has observed a similar dilemma in humans and the rhesus monkeys he studies.

“The paradox of a highly social species like rhesus monkeys and humans is that our complex sociality is the reason for our success, but it’s also the source of our greatest troubles,” he said. “Throughout human history, you see that the worst problems for people almost always come from other people, and it’s the same for the monkeys. You can put them anywhere, but their main problem is always going to be other rhesus monkeys.”

As Dr. Maestripieri sees it, rhesus monkeys embody the concept “Machiavellian” (and he accordingly named his recent popular book about the macaques “Macachiavellian Intelligence”).

“Individuals don’t fight for food, space or resources,” Dr. Maestripieri explained. “They fight for power.” With power and status, he added, “they’ll have control over everything else.”

Rhesus monkeys, midsize omnivores with ruddy brown fur, long bearded faces and disturbingly humanlike ears, are found throughout Asia, including in many cities, where they, like everybody else, enjoy harassing the tourists. The monkeys typically live in groups of 30 or so, a majority of them genetically related females and their dependent offspring.

A female monkey’s status is usually determined by her mother’s status. Male adults, as the ones who enter the group from the outside, must establish their social positions from scratch, bite, baring of canines and, most importantly, rallying their bases.
=========
Page 2 of 2)



“Fighting is never something that occurs between two individuals,” Dr. Maestripieri said. “Others get involved all the time, and your chances of success depend on how many allies you have, how wide is your network of support.”

Monkeys cultivate relationships by sitting close to their friends, grooming them at every possible opportunity and going to their aid — at least, when the photo op is right. “Rhesus males are quintessential opportunists,” Dr. Maestripieri said. “They pretend they’re helping others, but they only help adults, not infants. They only help those who are higher in rank than they are, not lower. They intervene in fights where they know they’re going to win anyway and where the risk of being injured is small.”

In sum, he said, “they try to gain maximal benefits at minimal cost, and that’s a strategy that seems to work” in advancing status.

Not all male primates pursue power by appealing to the gents. Among olive baboons, for example, a young male adult who has left his natal home and seeks to be elected into a new baboon group begins by making friendly overtures toward a resident female who is not in estrous at the moment and hence not being contested by other males of the troop.

“If the male is successful in forming a friendship with a female, that gives him an opening with her relatives and allows him to work his way into the whole female network,” said Barbara Smuts, a biologist at the University of Michigan. “In olive baboons, friendships with females can be much more important than political alliances with other males.”

Because males are often the so-called dispersing sex, while females stay behind in the support network of their female kin, females form the political backbone among many social mammals; the longer-lived the species, the denser and more richly articulated that backbone is likely to be.

With life spans rivaling ours, elephants are proving to possess some of the most elaborate social networks yet observed, and their memories for far-flung friends and relations are well in line with the species’ reputation. Elephant society is organized as a matriarchy, said George Wittemyer, an elephant expert at the University of California, Berkeley, with a given core group of maybe 10 elephants led by the eldest resident female. That core group is together virtually all the time, traveling over considerable distances, stopping to dig water holes, looking for fresh foliage to uproot and devour.

“They’re constantly making decisions, debating among themselves, over food, water and security,” Dr. Wittemyer said. “You can see it in the field. You can hear them vocally disagree.” Typically, the matriarch has the final say, and the others abide by her decision. If a faction disagrees strongly enough and wants to try a different approach, “the group will split up and meet back again later,” said Dr. Wittemyer.

Age has its privileges, he said, and the older females, even if they are not the biggest, will often get the best spots to sleep and the best food to eat. But it also has its responsibilities, and a matriarch is often the one to lead the charge in the face of conflicts with other elephants or predatory threats, sometimes to lethal effect.

Hal Whitehead of Dalhousie University and his colleagues have found surprising parallels between the elephant and another mammoth mammal, the sperm whale, possessor of the largest brain, in absolute terms, that the world has ever known. As with elephants, sperm whale society is sexually segregated, the females clustering in oceanic neighborhoods 40 degrees north or south of the Equator, and the males preferring waters around the poles.

As with elephants, the core social unit is a clan of some 10 or 12 females and their offspring. Sperm whales also are highly vocal. They communicate with one another using a Morse code-like pattern of clicks. Each clan, Dr. Whitehead said, has a distinctive click dialect that the members use to identify one another and that adults pass to the young. In other words, he said, “It looks like they have a form of culture.”

Nobody knows what the whales may have to click and clack about, but it could be a form of voting — time to stop here and synchronously dive down in search of deep water squid, now time to resurface, move on, dive again. Clans also seem to caucus on which males they like and will mate with more or less as a group and which ones they will collectively spurn. By all appearances, female sperm whales are terrible size queens. Over the generations, they have consistently voted in favor of enhanced male mass. Their dream candidate nowadays is some fellow named Moby, and he’s three times their size.


Title: The Texas Cult
Post by: Crafty_Dog on May 21, 2008, 04:53:22 AM
Civilization and the Texas Cult
By LIONEL TIGER
May 21, 2008; Page A17

The desperate tragedy involving polygamous cultists in Texas has attracted a growing phalanx of lawyers, judges, law enforcers and assorted psychologists.

Those responsible for coping with this astonishing disaster would be well-advised to add a primatologist to the team. The fact is that, despite all the blather about faith and freedom of religion, the men operating the various compounds in question are behaving in virtually the same manner as countless dominant males in countless primate troops observed over the years.

The essence of the case is that the men who control the politics of the group (as well as the hapless women and children who live there) have used junk theology about heaven, hell, paradise and salvation to maintain their unquestioned access to all females of reproductive age (or younger).

That's the reproductive fantasy of any adult male primate.

In this blow to simple decency, the Texas polygamists are not pathfinders. Multiple wives are of course permitted in the Islamic religion, and co-wives are a feature of dozens of human groups in which powerful men control sufficient resources to be able to support more than one woman.

This is usually because the societies in which they live are sharply unequal. Sex and offspring flow to those with resources.

One of the triumphs of Western arrangements is the institution of monogamy, which has in principle made it possible for each male and female to enjoy a plausible shot at the reproductive outcome which all the apparatus of nature demands. Even Karl Marx did not fully appreciate the immense radicalism of this form of equity.

The Texans' faith-flaunting is morally disgraceful and crudely cynical. It also raises bewildering questions about human gullibility on one hand and the efficacy of the Big Lie on the other.

Can anyone really believe that the notorious communal bed to which senior men command 16-year-old girls is part of some holy temple apparatus? Apparently some people do, and the few escapees from the fetid zoo have testified to the power the ridiculous theory wields.

The victims are not only young women but young men too. They are reproductively and productively disenfranchised, and are in effect forced to leave the communities to become hopeless, ill-schooled misfits in the towns of normal life. No dignified lives as celibate monks with colorful costumes for them.

Again, the issue is cross-cultural. Osama bin Laden has at least five wives, which means that four young men of his tribe have no date on Saturday night and forever. They may become willing jihadists, or desperate suicides eager to soothe their god by killing infidels and Americans.

Elsewhere, preference for sons has meant a sharp shortage of women in China. It is known that raiding parties from there cross into bordering countries with more regular sex ratios to steal women.

The deranged cults have been operating in plain sight for years in Texan communities whose police forces have been earnestly writing parking tickets while ignoring what is obvious major criminality. Some 400 young children have been drastically separated from their mothers – who among other derogations of civil life are allegedly part of longstanding welfare fraud engineered by their sexual tyrants.

And now what? It will be intensely depressing but probably useful to acknowledge this is at bottom a natural matter, a product of our inner behavioral nature. Understanding the shadowy sources of this nightmare may help our community cope with its victims.

Mr. Tiger teaches anthropology at Rutgers and is the author of "The Decline of Males" (St. Martin's, 2000).

See all of today's editorials and op-eds, plus video commentary, on Opinion Journal.

And add your comments to the Opinion Journal forum.
Title: Phrenology 2.0?
Post by: Body-by-Guinness on August 20, 2008, 07:47:55 AM
A rounder face 'means men are more aggressive'
By Roger Highfield, Science Editor
Last Updated: 12:01am BST 20/08/2008


Men with round faces tend to be more aggressive, a study of sportsmen has shown.

Five round-headed and aggressive sportsmen
Phwoar! What a lovely set of genes
Study suggest testosterone levels may be driven by looks
The male sex hormone testosterone makes faces more circular and now scientists have studied whether this characteristic is also linked to behaviour.

The shape of the face may have been honed by evolution to mark a man likely to be aggressive
A Canadian team studied 90 ice hockey players and found the rounder the face, the more aggressive the players.

For male varsity and professional hockey players, the facial ratio was linked in a statistically significant way with the number of penalty minutes per game, report Justin Carre and Prof Cheryl McCormick of Brock University, Ontario.

The penalties were incurred by players for violent acts including slashing, elbowing, checking from behind, fighting and so on.

However, there was not a link between facial shape and aggression in women.

"The facial structure of a man provides an indication of how aggressive he will be in a competitive situation," says Prof McCormick.

"Therefore, we are able to predict, with some accuracy, the behaviour of men on the basis of their facial features.

advertisement
"If men's faces are providing cues as to their potential for aggression, then likely people are probably picking up on this cue, although likely on a subconscious level."

The findings, published in the Proceedings of the Royal Society: Biological Sciences suggest that the shape of the face may have been honed by evolution as a marker of the propensity for aggressive behaviour: ancestors who did not pick up this warning sign could have found out to their cost that they were dealing with a more volatile and violent person.

By one theory, testosterone is responsible for the development of rugged looks, a jutting jaw and brow, a deep voice and other trappings of masculinity but it also damps down the body's protective immune system, so only high-quality (that is those with healthy, good 'genes') men can afford to display these macho characteristics.

But the hormone affects more than appearance and a range of earlier work has shown that testosterone levels affect behaviour, other than aggression.

For example, women's judgements of the extent to which a man was interested in infants based on his face predicted his actual interest in infants: more feminised faces were seen as more trustworthy.

People also show some accuracy at identifying 'cheaters' from their looks in an idealised game of cooperation. "Together, these findings suggest that people can make accurate inferences about others' personality traits and behavioural dispositions based on certain signals conveyed by the face," say the researchers.

However, there is a long and fraught history of attempting to read a personality from the way someone looks.

The Crime Museum at Scotland Yard in central London has more than 30 casts made of the heads of those hanged for murder at Newgate prison during the 19th century to provide evidence to back the then "scientific" theory of phrenology, which said that character and criminality could be determined by the shape of a person's head.

Phrenologists believed that the brain had different "brain organs" which represented a person's personality traits.

These were thought to be proportional to a person's propensities, as reflected by "bumps" in the skull. This work, now written off as pseudoscience, was used to back the idea that some people are "born criminal" and could be identified.

But today's study shows that there may be a bit more to looks than we thought. "Given that people readily make judgements of others based on their looks, and that we have evidence that the face may actually be providing relevant information, it will be fascinating to see if people's judgements of faces are accurate," says Prof McCormick.

"Although we naturally wince a bit at the comparison to phrenology, the comparison is certainly one that has crossed our minds."

http://www.telegraph.co.uk/earth/main.jhtml?xml=/earth/2008/08/20/sciface120.xml

Title: Evolutionary Attention Spans
Post by: Body-by-Guinness on August 25, 2008, 07:45:35 AM
What's Sexier than Public Policy?

Ronald Bailey | August 25, 2008, 10:09am

Sex, of course. That's why newspapers obsess over Sen. Larry Craig's (R-ID) (alleged) public bathroom romances but not his position on the Medicare prescription drug benefit program which is costing taxpayers billions. And why CNN ran 24/7 coverage of Gov. Elliot Spitzer's (D-NY) high cost hotel dalliances, but not his serial abuses of prosecutorial discretion.



The Washington Post's Shankar Vedantam's always interesting Department of Human Behavior feature delves into the question of why media tend to focus on sex over policy. Evolutionary psychologists argue that thanks to our evolutionary biology gossip is what interests readers, listeners, and viewers. As Vedantam explains:

[University of Guelph in Ontario psychologist Hank] Davis and other evolutionary psychologists argue that the reason John Edwards's adultery has more zing in our heads than a dry policy dispute that could cost taxpayers billions of dollars is that the human brain evolved in a period where there were significant survival advantages to finding out the secrets of others. Since humans lived in small groups, the things you learned about other people's character could tell you whom to trust when you were in a tight spot.

"We are continuing to navigate through the modern world with a Stone Age mind," Davis said.

In the Pleistocene era, he added, there was no survival value in being able to decipher a health-care initiative, but there was significant value in information about "who needs a favor, who is in a position to offer one, who is trustworthy, who is a liar, who is available sexually, who is under the protection of a jealous partner, who is likely to abandon a family, who poses a threat to us."

We may consciously know that we are no longer living in small hunter-gatherer groups and that it no longer makes sense to evaluate someone like Edwards as we might a friend or intimate partner, but our reptilian brain doesn't realize this. Our prefrontal cortex might reason that a man who cheats on his wife while she is fighting cancer could make a perfectly fine president in a complex world, but the visceral distaste people feel about Edwards stems from there being an ancient part of the human brain that says, "Gee, I don't want to get mixed up with this guy, because even in my hour of greatest need I might not be able to count on him," said Frank T. McAndrew, an evolutionary social psychologist at Knox College in Illinois.

Most Americans, of course, will never have any personal interaction with the people they elect president. Nonetheless, if the evolutionary psychologists are correct, people will tend to choose leaders they can relate to personally -- and reject the leaders with whom they cannot see having a personal relationship.

"The human brain does not have any special module for evaluating welfare policy or immigration policy, but it has modules for evaluating people on the basis of character," said Satoshi Kanazawa, an evolutionary psychologist at the London School of Economics. "That is probably why we have this gut reaction to affairs and marriages and lying. All of those things existed in the ancestral environment 100,000 years ago."

Whole Vedantam feature here.

http://www.reason.com/blog/printer/128249.html
Title: Re: Evolutionary biology/psychology
Post by: Karsk on August 25, 2008, 10:09:21 AM
Regarding the face width = aggression piece above, here is the  the actual paper:

http://journals.royalsociety.org/content/h80173234257qq01/fulltext.pdf (http://journals.royalsociety.org/content/h80173234257qq01/fulltext.pdf)

I have lots of questions about studies like this, especially when they get reported in the popular press. Popular reiterations of research like this often report on the sensational aspects of it.


The test groups were actually pretty homogeneous (hockey players of the same race for the most part).  This is good for the experiment because it controls for such differences . But how much does this correlation hold outside of the test group? Whole cultural and racial groups have rounder faces than others.  Are there similar findings among those groups?


Correlation does not imply causation.  Just because two things vary with one another does not necessarily mean much.  Here is an explanation of the method used in this study:

"The main result of a correlation is called the correlation coefficient (or "r"). It ranges from -1.0 to +1.0. The closer r is to +1 or -1, the more closely the two variables are related.

If r is close to 0, it means there is no relationship between the variables. If r is positive, it means that as one variable gets larger the other gets larger. If r is negative it means that as one gets larger, the other gets smaller (often called an "inverse" correlation).

While correlation coefficients are normally reported as r = (a value between -1 and +1), squaring them makes then easier to understand. The square of the coefficient (or r square) is equal to the percent of the variation in one variable that is related to the variation in the other. After squaring r, ignore the decimal point. An r of .5 means 25% of the variation is related (.5 squared =.25). An r value of .7 means 49% of the variance is related (.7 squared = .49).

A correlation report can also show a second result of each test - statistical significance. In this case, the significance level will tell you how likely it is that the correlations reported may be due to chance in the form of random sampling error. If you are working with small sample sizes, choose a report format that includes the significance level. This format also reports the sample size.

A key thing to remember when working with correlations is never to assume a correlation means that a change in one variable causes a change in another. Sales of personal computers and athletic shoes have both risen strongly in the last several years and there is a high correlation between them, but you cannot assume that buying computers causes people to buy athletic shoes (or vice versa)."  from  http://www.surveysystem.com/correlation.htm (http://www.surveysystem.com/correlation.htm)


Just because someone has a round face does not prove that they will be violent.  Making that assumption would place a whole bunch of people in a category that would not be accurate for them.   There are greater correlations that mean more (which also generate incorrect presumptions).

So they found a mild correlation between face shape and the number of penalties accrued.  So the next question is "is it significant or important in any way?"

Are the guys with the round faces more prone to dastardly deeds or are they the "heroes" of the hockey team?  Lots of times the accruers of penalties are the enforcers.  Hockey by design is a rough game. Maybe these guys are actually the ones doing the job that is expected of them.

The other part of the study that tried to predict the number of penalties as a function of a test taken that was supposed to indicate "trait dominance" 

"Participants completed a 10-item questionnaire assessing
trait dominance (International Personality Item Pool scales;
Goldberg et al. 2006). Some examples of items include ‘Like
having authority over others’ and ‘Want to be in charge’.
Responses were scored on a Likert scale ranging from K2
(very inaccurate) to C2 (very accurate), and had high
reliability (Cronbach’s alphaZ0.82)."

What does it mean that this test gave an insignificant result?  Why isn't this correlated to number of penalties as well?  Its a questions that they brought up and there is no explanation for it.


Anyways,

nothing wrong with the paper. But scientific studies are often portrayed as though they are more than they are by popular press.  I don't see much in this paper that would peak my interest beyond being mildly interesting.  It won't make me look askance at my cheerful polish friend with the gentle disposition and extremely wide face other than to send him the article and innocently ask him if he has neanderthal roots as a joke   :-D


Karsk




Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on August 26, 2008, 04:51:26 AM
Nice post.
Title: Jews and their DNA,1
Post by: Body-by-Guinness on September 07, 2008, 10:18:44 AM
Jews and Their DNA

Hillel Halkin From issue: September 2008

Eight years ago, I published an article in these pages called “Wandering Jews—and Their Genes” (September 2000). At the time I was working on a book about a Tibeto-Burmese ethnic group in the northeast Indian states of Mizoram and Manipur, many of whose members believe that they descend from the biblical tribe of Manasseh, and about a group of Judaizers among them known as the B’nei Menashe, over a thousand of whom live today in Israel as converts to Judaism.

This led me to an interest in Jewish historical genetics, then a new discipline. Historical genetics itself was still a pioneering field, launched by the discovery that two sources of DNA in the human body, the Y chromosome that determines male sex and the mitochondria that aid cell metabolism, never change (barring rare mutations) in their transmission from fathers to sons and from mothers to children of both sexes. This made it possible to trace paternal and maternal lines of descent far into the past and to learn about the movements and interactions of human populations that originated hundreds, thousands, and even tens of thousands of years ago.

In my article, I observed that preliminary studies in Jewish genetics had both “shored up” and “undermined” some conventional ideas about Jewish history. On the one hand, they had indicated that there was a high degree of Y-chromosome similarity among Jewish males from all over the world, coupled with a much lower degree when the comparison was made between Jews and non-Jews in the same region. The one part of the globe in which Jews correlated as highly with many non-Jews as they did with other Jews was the Middle East—precisely what one might expect of a people that claimed to have originated in Palestine (or in Ur of the Chaldees, if you go back to Abraham) and to have spread from it.

Other studies established that the Y chromosomes of kohanim—male Jews said to descend from the priestly caste whose supposed progenitor was the biblical Aaron—had their own unique DNA signature, labeled the Cohen Modal Haplotype. Not only did half of all kohanim, who comprise about four percent of the world’s Jewish population, share this DNA configuration, but minor mutations in it pointed to a common ancestor who lived a few centuries before or after 1000 B.C.E.—that is, close to the period in which Aaron and his brother Moses are situated by biblical chronology.

Such evidence seemed to confirm traditional notions of Jewish origins. It suggested that the Jews, while certainly not a “race,” were indeed, despite the skepticism of many modern historians, the highly endogamous people they had always considered themselves to be, one that had admixed with outsiders relatively little during long centuries of wandering in the Diaspora. It also strengthened the reliability of the Bible as a historical source. Modern critics who contended that the Bible was a late document that imagined a largely non-existent past had always singled out the priestly codes of the Pentateuch as a prime illustration of this. But if the priesthood was really an institution going back to early Israelite history, rather than the backward projection in time of later generations, revisionist Bible criticism itself needed to be revised.

Yet there was contrary evidence, too. Early studies of mitochondrial DNA reported that Jewish women, unlike Jewish men, did not correlate well with one another globally. Furthermore, the greatest demographic mystery of Jewish history—that of the origins of the Ashkenazi population of Central and Eastern Europe—had only appeared to deepen.

The standard Jewish version of these origins was that Ashkenazi Jewry had first crystallized in the late first millennium of the Christian era in the French-German borderland along the Rhine; that it had reached the Rhineland from southern France, to which it had come in earlier centuries either directly from Palestine or via Italy and Spain; and that it had then migrated eastward and northward into Central and Eastern Europe.Even before the advent of historical genetics, however, this account had been challenged. There were linguists who argued that East European Yiddish, the Germanic language of most Ashkenazi Jews, had more in common with the dialects of southern and southeastern Germany than with those of the Rhineland in the west. There were demographers who contended that the Jewish population of the Rhineland prior to the appearance of East European Jewry, which would eventually become the world’s largest Jewish community, was too small to account for the latter’s rapid growth.

The early genetic findings appeared to support the challengers. If the Rhineland theory was correct, Ashkenazi DNA should have had greater affinities with non-Jewish DNA from northern France and western Germany than with non-Jewish DNA from elsewhere; no one denied, after all, that wherever and whenever Jews had lived, some Gentiles must have joined them or begotten children with them. Yet there was no sign of this. Where, then, had Ashkenazi Jewry come from?

It was a mixed picture. Since then, eight years have gone by, historical genetics has greatly refined its methods and taxonomy, and several major new studies in Jewish genetic history have been published. What, viewed from their perspective, does Jewish history look like now?

_____________

 

Two new books address this question. One, David B. Goldstein’s Jacob’s Legacy: A Genetic View of Jewish History, is the work of a scientist who teaches at Duke University and has been personally involved in much Jewish genetic research.1 The other, Jon Entine’s Abraham’s Children: Race, Identity, and the DNA of the Chosen People, is by a layman and journalist.2 Yet since Entine has done a serious and responsible job of reporting, and Goldstein has written a non-technical survey for the general reader, the difference between them is one more of style than of substance. They agree on most major points, starting with the puzzling disparity in the distribution patterns of Jewish Y-chromosome and mitochondrial DNA.

The fact of this disparity is now solidly established. There is no doubt that statistically (and only statistically: it is important to keep in mind that any randomly chosen Jewish individual may prove an exception to the rule), Jewish males with antecedents in such widely separated places as Yemen, Georgia, and Bukhara in Central Asia are far more likely to share similar Y-chromosome DNA with one another than with Yemenite, Georgian, or Bukharan non-Jews. Jewish females from the same backgrounds, on the other hand, yield opposite results: their mitochondrial DNA has markedly less resemblance to that of Jewish women from elsewhere than it does to that of non-Jewish women in the countries their families hailed from. The main difference between them and these Gentile women is that their mitochondrial DNA is less varied—that is, they descend from a small number of maternal ancestors. Geneticists call such a phenomenon, in which a sizable population has developed from a very small number of progenitors, a “founder” or “bottleneck” effect. (In “bottlenecks,” these few progenitors are survivors of larger groups that were drastically reduced by war, famine, plague, or other calamities.)

This calls for a new understanding of the spread of Jewish settlement in the Diaspora. Until now, it has been assumed that nearly all of the world’s Jewish communities began with the migration of cross-sections of older communities, which took their families, institutions, and practices with them and perpetuated their lives in new surroundings. Now, it would seem, as David Goldstein writes, that

[some] Jewish men . . . travel[ed] long distances to establish small Jewish communities [by themselves]. They would settle in new lands and, if unmarried, take local women for wives. The communities might [at a later date] have been augmented by additional male travelers from Jewish source populations. Once they were established, however, the barriers would go up against further input of new mitochondrial DNA, precisely because of female-defined ethnicity [i.e., the halakhic practice of determining Jewishness by the mother]; few [additional] females would be permitted to join.
Presumably, these adventurous bachelors setting out (perhaps on business ventures) for far lands could not persuade Jewish women to come with them, or else they traveled to their destinations with no intention of staying there. In the absence of rabbis to perform conversions, they married local women who, while consenting to live as Jews, were not halakhically Jewish. By halakhic standards, therefore, their descendants were not Jewish, either, even though their Jewishness was not challenged by the rabbinical authorities. Although such communities must, in their first generations, have known the truth about themselves, this does not appear to have bothered them or anyone else very much.

_____________


In a class by itself is the mitochondrial DNA of Ashkenazi women. It does not correlate closely with the DNA of non-Jewish women in Western, Central, or Eastern Europe and it has a large Middle Eastern component. Yet in their maternal lineage, Ashkenazim, too, exhibit a strong “founder effect.” Over forty percent of them, a 2005 study showed, descend from just four “founding mothers” having Middle-Eastern-profile mitochondrial DNA. Since Ashkenazi Y-chromosome DNA does not exhibit so dramatic a founder’s effect, one can assume that Ashkenazi Jewry, too, began with the migration of a preponderantly male group of Jews to new territories. Because these territories, however, were more contiguous with the old ones than were far-flung regions like Bukhara or Yemen, the men were more able to import wives from existing Jewish communities and less dependent on marrying local Gentiles.

But where did Ashkenazi Jewry, male and female alike, derive from if not from the Rhineland? One possibility that is more consistent with the linguistic data is that it entered southern Germany from northern Italy and pushed further north from there into the Slavic-speaking areas of Europe. Another is that Jews migrated to Slavic lands from the Byzantine Empire. These hypotheses, which are not mutually exclusive, can now claim a measure of scientific support, since the Y chromosomes of Ashkenazi Jews have more in common with those of Italians and Greeks than with those of West Europeans.

A more dramatic scenario, popularized by Arthur Koestler in his 1976 book The Thirteenth Tribe, has to do with the Khazars, a Turkish people living between the Black and Caspian Seas, whose royal house adopted Judaism (with what degree of rabbinical supervision, we have no way of knowing) in the 8th century c.e. A great deal is obscure in the history of the Khazar kingdom, which at its apogee ruled much of present-day Ukraine, and the degree of the Judaization of its population is uncertain. Yet Koestler and a small number of historians on whom he based himself were convinced that, following the destruction of this kingdom in the 11th century by its Slavic enemies, many of its Jews fled westward to form the nucleus of what was to become East European Jewry.3

The Khazar theory never had many backers in scholarly circles; there was little evidence to support it and good reasons to be dubious about it. Why, for instance, does medieval rabbinic literature almost never mention the Khazars? Why, if they spoke a Turkish language, did East European Jewry become Yiddish-speaking? “Like virtually every academic I have ever consulted on the subject,” David Goldstein writes, “I was initially quite dismissive of Koestler’s identification of the Khazars [with] Ashkenazi Jewry.” Yet, he continues, “I am no longer so sure. The Khazar connection seems no more farfetched than the spectacular continuity of the Cohen line.”

This is one of the few occasions on which Jon Entine disagrees with him. Abraham’s Children declares:

The studies of the Y-chromosome and [mitochondrial] DNA do not support the . . . notion that Jews are descended in any great numbers from the Khazars or some Slavic group, although it’s evident some Jews do have Khazarian blood. The Khazarian theory has been put to rest, or at least into perspective.
_____________

 

Who is right? Either could be, for the latest evidence is ambiguous. It consists of two studies. One, “Y-Chromosome Evidence for a Founder Effect in Ashkenazi Jews,” was published in 2004 in the European Journal of Human Genetics by a small team from the Hebrew University of Jerusalem. The other was the work of a larger, American-Israeli-British group to which Goldstein belonged; its report, “Multiple Origins of Ashkenazi Levites: Y-Chromosome Evidence for Both Near Eastern and European Ancestries,” appeared in the American Journal of Human Genetics in 2003. Both studies discuss a mutation, widely found in Poland, Lithuania, Belarus, and Ukraine, that occurs in a Y-chromosome classification known as Haplogroup R, at a DNA site labeled M117.

The Hebrew University study states:

Recent genetic studies . . . showed that Ashkenazi Jews are more closely related to other Jewish and Middle Eastern groups than to their host populations in Europe. However, Ashkenazim have an elevated frequency of R-M117, the dominant Y-chromosome haplogroup in Eastern Europeans, suggesting possible gene flow [into the Ashkenazi population]. In the present study of 495 Y chromosomes of Ashkenazim, 57 (11.5 percent) were found to belong to R-M117.
As for the American-Israeli-British study, it was designed to ascertain whether Levites, who functioned as priests’ assistants in the ancient Temple and are supposedly also descended from Aaron, have a worldwide genetic signature similar to or the same as the Cohen Modal Haplotype.4 The answer turned out to be negative, since the Y chromosomes of Levites from different geographical backgrounds proved to correlate no better with one another than they did with the Y chromosomes of non-Levitic Jews. And yet, rather astonishingly, Ashkenazi Levites, when taken separately, do have a “modal haplotype” of their own—and it is the same R-M117 mutation on which the Hebrew University study centered! Fifty-two percent of them have this mutation, which is rarely found in non-Ashkenazi Jews and has a clear non-Jewish provenance.

_____________

 

What is one to make of this finding? An 11.5-percent incidence of R-M117 among Ashkenazi Jews in general is easily explainable: the mutation could have entered the Jewish gene pool slowly, in small increments in every generation, during the thousand years of Ashkenazi Jewry’s existence. (This need not necessarily have been via conversion to Judaism and marriage to Jewish women. Pre- and extra-marital sexual relations, and even rape, widespread in times of anti-Jewish violence, were in all likelihood more common.) But the 52-percent rate among Levites is something else. Here we are dealing not with a gradual, long-term process (for no imaginable process could have produced such results), but with a one-time event of some sort.

Such an event could obviously not have been a sudden influx of Levites into the Jewish community from a Gentile society. Both of our studies, therefore, raise the possibility that the original R-M117 Levites were Khazarian Jews who migrated westward upon the fall of the Khazar kingdom. Of course, since all or most Khazarian Jews were converts (although some may have been Jews who came from elsewhere), few could have descended from Aaron. Yet it is quite possible that some became, or were designated, “honorary” Levites in the course of the Judaization of the Khazarian population. As the American-Israeli-British study observes, Jews traditionally held to “a lesser degree of stringency for the assumption of Levite status than for the assumption of Cohen status,” so that self-declared Khazarian Levites might have fathered lineages whose Levitic pedigree came to be accepted.

But if R-M117 did enter the East European Jewish gene pool via a lineage of Khazar Levites, how many Khazars can be assumed to have joined the Ashkenazi community? At this point, it becomes pure guesswork. Analyzing the data, the American-Israeli-British study concludes that the number of R-M117 Levites absorbed by Ashkenazi Jewry ranged from one to fifty individuals. But as much as we might like to do the rest of the arithmetic ourselves, we can’t. For one thing, we have no way of knowing what the percentage of Levites in the Khazarian Jewish population was. Nor do we know the percentage of Khazars possessing M117, which is found in 12 or 13 percent of Russian and Ukrainian males today. If these were also its proportions among the Khazars, there would have been seven non-M117 Khazars joining or founding Ashkenazi Jewry for every Khazar who had the mutation.

In sum, even if the R-M117 Levites are traceable to Khazaria, the total flow of Khazarians into the East European Jewish population could have been anywhere from a single person to many thousands. If it was the latter, the Khazar input was significant, as David Goldstein suspects it was; if the former, it was trivial, as Jon Entine believes. The last eight years of research in Jewish historical genetics have not left us any wiser in this respect.

_____________
Title: Jews and their DNA, 2
Post by: Body-by-Guinness on September 07, 2008, 10:19:26 AM
Traditional accounts of Jewish history, it would appear, are part true and part myth. Despite their dispersion in space and time, the Jews have continued to be that most curious (and in the eyes of many, preposterous) of combinations: at once a people or nation, fellow communicants in the world’s oldest monotheistic religion, and a family or tribe belonged to only by those born or married into it. They could not have remained such an amalgam had they not clung to strict rules of membership and admission.

Yet these rules were not observed everywhere or always. There were periods and places in which a blind eye was turned to them, most often when violations were not remediable. Had a rabbi arrived in Yemen or Bukhara soon after the founding of its Jewish community, he might have been able to insist on the halakhic conversion of its handful of Jews. But this would no longer have been practicable after several generations had gone by, especially since Yemenite and Bukharan Jews would have forgotten by then that their maternal progenitors were not halakhically Jewish and would have reacted with resentment to such a demand. Similarly, Khazars identifying themselves as Levites were accepted as such without inquiries into their past. It is an old rabbinic adage that one does not inflict demands on the public that the public is incapable of meeting. Better a tolerated myth than an intolerable truth.

Such, at any rate, was the attitude of a pre-modern age in which all Jews accepted rabbinic authority, so that all rabbis felt obliged to find solutions for all Jews. Since the mid-19th century, however, this has progressively ceased to be true. Rabbinic authority itself has fractured and dissipated. Most Jews no longer want rabbis to be responsible for them, and most rabbis no longer feel responsible for most Jews. The consequence of this, as reflected in the “Who Is A Jew?” debate that has racked world Jewry for the past several decades, is that the Jewish tribe is breaking up. In the United States, Orthodox rabbis do not recognize the Jewishness of converts to Reform or Conservative Judaism, Conservative rabbis do not recognize the Jewishness of children born to Jewish fathers but not to Jewish mothers, and Reform rabbis routinely preside over the marriages of Jewish men to non-Jewish women even though they may be creating future generations that they alone will consider Jewish.

In Israel, where non-Orthodox marriages and conversions cannot be performed, the problem is even more severe, for Jewishness in a Jewish state is a secular legal category as well. Israel’s Law of Return, for example, guarantees the right to immigrate and acquire Israeli citizenship to every Jew and his immediate family, including the first two generations of his descendants. Yet the more contentious the question of who is a Jew becomes, the more this law divides Jews rather than unites them.

Meanwhile, already living in Israel are hundreds of thousands of halakhically non-Jewish immigrants, most from the former Soviet Union, who entered the country under the Law of Return because they were either married to Jews or had a Jewish father or grandfather. As matters stand now, they and their children cannot have a Jewish wedding in Israel. Many of them, probably most, would like recognition as Jews, and not a few would be willing to convert in order to obtain it. But Israel’s Orthodox rabbinate has made the conversion procedure so difficult, in part by hinging it on the promise to live an Orthodox life, that most prospective converts have been deterred. Recently, perhaps for the first time in Jewish history, a conversion was retroactively annulled by the rabbinate on the grounds that such a promise was not kept.

For its part, the rabbinate insists that it has been forced to adopt more rigorous standards by the secular nature of Israeli society, which precludes the kind of “honor system” for determining Jewish identity that was operative in Jewish life in the past. Even Israelis whose Jewishness might appear to be beyond question now find themselves questioned about it.

_____________

 

To take a small personal example: my Israeli-born daughter, whose Israeli ID card lists her as “Jewish” and who is getting married in Israel this month, has been required to provide a letter from an Orthodox rabbi in the United States, where I and my wife were born and raised, attesting to the Orthodox ceremony in which we were wed in New York. The reasoning behind this is simple. Had we been married in Israel, this would have been considered proof of our daughter’s Jewishness, since our own Jewishness would already have been rabbinically certified. But if we were married in a non-Orthodox ceremony in the United States, we would have to bring further proof of our Jewishness since no non-Orthodox rabbi could be trusted to have vetted us properly.

And what could such further proof be? If we could find no Orthodox rabbi to speak for us, it would indeed be difficult to supply. My daughter would then have had the option of either arduously trying to assemble convincing evidence or of getting married outside of Israel (in which case her marriage would be recognized by Israeli secular law). Yet if she were to choose the second of these courses, as an increasing number of young Israelis are doing nowadays in their disinclination to deal with the rabbinate, she would in effect be choosing it for her children, too, since by the time they reached marriageable age, proof of their Jewishness would be even more difficult. In this manner, a growing public is being created in Israel that is losing its Jewish status in the eyes of rabbinic law.

The rabbinate’s position is understandable. Once, when there was no secular advantage in being Jewish, there was no reason to suspect anyone’s declaration of Jewishness; now, such avowals can no longer be taken at face value. And understandable, too, is the position of Israeli secularists who are indifferent to the rabbinate’s attitude or even welcome it.

For such secular Israelis, the idea of biological Jewishness is an embarrassing anachronism. Secular Zionism, after all, set out to normalize Jewish existence. Surely, they reason, its goal should therefore be to make Israelis a people whose identity is based, like that of other peoples, on territory, language, and culture rather than on shared blood ties. If Orthodoxy wishes to hasten this process, so much the better. Perhaps one day Israel will be become the “state of all its citizens” that democratic values require it to be, a country of Hebrew-speaking Jews, Muslims, and Christians, all equal before the law. Although the great majority of secular Israelis do not yet subscribe to this point of view, more and more will come to it if things continue on their present course.

As far as much of the rest of the world is concerned, biological Jewishness has always been an embarrassing anachronism—at least ever since the time of the Roman Empire and early Christianity. For the most part, Jews have nevertheless managed to go their own unembarrassed way. The genetic record shows that they have on the whole succeeded. But this is only, the same record shows, because they have made a point in the past of not embarrassing one another. There is a lot of DNA in the Jewish people that came in, as it were, through the back door. Unless ways are found to keep this door open, the walls of the house may have to be torn down.

_____________

 

In 2003, a year after the publication of my book Across the Sabbath River, I became involved in a historical genetics-research project myself. I did so at the invitation of two geneticists whose names appear on many of the scientific papers mentioned in this article: Professor Karl Skorecki and Dr. Doron Behar of Rambam Hospital and the Rappaport Research Institute in Haifa. They had read my book and wanted to know how I felt about taking part in a DNA study of the Mizo and Kuki people of northeast India, the purpose of which would be to determine whether there was evidence for a “Jewish”—that is, a Middle Eastern—origin for any of them.

Both men had qualms about the matter. Unlike other genetic investigations they had participated in, this one might have practical consequences. The B’nei Menashe believe that they descend from one of the “ten lost tribes” of Israel that was driven into exile by the Assyrians in the 8th century b.c.e. This belief, which first surfaced in Mizoram and Manipur in the 1950’s, is basic to their identity. Because of it, they have chosen to live Jewish lives and to convert once they have managed to reach Israel.

In my book I had come to the unexpected conclusion that there was a kernel of historical truth in their claim, although I did not think that more than a tiny fraction of Mizos and Kukis might have distant Israelite ancestors. What would happen, Skorecki and Behar asked, when our study was published? Whatever its findings, they would be certain to disappoint the B’nei Menashe and perhaps even to undermine their sense of Jewishness. And what if these findings were seized on by those in the Israeli government who wished to shut the country’s gates to the B’nei Menashe? Did we have the moral right to take such risks?

I answered that I thought we did. (This is the only basis I can imagine for David Goldstein’s strange statement in Jacob’s Legacy that I “agitated for the Mizos to undergo DNA tests in order to vindicate their claims.”) Israel’s gates had already been shut—and, apart from briefly swinging open again in 2006-7, have remained so—and even if one did not agree that the scientific truth was worth pursuing at all costs, someone else would pursue it in this case if we didn’t. It was best for the work to be done by an Israeli team that was sensitive to the issues involved.

In the end, we went ahead. Three rounds of sampling, based on the theories in my book and involving approximately 500 people, were carried out in India in 2003, 2006, and 2007. Although Goldstein writes (on what grounds, I again don’t know) that “most” Mizos and Kukis “resisted” genetic testing, I am aware of only one case in which someone who was asked to be sampled refused to cooperate. The difficulties were of an entirely different nature, such as a suitcase full of samples that was lost for several days in Tashkent, or the fact that at a critical juncture one of our samplers was murdered for reasons having nothing to do with our study.

The final lab results are now being tabulated. They will not, so it seems, be earth-shaking. Nearly all of the samples have turned out to have typically Tibeto-Burmese DNA. Although a very few look Middle Eastern, there may be no way of absolutely ruling out other possible sources for them. After all our effort, the results are inconclusive. And in any case, as historical geneticists are fond of saying, “absence of evidence is not evidence of absence.” There are many reasons why an originally small input of DNA might not turn up in a study: its bearers may have failed to reproduce their lineage, or the sample may be too small, or a crucial population group may be missing from it.

It has not been my impression, however, that the B’nei Menashe are waiting for the results with bated breath. In the five years that have passed since the study was commenced, scarcely any of them has contacted me to ask about it, and there has been, as far as I know, little discussion of it in their community. There appears to be no reason to think that, when eventually published, it will have much of an impact on them or their fate.

This comes as a relief. Despite my assurances to Skorecki and Behar, I too had my doubts. But the B’nei Menashe are more grounded in their own beliefs than we had feared. They will stick to them regardless of what two highly professional geneticists and one sadly amateur historian say in some scientific journal.

_____________

 

This, I think, is as it should be. There may be a few people who can subsist on an austere regimen of all truth and no myth, and there are all too many people who live on a flabby diet of all myth and no truth. But some indeterminably proportioned combination of the two dispositions is what most of us require for our health. This is as true of societies as it is of individuals.

I myself have long suspected, starting far before I knew anything of historical genetics or Arthur Koestler’s The Thirteenth Tribe, that I have Khazar blood in me. One of my father’s sisters had distinctly slanty eyes. In one of her daughters, these are even more pronounced. The daughter’s daughter has features that could come straight from the steppes of Asia.

I rather like the idea of Khazar forefathers. Far from deconstructing my Jewishness, it romanticizes it even more. The thought that my distant ancestors on the plains of Russia had the intelligence and folly to choose Judaism for their religion; that they prayed to a Jewish God as they rode into battle; that (as the historians tell us) they held back the Muslim invasion of Europe from the east and helped keep the West safe for Dante and Shakespeare. Does it make me feel that, as Arab propaganda would have it, I don’t belong in Palestine? Why should it? We Khazars threw in our lot with the Jews and the Jews embraced us. Since then, we’ve also been Jews.

And who is we? Each of us has had many thousands of forebears, and each of those had many thousands in turn. The traces of millions of human beings are in our minds, our hair, our eyes and noses, our inner organs, the shape of our toes, our trillions of cells. By pure chance, two of these trillions are passed on unchanged and can be given labels like R-M117. Instructive as they are, we needn’t make too much of them.

_____________

ABOUT THE AUTHOR

Hillel Halkin is a columnist for the New York Sun and a long-time contributor to COMMENTARY. His “How Not to Repair the World” appeared in our July-August issue.
AGREE? DISAGREE? WRITE A LETTER TO THE EDITOR

Let us know what you think! Send an email to editor@commentarymagazine.com
FOOTNOTES

1 Yale, 176 pp., $26.00. 2 Grand Central, 432 pp., $27.99. 3 An assimilationist Jew and at one time of his life an idiosyncratic Zionist, Koestler was attracted to this theory because it demonstrated, so he thought, that the Jews of the Diaspora were a “pseudo-nation” held together by “a system of traditional beliefs based on racial and historical premises which turn out to be illusory.” Either, therefore, they should emigrate to Israel or they should cease to exist. Ironically, however, Koestler’s book was soon enlisted by Arab propaganda in its war against Israel and Zionism. What claim could the Jews have to Palestine, Arab spokesmen asked, if their original ancestors came from southern Russia? 4 Constituting, like priests, about four percent of the world’s Jews, Levites can easily be identified because, again like priests, they are assigned minor tasks in Jewish ritual to this day, so that every religiously observant Levite knows he is one.

https://www.commentarymagazine.com/viewarticle.cfm/jews-and-their-dna-12496?page=2

Title: Genetic Link in Delinquent Peer Group Affiliation
Post by: Body-by-Guinness on October 01, 2008, 11:05:40 AM
Public release date: 1-Oct-2008

Contact: Kevin Beaver
kbeaver@fsu.edu
850-644-9180
Florida State University
 
Study reveals specific gene in adolescent men with delinquent peers

But family environment can tip the balance for better or worse

TALLAHASSEE, Fla. -- Birds of a feather flock together, according to the old adage, and adolescent males who possess a certain type of variation in a specific gene are more likely to flock to delinquent peers, according to a landmark study led by Florida State University criminologist Kevin M. Beaver.

"This research is groundbreaking because it shows that the propensity in some adolescents to affiliate with delinquent peers is tied up in the genome," said Beaver, an assistant professor in the FSU College of Criminology and Criminal Justice.

Criminological research has long linked antisocial, drug-using and criminal behavior to delinquent peers -- in fact, belonging to such a peer group is one of the strongest correlates to both youthful and adult crime. But the study led by Beaver is the first to establish a statistically significant association between an affinity for antisocial peer groups and a particular variation (called the 10-repeat allele) of the dopamine transporter gene (DAT1).

However, the study's analysis of family, peer and DNA data from 1,816 boys in middle and high school found that the association between DAT1 and delinquent peer affiliation applied primarily for those who had both the 10-repeat allele and a high-risk family environment (one marked by a disengaged mother and an absence of maternal affection).

In contrast, adolescent males with the very same gene variation who lived in low-risk families (those with high levels of maternal engagement and warmth) showed no statistically relevant affinity for antisocial friends.

"Our research has confirmed the importance of not only the genome but also the environment," Beaver said. "With a sample comprised of 1,816 individuals, more than usual for a genetic study, we were able to document a clear link between DAT1 and delinquent peers for adolescents raised in high-risk families while finding little or no such link in those from low-risk families. As a result, we now have genuine empirical evidence that the social and family environment in an adolescent's life can either exacerbate or blunt genetic effects."

Beaver and research colleagues John Paul Wright, an associate professor and senior research fellow at the University of Cincinnati, and Matt DeLisi, an associate professor of sociology at Iowa State University, have described their novel findings in the paper "Delinquent Peer Group Formation: Evidence of a Gene X Environment Correlation," which appears in the September 2008 issue of the Journal of Genetic Psychology.

The biosocial data analyzed by Beaver and his two co-authors derived from "Add Health," an ongoing project focused on adolescent health that is administered by the University of North Carolina-Chapel Hill and funded largely by the National Institute of Child Health and Human Development. Since the program began in 1994, a total of nearly 2,800 nationally representative male and female adolescents have been genotyped and interviewed.

"We can only hypothesize why we saw the effect of DAT1 only in male adolescents from high-risk families," said Beaver, who will continue his research into the close relationship between genotype and environmental factors -- a phenomenon known in the field of behavioral genetics as the "gene X environment correlation."

"Perhaps the 10-repeat allele is triggered by constant stress or the general lack of support, whereas in low-risk households, the variation might remain inactive," he said. "Or it's possible that the 10-repeat allele increases an adolescent boy's attraction to delinquent peers regardless of family type, but parents from low-risk families are simply better able to monitor and control such genetic tendencies."

Among female adolescents who carry the 10-repeat allele, Beaver and his colleagues found no statistically significant affinity for antisocial peers, regardless of whether the girls lived in a high-risk or low-risk family environment.

http://www.eurekalert.org/pub_releases/2008-10/fsu-srs100108.php
Title: Dawn of Disease?
Post by: Body-by-Guinness on October 19, 2008, 06:24:36 PM
Genetic-based Human Diseases Are An Ancient Evolutionary Legacy, Research Suggests

ScienceDaily (Oct. 19, 2008) — Tomislav Domazet-Lošo and Diethard Tautz from the Max Planck Institute for Evolutionary Biology in Plön, Germany, have systematically analysed the time of emergence for a large number of genes - genes which can also initiate diseases. Their studies show for the first time that the majority of these genes were already in existence at the origin of the first cells.

The search for further genes, particularly those which are involved in diseases caused by several genetic causes, is thus facilitated. Furthermore, the research results confirm that the basic interconnections are to be found in the function of genes - causing the onset of diseases - can also be found in model organisms (Molecular Biology and Evolution).

The Human Genome Project that deciphered the human genetic code, uncovered thousands of genes that, if mutated, are involved in human genetic diseases. The genomes of many other organisms were deciphered in parallel. This now allows the evolution of these disease associated genes to be systematically studied.

Tomislav Domazet-Lošo and Diethard Tautz from the Max Planck Institute for Evolutionary Biology in Plön (Germany) have used for this analysis a novel statistical method, "phylostratigraphy" that was developed by Tomislav Domazet-Lošo at the Ruđer Bošković Institute in Zagreb (Croatia). The method allows the point of origin for any existing gene to be determined by tracing the last common ancestor in which this gene existed. Based on this information, it is then possible to determine the minimum age for any given gene.

Applying this method to disease genes, the scientists from Plön came to surprising findings. The vast majority of these genes trace back to the origin of the first cell. Other large groups emerged more than one billion years ago around the first appearance of multi-cellular organisms, as well as at the time of origin of bony fishes about 400 million years ago. Surprisingly, they found almost no disease associated genes among those that emerged after the origin of mammals.

These findings suggest that genetic diseases affected primarily ancient cellular processes, which emerged already during the early stages of life on Earth. This leads to the conclusion that all living organisms today, i.e. not only humans, will be affected by similar genetic diseases. Furthermore, this implies that genetically caused diseases will never be beaten completely, because they are linked to ancient evolutionary processes.

Although it was already known that many disease associated genes occur also in other organisms distant to humans, such as the fruitfly Drosophila or the round worm Caenorhabditis, the analysis of Domazet-Lošo and Tautz shows now for the first time that this is systematically true for the vast majority of these genes. At present it remains unknown why the more recently evolved genes, for example those involved in the emergence of the mammals, do not tend to cause diseases when mutated.

The research results of the scientists from Plön also have some practical consequences. It will now be easier to identify candidates for further disease genes, in particular for those involved in multi-factorial diseases. Furthermore, the results confirm that the functional knowledge gained about such genes from remote model organisms is also relevant for understanding the genes in humans.

http://www.sciencedaily.com/releases/2008/10/081016124043.htm
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on October 19, 2008, 06:47:18 PM
That's deep  :-o
Title: No Junk in the Trunk?
Post by: Body-by-Guinness on November 07, 2008, 09:47:18 AM
'Junk' DNA Proves Functional; Helps Explain Human Differences From Other Species

enlarge

According to a new study, what was previously believed to be "junk" DNA is one of the important ingredients distinguishing humans from other species. (Credit: iStockphoto)
ScienceDaily (Nov. 5, 2008) — In a paper published in Genome Research on Nov. 4, scientists at the Genome Institute of Singapore (GIS) report that what was previously believed to be "junk" DNA is one of the important ingredients distinguishing humans from other species.

More than 50 percent of human DNA has been referred to as "junk" because it consists of copies of nearly identical sequences. A major source of these repeats is internal viruses that have inserted themselves throughout the genome at various times during mammalian evolution.

Using the latest sequencing technologies, GIS researchers showed that many transcription factors, the master proteins that control the expression of other genes, bind specific repeat elements. The researchers showed that from 18 to 33% of the binding sites of five key transcription factors with important roles in cancer and stem cell biology are embedded in distinctive repeat families.

Over evolutionary time, these repeats were dispersed within different species, creating new regulatory sites throughout these genomes. Thus, the set of genes controlled by these transcription factors is likely to significantly differ from species to species and may be a major driver for evolution.

This research also shows that these repeats are anything but "junk DNA," since they provide a great source of evolutionary variability and might hold the key to some of the important physical differences that distinguish humans from all other species.

The GIS study also highlighted the functional importance of portions of the genome that are rich in repetitive sequences.

"Because a lot of the biomedical research use model organisms such as mice and primates, it is important to have a detailed understanding of the differences between these model organisms and humans in order to explain our findings," said Guillaume Bourque, Ph.D., GIS Senior Group Leader and lead author of the Genome Research paper.

"Our research findings imply that these surveys must also include repeats, as they are likely to be the source of important differences between model organisms and humans," added Dr. Bourque. "The better our understanding of the particularities of the human genome, the better our understanding will be of diseases and their treatments."

"The findings by Dr. Bourque and his colleagues at the GIS are very exciting and represent what may be one of the major discoveries in the biology of evolution and gene regulation of the decade," said Raymond White, Ph.D., Rudi Schmid Distinguished Professor at the Department of Neurology at the University of California, San Francisco, and chair of the GIS Scientific Advisory Board.

"We have suspected for some time that one of the major ways species differ from one another – for instance, why rats differ from monkeys – is in the regulation of the expression of their genes: where are the genes expressed in the body, when during development, and how much do they respond to environmental stimuli," he added.

"What the researchers have demonstrated is that DNA segments carrying binding sites for regulatory proteins can, at times, be explosively distributed to new sites around the genome, possibly altering the activities of genes near where they locate. The means of distribution seem to be a class of genetic components called 'transposable elements' that are able to jump from one site to another at certain times in the history of the organism. The families of these transposable elements vary from species to species, as do the distributed DNA segments which bind the regulatory proteins."

Dr. White also added, "This hypothesis for formation of new species through episodic distributions of families of gene regulatory DNA sequences is a powerful one that will now guide a wealth of experiments to determine the functional relationships of these regulatory DNA sequences to the genes that are near their landing sites. I anticipate that as our knowledge of these events grows, we will begin to understand much more how and why the rat differs so dramatically from the monkey, even though they share essentially the same complement of genes and proteins."

http://www.sciencedaily.com/releases/2008/11/081104180928.htm
Title: Cross Species DNA Transfer?
Post by: Body-by-Guinness on November 07, 2008, 09:49:02 AM
Second Post:

'Space invader' DNA infiltrated mammalian genomes
22:00 20 October 2008
NewScientist.com news service
Jessica Griggs

Parts of mammalian DNA are so alien they have been dubbed "space invaders" by the researchers that found them. The discovery, if confirmed, will change our understanding of evolution.

We normally get our genes "vertically" – handed down from our parents and theirs before them. Bacteria get theirs in this way too, but also "horizontally" – passed from one, unrelated individual to another.

Now biologists at the University of Texas, Arlington, have found the unexpected: horizontal gene transfer has occurred in mammals and amphibians too.

The culprit is a kind of "parasitic" DNA found in all our cells, known as a transposon. Study leader Cédric Feschotte says that what he calls space invader tranposons jumped sideways millions of years ago into several species by piggybacking onto a virus.

The transposon then assimilated itself into sex chromosomes, ensuring that it would get passed onto future generations. "It is very interesting conceptually – the idea that some parts of a mammal's DNA don't come from an ancestral species," he says.

Alien invasion

Out of 26 animal genomes, the team found a near-identical length of DNA, known as the hAT transposon, in seven species, separated by some 340 million years of evolution.

These include species as widely diverged as a bush baby, a South American opossum, an African clawed frog and a tenrec – a mammal that looks like a hedgehog, but is actually more closely related to elephants.

The fact that invasive DNA was seen in a bush baby but not in any other primates, and in a tenrec but not in elephants, hints that something more exotic than standard inheritance is going on.

However, this patchy distribution by itself does not rule out the traditional method, as some of the species could have lost the transposon DNA throughout evolutionary history.

So the team looked at the position of the hAT transposon – if it had been inherited from a common ancestor it would have been found in the same position, with respect to other genes, in each species. But they could not find a single case of this.

Since first entering the genome, the hAT has been able to reproduce dramatically – in the tenrec, 99,000 copies were found, making up a significant chunk of its DNA. Feschotte speculates that this must have had a dramatic effect on its evolutionary development.

"It's like a bombardment", he says. "It must have been evolutionarily significant because the transposon generated a huge amount of DNA after the initial transfer."

Feschotte says he expects many more reports of horizontal gene jumping. "We're talking about a paradigm shift because, until now, horizontal transfer has been seen as very rare in animal species. It's actually a lot more common than we think."

Mammal extinctions?

The team thinks that the hAT transposon invasion occurred about 30 million years ago and spread across at least two continents. "It's like a pandemic, and one that can infect species that weren't genetically or geographically close. It's puzzling, scary almost," Feschotte says.

It may not be a coincidence that the time of the invasion coincides with a period in evolutionary history that saw mass mammal extinctions. This is usually attributed to climate change, Feschotte says, but it is not crazy to suppose that this type of invasion could contribute to species extinction.

The hAT transposon does not occur in humans, but some 45% of our genome is of transposon origin.

Feschotte's work on the hAT transposon is the first time that a "jumping gene" has been shown to have entered mammalian genomes, and the first time it has been shown to do so in at around the same time, in a range of unrelated species, in different parts of the world.

Feschotte admits that we cannot rule out another transposon offensive occurring in mammals, and thinks that bats are the species most likely to be the source. For some reason, he says, they seem to be most susceptible to picking up transposons – possibly because of the viruses they carry.

'Rather scary'

"Bats are notorious reservoir species for a plethora of viruses, including some very nasty to humans like rabies, SARS and perhaps Ebola," he says.

"Since these bats are full of active DNA transposons and are frequently involved in viral spill-over, the door for the transfer of an active DNA transposon to humans seems wide open. Rather scary."

Greg Hurst, an evolutionary biologist at the University of Liverpool, UK, says that the arrival of a new transposable element can be evolutionary significant, because new elements tend to be more active. "They will jump a fair bit more than older elements, which the resident genome will have evolved to suppress."

Most of the consequences of having a transposon jump around in your genome will be deleterious, Hurst says, but some will be advantageous. "The evolutionary life of the species could certainly hit the fast lane for a bit when it happens."

Journal reference: Proceedings of the National Academy of Sciences (DOI: 10.1073/pnas.0806548105)

Related Articles
Evolution myths: Evolution promotes the survival of species
http://www.newscientist.com/article.ns?id=dn13687
16 April 2008
Genomics: Junking the junk DNA
http://www.newscientist.com/article.ns?id=mg19526121.500
11 July 2007
Primeval life reflected in present-day pools
http://www.newscientist.com/article.ns?id=mg18625062.000
02 July 2005
Jumping genes help choreograph development
http://www.newscientist.com/article.ns?id=mg18424702.700
23 October 2004
Weblinks
Cedric Feschotte, University of Texas
http://www.uta.edu/biology/feschotte/
Feschotte lab
http://www3.uta.edu/faculty/cedric/
Greg Hurst, University of Liverpool
http://pcwww.liv.ac.uk/~ghurst/

http://www.newscientist.com/channel/life/dn14992-space-invader-dna-infiltrated-mammalian-genomes.html?DCMP=ILC-arttsrhcol&nsref=specrt13_bar
Title: Evolution Shaped by Warfare?
Post by: Body-by-Guinness on November 13, 2008, 05:25:36 PM
Perhaps a chew toy for Crafty:

How warfare shaped human evolution

12 November 2008 by Bob Holmes


IT'S a question at the heart of what it is to be human: why do we go to war? The cost to human society is enormous, yet for all our intellectual development, we continue to wage war well into the 21st century.

Now a new theory is emerging that challenges the prevailing view that warfare is a product of human culture and thus a relatively recent phenomenon. For the first time, anthropologists, archaeologists, primatologists, psychologists and political scientists are approaching a consensus. Not only is war as ancient as humankind, they say, but it has played an integral role in our evolution.

The theory helps explain the evolution of familiar aspects of warlike behaviour such as gang warfare. And even suggests the cooperative skills we've had to develop to be effective warriors have turned into the modern ability to work towards a common goal.

These ideas emerged at a conference last month on the evolutionary origins of war at the University of Oregon in Eugene. "The picture that was painted was quite consistent," says Mark Van Vugt, an evolutionary psychologist at the University of Kent, UK. "Warfare has been with us for at least several tens, if not hundreds, of thousands of years." He thinks it was already there in the common ancestor we share with chimps. "It has been a significant selection pressure on the human species," he says. In fact several fossils of early humans have wounds consistent with warfare.

Studies suggest that warfare accounts for 10 per cent or more of all male deaths in present-day hunter-gatherers. "That's enough to get your attention," says Stephen LeBlanc, an archaeologist at Harvard University's Peabody Museum in Boston.

Primatologists have known for some time that organised, lethal violence is common between groups of chimpanzees, our closest relatives. Whether between chimps or hunter-gatherers, however, intergroup violence is nothing like modern pitched battles. Instead, it tends to take the form of brief raids using overwhelming force, so that the aggressors run little risk of injury. "It's not like the Somme," says Richard Wrangham, a primatologist at Harvard University. "You go off, you make a hit, you come back again." This opportunistic violence helps the aggressors weaken rival groups and thus expand their territorial holdings.

Such raids are possible because humans and chimps, unlike most social mammals, often wander away from the main group to forage singly or in smaller groups, says Wrangham. Bonobos - which are as closely related to humans as chimps are - have little or no intergroup violence because they tend to live in habitats where food is easier to come by, so that they need not stray from the group.

If group violence has been around for a long time in human society then we ought to have evolved psychological adaptations to a warlike lifestyle. Several participants presented the strongest evidence yet that males - whose larger and more muscular bodies make them better suited for fighting - have evolved a tendency towards aggression outside the group but cooperation within it. "There is something ineluctably male about coalitional aggression - men bonding with men to engage in aggression against other men," says Rose McDermott, a political scientist at Stanford University in California.

Aggression in women, she notes, tends to take the form of verbal rather than physical violence, and is mostly one on one. Gang instincts may have evolved in women too, but to a much lesser extent, says John Tooby, an evolutionary psychologist at the University of California at Santa Barbara. This is partly because of our evolutionary history, in which men are often much stronger than women and therefore better suited for physical violence. This could explain why female gangs only tend to form in same-sex environments such as prison or high school. But women also have more to lose from aggression, Tooby points out, since they bear most of the effort of child-rearing.

Not surprisingly, McDermott, Van Vugt and their colleagues found that men are more aggressive than women when playing the leader of a fictitious country in a role-playing game. But Van Vugt's team observed more subtle responses in group bonding. For example, male undergraduates were more willing than women to contribute money towards a group effort - but only when competing against rival universities. If told instead that the experiment was to test their individual responses to group cooperation, men coughed up less cash than women did. In other words, men's cooperative behaviour only emerged in the context of intergroup competition (Psychological Science, vol 18, p 19).

Some of this behaviour could arguably be attributed to conscious mental strategies, but anthropologist Mark Flinn of the University of Missouri at Columbia has found that group-oriented responses occur on the hormonal level, too. He found that cricket players on the Caribbean island of Dominica experience a testosterone surge after winning against another village. But this hormonal surge, and presumably the dominant behaviour it prompts, was absent when the men beat a team from their own village, Flinn told the conference. "You're sort of sending the signal that it's play. You're not asserting dominance over them," he says. Similarly, the testosterone surge a man often has in the presence of a potential mate is muted if the woman is in a relationship with his friend. Again, the effect is to reduce competition within the group, says Flinn. "We really are different from chimpanzees in our relative amount of respect for other males' mating relationships."

The net effect of all this is that groups of males take on their own special dynamic. Think soldiers in a platoon, or football fans out on the town: cohesive, confident, aggressive - just the traits a group of warriors needs.

Chimpanzees don't go to war in the way we do because they lack the abstract thought required to see themselves as part of a collective that expands beyond their immediate associates, says Wrangham. However, "the real story of our evolutionary past is not simply that warfare drove the evolution of social behaviour," says Samuel Bowles, an economist at the Santa Fe Institute in New Mexico and the University of Siena, Italy. The real driver, he says, was "some interplay between warfare and the alternative benefits of peace".

Though women seem to help broker harmony within groups, says Van Vugt, men may be better at peacekeeping between groups.

Our warlike past may have given us other gifts, as well. "The interesting thing about war is we're focused on the harm it does," says Tooby. "But it requires a super-high level of cooperation." And that seems to be a heritage worth hanging on to.

http://www.newscientist.com/article/mg20026823.800-how-warfare-shaped-human-evolution.html?full=true
Title: Mammoth's Genome Sequenced
Post by: Body-by-Guinness on November 19, 2008, 11:26:09 AM
News -  November 19, 2008

Scientists Sequence Half the Woolly Mammoth's Genome
Study could be a step toward resurrecting a long-extinct animal
By Kate Wong


Editor's note: This story will appear in our January issue but is being posted early because of a publication in today's Nature.

Thousands of years after the last woolly mammoth lumbered across the tundra, scientists have sequenced a whopping 50 percent of the beast’s nuclear genome,  they report in a new study. Earlier attempts to sequence the DNA of these icons of the Ice Age produced only tiny quantities of code. The new work marks the first time that so much of the genetic material of an extinct creature has been retrieved. Not only has the feat provided insight into the evolutionary history of mammoths, but it is a step toward realizing the science-fiction dream of being able to resurrect a long-gone animal.

Researchers led by Webb Miller and Stephan C. Schuster of Pennsylvania State University extracted the DNA from hair belonging to two Siberian woolly mammoths and ran it through a machine that conducts so-called highthroughput sequencing. Previously, the largest amount of DNA from an extinct species comprised around 13 million base pairs—not even 1 percent of the genome. Now, writing in the November 20 issue of Nature, the team reports having obtained more than three billion base pairs. “It’s a technical breakthrough,” says ancient-DNA expert Hendrik N. Poinar of McMaster University in Ontario.

Interpretation of the sequence is still nascent, but the results have already helped overturn a long-held assumption about the proboscidean past. Received wisdom holds that the woolly mammoth was the last of a line of species in which each one begat the next, with only one species existing at any given time. The nuclear DNA reveals that the two mammoths that yielded the DNA were quite different from each other, and they seem to belong to populations that diverged 1.5 million to two million years ago. This finding confirms the results of a recent study of the relatively short piece of DNA that resides in the cell’s energy-producing organelles—called mitochondrial DNA—which suggested that multiple species of woolly mammoth coexisted. “It looks like there was speciation that we were previously unable to detect” using fossils alone, Ross D. E. MacPhee of the American Museum of Natural History in New York City observes.

Thus far the mammoth genome exists only in bits and pieces: it has not yet been assembled. The researchers are awaiting completion of the genome of the African savanna elephant, a cousin of the woolly mammoth, which will serve as a road map for how to reconstruct the extinct animal’s genome.

Armed with complete genomes for the mammoth and its closest living relative, the Asian elephant, scientists may one day be able to bring the mammoth back from the beyond. “A year ago I would have said this was science fiction,” Schuster remarks. But as a result of this sequencing achievement, he now believes one could theoretically modify the DNA in the egg of an elephant to match that of its furry cousin by artificially introducing the appropriate substitutions to the genetic code. Based on initial comparisons of mammoth and elephant DNA, he estimates that around 400,000 changes would produce an animal that looks a lot like a mammoth; an exact replica would require several million.

(The recent cloning of frozen mice is not applicable to woolly mammoths, Schuster believes, because whereas mice are small and therefore freeze quickly, a mammoth carcass would take many days to ice over—a delay that would likely cause too much DNA degradation for cloning.)

In the nearer term, biologists are hoping to glean insights into such mysteries as how woolly mammoths were adapted to their frigid world and what factors led to their demise. Miller notes that by studying the genomes of multiple mammoths from different time periods, researchers will be able to chart the decrease in genetic diversity as the species died out. The downfall of the mammoths and other species may contain lessons for modern fauna in danger of disappearing, he says.

Indeed, the team is now sequencing DNA they have obtained from a thylacine, an Australian marsupial that went extinct in 1936, possibly as a result of infection. They want to compare its DNA with that of the closely related Tasmanian devil, which is currently under threat from a devastating facial cancer.

“We’re hoping to learn why one species went extinct and the other didn’t and then use that [knowledge] in conservation efforts,” Miller says. If the research turns up genes associated with survival, scientists can use that information to develop a breeding program for the Tasmanian devil that maximizes the genetic diversity of the population—and increases the frequency of genes that confer immunity. Perhaps the greatest promise of ancient DNA is not raising the dead but preserving the living.

http://www.sciam.com/article.cfm?id=woolly-mammoth-genome-sequenced
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on November 19, 2008, 05:47:59 PM
What an extraordinary world we live in!

What does "proboscidean" mean?
Title: Re: Evolutionary biology/psychology
Post by: Body-by-Guinness on November 19, 2008, 06:54:47 PM
I assumed a root of "proboscis" and so figured "of bigged nosed creatures."
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on November 20, 2008, 12:46:00 AM
Duh :oops:
Title: Somerville: Aping their betters
Post by: Crafty_Dog on December 06, 2008, 06:09:35 AM
Margaret Somerville | Friday, 5 December 2008
Aping their betters
If animals co-operate to benefit their community, does it mean they are ethical beings?

Recently, I participated in a round-table discussion, “Apes or Angels: What is the Origin of Ethics?” at McGill University. It was billed as honouring the 150th anniversary of the publication of Charles Darwin's Theory of Natural Selection.

The issue on the table was whether the ethical system that underlies "our unique social and economic system ... that leads us to rely on the support and co-operation of other individuals, largely unknown to one another" is simply the result of evolution through natural selection and a more advanced form of the social co-operation we see in animals; or whether "our social behaviour and the ethics on which it is based [are] uniquely human and owe nothing to the processes that govern societies of ants or bacteria. Our bodies may have evolved, but our ethics requires another kind of explanation."

In short, are ethics and morality in humans just one more outcome of natural selection through evolution, or do they have some other origin?

My co-panelists included world-renowned evolutionary biologists; distinguished academics specializing in researching the relation of economics and evolutionary biology; an anthropologist with expertise on co-operative behaviour in apes and monkeys; and a global leader in the field of evolution education, whose expert witness testimony in the U.S. federal trial on biological evolution, education and the U.S. constitution, contributed to the court ruling that the teaching of intelligent design in high-school science classes was unconstitutional.

I was a loner as an ethicist and, possibly, the only person who thought that humans were not just an improved version of other animals in terms of ethical behaviour.

First, we discussed whether we could say animals had a sense of ethics. My co-panelists referred to research that shows primates perceive and become angry when they can see they are not being treated fairly -- for instance, when one gets a bigger reward for a certain response than another. They explained that animals form community and act to maximize benefit to the community, including through self-sacrifice. They proposed that these behaviours were early forms of ethical conduct and that it was relevant in tracing and understanding the evolution of ethics in humans to know when these behaviours first appeared, in which animals, and at what point on the evolutionary tree.

This approach reflects a range of crucial assumptions.

First, that ethics -- and one assumes morality, as ethics is based on morality -- is just a genetically determined characteristic not unique to humans. Genetic reductionism is a view that we are nothing more than "gene machines", including with respect to our most "human" characteristics, such as ethics.

We probably have genes that give us the capacity to seek ethics. (These genes might need to be activated by certain experiences or learning. We can imagine them as being like a TV set: we need it to see a telecast, but it doesn't determine what we see.) I propose, however, that ethics consists of more than just a genetically programmed response.

Ethics require moral judgment. That requires deciding between right and wrong. As far as we know, animals are not capable of doing that. There's a major difference between engaging in social conduct that benefits the community, as some animals do, and engaging in that same conduct because it would be ethically wrong not to do so, as humans do.

My colleagues believed ethics were not unique to humans. Definition is a problem here. If ethics are broadly defined to encompass certain animal behaviour, they are correct. But if ethics are the practical application of morality, then to say animals have ethics is to attribute a moral instinct to them.

My colleagues' approach postulates an ethics continuum on which humans are just more "ethically advanced" than animals -- that is, there is only a difference in degree, not a difference in kind, between humans and animals with respect to having a capacity to be ethical.

Whether animals and humans are just different-in-degree or different-in-kind ("special" and, therefore, deserve "special respect") is at the heart of many of the most important current ethical conflicts, including those about abortion, human embryonic stem cell research, new reproductive technologies, and euthanasia.

Princeton philosopher Peter Singer is an "only a difference in degree" adherent. He says we are all animals and, therefore, giving preferential treatment to humans is "speciesism" -- wrongful discrimination on the basis of species identity. Animals and humans deserve the same respect. What we wouldn't do to humans we shouldn't do to animals; and what we would do for animals -- for instance, euthanasia -- we should do for humans.

MIT artificial intelligence and robotics scientist, Rodney Brooks, argues the same on behalf of robots. He claims that those which are more intelligent than us will deserve greater respect than we do.

In contrast, I believe that humans are "special" (different-in-kind) as compared with other animals and, consequently, deserve "special respect".

Traditionally, we have used the idea that humans have a soul and animals don't to justify our differential treatment of humans and animals in terms of the respect they deserve. But soul is no longer a universally accepted concept.

Ethics can, however, be linked to a metaphysical base without needing to invoke religious or supernatural features or beliefs. E could speak of a secular "human spirit" nature or, as German philosopher Jurgen Habermas describes it, an "ethics of the human species". I propose that ethics necessarily involve some transcendent experience, one that humans can have and animals cannot.

And I want to make clear that we can believe in evolution and also believe in God. The dichotomy often made in the media between being "atheist-anti-religion/pro-evolution," on the one hand, and "believer-pro-religion/anti-evolution," on the other, does not reflect reality. Evolution and a belief in God are not, as Richard Dawkins argues, incompatible.

The argument that it's dangerous to abandon the ideas of human specialness and that a moral instinct and search for ethics is uniquely human, was greeted with great skepticism by my colleagues, who seemed to think that only religious people would hold such views.

To conclude: "Do ants have ethics?" -- that is, Does the behaviour, bonding and the formation of community in animals have a different base from that in humans? How we answer that question is of immense importance, because it will have a major impact on the ethics we hand on to future generations.

Margaret Somerville is director of the Centre for Medicine, Ethics and Law at McGill University and author of The Ethical Imagination: Journeys of the Human Spirit.

Title: "Altruistic Punishment"
Post by: Body-by-Guinness on December 09, 2008, 02:22:47 PM
Spare the Rod, Spoil Society
Does punishing free riders increase long-run cooperation?

Ronald Bailey | December 9, 2008
Want to punish the fat cats on Wall Street who have allegedly wrecked the economy? Of course you do! And it's only natural, according to research done by two economists whose work focuses on the puzzle of human cooperation. As Swiss researchers Ernst Fehr and Simon Gächter noted in 2002, "Unlike other creatures, people frequently cooperate with genetically unrelated strangers, often in large groups." They argued that one significant key to cooperation is the existence of "altruistic punishment."
During the course of human evolution, people frequently engaged in cooperative activities such as big game hunting and the preservation of common property resources like fisheries. But it's all too easy for individuals to free ride on such projects. So how does true cooperation occur? Fehr and Gächter point to altruistic punishers: people who respond with strong emotion or even violence when someone else benefits from the labor of others without contributing something themselves. It may cost something to act as altruistic punisher, but going postal on non-cooperators does encourage everybody to contribute to the public good. The New York Times even speculated that this drive to punish free riders was behind the American public's disinclination to support the Congressional bailout proposals back in September.

To test their hypothesis, Fehr and Gächter set up a series of public goods experiments in which each player chose how much money to contribute to a joint investment—without knowing beforehand how much the other players will contribute. If everyone puts in a lot, they maximize their profits. However, the games were rigged such that non-cooperators could gain even more by taking a share of the profits while retaining their own initial endowments. The researchers found that when punishing non-cooperators was possible (say, spend $1 to reduce the free riders' endowments by $3), it substantially increased the amount that nearly all subjects invested in the public good. For example, in experiments done in 2000, where free riders could be punished, experimental subjects contributed two to four times more than when there was no punishment option.

However, Dutch experimenters noted that public goods games where punishment was allowed actually produced lower overall returns than did games in which no punishment occurred. Why? Because the destruction of the non-cooperators' resources was greater than the subsequent gains from cooperation. Punishment increases cooperation, but it also makes the group poorer. This is not a particularly inspiring outcome.

In a new study published last week in Science, however, Gächter and his colleagues show that the lower overall returns from games in which punishment is possible may be an experimental artifact resulting from the number of rounds in which the games are played. In most experimental conditions, public goods games are played for ten rounds or less. The new research compares the outcomes of games lasting for 10 rounds versus 50 rounds, both when punishment is possible and when it is not.

What happens when players have only ten rounds in which to invest? As satisfying as it is to punish free riders, the average payoffs after ten rounds are indeed lower than when no punishment is allowed. The results are quite different when the games last 50 rounds. Interestingly, the payoffs in the initial rounds when punishment is possible are lower than the payoffs when punishment can't occur. However, as rounds of play accumulate, the payoffs in the games where free riders can be punished rise rapidly, while the payoffs in games in which free riders are not punished drop throughout the duration of play.

Another happy result is that once players understand that they can be punished for free riding, they start investing enthusiastically in the common pool, causing the costs of punishment to drop to near zero. "Overall, our experiments show that punishment not only increases cooperation, it also makes groups and individuals better off in the long run because the costs of punishment become negligible and are outweighed by the increased gains from cooperation," conclude the researchers. "These results support group selection models of cooperation and punishment, which require that punishment increases not only cooperation but also group average payoffs."

So punishing free riders increases cooperation and boosts incomes over the long-run. But that is not always the case (at least in shorter-term games). Earlier this year, Gächter and his colleagues reported the results from a series of public goods games using players from 16 different societies. Their research turned up profound cross-cultural differences in response to punishment. All groups punished free riders, but the free riders did not all respond with increased cooperation. Instead, some sought revenge by punishing their punishers—if you whack me, I'll whack you, in other words. So a cycle of vendettas broke out.

For example, players from Muscat, Greece, and Saudi Arabia were the most vengeful. On the other hand, players from the United States, Australia, and Britain were the least vengeful and most likely to respond to punishment with increased cooperation. The researchers concluded that revenge is stronger among participants from "societies with weak norms of civic cooperation and a weak rule of law." Not surprisingly, the overall payoffs were significantly lower in the games in which participants indulged in cycles of vengeance.

We've come a long way from the bands of Pleistocene hunter-gatherers in which these psychological tendencies evolved. In today's complex economy, which encompasses globe-spanning webs of cooperation, how do people correctly identify free riders who merit punishment? Are the investment bankers with big bonuses free riders? What about hedge fund managers? Government agencies? Politicians? Perhaps the good news from experimental economics is that while Americans want to punish free riders as much as the next guys do, we are unlikely to engage in a self-defeating cycle of financial vengeance that will make us all poorer.

Ronald Bailey is reason's science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is now available from Prometheus Books.

http://www.reason.com/news/show/130472.html
Title: Predictions confirmed
Post by: Crafty_Dog on December 11, 2008, 01:07:43 PM
Evolutionary theory predictions confirmed

http://chem.tufts.edu/AnswersInScience/evo_science.html
Title: A HISTORY OF VIOLENCE Stephen Pinker
Post by: rachelg on December 14, 2008, 05:30:22 PM
http://www.edge.org/3rd_culture/pinker07/pinker07_index.html


A HISTORY OF VIOLENCE

In sixteenth-century Paris, a popular form of entertainment was cat-burning, in which a cat was hoisted in a sling on a stage and slowly lowered into a fire. According to historian Norman Davies, "[T]he spectators, including kings and queens, shrieked with laughter as the animals, howling with pain, were singed, roasted, and finally carbonized." Today, such sadism would be unthinkable in most of the world. This change in sensibilities is just one example of perhaps the most important and most underappreciated trend in the human saga: Violence has been in decline over long stretches of history, and today we are probably living in the most peaceful moment of our species' time on earth.

In the decade of Darfur and Iraq, and shortly after the century of Stalin, Hitler, and Mao, the claim that violence has been diminishing may seem somewhere between hallucinatory and obscene. Yet recent studies that seek to quantify the historical ebb and flow of violence point to exactly that conclusion.

Some of the evidence has been under our nose all along. Conventional history has long shown that, in many ways, we have been getting kinder and gentler. Cruelty as entertainment, human sacrifice to indulge superstition, slavery as a labor-saving device, conquest as the mission statement of government, genocide as a means of acquiring real estate, torture and mutilation as routine punishment, the death penalty for misdemeanors and differences of opinion, assassination as the mechanism of political succession, rape as the spoils of war, pogroms as outlets for frustration, homicide as the major form of conflict resolution—all were unexceptionable features of life for most of human history. But, today, they are rare to nonexistent in the West, far less common elsewhere than they used to be, concealed when they do occur, and widely condemned when they are brought to light.

At one time, these facts were widely appreciated. They were the source of notions like progress, civilization, and man's rise from savagery and barbarism. Recently, however, those ideas have come to sound corny, even dangerous. They seem to demonize people in other times and places, license colonial conquest and other foreign adventures, and conceal the crimes of our own societies. The doctrine of the noble savage—the idea that humans are peaceable by nature and corrupted by modern institutions—pops up frequently in the writing of public intellectuals like José Ortega y Gasset ("War is not an instinct but an invention"), Stephen Jay Gould ("Homo sapiens is not an evil or destructive species"), and Ashley Montagu ("Biological studies lend support to the ethic of universal brotherhood"). But, now that social scientists have started to count bodies in different historical periods, they have discovered that the romantic theory gets it backward: Far from causing us to become more violent, something in modernity and its cultural institutions has made us nobler.

To be sure, any attempt to document changes in violence must be soaked in uncertainty. In much of the world, the distant past was a tree falling in the forest with no one to hear it, and, even for events in the historical record, statistics are spotty until recent periods. Long-term trends can be discerned only by smoothing out zigzags and spikes of horrific bloodletting. And the choice to focus on relative rather than absolute numbers brings up the moral imponderable of whether it is worse for 50 percent of a population of 100 to be killed or 1 percent in a population of one billion.

Yet, despite these caveats, a picture is taking shape. The decline of violence is a fractal phenomenon, visible at the scale of millennia, centuries, decades, and years. It applies over several orders of magnitude of violence, from genocide to war to rioting to homicide to the treatment of children and animals. And it appears to be a worldwide trend, though not a homogeneous one. The leading edge has been in Western societies, especially England and Holland, and there seems to have been a tipping point at the onset of the Age of Reason in the early seventeenth century.

At the widest-angle view, one can see a whopping difference across the millennia that separate us from our pre-state ancestors. Contra leftist anthropologists who celebrate the noble savage, quantitative body-counts—such as the proportion of prehistoric skeletons with axemarks and embedded arrowheads or the proportion of men in a contemporary foraging tribe who die at the hands of other men—suggest that pre-state societies were far more violent than our own. It is true that raids and battles killed a tiny percentage of the numbers that die in modern warfare. But, in tribal violence, the clashes are more frequent, the percentage of men in the population who fight is greater, and the rates of death per battle are higher. According to anthropologists like Lawrence Keeley, Stephen LeBlanc, Phillip Walker, and Bruce Knauft, these factors combine to yield population-wide rates of death in tribal warfare that dwarf those of modern times. If the wars of the twentieth century had killed the same proportion of the population that die in the wars of a typical tribal society, there would have been two billion deaths, not 100 million.

Political correctness from the other end of the ideological spectrum has also distorted many people's conception of violence in early civilizations—namely, those featured in the Bible. This supposed source of moral values contains many celebrations of genocide, in which the Hebrews, egged on by God, slaughter every last resident of an invaded city. The Bible also prescribes death by stoning as the penalty for a long list of nonviolent infractions, including idolatry, blasphemy, homosexuality, adultery, disrespecting one's parents, and picking up sticks on the Sabbath. The Hebrews, of course, were no more murderous than other tribes; one also finds frequent boasts of torture and genocide in the early histories of the Hindus, Christians, Muslims, and Chinese.

At the century scale, it is hard to find quantitative studies of deaths in warfare spanning medieval and modern times. Several historians have suggested that there has been an increase in the number of recorded wars across the centuries to the present, but, as political scientist James Payne has noted, this may show only that "the Associated Press is a more comprehensive source of information about battles around the world than were sixteenth-century monks." Social histories of the West provide evidence of numerous barbaric practices that became obsolete in the last five centuries, such as slavery, amputation, blinding, branding, flaying, disembowelment, burning at the stake, breaking on the wheel, and so on. Meanwhile, for another kind of violence—homicide—the data are abundant and striking. The criminologist Manuel Eisner has assembled hundreds of homicide estimates from Western European localities that kept records at some point between 1200 and the mid-1990s. In every country he analyzed, murder rates declined steeply—for example, from 24 homicides per 100,000 Englishmen in the fourteenth century to 0.6 per 100,000 by the early 1960s.

On the scale of decades, comprehensive data again paint a shockingly happy picture: Global violence has fallen steadily since the middle of the twentieth century. According to the Human Security Brief 2006, the number of battle deaths in interstate wars has declined from more than 65,000 per year in the 1950s to less than 2,000 per year in this decade. In Western Europe and the Americas, the second half of the century saw a steep decline in the number of wars, military coups, and deadly ethnic riots.

Zooming in by a further power of ten exposes yet another reduction. After the cold war, every part of the world saw a steep drop-off in state-based conflicts, and those that do occur are more likely to end in negotiated settlements rather than being fought to the bitter end. Meanwhile, according to political scientist Barbara Harff, between 1989 and 2005 the number of campaigns of mass killing of civilians decreased by 90 percent.

The decline of killing and cruelty poses several challenges to our ability to make sense of the world. To begin with, how could so many people be so wrong about something so important? Partly, it's because of a cognitive illusion: We estimate the probability of an event from how easy it is to recall examples. Scenes of carnage are more likely to be relayed to our living rooms and burned into our memories than footage of people dying of old age. Partly, it's an intellectual culture that is loath to admit that there could be anything good about the institutions of civilization and Western society. Partly, it's the incentive structure of the activism and opinion markets: No one ever attracted followers and donations by announcing that things keep getting better. And part of the explanation lies in the phenomenon itself. The decline of violent behavior has been paralleled by a decline in attitudes that tolerate or glorify violence, and often the attitudes are in the lead. As deplorable as they are, the abuses at Abu Ghraib and the lethal injections of a few murderers in Texas are mild by the standards of atrocities in human history. But, from a contemporary vantage point, we see them as signs of how low our behavior can sink, not of how high our standards have risen.

The other major challenge posed by the decline of violence is how to explain it. A force that pushes in the same direction across many epochs, continents, and scales of social organization mocks our standard tools of causal explanation. The usual suspects—guns, drugs, the press, American culture—aren't nearly up to the job. Nor could it possibly be explained by evolution in the biologist's sense: Even if the meek could inherit the earth, natural selection could not favor the genes for meekness quickly enough. In any case, human nature has not changed so much as to have lost its taste for violence. Social psychologists find that at least 80 percent of people have fantasized about killing someone they don't like. And modern humans still take pleasure in viewing violence, if we are to judge by the popularity of murder mysteries, Shakespearean dramas, Mel Gibson movies, video games, and hockey.

What has changed, of course, is people's willingness to act on these fantasies. The sociologist Norbert Elias suggested that European modernity accelerated a "civilizing process" marked by increases in self-control, long-term planning, and sensitivity to the thoughts and feelings of others. These are precisely the functions that today's cognitive neuroscientists attribute to the prefrontal cortex. But this only raises the question of why humans have increasingly exercised that part of their brains. No one knows why our behavior has come under the control of the better angels of our nature, but there are four plausible suggestions.

The first is that Hobbes got it right. Life in a state of nature is nasty, brutish, and short, not because of a primal thirst for blood but because of the inescapable logic of anarchy. Any beings with a modicum of self-interest may be tempted to invade their neighbors to steal their resources. The resulting fear of attack will tempt the neighbors to strike first in preemptive self-defense, which will in turn tempt the first group to strike against them preemptively, and so on. This danger can be defused by a policy of deterrence—don't strike first, retaliate if struck—but, to guarantee its credibility, parties must avenge all insults and settle all scores, leading to cycles of bloody vendetta. These tragedies can be averted by a state with a monopoly on violence, because it can inflict disinterested penalties that eliminate the incentives for aggression, thereby defusing anxieties about preemptive attack and obviating the need to maintain a hair-trigger propensity for retaliation. Indeed, Eisner and Elias attribute the decline in European homicide to the transition from knightly warrior societies to the centralized governments of early modernity. And, today, violence continues to fester in zones of anarchy, such as frontier regions, failed states, collapsed empires, and territories contested by mafias, gangs, and other dealers of contraband.

Payne suggests another possibility: that the critical variable in the indulgence of violence is an overarching sense that life is cheap. When pain and early death are everyday features of one's own life, one feels fewer compunctions about inflicting them on others. As technology and economic efficiency lengthen and improve our lives, we place a higher value on life in general.

A third theory, championed by Robert Wright, invokes the logic of non-zero-sum games: scenarios in which two agents can each come out ahead if they cooperate, such as trading goods, dividing up labor, or sharing the peace dividend that comes from laying down their arms. As people acquire know-how that they can share cheaply with others and develop technologies that allow them to spread their goods and ideas over larger territories at lower cost, their incentive to cooperate steadily increases, because other people become more valuable alive than dead.

Then there is the scenario sketched by philosopher Peter Singer. Evolution, he suggests, bequeathed people a small kernel of empathy, which by default they apply only within a narrow circle of friends and relations. Over the millennia, people's moral circles have expanded to encompass larger and larger polities: the clan, the tribe, the nation, both sexes, other races, and even animals. The circle may have been pushed outward by expanding networks of reciprocity, à la Wright, but it might also be inflated by the inexorable logic of the golden rule: The more one knows and thinks about other living things, the harder it is to privilege one's own interests over theirs. The empathy escalator may also be powered by cosmopolitanism, in which journalism, memoir, and realistic fiction make the inner lives of other people, and the contingent nature of one's own station, more palpable—the feeling that "there but for fortune go I".

Whatever its causes, the decline of violence has profound implications. It is not a license for complacency: We enjoy the peace we find today because people in past generations were appalled by the violence in their time and worked to end it, and so we should work to end the appalling violence in our time. Nor is it necessarily grounds for optimism about the immediate future, since the world has never before had national leaders who combine pre-modern sensibilities with modern weapons.

But the phenomenon does force us to rethink our understanding of violence. Man's inhumanity to man has long been a subject for moralization. With the knowledge that something has driven it dramatically down, we can also treat it as a matter of cause and effect. Instead of asking, "Why is there war?" we might ask, "Why is there peace?" From the likelihood that states will commit genocide to the way that people treat cats, we must have been doing something right. And it would be nice to know what, exactly, it is.

[First published in The New Republic, 3.19.07.]
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on December 15, 2008, 12:35:35 AM
In contrast to the notions of this article are the writings of Austrian ethnologist (study of animals) and Nobel Laureate Konrad Lorenz who, like psychologist Carl Jung and Sandhurst military historian John Keegan (History of War) point out that over time War has become evermore efficient in its brutality.  Working from memory, the deaths of the American Civil War exceeded what went before, yet was exceeded by the trench warfare of WW1, then the 20 million or so killed by Stalin and the tens of millions killed by Mao, then WW2 (including the use of nuclear weapons etc.)

Against the long term trend, it is risky to see the last few decades as a historical turning point.  If could be, but there's plenty to suggest otherwise as well.
Title: Re: Evolutionary biology/psychology
Post by: rachelg on December 17, 2008, 06:17:36 PM
Against the long term trend, it is risky to see the last few decades as a historical turning point.  If could be, but there's plenty to suggest otherwise as well.


It may certainly be soon to tell if the last few decades  are a turning point but the statistics are currently not in favor  of feminism/working mothers causing  rising violence and the breakdown of society. 


 I'm a little "overbooked" on economic stuff lately but I will add Wrights book to my to be read list
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on December 17, 2008, 11:46:29 PM
"It may certainly be soon to tell if the last few decades  are a turning point but the statistics are currently not in favor  of feminism/working mothers causing  rising violence and the breakdown of society."

Delivered with panache and wit :lol: but I stilll insist upon the point that mothers matter and when they disappear from their children's lives the consequences are profound.
Title: Lionel Tiger: Monkeys and Utopia
Post by: Crafty_Dog on December 26, 2008, 11:52:53 PM
By LIONEL TIGER
Reveries about human perfection do not exist solely in the enthusiastic systems confected by Karl Marx, or in the REM sleep of Hugo Chávez, or through the utopian certainties of millenarians. There has been a persistent belief through countless societies that life is better, much better, somewhere else. In some yet-unfound reality there is an expression of our best natures -- our loving, peaceful, lyrically fair human core.

Anthropologists have been at the center of this quest, its practitioners sailing off to find that elusive core of perfection everywhere else corrupted by civilization. In the 1920s, Margaret Mead found it in Samoa, where the people, she said, enjoyed untroubled lives. Adolescents in particular were not bothered by the sexual hang-ups that plague our repressive society. Decades later an Australian researcher, Derek Freeman, retraced her work and successfully challenged its validity. Still, Mead's work and that of others reinforced the notion that our way of life was artificial, inauthentic, just plain wrong.

Enter primatology, which provided yet more questions about essential hominid nature -- and from which species we could, perhaps, derive guidance about our inner core. First studied in the wild were the baboons, which turned out to have harsh power politics and sexual inequity. Then Jane Goodall brought back heartwarming film of African chimps who were loving, loyal, fine mothers, with none of the militarism of the big bad baboons. But her subjects were well fed, and didn't need to scratch for a living in their traditional way. Later it became clear that chimps in fact formed hunting posses. They tore baby baboons they captured limb from limb, and seemed to enjoy it.

In Today's Opinion Journal
 

REVIEW & OUTLOOK

Rick Warren, Obama and the LeftA Rigged Auction DerailedPlankton Watch

TODAY'S COLUMNIST

Declarations: A Year for the Books
– Peggy Noonan

COMMENTARY

There's No Pain-Free Cure for Recession
– Peter SchiffCross Country: All I Wanted for Christmas Was a Newspaper
– Paul MulshineOf Monkeys and Utopia
– Lionel TigerWhere to look now for that perfect, pacifistic and egalitarian core? Franz de Waal, a talented and genial primatologist, observed the behavior of bonobos at Emory University's primate lab in the 1980s. These chimpanzees, he found, engaged in a dramatic amount of sexual activity both genital and oral, heterosexual and homosexual -- and when conflicts threatened to arise a bout of sex settled the score and life went on. Bonobos made love, not war. No hunting, killing, male dominance, or threats to the sunny paradise of a species so closely related to us. His research attracted enormous attention outside anthropology. Why not? How can this lifestyle not be attractive to those of us struggling on a committee, in a marriage, and seeking lubricious resolution?

Alas, Mr. de Waal also hadn't studied his species in the wild. And, with a disappointing shock in some quarters, for the past five years bonobos have been studied in their natural habitat in a national park in the Congo.

There, along with colleagues, Gottfried Hohman of the Institute for Evolutionary Anthropology in Leipzig has seen groups of bonobos engage in clearly willful and challenging hunts. Indeed, female bonobos took full part in the some 10 organized hunts which have been observed thus far. Another paradise lost.

Reveries about hidden human perfection centered in primate life have been sharply curtailed by what we've learned about the Malibu ape -- when it seeks its own food, doesn't live in an easy-hook-up dormitory, and may confront severe challenges in life.

Bonobo, we hardly know you.

Mr. Tiger is the Charles Darwin professor of anthropology at Rutgers University.
Title: Better Recipes Cause Progress, I
Post by: Body-by-Guinness on January 07, 2009, 01:18:37 PM
Interesting interview, one that causes me to reflect on the dark age panic mongers would impose upon the world.

'Chiefs, Thieves, and Priests'
Science writer Matt Ridley on the causes of poverty and prosperity

Ronald Bailey | February 2009 Print Edition

Matt Ridley, an Oxford-educated zoologist, turned to journalism in 1983, when he got a job as The Economist’s science reporter. He soon became the magazine’s Washington correspondent and eventually served as it’s American editor. This time in the United States had a profound intellectual effect on Ridley, ultimately leading him to become a self-described classical liberal, a “person who believes in economic freedom and social freedom, too.”

Ridley, 50, has written several superb books that combine clear explanations of complex biology with discussions of the science’s implications for human society. In The Origins of Virtue: Human Instincts and the Evolution of Cooperation (1997), Ridley showed how natural selection led to human morality, including the development of property rights and our propensity to exchange. At the end he warned that government can subvert our natural tendency to cooperate. “We are not so nasty that we need to be tamed by intrusive government, nor so nice that too much government does not bring out the worst in us,” he concluded. Reviewing the book for reason, the UCLA economist Jack Hirshleifer noted that “Ridley leans in the anarchist direction.”

Written just before researchers announced the completed sequencing of the human genome, Ridley’s Genome: The Autobiography of a Species in 23 Chapters (2000) toured our 23 chromosome pairs to illustrate how genes cause disease, direct the production of proteins, and influence intelligence. While pointing out the differential heritability of many human characteristics, Ridley condemned genetic determinism and eugenics as unscientific. “Many modern accounts of the history of eugenics present it as an example of the dangers of letting science, genetics especially, out of control,” he wrote. “It is much more an example of the danger of letting government out of control.” Ridley further deflated genetic determinism in Nature via Nurture: Genes, Experience, and What Makes Us Human (2003), which explained how genes change their expression in response to environmental influences.

Ridley is now working on a book about how and why progress happens. During a visit to Blagdon Hall, Ridley’s home outside Newcastle upon Tyne, I took advantage of the author’s weakened state (he had broken his collarbone falling from a horse) to talk about the new book.

reason: What’s the book about?

Matt Ridley: My last three or four books have all argued that there is such a thing as an evolved human nature which is true all over the world and has been true throughout history. But something changes. Clearly, my life is completely different from what it would’ve been if I was an Ice Age hunter-gatherer. Technology changes. Society changes. Prosperity changes.

What I want to do is turn the question on its head and come at it from the point of view of an evolutionary biologist who looks at this species—man—which has a constant nature but has somehow acquired an ever-changing lifestyle. I want to understand what’s driving that change. Let’s give it the obvious word, even though it’s a very unfashionable one: progress. The book is about where progress came from, how it works, and, most important, how long it can continue in the future.

My major themes are specialization, exchange, technology, energy, and then population. Human beings have progressed in material living standards, on the whole, since the Stone Age, but they’ve also progressed enormously in terms of the number of people on the planet. That’s because we got better at turning the energy available into people, and the denser the population has got, the more things we’ve been able to invent that we wouldn’t have been able to invent with a sparse population. For example, if you’re going to smelt metals, you need a fairly dense population of customers before it’s worth building kilns.

Population density can also lead to reductions in the standard of living. There must be cases in history where people have tried to live at too a high a density for the resources that were available to them. They’ve either then suffered one of Malthus’ positive checks—war, famine, and disease—or, and this is a slightly more original point, they’ve reduced their division of labor, i.e., they’ve returned to self-sufficiency.

If you look at the Bronze Age empires in Mesopotamia or Egypt, or the Roman Empire, or some of the Chinese dynasties, at a certain point the population density gets too high for people to be able to generate a surplus of consumption income to support trade and specialization by others, and you have to go back to being self-sufficient. Essentially that’s what happened to every surge in productivity, wealth, and technology up to the one that came around 1800, the Industrial Revolution.

At some point there’s something you’re relying on that gets more and more expensive. If you look at Mesopotamia, it deforested itself. It has to go further and further for wood, for construction. Maybe it’s food.

The English Industrial Revolution had been bubbling along very nicely in the 18th century, with fantastic increases of productivity, particularly with respect to cotton textiles. We saw a quintupling of cotton cloth output in two consecutive decades, in the 1780s and 1790s, none of it based on fossil fuels yet but based on water power.

At some point, you run out of dams. You run out of rivers in Lancashire to dam. At some point England would suffer the fate of Holland, or Venice before that, or of China, Egypt, or Japan. What did England do that others didn’t? It started using fossil fuels.

By 1870 Britain is consuming the coal equivalent to 850 million human laborers. It could have done everything it did with coal with trees, with timber, but not from its own land. Timber was bound to get more expensive the more you used of it. Coal didn’t get more expensive the more you used of it. It didn’t get particularly cheaper either, but it didn’t get more expensive, so you don’t get diminishing returns the more you use of it.

reason: One of the things that Marco Polo reported to the amazement of Europe was that those Chinese people are burning rocks. So the Chinese had access to coal already, and that extra energy didn’t make them wealthy.

Ridley: That’s right. [University of California at Irvine historian] Kenneth Pomeranz’s answer to that is very straightforward: The coal was in Shanxi in Inner Mongolia in the far northwest. Those areas got hit very soon after Marco Polo was there by a peculiar combination of barbarians and plague. It was hit much harder than the rest of China and was totally depopulated. When China revived as an economy, it was a long way away from the coal, so it had a wood-based iron industry, for example on the Yangtze, which was impossibly far from the coal mines in the far northwest.

The north of England happened to have a coal field that was near the surface and near navigable water. Remember, you cannot transport anything by bulk in the 18th century unless it’s within a very short distance of water. It happened to have a huge demand on its doorstep too.

The fossil fuel industry itself did not get much more efficient. A miner in the early 20th century is using a pony and a lamp and a pick ax like he was in the 18th century, and the product he’s producing is not a lot cheaper. But it’s not more expensive, and it’s hugely larger in volume.

reason: What institutional environment favors progress?

Ridley: It’s very clear from history that markets bring forth innovation. If you’ve got free and fair exchange with decent property rights and a sufficiently dense population, then you get innovation. That’s what happens in west Asia around 50,000 years ago: the Upper Paleolithic Revolution.

The only institution that really counts is trust, if you like. And something’s got to allow that to build. Property rights are just another expression of trust, aren’t they? I trust you to deliver this property to me. I trust somebody else to allow me to keep this property if I acquire it from you.

But human beings are spectacularly good at destroying trust-generating institutions. They do this through three creatures: chiefs, thieves, and priests.

Chiefs think, “I’m in charge, I own everything, I’m taking over, I’m going to tell everyone how to do it, and I’m going to confiscate property whenever I feel like it.” That’s what happens again and again in the Bronze Age. You get a perfectly good trading diaspora and somebody turns it into an empire.

A classic example is the Chinese retreat in the 1400s, 1500s. China got rich and technologically sophisticated around the year 1000 A.D. That’s when it’s working at its best.

Interestingly, it’s just come out of a period when it’s not unified. Once you’re unified, people keep imposing monopolies and saying there’s only one way of doing things and you’ve got to do it this way. Whereas when you’re fragmented, as Europe remained throughout this period, people can move from one polity to another until they find one they like.

If you want a recipe for how to shut down an economy, just read what the early Ming emperors did. They nationalized foreign trade. They forbade population movements within the country, so villagers weren’t allowed to migrate to towns. They forbade merchants from trading on their own account without specific permission to do specific things. You had to actually register your inventories with the imperial bureaucrats every month, that kind of thing. And they did the usual idiotic thing of building walls, invading Vietnam.

Thieves—one of the reasons for the growth of the Arab civilization in the seventh and eighth centuries must be the fact that the Red Sea was increasingly infested with pirates. It became increasingly difficult to trade with India. Byzantium was having a real problem doing it, and the Arabs had come up with a great new technology for crossing the desert called the camel train. So the rule of law to prevent thievery is also important; but the rule of too much law, to allow chiefs to take everything, is equally a risk.

Priests—well, I must admit I don’t think one can necessarily blame religion for shutting down trust, trade, and exchange. But there’s little doubt that it didn’t help in the Middle Ages, surely. I won’t go further than that.

reason: They did try to adjust prices in the marketplace. Whether that actually had an effect I don’t know.

Ridley: Usury laws and that sort of thing. That’s exactly right.

reason: So periods of rising productivity are choked off by institutional barriers. You get an over-elaboration of rules and regulations and taxation. And that’s what killed them off, not lack of fuel or lack of ingenuity, but governance that just got so bad that it stopped it. Is that plausible?

Ridley: I think that’s a big part of it. How does that fit with my story that it shuts down because of a Malthusian thing or diminishing returns on sources of energy? Do they go together, or does one explain one collapse and another explain another? I don’t know. The problem with history is it tends to be overdetermined. You’ve got lots of different things happening at once.

If we were having this conversation in 1800, I think I would have very good reason for telling you that however wonderful the prosperity you can generate by elaborating division of labor and specialization and exchange, you’re never going to be able to escape this trap that living standards don’t seem to be able to go up faster than the population. But we’re not having this conversation in 1800, and we’ve had 200 years in which we’ve shown that you can actually have a dramatic transformation of living standards in a very large portion of the world purely by elaborating the division of labor, as long as you’ve got energy amplification in there too.

Title: Better Recipes Cause Progress, II
Post by: Body-by-Guinness on January 07, 2009, 01:19:08 PM
reason: [Yale economist] William Nordhaus would say that at least 50 percent of economic growth in the 20th century is because we’re using better recipes, which is better technology.

Ridley: Absolutely. The compact fluorescent light bulb is a better recipe than the filament light bulb, which was better than the kerosene lamp, which was better than the tallow candle. If I overemphasized energy, maybe it’s just because I’ve been recently reading and writing on that subject. The proximate cause of our prosperity is technology. I quite agree.

The ultimate cause of technology is division of labor, though. The man who made a mango slicing machine in 1800 would have been lucky to sell 20, because he only had access to his village. Now he can have access through the Internet to the world, so it pays him to make something as specialized as a mango slicing device. And that makes living standards rise. My standard of living has risen because a man has made a mango slicing device that I really can use.

But I also need an awful lot of watts to run my lifestyle: to turn on the lights, to drive the machine that made my mango slicing device, to provide me with the transport that I deem necessary to make my life interesting, but in particular, to drive those container ships that are bringing my mango slicing devices from Korea.

The fact that I can now earn an hour of reading light with half a second of work, if I’m on the average American wage, whereas it took eight seconds in the 1950s, releases me to go and spend another seven and a half seconds consuming some other kind of energy, like driving my power boat across a lake where I have a recreation home which I’ve driven to in my 4x4, or even just deciding to leave the light on all night so that my daughter doesn’t have to worry about being left in the dark.

reason: Flipping this around a little bit, what’s the cause of poverty in the modern world?

Ridley: I think lack of access to networks of exchange and specialization is the principal cause of poverty. If you find yourself in a position where you make everything yourself rather than buy it from someone else, then you are by definition poor.

Now, I buy the argument that it is possible to be poorer in the modern world than it was a couple of hundred years ago because the diseases that would’ve killed you a couple of hundred years ago can be prevented. It is conceivable that some people in Africa are living at a lower standard of living than anyone was 200 years ago.

reason: Of course, living might be considered a higher standard of living than dying.

Ridley: Well, exactly. To get hopeful, is Africa really that different from South Asia in the ’60s and ’70s? The standard of living is rising in most of Africa. There are parts where it’s not—in Congo it’s not, but in Kenya and Ghana it is. They’re not great, these countries, but they’re not regressing. The health outcomes are improving pretty dramatically, child mortality in particular. Fertility is falling, as it does after child mortality has started falling.

And you also have got the beginnings of an explosion of entrepreneurship that will allow them to leapfrog onto new technologies that were not available. The lack of decent telephone networks means that they’re going straight into a mobile world. Mobile telephones are amazingly ubiquitous in Africa, even among people who are not particularly well off, often in a form of shared ownership. Just look at the effect that that’s had on Kenyan farmers finding markets for their produce. They call ahead and find where the best prices are and send their produce there.

reason: I was at a Cato Institute function where the British development economist Peter Bauer was giving a lecture, and I had a really smart-ass question: Isn’t the problem with a lot of poor countries, Africa in particular, that there’s corruption and we have to get rid of corruption? And he leaned back on the podium and smiled and shook his head, no. And he said when the United States and Britain were developing in the 19th century, their governments were as corrupt as anything you’d find in Africa, but the governments in Britain and the United States had control of 1 percent or 2 percent of the economy when those countries were growing. In many African countries, the government controls over 60 percent of the economy. That’s the difference.

Ridley: Very nice point. I find myself completely surrounded by pessimists, people who think that Africa is never going to get rich, that it’s deteriorating rather than improving, that living standards are about to get worse. And they’re not convinced they have been getting better in the last few years because things like congestion at airports have gotten worse. There’s a tremendous tendency to take improvements for granted and to notice deteriorations.

There are a lot of people who think, “Ah, we are in a uniquely dangerous situation in my generation. Back in my parents’ generation, they looked forward to the future with confidence and happiness.” That ain’t true either. If you go back and look at every generation, it was dominated by pessimists. There is this wonderful quote from Lord Macaulay in 1830, who says, why is it that with nothing but improvement behind us we anticipate nothing but disaster before us?

What the precautionary principle [the idea that when science has not yet determined whether a new product or process is safe, the government should prohibit or restrict its use] misses is the danger that in not progressing you might miss out on future improvements in living standards for poor people in Africa. I’m desperately hoping to persuade the world, not that everything’s going to be fine, but that there’s a chance everything’s going to be better for everybody and that we should be very careful not to cut ourselves off from that chance.

reason: How would you describe your politics?

Ridley: I’m a good old-fashioned 19th-century liberal. I love progress, and I love change. What makes what I’ve just said seem right-wing, particularly in Europe, is that it seems to be more concerned with wealth creation than social justice, i.e., with baking another cake rather than cutting up the existing cake. Actually, to some extent, I am an egalitarian. I think that there are ways in which you have to keep equal opportunities in life in order to generate the incentives for people to generate wealth. But I think I’m that classically underrepresented voter, the person who believes in economic freedom and social freedom, too.

I lived in America for three years, which is not a long time, but it was a very influential time for me. I arrived there a pretty standard statist in my views of the world and left a—not a completely convinced libertarian but a person who had suddenly started thinking about politics from the individual’s point of view much more than I had before. Meeting Julian Simon and Aaron Wildavsky and people from the Property and Environment Research Center and George Mason University had an influence on me. I encountered a view that’s hard to come across in Europe.

The fall of the Berlin Wall was also a very important moment in my life. It told me that all those people who said that the Soviet Union was actually a lot better place than it was made out to be, and I’d come across tons of them in my life, were plain wrong, not just a little bit wrong.

I recalled one conversation I had around 1985. A singer who is now a famous labor activist and a highly respected elder statesman, Billy Bragg—I happened to sit next to him on an airplane. He had just come back from playing East Berlin. He was perfectly friendly, but he spent most of that plane ride trying to persuade me that East Germans were much happier than West Germans and it was complete bollocks, this propaganda from the West that they were unhappy. And he’s hugely respected still as a Labour Party grandee.

reason: What would you say to people who say that progress is simply unsustainable, that the Africans and the Indians and the Chinese will never be able to live at the same living standards as we do?

Ridley: I’d respond to that by saying that in a sense they’re absolutely right. If we go on as we are, it’ll be very difficult to sustain things. But we won’t go on as we are. That’s what we never do. We always change what we do and we always get much more efficient at using things—energy, resources, etc.

Just take land area for feeding the world. If we’d gone on as we were, as hunter-gatherers, we’d have needed about 85 Earths to feed 6 billion people. If we’d gone on as early slash-and-burn farmers, we’d have needed a whole Earth, including all the oceans. If we’d gone on as 1950 organic farmers without a lot of fertilizer, we’d have needed 82 percent of the world’s land area for cultivation, as opposed to the 38 percent that we farm at the moment.

Sure, if every office in China uses as much paper as every office does in America now and there’re just as many of them, then we’re going to run out of trees to chop down to make the paper. Well, I’m willing to bet that we’ll have found ways of recycling paper or making paper from less material or not using so much paper. It might take paper getting expensive before that happens.

Ronald Bailey is reason’s science correspondent.

http://www.reason.com/news/show/130848.html

Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on January 07, 2009, 07:01:26 PM
I've read Ridley's book "The Red Queen" and recommend it highly and have another of his books in my ever growing "to read" pile.  Thanks for the nice find.
Title: Irrational Intelligence
Post by: Body-by-Guinness on January 30, 2009, 08:42:19 AM
Hmm, think I've seen some of that manifested around here.

From the issue dated January 30, 2009
NOTA BENE
Irrational Intelligence; Get Smarter



Article tools
By KACIE GLENN

Ever bought a 12-foot Christmas tree for a 10-foot-high apartment? Picked up a hitchhiker in a nasty part of town? Or, perhaps, taken out a mortgage you couldn't afford? The good news is that poor decision-making skills may have little effect on your IQ score, according to Keith E. Stanovich, author of What Intelligence Tests Miss: The Psychology of Rational Thought (Yale University Press). The bad news? He thinks you'd lose a few points on a more-accurate gauge of intelligence.

Stanovich, an adjunct professor of human development and applied psychology at the University of Toronto, believes that the concept of intelligence, as measured by IQ tests, fails to capture key aspects of mental ability. But that doesn't mean he discounts the tests' credibility: "Readers might well expect me to say that IQ tests do not measure anything important, or that there are many kinds of intelligence, or that all people are intelligent in their own way," he writes. After all, theories about emotional and social intelligence — which weigh interpersonal skills, the ability to empathize, and other "supracognitive" characteristics — have gained popularity in recent years, in part by de-emphasizing the importance of IQ.

Instead, Stanovich suggests that IQ tests focus on valuable qualities and capacities that are highly relevant to our daily lives. But he believes the tests would be far more effective if they took into account not only mental "brightness" but also rationality — including such abilities as "judicious decision making, efficient behavioral regulation, sensible goal prioritization ... [and] the proper calibration of evidence."

Our understanding of intelligence, he writes, has been muddled by the discrepancy between the vague, comprehensive vernacular term, which encompasses all the functions and manifestations of "smarts," and the narrower theories that "confine the concept of intelligence to the set of mental abilities actually tested on extant IQ tests." The latter conceptualization allows intelligence to coexist with foolishness because IQ tests do not measure the rationality required to abstain from dumb decisions, according to the author. Casual observers, however, usually define intelligence broadly and are confused by inconsistencies: "Blatantly irrational acts committed by people of obvious intelligence ... shock and surprise us and call out for explanation."

The author notes that because most people — even educators and psychologists — accept test-defined intelligence as a fair assessment of mental faculties, we tend to dismiss inconsistencies between a person's IQ scores and rationality as indicators of a disorder or learning disability. So persistent is that faulty logic that "we are almost obligated to create a new disability category when an important skill domain is found to be somewhat dissociated from intelligence." As long as we continue to worship IQ tests that do not assess rational thought processes, we will continue to misjudge our own and others' cognitive abilities, warns the scholar.

In an earlier work, Stanovich coined his own term — dysrationalia — for "the inability to think and behave rationally despite adequate intelligence." That "disorder," he suggests, might afflict some of the smartest people you know.

***

In an age of Baby Einstein DVD's and French lessons for 5-year-olds, it may seem passé to suggest that a child's IQ is determined primarily by genetics. But until recently, writes Richard E. Nisbett in Intelligence and How to Get It: Why Schools and Cultures Count (Norton), most scientists who studied intelligence believed "that the overwhelming importance of heritability meant that the environment could do little and that social programs intended to improve intelligence were doomed to failure." Nisbett argues that a variety of social, cultural, and economic factors can significantly affect a child's IQ, and suggests ways to improve intelligence scores, as well as grades, by manipulating those factors.

Often-cited studies have shown that the difference in IQ between identical twins raised apart is only slightly less than the difference between twins raised together, whereas the correlation between the intelligence scores of a parent who adopts a child and that child is slim. Yet, Nisbett reminds us, even separated twins are likelier to grow up under similar economic and social conditions than two people chosen at random, and they might even be treated similarly because of shared looks and other characteristics in common. At the same time, most adoptive families are well-off and nurturing. The consistency of those environmental factors makes their impact on a child's intelligence seem smaller than it really is.

Opinions have changed over the last few years, and many scientists would now agree, "If you were to average the contribution of genetics to IQ over different social classes, you would probably find 50 percent to be the maximum contribution of genetics," says Nisbett, a professor of psychology at the University of Michigan at Ann Arbor. Class is a crucial determinant of intelligence; adoption studies, for example, have indicated that "raising someone in an upper-middle-class environment versus a lower-class environment is worth 12 to 18 points of IQ — a truly massive effect," he says. Children of middle-class parents are read to, spoken to, and encouraged more than children of working-class parents, all experiences that influence intellectual development.

Intelligence and How to Get It also examines how better schooling boosts IQ scores and how school systems can improve. Nisbett cautions that more money does not always equate to higher-quality education, and that parents who take advantage of vouchers to move their children to better schools are a self-selecting group of people who are motivated to help their children excel academically, which leads some researchers to overestimate the vouchers' effectiveness. On the other hand, he finds that class size and teachers' experience and skills can make a big difference, especially for poor and minority children. He notes, too, that children who are exposed to "instructional technologies" in the classroom benefit intellectually; working with word-processing programs, for example, can help students learn to read faster, which leads to further advantages.

The psychologist maintains that there are myriad ways to enhance a child's intelligence by changing his or her learning environment. Young kids who emulate their parents' self-control go on to achieve better grades and higher SAT scores than those who don't. They also learn better, and therefore are more successful in school and have a higher IQ, when they are praised for working hard but not offered incentives to do activities they already show interest in: The danger is turning play and learning into work. It couldn't hurt to angle for access to the best schools and most-experienced teachers, either, Nesbitt suggests.

"Intellectual capital" — which more fully captures academic potential than IQ, he says — "is the result of stimulation and support for exploration and achievement in the home, the neighborhood, and the schools." Nurturing young people's minds might not override their DNA, the author admits, but it does help them achieve their intellectual potential.

http://chronicle.com
Section: The Chronicle Review
Volume 55, Issue 21, Page B18

http://chronicle.com/temp/reprint.php?id=6pfm8ytzbg1p8n5p2vl4rrcmwvckp31x
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on February 10, 2009, 04:04:03 PM
James Q. Wilson
The DNA of Politics
Genes shape our beliefs, our values, and even our votes.

Radek Pietruszka/epa/Corbis

Studies of identical twins, like Polish president Lech Kaczyński, right, and former prime minister Jaroslaw, show that 40 percent of our political views have a genetic component.Children differ, as any parent of two or more knows. Some babies sleep through the night, others are always awake; some are calm, others are fussy; some walk at an early age, others after a long wait. Scientists have proved that genes are responsible for these early differences. But people assume that as children get older and spend more time under their parents’ influence, the effect of genes declines. They are wrong.

For a century or more, we have understood that intelligence is largely inherited, though even today some mistakenly rail against the idea and say that nurture, not nature, is all. Now we know that much of our personality, too, is inherited and that many social attitudes have some degree of genetic basis, including our involvement in crime and some psychiatric illnesses. Some things do result entirely from environmental influences, such as whether you follow the Red Sox or the Yankees (though I suspect that Yankee fans have a genetic defect). But beyond routine tastes, almost everything has some genetic basis. And that includes politics.

When scholars say that a trait is “inherited,” they don’t mean that they can tell what role nature and nurture have played in any given individual. Rather, they mean that in a population—say, a group of adults or children—genes explain a lot of the differences among individuals.

There are two common ways of reaching this conclusion. One is to compare adopted children’s traits with those of their biological parents, on the one hand, and with those of their adoptive parents, on the other. If a closer correlation exists with the biological parents’ traits, then we say that the trait is to that degree inherited.

The other method is to compare identical twins’ similarity, with respect to some trait, with the similarity of fraternal twins, or even of two ordinary siblings. Identical twins are genetic duplicates, while fraternal twins share only about half their genes and are no more genetically alike than ordinary siblings are. If identical twins are more alike than fraternal twins, therefore, we conclude that the trait under consideration is to some degree inherited.

Three political science professors—John Alford, Carolyn Funk, and John Hibbing—have studied political attitudes among a large number of twins in America and Australia. They measured the attitudes with something called the Wilson-Patterson Scale (I am not the Wilson after whom it was named), which asks whether a respondent agrees or disagrees with 28 words or phrases, such as “death penalty,” “school prayer,” “pacifism,” or “gay rights.” They then compared the similarity of the responses among identical twins with the similarity among fraternal twins. They found that, for all 28 taken together, the identical twins did indeed agree with each other more often than the fraternal ones did—and that genes accounted for about 40 percent of the difference between the two groups. On the other hand, the answers these people gave to the words “Democrat” or “Republican” had a very weak genetic basis. In politics, genes help us understand fundamental attitudes—that is, whether we are liberal or conservative—but do not explain what party we choose to join.

Genes also influence how frequently we vote. Voting has always puzzled scholars: How is it rational to wait in line on a cold November afternoon when there is almost no chance that your ballot will make any difference? Apparently, people who vote often feel a strong sense of civic duty or like to express themselves. But who are these people? James Fowler, Laura Baker, and Christopher Dawes studied political participation in Los Angeles by comparing voting among identical and fraternal twins. Their conclusion: among registered voters, genetic factors explain about 60 percent of the difference between those who vote and those who do not.

A few scholars, determined to hang on to the belief that environment explains everything, argue that such similarities occur because the parents of identical twins—as opposed to the parents of fraternal twins—encourage them to be as alike as possible as they grow up. This is doubtful. First, we know that many parents make bad guesses about their children’s genetic connection—thinking that fraternal twins are actually identical ones, or vice versa. When we take twins’ accurate genetic relationships into account, we find that identical twins whom parents wrongly thought to be fraternal are very similar, while fraternal twins wrongly thought to be identical are no more alike than ordinary siblings.

Moreover, studying identical twins reared apart by different families, even in different countries, effectively shows that their similar traits cannot be the result of similar upbringing. The University of Minnesota’s Thomas Bouchard has done research on many identical twins reared apart (some in different countries) and has found that though they never knew each other or their parents, they proved remarkably alike, especially in personality—whether they were extroverted, agreeable, neurotic, or conscientious, for example.

Some critics complain that the fact that identical twins live together with their birth parents, at least for a time, ruins Bouchard’s findings: during this early period, they say, parenting must influence the children’s attitudes. But the average age at which the identical twins in Bouchard’s study became separated from their parents was five months. It is hard to imagine parents teaching five-month-old babies much about politics or religion.

The gene-driven ideological split that Alford and his colleagues found may, in fact, be an underestimate, because men and women tend to marry people with whom they agree on big issues—assortative mating, as social scientists call it. Assortative mating means that the children of parents who agree on issues will be more likely to share whatever genes influence those beliefs. Thus, even children who are not identical twins will have a larger genetic basis for their views than if their parents married someone with whom they disagreed. Since we measure heritability by subtracting the similarity among fraternal twins from the similarity among identical ones, this difference may neglect genetic influences that already exist on fraternal twins. And if it does, it means that we are underestimating genetic influences on attitudes.

When we step back and look at American politics generally, genes may help us understand why, for countless decades, about 40 percent of all voters have supported conservative causes, about 40 percent have backed liberal ones, and the 20 percent in the middle have decided the elections. On a few occasions, the winning presidential candidate has won about 60 percent of the vote. But these days we call a 55 percent victory a “landslide.” It is hard to imagine a purely environmental force that would rule out a presidential election in which one candidate got 80 percent of the vote and his rival only 20 percent. Something deeper must be going on.

All of this leaves open the question: Which genes help create which political attitudes? Right now, we don’t know. To discover the links will require lengthy studies of the DNA of people with different political views. Scientists are having a hard time locating the specific genes that cause diseases; it will probably be much harder to find the complex array of genes that affects politics.

There are problems with the observed link between genes and politics. One is that it is fairly crude so far. Liberals and conservatives come in many varieties: one can be an economic liberal and a social conservative, say, favoring a large state but opposing abortion; or an economic conservative and a social liberal, favoring the free market but supporting abortion and gay rights. If we add attitudes about foreign policy to the mix, the combinations double. Most tests used in genetic studies of political views do not allow us to make these important distinctions. As a result, though we know that genes affect ideology, that knowledge is clumsy. In time, I suspect, we will learn more about these subtleties.

Further, it’s important to emphasize that biology is not destiny. Genetic influences rarely operate independently of environmental factors. Take the case of serotonin. People who have little of this neurotransmitter are at risk for some psychological problems, but for many of them, no such problems occur unless they experience some personal crisis. Then the combined effect of genetic influences and disruptive experiences will trigger a deep state of depression, something that does not happen to people who either do not lack serotonin or who do lack it but encounter no crisis. Recently, in the first study to find the exact genes that affect political participation, Fowler and Dawes found two genes that help explain voting behavior. One of the genes, influencing serotonin levels, boosts turnout by 10 percent—if the person also attends church frequently. Nature and nurture interact.

The same is probably true of political ideology. When campus protests and attacks on university administrators began in the late 1960s, it was not because a biological upheaval had increased the number of radicals; it was because such people encountered events (the war in Vietnam, the struggle over civil rights) and group pressures that induced them to take strong actions. By the same token, lynchings in the South did not become common because there were suddenly more ultra-racists around. Rather, mob scenes, media frenzies, and the shock of criminal events motivated people already skeptical of civil rights to do terrible things.

Another challenge is politicized assessment of the genetic evidence. Ever since 1950, when Theodor Adorno and his colleagues published The Authoritarian Personality, scholars have studied right-wing authoritarianism but neglected its counterpart on the left. In his study of identical twins reared apart, Bouchard concludes that right-wing authoritarianism is, to a large degree, inherited—but he says nothing about the Left. This omission is puzzling, since as Bouchard was studying twins at the University of Minnesota, he was regularly attacked by left-wing students outraged by the idea that any traits might be inherited. A few students even threatened to kill him. When I pointed this out to him, he suggested, in good humor, that I was a troublemaker.

Yet if you ask who in this country has prevented people from speaking on college campuses, it is overwhelmingly leftists. If you ask who storms the streets and shatters the windows of Starbucks coffee shops to protest the World Trade Organization, it is overwhelmingly leftists. If you ask who produces campus codes that infringe on free speech, it is overwhelmingly leftists. If you ask who invaded the classroom of my late colleague Richard Herrnstein and tried to prevent him from teaching, it was overwhelmingly leftists.

A better way to determine if authoritarianism is genetic would be to ask people what the country’s biggest problems are. Liberals might say the inequality of income or the danger of global warming; conservatives might indicate the tolerance of abortion or the abundance of pornography. You would then ask each group what they thought should be done to solve these problems. An authoritarian liberal might say that we should tax high incomes out of existence and close down factories that emit greenhouse gases. A conservative authoritarian might suggest that we put abortion doctors in jail and censor books and television programs. This approach would give us a true measure of authoritarianism, left and right, and we would know how many of each kind existed and something about their backgrounds. Then, if they had twins, we would be able to estimate the heritability of authoritarianism. Doing all this is a hard job, which may explain why no scholars have done it.

Genes shape, to varying degrees, almost every aspect of human behavior. The struggle by some activists to deny or downplay that fact is worrisome. The anti-gene claim is ultimately an ill-starred effort to preserve the myth that, since the environment can explain everything, political causes that attempt to alter the environment can bring about whatever their leaders desire.

The truth is that though biology is not destiny, neither is it an easily changed path to utopia.

James Q. Wilson, formerly a professor at Harvard and at UCLA, now lectures at Pepperdine University. In 2003, he was awarded the Presidential Medal of Freedom.
Title: WSJ: Last Minute Changes
Post by: Crafty_Dog on February 13, 2009, 11:36:32 AM
By CHRISTOPHER F. CHABRIS
The debate over the validity of evolutionary theory may be real enough when it comes to religious belief and cultural outlook. But it has nothing to do with science. No evidence seriously contradicts the idea that the plant and animal species found on Earth today are descended from common ancestors that existed long ago. Indeed, the evidence for natural selection is infinitely stronger than it was when Charles Darwin proposed it 150 years ago, mainly because later discoveries in the field of genetics supplied the biological mechanisms to explain the patterns that Darwin and his contemporaries were observing.

But scientists do disagree over the pace and time-span of human evolution. Gregory Cochran and Henry Harpending begin "The 10,000 Year Explosion" with a remark from the paleontologist Stephen J. Gould, who said that "there's been no biological change in humans for 40,000 or 50,000 years." They also cite the evolutionist Ernst Mayr, who agrees that "man's evolution towards manness suddenly came to a halt" in the same epoch. Such claims capture the consensus in anthropology, too, which dates the emergence of "behaviorally modern humans" -- beings who acted much more like us than like their predecessors -- to about 45,000 years ago.

But is the timeline right? Did human evolution really stop? If not, our sense of who we are -- and how we got this way -- may be radically altered. Messrs. Cochran and Harpending, both scientists themselves, dismiss the standard view. Far from ending, they say, evolution has accelerated since humans left Africa 40,000 years ago and headed for Europe and Asia.

Evolution proceeds by changing the frequency of genetic variants, known as "alleles." In the case of natural selection, alleles that enable their bearers to leave behind more offspring will become more common in the next generation. Messrs. Cochran and Harpending claim that the rate of change in the human genome has been increasing in recent millennia, to the point of turmoil. Literally hundreds or thousands of alleles, they say, are under selection, meaning that our social and physical environments are favoring them over other -- usually older -- alleles. These "new" variants are sweeping the globe and becoming more common.

 The 10,000 Year Explosion
By Gregory Cochran and Henry Harpending
(Basic, 288 pages, $27)
But genomes don't just speed up their evolution willy-nilly. So what happened, the authors ask, to keep human evolution going in the "recent" past? Two crucial events, they contend, had to do with food production. As humans learned the techniques of agriculture, they abandoned their diffuse hunter-gatherer ways and established cities and governments. The resulting population density made humans ripe for infectious diseases like smallpox and malaria. Alleles that helped protect against disease proved useful and won out.

The domestication of cattle for milk production also led to genetic change. Among people of northern European descent, lactose intolerance -- the inability to digest milk in adulthood -- is unusual today. But it was universal before a genetic mutation arose about 8,000 years ago that made lactose tolerance continue beyond childhood. Since you can get milk over and over from a cow, but can get meat from it only once, you can harvest a lot more calories over time for the same effort if you are lactose tolerant. Humans who had this attribute would have displaced those who didn't, all else being equal. (If your opponent has guns and you don't, drinking milk won't save you.)

To make their case for evolution having continued longer than is usually claimed, Messrs. Cochran and Harpending remind us that dramatic changes in human culture appeared about 40,000 years ago, resulting in painting, sculpture, and better tools and weapons. A sudden change in the human genome, they suggest, made for more creative, inventive brains. But how could such a change come about? The authors propose that the humans of 40,000 years ago occasionally mated with Neanderthals living in Europe, before the Neanderthals became extinct. The result was an "introgression" of Neanderthal alleles into the human lineage. Some of those alleles may have improved brain function enough to give their bearers an advantage in the struggle for survival, thus becoming common.

In their final chapter, Messrs. Cochran and Harpending venture into recorded history by observing two interesting facts about Ashkenazi Jews (those who lived in Europe after leaving the Middle East): They are disproportionately found among intellectual high-achievers -- Nobel Prize winners, world chess champions, people who score well on IQ tests -- and they are victims of rare genetic diseases, like Gaucher's and Tay-Sachs. The authors hypothesize that these two facts are connected by natural selection.

Just as sickle-cell anemia results from having two copies of an allele that protects you against malaria if you have just one, perhaps each Ashkenazi disease occurs when you have two copies of an allele that brings about something useful when you have just one. That useful thing, according to Messrs. Cochran and Harpending, is higher cognitive ability. They argue that the rare diseases are unfortunate side-effects of natural selection for intelligence, which Messrs. Cochran and Harpending think happened during the Middle Ages in Europe, when Jews rarely intermarried with other Europeans.

"The 10,000 Year Explosion" is important and fascinating but not without flaw. Messrs. Cochran and Harpending do not stop often enough to acknowledge and rebut the critics of their ideas. And though the authors cite historical sources and scientific articles in support of their thesis, they too often write in a speculative voice, qualifying claims with "possible," "likely," "might" and "probably." This voice is inevitable in any discussion of events tens of thousands of years ago. But it leads to another problem: The authors don't say enough about the developments in genetic science that allow them to make inferences about humanity's distant past. Readers will wonder, for instance, exactly how it is possible to recognize ancient Neanderthal DNA in our modern genomes. Despite all this, the provocative ideas in "The 10,000 Year Explosion" must be taken seriously by anyone who wants to understand human origins and humanity's future.

Mr. Chabris is a psychology professor at Union College in Schenectady, N.Y.

Title: Brain Focus and Neural Inhibitions
Post by: Body-by-Guinness on February 25, 2009, 10:31:06 AM
My ex-wife, who could belabor a point to a mind numbing degree, use to get quite annoyed when I'd tune out her long winded meanderings. From my end the process was an unbidden one: some mechanism conducted a signal to noise evaluation and filtered out the noise, at which point my brain would cast about for something germane to focus on, at least until interrupted by an "are you listening to me!?" Be that as it may, those memories caused me to mull this piece:

Brain mechanism recruited to reduce noise during challenging tasks

New research reveals a sophisticated brain mechanism that is critical for filtering out irrelevant signals during demanding cognitive tasks. The study, published by Cell Press in the February 26 issue of the journal Neuron, also provides some insight into how disruption of key inhibitory pathways may contribute to schizophrenia.

"The ability to keep track of information and one's actions from moment to moment is necessary to accomplish even the simple tasks of everyday life," explains senior study author, Dr. Helen Barbas from Boston University and School of Medicine. "Equally important is the ability to focus on relevant information and ignore noise."

Dr. Barbas and colleague, Dr. Maria Medalla, were interested in examining the synaptic mechanisms for selection and suppression of signals involved in working memory. They focused on the fine synaptic interactions of pathways with excitatory and inhibitory neurons in brain areas involved in attention.

"The primate dorsolateral prefrontal cortex (DLPFC) and anterior cingulated cortex (ACC) are brain regions that focus attention on relevant signals and suppress noise in cognitive tasks. However, their synaptic communication and unique roles in cognitive control are largely unknown," explains Dr. Barbas.

The researchers found that a pathway linking two related prefrontal areas within DLPFC and a pathway from the functionally distinct ACC to DLPFC similarly innervated excitatory neurons associated with paying attention to relevant stimuli. Interestingly, large nerve fiber endings from ACC contacted selectively inhibitory neurons that help suppress "noisy" excitatory neurons nearby.

These observations suggest that ACC has a greater impact in reducing noise in dorsolateral areas during challenging cognitive tasks involving conflict, error, or reversing decisions. These mechanisms are often disrupted in schizophrenia, and previous functional imaging studies by others have shown that schizophrenia is associated with reduced activity in ACC.

The authors conclude that ACC pathways may help reduce noise by stimulating inhibitory neurons in DLPFC. "The present data provide a circuit mechanism to suggest that pathology in the output neurons of ACC in schizophrenia might reduce excitatory drive to inhibitory neurons of dorsolateral prefrontal cortices, perturbing the delicate balance of excitation and inhibition," offers Dr. Barbas.

http://www.eurekalert.org/pub_releases/2009-02/cp-bmr022309.php
Title: Sexual Insanity
Post by: Crafty_Dog on February 28, 2009, 09:39:16 PM
Bill Muehlenberg | Friday, 27 February 2009
Sexual insanity
A 13-year-old father? A woman with 14 IVF children? We can’t say we were not warned.
Week by week the stories become more sensational. Blogs were still buzzing over California’s “octomom”, Nadya Suleman, when the story of Alfie Patten, a baby-faced British 13-year-old and putative father, grabbed the international headlines. In Australia, where I live, an appeal court has awarded a lesbian duo hundreds of thousands of dollars in compensation for getting two babies from IVF treatment rather than one.

Strangely enough, such dramatic consequences of the erosion of marriage and the explosion of out-of-control sexuality were foreseen -- in some instances long ago. In 1968 Will and Ariel Durant’s important book, The Lessons of History appeared. In it they said, The sex drive in the young is a river of fire that must be banked and cooled by a hundred restraints if it is not to consume in chaos both the individual and the group.

Although the sexual revolution took off in the mid-60s, other social commentators had made similar warnings earlier on. In 1956 Harvard sociologist Pitirim Sorokin put it this way:

This sex revolution is as important as the most dramatic political or economic upheaval. It is changing the lives of men and women more radically than any other revolution of our time… Any considerable change in marriage behaviour, any increase in sexual promiscuity and sexual relations, is pregnant with momentous consequences. A sex revolution drastically affects the lives of millions, deeply disturbs the community, and decisively influences the future of society.

And back in 1927, J.D. Unwin of Cambridge University made similar remarks:

The whole of human history does not contain a single instance of a group becoming civilised unless it has been completely monogamous, nor is there any example of a group retaining its culture after it has adopted less rigorous customs. Marriage as a life-long association has been an attendant circumstance of all human achievement, and its adoption has preceded all manifestations of social energy… Indissoluble monogamy must be regarded as the mainspring of all social activity, a necessary condition of human development.

But these warnings have fallen on deaf ears, and our sexual decline is now gathering speed. Let’s look again at the stories I mentioned at the beginning of this article -- reported in the media within days of each other. Any one of them reveals a culture in crisis, but taken together they show a West on a slide to sexual suicide.

The case of Nadya Suleman, America’s single mom extraordinaire, is so well publicised we need only briefly recap here. Nadya had “a dream…to have a large family, huge family”, so she went right ahead and got herself six children with the aid of a sperm donor and IVF. But that was not enough; she went back again to the clinic and, wonder of wonders, produced octuplets. The 33-year-old California woman is unrepentant. “This is my choice to be a single parent,” she said.

It’s hard to know who has been more reckless and irresponsible, the woman or her IVF doctor. He recently implanted a 49-year-old woman with seven embryos, who is now pregnant with quadruplets. One can understand there are those in the IVF industry simply happy to make money, regardless of the consequences. Now that the consequence in this case is a single mother with 14 children, they will no doubt try to wash their hands of the whole affair and let society pick up the tab for supporting them.

Equally famous is the case of Alfie Patten, the 13-year-old English father who was just twelve when he conceived the child. He and his 15-year-old girlfriend are now parents, but seemingly clueless as to what all this entails. And now it turns out that there is a question as to who the real father is. Evidently, two other young boys (14 and 16) are now claiming to be the father. Speculation is rife about lucrative publicity deals to be made. Meanwhile, a child has been born into social and sexual chaos.

There is something sadly predictable about Alfie’s case, but my first Australian example of sexual insanity is truly startling. It concerns two lesbians who successfully sued a Canberra IVF doctor for creating two babies instead of one. The case actually has three stages, one sane and two outrageous. In 2007 the lesbian pair outrageously sued the doctor, claiming they only wanted one child, and that two would damage their livelihood (even though their combined income is more than $100,000).

In July 2008 the ACT Supreme Court, sanely, rejected their claim, but an appeals court recently, and again outrageously, reversed the decision, ordering the doctor to pay the lesbians $317,000 in compensation. The women said having two children damaged their relationship. (Mind you, in the light of the Nadya Suleman story it is difficult to feel sorry for IVF doctors.)


Our last story involves the growing trend of rental agreements in Australian cities involving sex instead of rent money. It seems that some men are taking advantage of the rental crisis by placing online ads which offer women free rooms in exchange for sex. One ad, for a Melbourne townhouse, offered "free rent for someone special: instead of rent, I am looking for someone to help me with certain needs/requirements on a regular basis''.

The Sunday Telegraph explains: “The zero-rent ads, targeting desperate women looking for somewhere to live, are becoming increasingly common on popular ‘share house’ rental websites. Although there have been numerous complaints about the ads, which some website users have dubbed ‘offensive’, they do not breach policy guidelines for sites such as flatmates com.au”


“Desperate women”? Let’s not be too ready to excuse those who accept what amounts to an invitation to prostitution, thereby putting themselves in danger and contributing to the environment of sexual insanity. Like the previous examples, the blatant sexual pitch in these flatmate ads is a sign of a society which is fast losing all bearings concerning things sexual or things moral.

Our wiser, saner and more moral forebears provided plenty of warning about these things, but we have chosen to ignore such warnings and now each passing day seems to bring out another horror story of sexual insanity.

As G.K. Chesterton wrote a century ago: A society that claims to be civilized and yet allows the sex instinct free-play is inoculating itself with a virus of corruption which sooner or later will destroy it. It is only a question of time. He is worth quoting at length:

What had happened to the human imagination, as a whole, was that the whole world was coloured by dangerous and rapidly deteriorating passions; by natural passions becoming unnatural passions. Thus the effect of treating sex as only one innocent natural thing was that every other innocent natural thing became soaked and sodden with sex. For sex cannot be admitted to a mere equality among elementary emotions or experiences like eating and sleeping. The moment sex ceases to be a servant it becomes a tyrant. There is something dangerous and disproportionate in its place in human nature, for whatever reason; and it does really need a special purification and dedication. The modern talk about sex being free like any other sense, about the body being beautiful like any tree or flower, is either a description of the Garden of Eden or a piece of thoroughly bad psychology, of which the world grew weary two thousand years ago.

We are today witnessing the bitter fruit of allowing sex to become a tyrant. Each day new headlines testify to the fact that when we abuse the wonderful gift of sex, we abuse ourselves and our neighbours. The question is, how much more abuse can we take as a culture before society can no longer function? One suspects that we should find this out quite soon.

Bill Muehlenberg is a lecturer in ethics and philosophy at several Melbourne theological colleges and a PhD candidate at Deakin University.
Title: New Adult Stem Cell Technique
Post by: Body-by-Guinness on March 02, 2009, 12:42:02 PM
Wonderful Stem Cell News

Ronald Bailey | March 2, 2009, 10:15am

Canadian and British stem cell researchers are reporting an exciting new method for producing stem cells from adult cells without using viruses. In 2006, researchers in Japan and Wisconsin discovered how to use viruses to ferry four genes that turn adult cells into stem cells that act very much like embryonic stem cells. Like stem cells derived from embryos, the induced pluripotent stem (iPS) cells can differentiate into various cell types that could be used as transplants to replace diseased or damaged tissues. In addition, since the stem cells are produced using adult cells taken from individual patients, they would be genetic matches for each patient. This would mean that transplants of such cells would not risk being rejected by a patient's immune system.

However, researchers worried that using viruses to produce iPS cells might result in cancer. The new technique uses the piggyBac transposon derived from butterflies to incorporate into skin cells the suite of four genes necessary to transform them into stem cells. (A transposon ia a mobile DNA sequence that can move from one site in a chromosome to another, or between different chromosomes.) Once the genes are installed, the transposon can be completely eliminated from the cells. If iPS cells work out, another tremendous advantage to them is that they can be produced without using scarce human eggs.

In addition, opponents of human embryonic stem cell research argue that the new iPS cells are not morally problematic (from their point of view) because they are not derived from human embryos. On the other hand, it might be that iPS cells produced from skin cells could become embryos capable of developing into babies if implanted in a womb. The possibility that a soul can enter a specific cell evidently may depend on whether or not a single genetic switch is on or off.

In any case, the new research is a very promising avenue to the development of regenerative medicine.

http://www.reason.com/blog/printer/131989.html
Title: Future Planning in the Animal Kingdom
Post by: Body-by-Guinness on March 10, 2009, 06:31:08 AM
Last line of this piece bothers me quite a bit, but otherwise it contains many interesting tidbits.

Arsenal Confirms Chimp's Ability to Plan, Study Says
Animal at Swedish Zoo Collects Stones to Hurl at Visitors
By David Brown
Washington Post Staff Writer
Tuesday, March 10, 2009; A06

Santino evidently knows he's going to get upset, so he plans ahead.

The 30-year-old chimpanzee, who has lived in a Swedish zoo most of his life, sometimes gets agitated when zoo visitors begin to gather on the other side of the moat that surrounds his enclosure, where he is the dominant -- and only -- male in a group that includes half a dozen females.

He shows his displeasure by flinging stones or bits of concrete at the human intruders, but finding a suitable weapon on the spur of the moment perhaps isn't so easy. To prepare, Santino often begins his day by roaming the enclosure, finding stones and stacking them in handy piles.

On some days, he's barraged visitors with up to 20 projectiles thrown in rapid succession, always underhand. Several times he has hit spectators standing 30 feet away across the water-filled moat.

The behavior, witnessed dozens of times, has made Santino something of a local celebrity.

It also made him the subject of a scientific paper, published yesterday, documenting one of the more elaborate examples of contingency planning in the animal world.

"Many animals plan. But this is planning for a future psychological state. That is what is so advanced," said Mathias Osvath, director of the primate research station at Lund University and author of the paper in the journal Current Biology.

The animal's preparations include not only stockpiling the stones he finds but also, more recently, also fashioning projectiles from pieces of concrete he has broken off artificial rocks in his habitat.

Others have observed great apes planning, both in the wild and in captivity. Some birds in the corvid family, which includes jays and ravens, also plan for future contingencies. In general, though, planning by animals is thought to occur only when the payoff is immediate and more or less certain.

"People always assume that animals live in the present. This seems to indicate that they don't live entirely in the present," said Frans de Waal, a primatologist at Emory University in Atlanta, who was not involved in the research.

Santino was born in a zoo in Munich in 1978 but has lived all but five years of his life at the Furuvik Zoo, about 60 miles north of Stockholm.

He began throwing stones at age 16 when he became the sole -- and therefore dominant -- male in the group. None of the other chimpanzees, including a male that was in the group briefly, stored or threw stones.

The troop's habitat is an island surrounded by a moat. The stone-throwing is more frequent early in the season when the zoo reopens after the winter and Santino sees crowds of people across the water for the first time in months. Sometimes particular individuals seem to bother him, Osvath said.

On some days, zookeepers have found as many as five caches, containing three to eight stones each, along the shore facing the viewing area. Once, a hidden observer saw him gather stones five mornings in a row before the zoo opened.

Most of the stones are taken from the shallows at the edge of the moat. About a year after his storing and throwing began, however, Santino began tapping stones against the concrete artificial rocks, evidently listening for a hollow sound that indicates a fissure. He would then hit the concrete harder until a piece chipped off, occasionally then hitting it again to make it fist-size.

"I have seen him going around doing this. It is very impressive," Osvath said.

The throwing behavior is part of a normal display of dominance and territorial protection by male chimpanzees that occasionally involves throwing feces. Osvath doesn't think this animal is particularly smart or aggressive.

"I don't think he is unusual in any way. If anything, chimpanzees in the wild would plan more, I suspect," he said.

Osvath and others have tested chimpanzees' ability to plan. In one experiment, the animals were given a choice between eating grapes at the moment and getting and storing a rubber hose they could use sometime in the future to gain access to fruit soup, one of their favorite foods. Many chose the hose.

De Waal, who is also affiliated with the Yerkes National Primate Research Center in Atlanta, said he's observed a female chimp at a zoo in the Netherlands that in cold weather -- but not warm -- would bring an armful of straw from her enclosure when she went outside in order to have something to sit on.

Amy Fultz, a primatologist at Chimp Haven, a sanctuary in Louisiana for animals once used for entertainment or research, said she also has seen planning in some of the 132 chimpanzees living there.

As in the wild, some fashion tools from stalks of plants that they use to fish ants from anthills.

"One, named Karin, will gather up a particular species of verbena and save it in a place in her habitat. I have watched her go back and get them later in the day, or even later in the week," Fultz said.

One expert said planning by chimpanzees has been observed often enough in the wild that she questioned the novelty of Santino's behavior.

Sue Taylor Parker, a retired professor of biological anthropology at California's Sonoma State University who has compared the cognitive development of humans and primates, said wild chimpanzees sometimes carry rocks long distances to "anvil sites" for future use in cracking nuts. Cooperative hunting also implies a certain minimum of planning.

"Chimpanzee behavior that is at the edge of their highest abilities is always interesting to read about. I just question the uniqueness of this," she said. She added that the level of planning seen in Santino is roughly the same as that of 3-to-5-year-old children.

Unusual or not, Santino's rock-throwing may not be in evidence when spring comes to Sweden this year and he again sees visitors across the water.

In order to decrease his agitation, which was fueled in part by high testosterone levels characteristic of dominant males, the animal was castrated last fall.

http://www.washingtonpost.com/wp-dyn/content/article/2009/03/09/AR2009030901458.html?nav=hcmodule
Title: NYT: Can we increase our intelligence?
Post by: Crafty_Dog on March 11, 2009, 05:53:19 AM
Guest Column: Can We Increase Our Intelligence?

Many thanks to Steve Quake for four stimulating articles on some of the dilemmas facing scientists today. He now hands off to Sandra Aamodt and Sam Wang, two neuroscientists famous for their award-winning book, “Welcome to Your Brain: Why You Lose Your Car Keys But Never Forget How to Drive and Other Puzzles of Everyday Life.” Sandra and Sam will be writing their articles together; please welcome them.


By Sam Wang and Sandra Aamodt

It’s an honor to be invited to fill in for Olivia. We’ll be writing about slow and fast forces that shape the brain: natural selection, operating relatively slowly over many generations; and environmental influences, whose effects are visible across a few generations or even within one individual’s lifetime.

We’re often asked whether the human brain is still evolving. Taken at face value, it sounds like a silly question. People are animals, so selection pressure would presumably continue to apply across generations.

But the questioners are really concerned about a larger issue: how our brains are changing over time — and whether we have any control over these developments. This week we discuss intelligence and the “Flynn effect,” a phenomenon that is too rapid to be explained by natural selection.

It used to be believed that people had a level of general intelligence with which they were born that was unaffected by environment and stayed the same, more or less, throughout life. But now it’s known that environmental influences are large enough to have considerable effects on intelligence, perhaps even during your own lifetime.

A key contribution to this subject comes from James Flynn, a moral philosopher who has turned to social science and statistical analysis to explore his ideas about humane ideals. Flynn’s work usually pops up in the news in the context of race issues, especially public debates about the causes of racial differences in performance on intelligence tests. We won’t spend time on the topic of race, but the psychologist Dick Nisbett has written an excellent article on the subject.

Flynn first noted that standardized intelligence quotient (I.Q.) scores were rising by three points per decade in many countries, and even faster in some countries like the Netherlands and Israel. For instance, in verbal and performance I.Q., an average Dutch 14-year-old in 1982 scored 20 points higher than the average person of the same age in his parents’ generation in 1952. These I.Q. increases over a single generation suggest that the environmental conditions for developing brains have become more favorable in some way.

What might be changing? One strong candidate is working memory, defined as the ability to hold information in mind while manipulating it to achieve a cognitive goal. Examples include remembering a clause while figuring out how it relates the rest of a sentence, or keeping track of the solutions you’ve already tried while solving a puzzle. Flynn has pointed out that modern times have increasingly rewarded complex and abstract reasoning. Differences in working memory capacity account for 50 to 70 percent of individual differences in fluid intelligence (abstract reasoning ability) in various meta-analyses, suggesting that it is one of the major building blocks of I.Q. (Ackerman et al; Kane et al; Süss et al.) This idea is intriguing because working memory can be improved by training.


Felix Sockwell
 
A common way to measure working memory is called the “n-back” task. Presented with a sequential series of items, the person taking the test has to report when the current item is identical to the item that was presented a certain number (n) of items ago in the series. For example, the test taker might see a sequence of letters like

L K L R K H H N T T N X

presented one at a time. If the test is an easy 1-back task, she should press a button when she sees the second H and the second T. For a 3-back task, the right answers are K and N, since they are identical to items three places before them in the list. Most people find the 3-back condition to be challenging.

A recent paper reported that training on a particularly fiendish version of the n-back task improves I.Q. scores. Instead of seeing a single series of items like the one above, test-takers saw two different sequences, one of single letters and one of spatial locations. They had to report n-back repetitions of both letters and locations, a task that required them to simultaneously keep track of both sequences. As the trainees got better, n was increased to make the task harder. If their performance dropped, the task was made easier until they recovered.

Each day, test-takers trained for 25 minutes. On the first day, the average participant could handle the 3-back condition. By the 19th day, average performance reached the 5-back level, and participants showed a four-point gain in their I.Q. scores.

The I.Q. improvement was larger in people who’d had more days of practice, suggesting that the effect was a direct result of training. People benefited across the board, regardless of their starting levels of working memory or I.Q. scores (though the results hint that those with lower I.Q.s may have shown larger gains). Simply practicing an I.Q. test can lead to some improvement on the test, but control subjects who took the same two I.Q. tests without training improved only slightly. Also, increasing I.Q. scores by practice doesn’t necessarily increase other measures of reasoning ability (Ackerman, 1987).

Since the gains accumulated over a period of weeks, training is likely to have drawn upon brain mechanisms for learning that can potentially outlast the training. But this is not certain. If continual practice is necessary to maintain I.Q. gains, then this finding looks like a laboratory curiosity. But if the gains last for months (or longer), working memory training may become as popular as — and more effective than — games like sudoku among people who worry about maintaining their cognitive abilities.

Now, some caveats. The results, though tantalizing, are not perfect. It would have been better to give the control group some other training not related to working memory, to show that the hard work of training did not simply motivate the experimental group to try harder on the second I.Q. test. The researchers did not test whether working memory training improved problem-solving tasks of the type that might occur in real life. Finally, they did not explore how much improvement would be seen with further training.

Research on working memory training, as well as Flynn’s original observations, raise the possibility that the fast-paced modern world, despite its annoyances (or even because of them) may be improving our reasoning ability. Maybe even multitasking — not the most efficient way to work — is good for your brain because of the mental challenge. Something to think about when you’re contemplating retirement on a deserted island.

**********

NOTES:

C. Jarrold and J.N. Towse (2006) Individual differences in working memory. Neuroscience 139 (2006) 39–50.

P.L. Ackerman, M.E. Beier, and M.O. Boyle (2005) Working memory and intelligence: the same or different constructs? Psychological Bulletin 131:30–60.

M.J. Kane, D.Z. Hambrick, and A.R.A. Conway (2005) Working memory capacity and fluid intelligence are strongly related constructs: comment on Ackerman, Beier, and Boyle (2005). Psychological Bulletin 131:66–71.

H.-M. Süss, K. Oberauer, W.W. Wittmann, O. Wilhelm, and R. Schulze (2002) Working-memory capacity explains reasoning ability—and a little bit more. Intelligence 30:261–288.

S.M. Jaeggi, M. Buschkuehl, J. Jonides, and W.J. Perrig (2008) Improving fluid intelligence with training on working memory. Proceedings of the National Academy of Sciences USA 105:6829-6833. [full text]

D.A. Bors, F. Vigneau (2003) The effect of practice on Raven’s Advanced Progressive Matrices. Learning and Individual Differences 13:291–312.

P.L. Ackerman (1987) Individual differences in skill learning: An integration of psychometric and information processing perspectives. Psychological Bulletin 102:3–27.
Title: Remembering jokes
Post by: Crafty_Dog on March 17, 2009, 09:52:17 AM
Basics
In One Ear and Out the Other
NYT
Published: March 16, 2009

By all accounts, my grandfather Nathan had the comic ambitions of a Jack Benny but the comic gifts of a John Kerry. Undeterred, he always kept a few blank index cards in his pocket, so that if he happened to hear a good joke, he’d have someplace to write it down.

How I wish I knew where Nathan stashed that deck.

Like many people, I can never remember a joke. I hear or read something hilarious, I laugh loudly enough to embarrass everybody else in the library, and then I instantly forget everything about it — everything except the fact, always popular around the dinner table, that “I heard a great joke today, but now I can’t remember what it was.”

For researchers who study memory, the ease with which people forget jokes is one of those quirks, those little skids on the neuronal banana peel, that end up revealing a surprising amount about the underlying architecture of memory.

And there are plenty of other similarly illuminating examples of memory’s whimsy and bad taste — like why you may forget your spouse’s birthday but will go to your deathbed remembering every word of the “Gilligan’s Island” theme song. And why you must chop a string of data like a phone number into manageable and predictable chunks to remember it and will fall to pieces if you are in Britain and hear a number read out as “double-four, double-three.” And why your efforts to fill in a sudden memory lapse by asking your companions, “Hey, what was the name of that actor who starred in the movie we saw on Friday?” may well fail, because (what useless friends!) now they’ve all forgotten, too.

Welcome to the human brain, your three-pound throne of wisdom with the whoopee cushion on the seat.

In understanding human memory and its tics, Scott A. Small, a neurologist and memory researcher at Columbia, suggests the familiar analogy with computer memory.

We have our version of a buffer, he said, a short-term working memory of limited scope and fast turnover rate. We have our equivalent of a save button: the hippocampus, deep in the forebrain is essential for translating short-term memories into a more permanent form.

Our frontal lobes perform the find function, retrieving saved files to embellish as needed. And though scientists used to believe that short- and long-term memories were stored in different parts of the brain, they have discovered that what really distinguishes the lasting from the transient is how strongly the memory is engraved in the brain, and the thickness and complexity of the connections linking large populations of brain cells. The deeper the memory, the more readily and robustly an ensemble of like-minded neurons will fire.

This process, of memory formation by neuronal entrainment, helps explain why some of life’s offerings weasel in easily and then refuse to be spiked. Music, for example. “The brain has a strong propensity to organize information and perception in patterns, and music plays into that inclination,” said Michael Thaut, a professor of music and neuroscience at Colorado State University. “From an acoustical perspective, music is an overstructured language, which the brain invented and which the brain loves to hear.”

A simple melody with a simple rhythm and repetition can be a tremendous mnemonic device. “It would be a virtually impossible task for young children to memorize a sequence of 26 separate letters if you just gave it to them as a string of information,” Dr. Thaut said. But when the alphabet is set to the tune of the ABC song with its four melodic phrases, preschoolers can learn it with ease.

And what are the most insidious jingles or sitcom themes but cunning variations on twinkle twinkle ABC?

Really great jokes, on the other hand, punch the lights out of do re mi. They work not by conforming to pattern recognition routines but by subverting them. “Jokes work because they deal with the unexpected, starting in one direction and then veering off into another,” said Robert Provine, a professor of psychology at the University of Maryland, Baltimore County, and the author of “Laughter: A Scientific Investigation.” “What makes a joke successful are the same properties that can make it difficult to remember.”

This may also explain why the jokes we tend to remember are often the most clichéd ones. A mother-in-law joke? Yes, I have the slot ready and labeled.

Memory researchers suggest additional reasons that great jokes may elude common capture. Daniel L. Schacter, a professor of psychology at Harvard and the author of “The Seven Sins of Memory,” says there is a big difference between verbatim recall of all the details of an event and gist recall of its general meaning.

“We humans are pretty good at gist recall but have difficulty with being exact,” he said. Though anecdotes can be told in broad outline, jokes live or die by nuance, precision and timing. And while emotional arousal normally enhances memory, it ends up further eroding your attention to that one killer frill. “Emotionally arousing material calls your attention to a central object,” Dr. Schacter said, “but it can make it difficult to remember peripheral details.”

As frustrating as it can be to forget something new, it’s worse to forget what you already know. Scientists refer to this as the tip-of-the-tongue phenomenon, when you know something but can’t spit it out, and the harder you try the more noncompliant the archives.

It’s such a virulent disorder that when you ask friends for help, you can set off so-called infectious amnesia. Behind the tying up of tongues are the too-delicate nerves of our brain’s frontal lobes and their sensitivity to anxiety and the hormones of fight or flight. The frontal lobes that rifle through stored memories and perform other higher cognitive tasks tend to shut down when the lower brain senses danger and demands that energy be shunted its way.

For that reason anxiety can be a test taker’s worst foe, and the anxiety of a pop quiz from a friend can make your frontal lobes freeze and your mind go blank. That is also why you’ll recall the frustratingly forgotten fact later that night, in the tranquillity of bed.

Memories can be strengthened with time and practice, practice, practice, but if there’s one part of the system that resists improvement, it’s our buffers, the size of our working memory on which a few items can be temporarily cached. Much research suggests that we can hold in short-term memory only five to nine data chunks at a time.

The limits of working memory again encourage our pattern-mad brains, and so we strive to bunch phone numbers into digestible portions and could manage even 10-digit strings when they had area codes with predictable phrases like a middle zero or one. But with the rise of atonal phone numbers with random strings of 10 digits, memory researchers say the limits of working memory have been crossed. Got any index cards?

Title: The End of Philosophy
Post by: Crafty_Dog on April 07, 2009, 08:46:16 AM
The End of Philosophy
DAVID BROOKS
Published: April 6, 2009
Socrates talked. The assumption behind his approach to philosophy, and the approaches of millions of people since, is that moral thinking is mostly a matter of reason and deliberation: Think through moral problems. Find a just principle. Apply it.

One problem with this kind of approach to morality, as Michael Gazzaniga writes in his 2008 book, “Human,” is that “it has been hard to find any correlation between moral reasoning and proactive moral behavior, such as helping other people. In fact, in most studies, none has been found.”

Today, many psychologists, cognitive scientists and even philosophers embrace a different view of morality. In this view, moral thinking is more like aesthetics. As we look around the world, we are constantly evaluating what we see. Seeing and evaluating are not two separate processes. They are linked and basically simultaneous.

As Steven Quartz of the California Institute of Technology said during a recent discussion of ethics sponsored by the John Templeton Foundation, “Our brain is computing value at every fraction of a second. Everything that we look at, we form an implicit preference. Some of those make it into our awareness; some of them remain at the level of our unconscious, but ... what our brain is for, what our brain has evolved for, is to find what is of value in our environment.”

Think of what happens when you put a new food into your mouth. You don’t have to decide if it’s disgusting. You just know. You don’t have to decide if a landscape is beautiful. You just know.

Moral judgments are like that. They are rapid intuitive decisions and involve the emotion-processing parts of the brain. Most of us make snap moral judgments about what feels fair or not, or what feels good or not. We start doing this when we are babies, before we have language. And even as adults, we often can’t explain to ourselves why something feels wrong.

In other words, reasoning comes later and is often guided by the emotions that preceded it. Or as Jonathan Haidt of the University of Virginia memorably wrote, “The emotions are, in fact, in charge of the temple of morality, and ... moral reasoning is really just a servant masquerading as a high priest.”

The question then becomes: What shapes moral emotions in the first place? The answer has long been evolution, but in recent years there’s an increasing appreciation that evolution isn’t just about competition. It’s also about cooperation within groups. Like bees, humans have long lived or died based on their ability to divide labor, help each other and stand together in the face of common threats. Many of our moral emotions and intuitions reflect that history. We don’t just care about our individual rights, or even the rights of other individuals. We also care about loyalty, respect, traditions, religions. We are all the descendents of successful cooperators.

The first nice thing about this evolutionary approach to morality is that it emphasizes the social nature of moral intuition. People are not discrete units coolly formulating moral arguments. They link themselves together into communities and networks of mutual influence.

The second nice thing is that it entails a warmer view of human nature. Evolution is always about competition, but for humans, as Darwin speculated, competition among groups has turned us into pretty cooperative, empathetic and altruistic creatures — at least within our families, groups and sometimes nations.

The third nice thing is that it explains the haphazard way most of us lead our lives without destroying dignity and choice. Moral intuitions have primacy, Haidt argues, but they are not dictators. There are times, often the most important moments in our lives, when in fact we do use reason to override moral intuitions, and often those reasons — along with new intuitions — come from our friends.

The rise and now dominance of this emotional approach to morality is an epochal change. It challenges all sorts of traditions. It challenges the bookish way philosophy is conceived by most people. It challenges the Talmudic tradition, with its hyper-rational scrutiny of texts. It challenges the new atheists, who see themselves involved in a war of reason against faith and who have an unwarranted faith in the power of pure reason and in the purity of their own reasoning.

Finally, it should also challenge the very scientists who study morality. They’re good at explaining how people make judgments about harm and fairness, but they still struggle to explain the feelings of awe, transcendence, patriotism, joy and self-sacrifice, which are not ancillary to most people’s moral experiences, but central. The evolutionary approach also leads many scientists to neglect the concept of individual responsibility and makes it hard for them to appreciate that most people struggle toward goodness, not as a means, but as an end in itself.

Bob Herbert is off today.
Title: Neuroworld
Post by: Body-by-Guinness on April 10, 2009, 05:51:24 AM
Interesting blog that tracks odd neurological tidbits:

http://trueslant.com/ryansager/
Title: NYT: Animal "regret"?
Post by: Crafty_Dog on June 02, 2009, 07:02:10 AM
In That Tucked Tail, Real Pangs of Regret?

 TIERNEY
Published: June 1, 2009
If you own a dog, especially a dog that has anointed your favorite rug, you know that an animal is capable of apologizing. He can whimper and slouch and tuck his tail and look positively mortified — “I don’t know what possessed me.” But is he really feeling sorry?



Could any animal feel true pangs of regret? Scientists once scorned this notion as silly anthropomorphism, and I used to side with the skeptics who dismissed these displays of contrition as variations of crocodile tears. Animals seemed too in-the-moment, too busy chasing the next meal, to indulge in much self-recrimination. If old animals had a song, it would be “My Way.”

Yet as new reports keep appearing — moping coyotes, rueful monkeys, tigers that cover their eyes in remorse, chimpanzees that second-guess their choices — the more I wonder if animals do indulge in a little paw-wringing.

Your dog may not share Hamlet’s dithering melancholia, but he might have something in common with Woody Allen.

The latest data comes from brain scans of monkeys trying to win a large prize of juice by guessing where it was hidden. When the monkeys picked wrongly and were shown the location of the prize, the neurons in their brain clearly registered what might have been, according to the Duke University neurobiologists who recently reported the experiment in Science.

“This is the first evidence that monkeys, like people, have ‘would-have, could-have, should-have’ thoughts,” said Ben Hayden, one of the researchers. Another of the authors, Michael Platt, noted that the monkeys reacted to their losses by shifting their subsequent guesses, just like humans who respond to a missed opportunity by shifting strategy.

“I can well imagine that regret would be highly advantageous evolutionarily, so long as one doesn’t obsess over it, as in depression,” Dr. Platt said. “A monkey lacking in regret might act like a psychopath or a simian Don Quixote.”

In earlier experiments, both chimpanzees and monkeys that traded tokens for cucumbers responded negatively once they saw that other animals were getting a tastier treat — grapes — for the same price. They made angry sounds and sometimes flung away the cucumbers or their tokens, reported Sarah Brosnan, a psychologist at Georgia State University.

“I think animals do experience regret, as defined as the recognition of a missed opportunity,” Dr. Brosnan said. “In the wild, these abilities may help them to recognize when they should forage in different areas or find a different cooperative partner who will share the spoils more equitably.”

No one knows, of course, exactly how this sense of regret affects an animal emotionally. When we see a dog slouching and bowing, we like to assume he’s suffering the way we do after a faux pas, but maybe he’s just sending a useful signal: I messed up.

“It’s possible that this kind of social signal in animals could have evolved without the conscious experience of regret,” said Sam Gosling, a psychologist at the University of Texas, Austin. “But it seems more plausible that there is some kind of conscious experience even if it’s not the same kind of thing that you or I feel.”

Marc Bekoff, a behavioral ecologist at the University of Colorado, says he’s convinced that animals feel emotional pain for their mistakes and missed opportunities. In “Wild Justice,” a new book he wrote with the philosopher Jessica Pierce, Dr. Bekoff reports on thousands of hours of observation of coyotes in the wild as well as free-running domesticated dogs.

When a coyote recoiled after being bitten too hard while playing, the offending coyote would promptly bow to acknowledge the mistake, Dr. Bekoff said. If a coyote was shunned for playing unfairly, he would slouch around with his ears slightly back, head cocked and tail down, tentatively approaching and then withdrawing from the other animals. Dr. Bekoff said the apologetic coyotes reminded him of the unpopular animals skulking at the perimeter of a dog park.

“These animals are not as emotionally sophisticated as humans, but they have to know what’s right and wrong because it’s the only way their social groups can work,” he said. “Regret is essential, especially in the wild. Humans are very forgiving to their pets, but if a coyote in the wild gets a reputation as a cheater, he’s ignored or ostracized, and he ends up leaving the group.” Once the coyote is on his own, Dr. Bekoff discovered, the coyote’s risk of dying young rises fourfold.

If our pets realize what soft touches we are, perhaps their regret is mostly just performance art to sucker us. But I like to think that some of the ruefulness is real, and that researchers will one day compile a list of the Top 10 Pet Regrets. (You can make nominations at TierneyLab, at nytimes.com/tierneylab.) At the very least, I’d like to see researchers tackle a few of the great unanswered questions:

When you’re playing fetch with a dog, how much regret does he suffer when he gives you back the ball? As much as when he ends the game by hanging on to the ball?

Do animal vandals feel any moral qualms? After seeing rugs, suitcases and furniture destroyed by my pets, I’m not convinced that evolution has endowed animals with any reliable sense of property rights. But I’m heartened by Eugene Linden’s stories of contrite vandals in his book on animal behavior, “The Parrot’s Lament.”

He tells of a young tiger that, after tearing up all the newly planted trees at a California animal park, covered his eyes with his paws when the zookeeper arrived. And there were the female chimpanzees at the Tulsa Zoo that took advantage of a renovation project to steal the painters’ supplies, don gloves and paint their babies solid white. When confronted by their furious keeper, the mothers scurried away, then returned with peace offerings and paint-free babies.

How awkward is the King Kong Syndrome? Both male and female gorillas have become so fond of their human keepers that they’ve made sexual overtures — one even took to dragging his keeper by her hair. After the inevitable rebuff, do they regret ruining a beautiful friendship?

Do pet cats ever regret anything?
Title: Younger women good for you
Post by: Crafty_Dog on June 03, 2009, 10:07:42 PM
Men 'live longer' if they marry a younger woman
Men are likely to live longer if they marry a younger woman, new research suggests.


By Murray Wardrop
Published: 7:31AM BST 02 Jun 2009

A man's chances of dying early are cut by a fifth if their bride is between 15 and 17 years their junior.

The risk of premature death is reduced by 11 per cent if they marry a woman seven to nine years younger.

The study at Germany's Max Planck Institute also found that men marrying older women are more likely to die early.

The results suggest that women do not experience the same benefits of marrying a toy boy or a sugar daddy.

Wives with husbands older or younger by between seven and nine years increase their chances of dying early by 20 per cent.

This rises to 30 per cent if the age difference is close to 15 and 17 years.

Scientists say the figures for men may be the result of natural selection – that only the healthiest, most successful older men are able to attract younger mates.

"Another theory is that a younger woman will care for a man better and therefore he will live longer," said institute spokesman Sven Drefahl.

The study examined deaths between 1990 and 2005 for the entire population of Denmark.

On average in Europe, most men marry women around three years younger.
Title: Re: Evolutionary biology/psychology
Post by: matinik on June 05, 2009, 03:03:17 PM

Boys with 'Warrior Gene' More Likely to Join Gangs

LiveScience.com

Boys who have a so-called "warrior gene" are more likely to join gangs and also more likely to be among the most violent members and to use weapons, a new study finds.

"While gangs typically have been regarded as a sociological phenomenon, our investigation shows that variants of a specific MAOA gene, known as a 'low-activity 3-repeat allele,' play a significant role," said biosocial criminologist Kevin M. Beaver of Florida State University.

In 2006, the controversial warrior gene was implicated in the violence of the indigenous Maori people in New Zealand, a claim that Maori leaders dismissed.

But it's no surprise that genes would be involved in aggression. Aggression is a primal emotion like many others, experts say, and like cooperation, it is part of human nature, something that's passed down genetically. And almost all mammals are aggressive in some way or another, said Craig Kennedy, professor of special education and pediatrics at Vanderbilt University in Tennessee, whose research last year suggested that humans crave violence just like they do sex, food or drugs.

"Previous research has linked low-activity MAOA variants to a wide range of antisocial, even violent, behavior, but our study confirms that these variants can predict gang membership," says Beaver, the Florida State researcher. "Moreover, we found that variants of this gene could distinguish gang members who were markedly more likely to behave violently and use weapons from members who were less likely to do either."

The MAOA gene affects levels of neurotransmitters such as dopamine and serotonin that are related to mood and behavior, and those variants that are related to violence are hereditary, according to a statement from the university.

The new study examined DNA data and lifestyle information drawn from more than 2,500 respondents to the National Longitudinal Study of Adolescent Health. Beaver and colleagues from Florida State, Iowa State and Saint Louis universities will detail their findings in a forthcoming issue of the journal Comprehensive Psychiatry.

A separate study at Brown University from earlier this year found that individuals with the warrior gene display higher levels of aggression in response to provocation.

Over networked computers, 78 test subjects were asked to cause physical pain to an opponent they believed had taken money from them by administering varying amounts of hot sauce. While the results were not dramatic, low-activity MAOA subjects displayed slightly higher levels of aggression overall, the researchers said.

The Brown University results, published in the journal Proceedings of the National Academy of Sciences, support previous research suggesting that MAOA influences aggressive behavior, the scientists said. "

i wonder if this "warrior gene" is now being studied and/or synthesized by some egghead to be applied as  some sort of supersoldier serum
(shades of Capt. America!) :-D
Title: Women: The choosier sex?
Post by: rachelg on July 08, 2009, 07:42:48 PM
http://www.salon.com/mwt/broadsheet/feature/2009/07/08/mate_selection/index.html
 
I  recently listened to  a great podcast at econtalk and Alan Wolfe made the point that some evolutionary biologists are really  atheist Calvinists --Everything is predestined  and there is no free will.
 http://www.econtalk.org/archives/2009/05/wolfe_on_libera.html

I'm not against all evolutionary biology  but I thinks it sometimes  shares the pitfall of all social sciences -- taking  three post-it notes worth of data and writing a textbook worth of material

Women: The choosier sex?
That isn't the case in speed-dating where ladies approach men, says a new study

Tracy Clark-Flory
http://www.salon.com/mwt/broadsheet/feature/2009/07/08/mate_selection/print.html

Jul. 08, 2009 |

You've likely encountered this question many a time before: When it comes to sex, why do men do the chasing while women do the choosing? Maybe the query was first answered by your mother: Men have to fight for women because it's the fairer sex that gets pregnant, gives birth and does all the work of raising the kids! Perhaps at some point you got the sober evo-psych explanation: Females are more selective because they bear the greater reproductive burden. Or, maybe you're more familiar with pickup artist parlance: Chicks are choosier 'cause they're the ones who get knocked up. Most of us have heard the same answer put a number of different ways -- but now a team of researchers are casting doubt on our assumption about the push-pull of human courtship.

In a new study from Northwestern University, 350 college-age men and women attended speed-dating events. In half of the games of romantical chairs, the guys went from girl to girl; in the other half, the girls went from guy to guy. Each pair got four minutes to chat, after which they evaluated their interest in each other. When it came to the events where men worked the room, everyone performed just as expected: The men were less selective than the women. But when the usual speed-dating routine was turned on its head and the women made the rounds, the guys were more selective and the ladies were less picky.

The study's press release puts the findings simply: "Regardless of gender, the participants who rotated experienced greater romantic desire for and chemistry with their partners, compared to participants who sat throughout the event." Researcher Eli J. Finkel says the results suggest that research revealing women as the choosier sex might be best explained by "the roles men and women play in the opening seconds of new romantic contacts."

Now, don't go discarding the theory of human sexual selection just yet! Note that the study doesn't show that the sexes are equally selective. It does, however, raise some interesting questions: Could the disparity in sexual selectivity be a result of nurture (as in, "go out and get some nookie, you stud!") rather than nature ("man need sex -- grunt, scratch")? Are evolutionary tendencies easily overthrown by simple social engineering? One thing is for sure: This study will set the so-called seduction community abuzz with debate on how to recreate this speed-dating reversal in everyday life.

-- Tracy Clark-Flory
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on July 09, 2009, 07:15:23 AM
Wouldn't birth control have a lot to do with it?

Also, are current behaviors shown to be evolutionarily successful?  Intuitively it seems to me that there is a correlation between them and birth rates of less than population maitainance.
Title: Speed Dating and the end of the world
Post by: rachelg on July 09, 2009, 06:49:14 PM
Marc,
 The birth control  issue would have affected the moving men and women and the non-moving speed dating men and  women the same.
 
Are you saying the moving  men and women would not have been effected with out birth control ?  It is possible but probably unknowable.
 
Birth control being convenient, legal, safe and effective( half all pregnancies are unplanned though)  is a recent development
Birth control (onanism)  has existed at least since biblical times . There have also been herbs, pessaries ,etc that have been used for birth control and or abortion  for a very long time .


 I do not think the problem with our society is either birth control or  women chasing men. I think the problem is  how our society defines success and happiness and what is valuable .     Money is often treated as the highest good  and  seen as more valuable than relationships.  Children are seen as a drain and not worth the effort.   
 
We don't exactly praise those who get married young and have large families.
 
   I  do know religious women who control the size and timing of their family  with birth control and have large families.

It would seem better to me that children would mostly be planned or wanted additions to families. I do realize that unplanned children  often end up to be the best thing that every happened to some people.

Everything has consequences but I am strongly  in favor of both families and family planning. 

Also do why do we even care about the fate of our genes?  Why is what is best for our genes necessarily best for us?



 I would be content to blame  all of society's ills on speed dating. I have never participated but I have heard it was awful.
Title: Evolution & Level Playing Fields
Post by: Body-by-Guinness on July 11, 2009, 09:02:13 AM
I think the jury's still out on the evolutionary success of safe, effective, easily reversible contraception. There's a lot of research out there that associates lower birthrates caused by contraception to an increased standard of living, though which is the cause or effect is argued. Until just the past few decades, most humans had what would be considered by today's standards limited choices for leisure while subsistence demanded time and energy be devoted primarily to issues linked directly to survival. Children past the age of 5 or so were a labor asset, while below that age their mortality was high. Entertainment options were few and far between, with horizontal recreation being one of the few diversions consistently and easily available.

All that has changed both with the introduction of modern contraception, cheap consumer goods, and then the resources to obtain 'em. Indeed, I think the margin worth keeping an eye on here are first world/third world margins: will the downward trend in the price of relative luxury items continue, the leisure time to pursue them increase, or will environmental and religious zealots maintain or roll back the first world status quo and, in doing so, preserve the nasty, brutish and short third world status quo?

As mentioned, this will be interesting to watch. Think long experience has demonstrated that handing third world kleptocracies money does little to increase the standard of living of their citizens, though the falling cost of consumer goods makes products accessible in all sorts of unlikely places. Will environmental zealots via regulation or religious zealots via proscription derail the trickle down of consumer goods and resultant rise in the standard or living? Don't think we'll begin to have an answer to the evolutionary impact question until the playing field is more level, and think first world/third world margins will continue to be conflict points until a ubiquitous distribution of consumer goods is achieved.
Title: Security, Group Size, and the Human Brain
Post by: Crafty_Dog on July 15, 2009, 04:27:37 AM
      Security, Group Size, and the Human Brain



If the size of your company grows past 150 people, it's time to get name
badges. It's not that larger groups are somehow less secure, it's just
that 150 is the cognitive limit to the number of people a human brain
can maintain a coherent social relationship with.

Primatologist Robin Dunbar derived this number by comparing neocortex --
the "thinking" part of the mammalian brain -- volume with the size of
primate social groups. By analyzing data from 38 primate genera and
extrapolating to the human neocortex size, he predicted a human "mean
group size" of roughly 150.

This number appears regularly in human society; it's the estimated size
of a Neolithic farming village, the size at which Hittite settlements
split, and the basic unit in professional armies from Roman times to the
present day. Larger group sizes aren't as stable because their members
don't know each other well enough. Instead of thinking of the members as
people, we think of them as groups of people. For such groups to
function well, they need externally imposed structure, such as name badges.

Of course, badges aren't the only way to determine in-group/out-group
status. Other markers include insignia, uniforms, and secret handshakes.
They have different security properties and some make more sense than
others at different levels of technology, but once a group reaches 150
people, it has to do something.

More generally, there are several layers of natural human group size
that increase with a ratio of approximately three: 5, 15, 50, 150, 500,
and 1500 -- although, really, the numbers aren't as precise as all that,
and groups that are less focused on survival tend to be smaller. The
layers relate to both the intensity and intimacy of relationship and the
frequency of contact.

The smallest, three to five, is a "clique": the number of people from
whom you would seek help in times of severe emotional distress. The
twelve to 20 group is the "sympathy group": people with which you have
special ties. After that, 30 to 50 is the typical size of
hunter-gatherer overnight camps, generally drawn from the same pool of
150 people. No matter what size company you work for, there are only
about 150 people you consider to be "co-workers." (In small companies,
Alice and Bob handle accounting. In larger companies, it's the
accounting department -- and maybe you know someone there personally.)
The 500-person group is the "megaband," and the 1,500-person group is
the "tribe." Fifteen hundred is roughly the number of faces we can put
names to, and the typical size of a hunter-gatherer society.

These numbers are reflected in military organization throughout history:
squads of 10 to 15 organized into platoons of three to four squads,
organized into companies of three to four platoons, organized into
battalions of three to four companies, organized into regiments of three
to four battalions, organized into divisions of two to three regiments,
and organized into corps of two to three divisions.

Coherence can become a real problem once organizations get above about
150 in size.  So as group sizes grow across these boundaries, they have
more externally imposed infrastructure -- and more formalized security
systems. In intimate groups, pretty much all security is ad hoc.
Companies smaller than 150 don't bother with name badges; companies
greater than 500 hire a guard to sit in the lobby and check badges.  The
military have had centuries of experience with this under rather trying
circumstances, but even there the real commitment and bonding invariably
occurs at the company level. Above that you need to have rank imposed by
discipline.

The whole brain-size comparison might be bunk, and a lot of evolutionary
psychologists disagree with it. But certainly security systems become
more formalized as groups grow larger and their members less known to
each other. When do more formal dispute resolution systems arise: town
elders, magistrates, judges? At what size boundary are formal
authentication schemes required? Small companies can get by without the
internal forms, memos, and procedures that large companies require; when
does what tend to appear? How does punishment formalize as group size
increase? And how do all these things affect group coherence? People act
differently on social networking sites like Facebook when their list of
"friends" grows larger and less intimate. Local merchants sometimes let
known regulars run up tabs. I lend books to friends with much less
formality than a public library. What examples have you seen?

An edited version of this essay, without links, appeared in the
July/August 2009 issue of IEEE Security & Privacy.

A copy of this essay, with all embedded links, is here:
http://www.schneier.com/blog/archives/2009/07/security_group.html
Title: Crowd Control Rethinking
Post by: Body-by-Guinness on July 19, 2009, 07:12:23 PM
Why cops should trust the wisdom of the crowds

17 July 2009 by Michael Bond
Magazine issue 2717. Subscribe and get 4 free issues.

Would letting crowds manage themselves be a better alternative? (Image: Simon Dack/Brighton Argus)
THE protests that took place on the streets of London on the eve of the G20 summit in April lived up to many people's expectations. Around 2000 protestors turned up, and were heavily marshalled by police. There was a bit of trouble, but the police tactics - specifically, the decision to corral the entire crowd into a small area near the Bank of England, an approach known as "kettling" - kept a lid on the violence.

That, at least, is the official version of events, and it reflects a belief about crowds that is shared by police, governments and to a large degree the general public across the world: that they are hotbeds of trouble and must be contained. Trouble is seen as especially likely when something goes wrong at a large gathering. Under such circumstances, the expectation is that the crowd will lose its head and all hell will break loose.

The "unruly mob" concept is usually taken as read and used as the basis for crowd control measures and evacuation procedures across the world. Yet it is almost entirely a myth. Research into how people behave at demonstrations, sports events, music festivals and other mass gatherings shows not only that crowds nearly always act in a highly rational way, but also that when facing an emergency, people in a crowd are more likely to cooperate than panic. Paradoxically, it is often actions such as kettling that lead to violence breaking out. Often, the best thing authorities can do is leave a crowd to its own devices.

"In many ways, crowds are the solution," says psychologist Stephen Reicher, who studies group behaviour at the University of St Andrews, UK. Rather than being prone to irrational behaviour and violence, members of a crowd undergo a kind of identity shift that drives them to act in the best interests of themselves and everyone around them. This identity shift is often strongest in times of danger or threat. "The 'mad mob' is not an explanation, but a fantasy," says Reicher.

All this has profound implications for policing and the management of public events. "The classic view of crowd psychology, which is still widespread, talks about the loss of selfhood, leaving people at best out of control and at worst generically violent," says Reicher. "That is not only wrong, it's also counterproductive. If you believe all crowds are irrational, and that even rational people are liable to be dangerous in them, then you'll treat them accordingly, often harshly, and stop people doing things they have a right to do. And that can lead to violence."

If you believe all crowds are irrational and treat them accordingly, it can lead to violence
All that said, there's no question that being part of a group can sometimes lead people to do appalling things that they would usually abhor. Examples of crowd-fuelled violence abound, from Hutu death-squads in the Rwandan genocide to racist lynch mobs in the southern states of the US. Likewise, the cover crowds offer can attract individuals who are intent on causing trouble. We can all too easily be led astray by the influence of others (New Scientist, 14 April 2007, p 42).

However, crowd violence is actually extremely rare. "If 100 football matches happen on a Saturday and there is violence at one of them, we know which will appear on the front pages the next day," says Reicher. Widespread panic during crowd emergencies is also uncommon and only occurs in special circumstances, such as when escape routes start to close, says Tricia Wachtendorf of the Disaster Research Center at the University of Delaware in Newark. In most situations - particularly those involving large numbers of strangers - the crowd ends up behaving remarkably sensibly.

Evidence against the irrationality of crowds has been building for some time, largely from studies of emergencies. In a study to be published in the British Journal of Social Psychology (DOI: 10.1348/014466608X357893), a team led by John Drury at the University of Sussex, UK, talked to survivors of 11 crowd-based disasters or near-disasters, including the 1989 Hillsborough stadium crush that killed 96 soccer fans, and a free concert by Fatboy Slim on Brighton beach in 2002 that was swamped by 250,000 people, four times as many as expected, and led to around 100 injuries. In each case, most interviewees recalled a strong sense of unity with those around them as a result of their shared experience. Rather than being competitive or antagonistic, people did their best to be orderly and courteous - and went out of their way to help strangers. Researchers think that without such cooperation, more people could have been injured and killed.

The team found a similar pattern of solidarity and cooperative behaviour in a study of the suicide attacks in London on 7 July 2005, which led to crowds of commuters being trapped underground (International Journal of Mass Emergencies and Disasters, vol 27, p 66). "The public in general and crowds specifically are more resilient than they are given credit for," says Drury. During disasters, governments should treat them as the "fourth emergency service", he adds.

If anything, a crowd's disinclination to panic can work against it. "It's often difficult to get people to move and act," says Wachtendorf. An analysis of the 9/11 attacks on the World Trade Center, for example, performed by the US National Institute of Standards and Technology, showed that most people prevaricated for several minutes after the planes struck, making phone calls, filing papers or shutting down their computers before attempting to escape.

Having established that unruly mob behaviour is the exception, researchers are now getting to grips with the psychological processes that can transform hundreds or thousands of individuals into a unit. The key, according to Drury, Reicher and others, is the recognition that you share something important with those around you, which forces you to identify with them in a meaningful way. "It is a cognitive shift, a difference in self-appraisal, in understanding who you are and how you stand in relation to others," says Reicher.

The trigger is often a dramatic situational change such as a fire in a public place or aggressive police tactics at a protest march, but group solidarity can also arise from seemingly inconsequential circumstances, such as being stuck together in a train carriage. Reicher describes it as a shift towards intimacy: "People start agreeing with each other, trusting each other," he says. At the point when members of a crowd start to share a common social identity, the crowd goes from being a mere physical entity to a psychological unit, according to Clifford Stott at the University of Liverpool, UK, who specialises in the behaviour of soccer crowds.

United by circumstances

A study carried out by Drury, Reicher and David Novelli of the University of Sussex, to be published in the British Journal of Social Psychology, provides a graphic illustration of how quickly and easily we throw ourselves into "psychological crowds" united by circumstances. The researchers divided a group of volunteers into two according to whether they overestimated or underestimated the number of dots in a pattern - a deliberately arbitrary distinction. They then told each person that they would be talking to someone either from their own group or the other, and that they should arrange some chairs in preparation. Those who had been told they would be talking to a member of their own group placed the chairs on average 20 per cent closer together than those who had been told they would be talking to a member of the other group (DOI: 10.1348/014466609X449377). "We want to be closer to fellow group members, not only metaphorically but also physically, and physical proximity is a precondition for any kind of action coordination," says Reicher.

The fluidity of group psychology was also demonstrated in a 2005 experiment on English soccer fans by Mark Levine at the University of Lancaster, UK. He found that supporters of Manchester United who had been primed to think about how they felt about their team were significantly more likely to help an injured stranger if he was wearing a Manchester United shirt, rather than an unbranded shirt or one of rival team Liverpool. However, fans who were primed to think about their experience of being a football fan in general were equally likely to help strangers in Liverpool shirts and Manchester United shirts, but far less likely to help someone wearing an unbranded one (Personality and Social Psychology Bulletin, vol 31, p 443). This shows the potency of group membership, and also how fluid the boundaries can be.

This also happens in the real world, resulting in group bonding which, though transient, can override social, racial and political differences. A good example is the poll tax riots in London in 1990, when protestors from a wide spectrum of backgrounds and interest groups joined forces in the face of what they saw as overly aggressive police tactics. "You had people who were previously antagonistic - anarchists, conservatives, class-war activists - who in the context of the baton charges were united in common group membership," says Stott. This temporary homogenisation is common: think of the cohesiveness of soccer fans supporting an international team who might be hostile when supporting their own local clubs.

People who were previously antagonistic - anarchists, conservatives, class war activists - end up uniting in common group membership
Not everyone agrees. One criticism is that the cohesiveness of crowds is superficial, and that people preferentially draw close to those they know or are related to and remain far less attached to strangers around them. Anthony Mawson, an epidemiologist at the University of Mississippi Medical Center in Jackson, maintains that people's typical response in times of threat is to seek out people familiar to them (Public Health Reports, vol 123, p 555). Strangers can develop a shared identity only when they are together "for long enough that a sense of camaraderie develops among them", he says.

Yet studies by Drury and others suggest the bonds that form between strangers in crowds are very robust, and although people might help family members first in an emergency, they will also help others irrespective of their connection to them. "What is really of interest," says Drury, "is why so many people - strangers without any formal organisation, hierarchy or means of communication - join together and act as one."

So where does this inclination come from to empathise so strongly with others on the basis of shared fate alone? Nobody is really sure, though it appears to be uniquely human. As Mark van Vugt at the University of Kent, UK, and Justin Park at the University of Groningen in the Netherlands point out, no other species appears to have the capacity to form rapid emotional attachments to large, anonymous groups (The Psychology of Prosocial Behaviour, published by Wiley-Blackwell next month). The tendency of people to form strong social bonds while experiencing terror together also appears a universal human trait. "This is well known in traditional societies where boys going through puberty rituals in the transition to manhood are often put through frightening experiences," says Robin Dunbar, who studies the evolution of sociality at the University of Oxford.

Control and contain

What are the lessons from all this? One of the most important is that the current approach to managing crowds, which is all about control and containment, can be counterproductive. Police tend to assume that people in crowds are prone to random acts of violence and disorder, and treat them accordingly. But aggressive policing is likely to trigger an aggressive response as the crowd reacts collectively against the external threat. This is why many researchers consider kettling to be a bad idea. "You're treating the crowd indiscriminately, and that can change the psychology of the crowd, shifting it towards rather than away from violence," says Stott. He has found that low-profile policing can significantly reduce the aggressiveness of football crowds, and that if left alone they will usually police themselves.

Emergency services should also take note: in a situation such as a terrorist attack or fire, a crowd left to its own devices will often find the best solution. Attempts to intervene to prevent people panicking, such as restricting their movements, could make panic more likely. The key, says Wachtendorf, is to give crowds as much information as possible, as they are likely to use it wisely.

If you find yourself in a crowd emergency, the worst thing you can do is resist the group mentality. One of Drury's conclusions from his research into disasters is that the more people try to act individualistically - which results in competitive and disruptive behaviour - the lower everyone's chances of survival are. This is what some researchers believe happened in August 1985 when a British Airtours plane caught fire on the runway at Manchester Airport, UK, killing 55. Non-cooperative behaviour among passengers may have made it harder for people to reach the exits.

It can be hard to shake off the idea of crowds as inherently violent or dangerous, but it is worth remembering that they have also been responsible for just about every major societal change for the good in recent history, from the success of the US civil rights movement to the overthrowing of communist regimes in eastern Europe. Good leadership and individual heroics are all very well, but if you're looking for a revolution - or even just a good way out of a difficult situation - what you really need, it seems, is a crowd.

Michael Bond is a New Scientist consultant in London

http://www.newscientist.com/article/mg20327171.400-why-cops-should-trust-the-wisdom-of-the-crowds.html?full=true
Title: Seeking/How the brain hard-wires us to love Google, Twitter, and texting.
Post by: rachelg on August 16, 2009, 07:45:12 AM
I  almost always tend to think technology advances are makeing the world a much better place but everything but this was a little frightening.

Science
Seeking
How the brain hard-wires us to love Google, Twitter, and texting. And why that's dangerous.
By Emily Yoffe
Posted Wednesday, Aug. 12, 2009, at 5:40 PM ET

Seeking. You can't stop doing it. Sometimes it feels as if the basic drives for food, sex, and sleep have been overridden by a new need for endless nuggets of electronic information. We are so insatiably curious that we gather data even if it gets us in trouble. Google searches are becoming a cause of mistrials as jurors, after hearing testimony, ignore judges' instructions and go look up facts for themselves. We search for information we don't even care about. Nina Shen Rastogi confessed in Double X, "My boyfriend has threatened to break up with me if I keep whipping out my iPhone to look up random facts about celebrities when we're out to dinner." We reach the point that we wonder about our sanity. Virginia Heffernan in the New York Times said she became so obsessed with Twitter posts about the Henry Louis Gates Jr. arrest that she spent days "refreshing my search like a drugged monkey."

We actually resemble nothing so much as those legendary lab rats that endlessly pressed a lever to give themselves a little electrical jolt to the brain. While we tap, tap away at our search engines, it appears we are stimulating the same system in our brains that scientists accidentally discovered more than 50 years ago when probing rat skulls.

In 1954, psychologist James Olds and his team were working in a laboratory at McGill University, studying how rats learned. They would stick an electrode in a rat's brain and, whenever the rat went to a particular corner of its cage, would give it a small shock and note the reaction. One day they unknowingly inserted the probe in the wrong place, and when Olds tested the rat, it kept returning over and over to the corner where it received the shock. He eventually discovered that if the probe was put in the brain's lateral hypothalamus and the rats were allowed to press a lever and stimulate their own electrodes, they would press until they collapsed.

Olds, and everyone else, assumed he'd found the brain's pleasure center (some scientists still think so). Later experiments done on humans confirmed that people will neglect almost everything—their personal hygiene, their family commitments—in order to keep getting that buzz.

But to Washington State University neuroscientist Jaak Panksepp, this supposed pleasure center didn't look very much like it was producing pleasure. Those self-stimulating rats, and later those humans, did not exhibit the euphoric satisfaction of creatures eating Double Stuf Oreos or repeatedly having orgasms. The animals, he writes in Affective Neuroscience: The Foundations of Human and Animal Emotions, were "excessively excited, even crazed." The rats were in a constant state of sniffing and foraging. Some of the human subjects described feeling sexually aroused but didn't experience climax. Mammals stimulating the lateral hypothalamus seem to be caught in a loop, Panksepp writes, "where each stimulation evoked a reinvigorated search strategy" (and Panksepp wasn't referring to Bing).

It is an emotional state Panksepp tried many names for: curiosity, interest, foraging, anticipation, craving, expectancy. He finally settled on seeking. Panksepp has spent decades mapping the emotional systems of the brain he believes are shared by all mammals, and he says, "Seeking is the granddaddy of the systems." It is the mammalian motivational engine that each day gets us out of the bed, or den, or hole to venture forth into the world. It's why, as animal scientist Temple Grandin writes in Animals Make Us Human, experiments show that animals in captivity would prefer to have to search for their food than to have it delivered to them.

For humans, this desire to search is not just about fulfilling our physical needs. Panksepp says that humans can get just as excited about abstract rewards as tangible ones. He says that when we get thrilled about the world of ideas, about making intellectual connections, about divining meaning, it is the seeking circuits that are firing.

The juice that fuels the seeking system is the neurotransmitter dopamine. The dopamine circuits "promote states of eagerness and directed purpose," Panksepp writes. It's a state humans love to be in. So good does it feel that we seek out activities, or substances, that keep this system aroused—cocaine and amphetamines, drugs of stimulation, are particularly effective at stirring it.

Ever find yourself sitting down at the computer just for a second to find out what other movie you saw that actress in, only to look up and realize the search has led to an hour of Googling? Thank dopamine. Our internal sense of time is believed to be controlled by the dopamine system. People with hyperactivity disorder have a shortage of dopamine in their brains, which a recent study suggests may be at the root of the problem. For them even small stretches of time seem to drag. An article by Nicholas Carr in the Atlantic last year, "Is Google Making Us Stupid?" speculates that our constant Internet scrolling is remodeling our brains to make it nearly impossible for us to give sustained attention to a long piece of writing. Like the lab rats, we keep hitting "enter" to get our next fix.

University of Michigan professor of psychology Kent Berridge has spent more than two decades figuring out how the brain experiences pleasure. Like Panksepp, he, too, has come to the conclusion that what James Olds' rats were stimulating was not their reward center. In a series of experiments, he and other researchers have been able to tease apart that the mammalian brain has separate systems for what Berridge calls wanting and liking.

Wanting is Berridge's equivalent for Panksepp's seeking system. It is the liking system that Berridge believes is the brain's reward center. When we experience pleasure, it is our own opioid system, rather than our dopamine system, that is being stimulated. This is why the opiate drugs induce a kind of blissful stupor so different from the animating effect of cocaine and amphetamines. Wanting and liking are complementary. The former catalyzes us to action; the latter brings us to a satisfied pause. Seeking needs to be turned off, if even for a little while, so that the system does not run in an endless loop. When we get the object of our desire (be it a Twinkie or a sexual partner), we engage in consummatory acts that Panksepp says reduce arousal in the brain and temporarily, at least, inhibit our urge to seek.

But our brains are designed to more easily be stimulated than satisfied. "The brain seems to be more stingy with mechanisms for pleasure than for desire," Berridge has said. This makes evolutionary sense. Creatures that lack motivation, that find it easy to slip into oblivious rapture, are likely to lead short (if happy) lives. So nature imbued us with an unquenchable drive to discover, to explore. Stanford University neuroscientist Brian Knutson has been putting people in MRI scanners and looking inside their brains as they play an investing game. He has consistently found that the pictures inside our skulls show that the possibility of a payoff is much more stimulating than actually getting one.

Just how powerful (and separate) wanting is from liking is illustrated in animal experiments. Berridge writes that studies have shown that rats whose dopamine neurons have been destroyed retain the ability to walk, chew, and swallow but will starve to death even if food is right under their noses because they have lost the will to go get it. Conversely, Berridge discovered that rats with a mutation that floods their brains with dopamine learned more quickly than normal rats how to negotiate a runway to reach the food. But once they got it, they didn't find the food more pleasurable than the nonenhanced rats. (No, the rats didn't provide a Zagat rating; scientists measure rats' facial reactions to food.)

That study has implications for drug addiction and other compulsive behaviors. Berridge has proposed that in some addictions the brain becomes sensitized to the wanting cycle of a particular reward. So addicts become obsessively driven to seek the reward, even as the reward itself becomes progressively less rewarding once obtained. "The dopamine system does not have satiety built into it," Berridge explains. "And under certain conditions it can lead us to irrational wants, excessive wants we'd be better off without." So we find ourselves letting one Google search lead to another, while often feeling the information is not vital and knowing we should stop. "As long as you sit there, the consumption renews the appetite," he explains.

Actually all our electronic communication devices—e-mail, Facebook feeds, texts, Twitter—are feeding the same drive as our searches. Since we're restless, easily bored creatures, our gadgets give us in abundance qualities the seeking/wanting system finds particularly exciting. Novelty is one. Panksepp says the dopamine system is activated by finding something unexpected or by the anticipation of something new. If the rewards come unpredictably—as e-mail, texts, updates do—we get even more carried away. No wonder we call it a "CrackBerry."

The system is also activated by particular types of cues that a reward is coming. In order to have the maximum effect, the cues should be small, discrete, specific—like the bell Pavlov rang for his dogs. Panksepp says a way to drive animals into a frenzy is to give them only tiny bits of food: This simultaneously stimulating and unsatisfying tease sends the seeking system into hyperactivity. Berridge says the "ding" announcing a new e-mail or the vibration that signals the arrival of a text message serves as a reward cue for us. And when we respond, we get a little piece of news (Twitter, anyone?), making us want more. These information nuggets may be as uniquely potent for humans as a Froot Loop to a rat. When you give a rat a minuscule dose of sugar, it engenders "a panting appetite," Berridge says—a powerful and not necessarily pleasant state.

If humans are seeking machines, we've now created the perfect machines to allow us to seek endlessly. This perhaps should make us cautious. In Animals in Translation, Temple Grandin writes of driving two indoor cats crazy by flicking a laser pointer around the room. They wouldn't stop stalking and pouncing on this ungraspable dot of light—their dopamine system pumping. She writes that no wild cat would indulge in such useless behavior: "A cat wants to catch the mouse, not chase it in circles forever." She says "mindless chasing" makes an animal less likely to meet its real needs "because it short-circuits intelligent stalking behavior." As we chase after flickering bits of information, it's a salutary warning.
Emily Yoffe is the author of What the Dog Did: Tales From a Formerly Reluctant Dog Owner. You can send your Human Guinea Pig suggestions or comments to emilyyoffe@hotmail.com.

Article URL: http://www.slate.com/id/2224932/
Title: An Appendix isn't a Vestige?
Post by: Body-by-Guinness on August 25, 2009, 07:37:40 AM
Conventional wisdom does have a habit of getting turned on its head:

The Appendix: Useful and in Fact Promising
By Charles Q. Choi, Special to LiveScience
posted: 24 August 2009 07:05 am ET
The body's appendix has long been thought of as nothing more than a worthless evolutionary artifact, good for nothing save a potentially lethal case of inflammation.

Now researchers suggest the appendix is a lot more than a useless remnant. Not only was it recently proposed to actually possess a critical function, but scientists now find it appears in nature a lot more often than before thought. And it's possible some of this organ's ancient uses could be recruited by physicians to help the human body fight disease more effectively.

In a way, the idea that the appendix is an organ whose time has passed has itself become a concept whose time is over.

"Maybe it's time to correct the textbooks," said researcher William Parker, an immunologist at Duke University Medical Center in Durham, N.C. "Many biology texts today still refer to the appendix as a 'vestigial organ.'"

Slimy sac

The vermiform appendix is a slimy dead-end sac that hangs between the small and large intestines. No less than Charles Darwin first suggested that the appendix was a vestigial organ from an ancestor that ate leaves, theorizing that it was the evolutionary remains of a larger structure, called a cecum, which once was used by now-extinct predecessors for digesting food.

"Everybody likely knows at least one person who had to get their appendix taken out — slightly more than 1 in 20 people do — and they see there are no ill effects, and this suggests that you don't need it," Parker said.

However, Parker and his colleagues recently suggested that the appendix still served as a vital safehouse where good bacteria could lie in wait until they were needed to repopulate the gut after a nasty case of diarrhea. Past studies had also found the appendix can help make, direct and train white blood cells.

Now, in the first investigation of the appendix over the ages, Parker explained they discovered that it has been around much longer than anyone had suspected, hinting that it plays a critical function.

"The appendix has been around for at least 80 million years, much longer than we would estimate if Darwin's ideas about the appendix were correct," Parker said.

Moreover, the appendix appears in nature much more often than previously acknowledged. It has evolved at least twice, once among Australian marsupials such as the wombat and another time among rats, lemmings, meadow voles, Cape dune mole-rats and other rodents, as well as humans and certain primates.

"When species are divided into groups called 'families,' we find that more than 70 percent of all primate and rodent groups contain species with an appendix," Parker said.

Several living species, including several lemurs, certain rodents and the scaly-tailed flying squirrel, still have an appendix attached to a large cecum, which is used in digestion. Darwin had thought appendices appeared in only a small handful of animals.

"We're not saying that Darwin's idea of evolution is wrong — that would be absurd, as we're using his ideas on evolution to do this work," Parker told LiveScience. "It's just that Darwin simply didn't have the information we have now."

He added, "If Darwin had been aware of the species that have an appendix attached to a large cecum, and if he had known about the widespread nature of the appendix, he probably would not have thought of the appendix as a vestige of evolution."

What causes appendicitis?

Darwin was also not aware that appendicitis, or a potentially deadly inflammation of the appendix, is not due to a faulty appendix, but rather to cultural changes associated with industrialized society and improved sanitation, Parker said.

"Those changes left our immune systems with too little work and too much time their hands — a recipe for trouble," he said. "Darwin had no way of knowing that the function of the appendix could be rendered obsolete by cultural changes that included widespread use of sewer systems and clean drinking water."

Now that scientists are uncovering the normal function of the appendix, Parker notes a critical question to ask is whether anything can be done to prevent appendicitis. He suggests it might be possible to devise ways to incite our immune systems today in much the same manner that they were challenged back in the Stone Age.

"If modern medicine could figure out a way to do that, we would see far fewer cases of allergies, autoimmune disease, and appendicitis," Parker said.

The scientists detailed their findings online August 12 in the Journal of Evolutionary Biology.

http://www.livescience.com/health/090824-appendix-evolution.html

http://news.yahoo.com/s/livescience/20090824/sc_livescience/theappendixusefulandinfactpromising
Title: Madam, I'm Adam
Post by: Crafty_Dog on September 15, 2009, 05:44:51 AM
New Clues to Sex Anomalies in How Y Chromosomes Are Copied

NY Times
By NICHOLAS WADE
Published: September 14, 2009
The first words ever spoken, so fable holds, were a palindrome and an introduction: “Madam, I’m Adam.”


A few years ago palindromes — phrases that read the same backward as forward — turned out to be an essential protective feature of Adam’s Y, the male-determining chromosome that all living men have inherited from a single individual who lived some 60,000 years ago. Each man carries a Y from his father and an X chromosome from his mother. Women have two X chromosomes, one from each parent.

The new twist in the story is the discovery that the palindrome system has a simple weakness, one that explains a wide range of sex anomalies from feminization to sex reversal similar to Turner’s syndrome, the condition of women who carry only one X chromosome.

The palindromes were discovered in 2003 when the Y chromosome’s sequence of bases, represented by the familiar letters G, C, T and A, was first worked out by David C. Page of the Whitehead Institute in Cambridge, Mass., and colleagues at the DNA sequencing center at Washington University School of Medicine in St. Louis.

They came as a total surprise but one that immediately explained a serious evolutionary puzzle, that of how the genes on the Y chromosome are protected from crippling mutations.

Unlike the other chromosomes, which can repair one another because they come in pairs, one from each parent, the Y has no evident backup system. Nature has prevented it from recombining with its partner, the X, except at its very tips, lest its male-determining gene should sneak into the X and cause genetic chaos.

Discovery of the palindromes explained how the Y chromosome has managed over evolutionary time to discard bad genes: it recombines with itself. Its essential genes are embedded in a series of eight giant palindromes, some up to three million DNA units in length. Each palindrome readily folds like a hairpin, bringing its two arms together. The cell’s DNA control machinery detects any difference between the two arms and can convert a mutation back to the correct sequence, saving the Y’s genes from mutational decay.

After Dr. Page discovered the palindromes, he wondered whether the system had weaknesses that might explain the male sex chromosome anomalies that are a major object of his studies. In the current issue of Cell, with Julian Lange and others, he describes what they call the “Achilles’ heel” of the Y chromosome and the wide variety of sexual disorders that it leads to.

The danger of the palindrome protection system occurs when a cell has duplicated all its chromosomes prior to cell division, and each pair is held together at a site called the centromere. Soon, the centromere will split, with each half and its chromosome tugged to opposite sides of the dividing cell.

Before the split, however, a serious error can occur. Palindromes on one Y chromosome can occasionally reach over and form a fatal attraction with the counterpart palindrome on its neighbor. The two Y’s fuse at the point of joining, and everything from the juncture to the end of the chromosome is lost

The double-Y’s so generated come in a range of lengths, depending on which of the palindromes makes the unintended liaison. Like other chromosomes, the Y has a left arm and a right arm with the centromere in between. The male-determining gene lies close to the end of the left arm. If the palindromes at the very end of the right arm make the join, a very long double-Y results in which the two centromeres are widely separated. But if the joining palindromes are just to the right of the centromere, a short double-Y is formed in which the two centromeres lie close together.

Dr. Page detected among his patients both short and long double-Y’s and those of all lengths in between. He and his colleagues then noticed a surprising difference in the patients’ sexual appearance that depended on the length between the centromeres of their double-Y’s.

The patients in whom the distance between the Y’s two centromeres is short are males. But the greater the distance between the centromeres, the more likely the patients are to be anatomically feminized. A few of the patients were so feminized that they had the symptoms of Turner’s syndrome, a condition in which women are born with a single X chromosome.

The explanation for this spectrum of results, in Dr. Page’s view, lies in how the double-Y’s are treated in dividing cells and in the consequences for determining the sex of the fetus.

========

(Page 2 of 2)



When the centromeres are close together, they are seen as one and dragged to one side of the dividing cell. As long as the Y’s male-determining gene is active in the cells of the fetal sex tissue, or gonad, the gonads will turn into testes whose hormones will masculinize the rest of the body.


But when the centromeres lie far apart, chromosomal chaos results. During cell division, both centromeres are recognized by the cell division machinery, and in the tug of war the double-Y chromosome may sometimes survive and sometimes be broken and lost to the cell.

Such individuals can carry a mixture of cells, some of which carry a double-Y and some of which carry no Y chromosome. In the fetal gonads, that mixture of cells produces people of intermediate sex. In many of these cases the patients had been raised as female but had testicular tissue on one side of the body and ovarian tissue on the other.

In the extreme version of this process, the distribution of cells may be such that none of the fetal gonad cells possess a Y chromosome, even though other cells in the body may do so. Dr. Page and his colleagues found five of the feminized patients had symptoms typical of Turner’s syndrome. The patients had been brought to Dr. Page’s attention because their blood cells contained Y chromosomes. Evidently by the luck of the draw, the blood cell lineage had retained Y chromosomes but the all important fetal gonad cells had been denied them.

In 75 percent of women with Turner’s syndrome, the single X comes from the mother. “Since they are females, everyone imagines it’s Dad’s X that is missing,” Dr. Page said. “But it could easily be Dad’s Y.”

That the degree of feminization parallels the distance between the two centromeres of the double Y chromosome is “a fantastic experiment of nature,” Dr. Page said. Despite having studied the Y chromosome for nearly 30 years, he has learned that it is always full of surprises.

“I continue to see the Y as an infinitely rich national park where we go to see unusual things, and we are never disappointed,” he said.

Dr. Cynthia Morton, editor of the American Journal of Human Genetics, said the new explanation of Turner’s syndrome was plausible. “It’s another beautiful David Page contribution to the science of genetics,” Dr. Morton said.
Title: Color-Blind Monkeys Get Full Color Vision
Post by: rachelg on September 16, 2009, 06:39:11 PM
Wednesday, September 16, 2009
Color-Blind Monkeys Get Full Color Vision
http://www.technologyreview.com/computing/23483/?a=f
Gene therapy can transform the visual system, even in adults.
By Emily Singer

Squirrel monkeys, which are naturally red-green color-blind, can attain humanlike color vision when injected with the gene for a human photoreceptor. The research, performed in adult animals, suggests that the visual system is much more flexible than previously thought--the monkeys quickly learned to use the new sensory information. Researchers hope these results will also hold true for humans afflicted with color blindness and other visual disorders, expanding the range of blinding diseases that might be treated with gene therapy.

"The core observation here is that the animal can use this extra input on such a rapid timescale and make decisions with it," says Jeremy Nathans, a neuroscientist at Johns Hopkins University in Baltimore. "That's incredibly cool."

"This is an amazing step forward in terms of our ability to modify the retina with genetic engineering," says David Williams, director of the Center for Visual Science at the University of Rochester in New York, who was not involved in the study.

Normal vision in squirrel monkeys is almost identical to red-green colorblindness in humans, making the monkeys excellent subjects for studying the disorder. Most people have three types of color photoreceptors--red, green, and blue--which allow them to see the full spectrum of colors. People with red-green color blindness, a genetic disorder that affects about 5 percent of men and a much smaller percentage of women, lack the light-sensitive protein for either red or green wavelengths of light. Because they have only two color photoreceptors, their color vision is limited--they can't distinguish a red X on a green background, for example.

In the new study, published today in Nature, scientists from the University of Washington in Seattle injected the gene for the human version of the red photopigment directly into two animals' eyes, near the retina. The gene, which sits inside a harmless virus often used for gene therapy, is engineered so that it only becomes active in a subset of green photoreceptors. It begins producing the red pigment protein about nine to 20 weeks after injection, transforming that cell into one that responds to the color red.

Researchers screened the monkeys before and after the treatment, using a test very similar to the one used to assess color blindness in people. Colored shapes were embedded in a background of a different color, and the monkeys touched the screen where they saw the shape. The researchers found that the animals' color vision changed dramatically after the treatment. "Human color vision is very good; you only need a tiny bit of red tint to distinguish two shades," says Jay Neitz, one of the authors of the study. "[The] cured animals are not quite as good as other [types of] monkeys with normal color vision, but they are close."

Both animals described in the study have also retained their new tricolor sensory capacity for more than two years. And neither has shown harmful side effects, such as an immune reaction to the foreign protein. The researchers have since treated four additional animals, with no signs of complications. "The results are quite compelling," says Gerald Jacobson, a neuroscientist at the University of California, Santa Barbara, who was not involved in the study. "There is the potential to do the same for humans."

Gene-therapy trials are already under way for a more severe visual impairment, called Leber congenital amaurosis, in which an abnormal protein in sufferers' photoreceptors severely impairs their sensitivity to light. Whether this research should be converted into a treatment for human color blindness is likely to be controversial. "I think it would be a poor use of medical technology when there are so many more serious problems," says Nathans. "Color-vision variation is one of the kinds of variations that make life more interesting. One may think of it as a deficiency, but color-blind people are also better at some things, such as breaking through camouflage." They may also have slightly improved acuity, he says.

However, both Nietz and Jacobson say they frequently receive calls from color-blind people searching for cures, and they hope the research can eventually be used in humans.

"It seems a trivial defect for those of us who are not color-blind, but it does close a lot of avenues," says Jacobson. People who are color-blind can't become commercial pilots, police officers, or firefighters, for example. "People tell me every day how they feel that they miss out because they don't have normal color vision," says Neitz. "You obviously don't want to risk other aspects of vision, but I think this could get to a point where this could be done relatively without risk."

The findings challenge existing notions about the visual system, which was thought to be hardwired early in development. This is supported, for instance, by the fact that cats deprived of vision in one eye early in life never gain normal use of that eye. "People had explored visual plasticity and development using deprivation in a lot of different ways," says Neitz. "But no one has been able to explore it by adding something that wasn't there."

That flexibility is also important for clinical applications of the technology. The fact that adult monkeys could use their novel sensory information suggests that corrective gene therapies for color blindness need not be delivered early in development, as some had feared. However, it's not yet clear whether color vision will be a unique example of plasticity in the adult visual system, or one of many.

Researchers hope the findings will prove applicable to other retinal diseases. Hundreds of mutations have already been identified that are linked to defects in the photoreceptors and other retinal cells, leading to diseases such as retinitis pigmentosa, a degenerative disease that can lead to blindness. However, unlike color blindness, in which the visual system is intact, save for the missing photopigment, many of these diseases trigger damage to the photoreceptor cells. "I think it's hard to know in what way it will extrapolate to more serious blinding disorders that involve more serious degeneration of retina," says Nathans.

The research also raises the possibility of adding new functionality to the visual system, which might be of particular interest to the military. "You might be able to take people with normal vision and give them a pigment for infrared," says Williams. "I'm sure a lot of soldiers would like to have their infrared camera built right into the retina."

Copyright Technology Review 2009.
Upcoming Events

Lab to Market Workshop
Cambridge, MA
Tuesday, September 22, 2009
http://www.technologyreview.com/emtech/09/workshop.aspx

EmTech 09
Cambridge, MA
Tuesday, September 22, 2009 - Thursday, September 24, 2009
http://www.technologyreview.com/emtech

Nanotech Europe 2009
Berlin, Germany
Monday, September 28, 2009 - Wednesday, September 30, 2009
http://www.nanotech.net

2009 Medical Innovation Summit
Cleveland, OH
Monday, October 05, 2009 - Wednesday, October 07, 2009
http://www.ClevelandClinic.org/innovations/summit

Optimizing Innovation 2009
New York, NY
Wednesday, October 21, 2009 - Thursday, October 22, 2009
http://www.connecting-group.com/Web/EventOverview.aspx?Identificador=6
Title: Why Women Have Sex
Post by: Body-by-Guinness on October 04, 2009, 02:19:51 PM
The flip tone doesn't lend much, but some of the findings are interesting.

Why women have sex
According to a new book, there are 237 reasons why women have sex. And most of them have little to do with romance or pleasure
Monday 28 September 2009

Do you want to know why women have sex with men with tiny little feet? I am stroking a book called Why Women Have Sex. It is by Cindy Meston, a clinical psychologist, and David Buss, an evolutionary psychologist. It is a very thick, bulging book. I've never really wondered Why Women Have Sex. But after years of not asking the question, the answer is splayed before me.

Meston and Buss have interviewed 1,006 women from all over the world about their sexual motivation, and in doing so they have identified 237 different reasons why women have sex. Not 235. Not 236. But 237. And what are they? From the reams of confessions, it emerges that women have sex for physical, emotional and material reasons; to boost their self-esteem, to keep their lovers, or because they are raped or coerced. Love? That's just a song. We are among the bad apes now.

Why, I ask Meston, have people never really talked about this? Alfred Kinsey, the "father" of sexology, asked 7,985 people about their sexual histories in the 1940s and 50s; Masters and Johnson observed people having orgasms for most of the 60s. But they never asked why. Why?

"People just assumed the answer was obvious," Meston says. "To feel good. Nobody has really talked about how women can use sex for all sorts of resources." She rattles off a list and as she says it, I realise I knew it all along: "promotion, money, drugs, bartering, for revenge, to get back at a partner who has cheated on them. To make themselves feel good. To make their partners feel bad." Women, she says, "can use sex at every stage of the relationship, from luring a man into the relationship, to try and keep a man so he is fulfilled and doesn't stray. Duty. Using sex to get rid of him or to make him jealous."

"We never ever expected it to be so diverse," she says. "From the altruistic to the borderline evil." Evil? "Wanting to give someone a sexually transmitted infection," she explains. I turn to the book. I am slightly afraid of it. Who wants to have their romantic fantasies reduced to evolutional processes?

The first question asked is: what thrills women? Or, as the book puts it: "Why do the faces of Antonio Banderas and George Clooney excite so many women?"

We are, apparently, scrabbling around for what biologists call "genetic benefits" and "resource benefits". Genetic benefits are the genes that produce healthy children. Resource benefits are the things that help us protect our healthy children, which is why women sometimes like men with big houses. Jane Eyre, I think, can be read as a love letter to a big house.

"When a woman is sexually attracted to a man because he smells good, she doesn't know why she is sexually attracted to that man," says Buss. "She doesn't know that he might have a MHC gene complex complimentary to hers, or that he smells good because he has symmetrical features."

So Why Women Have Sex is partly a primer for decoding personal ads. Tall, symmetrical face, cartoonish V-shaped body? I have good genes for your brats. Affluent, GSOH – if too fond of acronyms – and kind? I have resource benefits for your brats. I knew this already; that is how Bill Clinton got sex, despite his astonishing resemblance to a moving potato. It also explains why Vladimir Putin has become a sex god and poses topless with his fishing rod.

Then I learn why women marry accountants; it's a trade-off. "Clooneyish" men tend to be unfaithful, because men have a different genetic agenda from women – they want to impregnate lots of healthy women. Meston and Buss call them "risk-taking, womanising 'bad boys'". So, women might use sex to bag a less dazzling but more faithful mate. He will have fewer genetic benefits but more resource benefits that he will make available, because he will not run away. This explains why women marry accountants. Accountants stick around – and sometimes they have tiny little feet!

And so to the main reason women have sex. The idol of "women do it for love, and men for joy" lies broken on the rug like a mutilated sex toy: it's orgasm, orgasm, orgasm. "A lot of women in our studies said they just wanted sex for the pure physical pleasure," Meston says. Meston and Buss garnish this revelation with so much amazing detail that I am distracted. I can't concentrate. Did you know that the World Health Organisation has a Women's Orgasm Committee? That "the G-spot" is named after the German physician Ernst Gräfenberg? That there are 26 definitions of orgasm?

And so, to the second most important reason why women have sex – love. "Romantic love," Meston and Buss write, "is the topic of more than 1,000 songs sold on iTunes." And, if people don't have love, terrible things can happen, in literature and life: "Cleopatra poisoned herself with a snake and Ophelia went mad and drowned." Women say they use sex to express love and to get it, and to try to keep it.

Love: an insurance policy

And what is love? Love is apparently a form of "long-term commitment insurance" that ensures your mate is less likely to leave you, should your legs fall off or your ovaries fall out. Take that, Danielle Steele – you may think you live in 2009 but your genes are still in the stone age, with only chest hair between you and a bloody death. We also get data which confirms that, due to the chemicals your brain produces – dopamine, norepinephrine and phenylethylamine – you are, when you are in love, technically what I have always suspected you to be – mad as Stalin.

And is the world mad? According to surveys, which Meston and Buss helpfully whip out from their inexhaustible box of every survey ever surveyed, 73% of Russian women are in love, and 63% of Japanese women are in love. What percentage of women in north London are in love, they know not. But not as many men are in love. Only 61% of Russian men are in love and only 41% of Japanese men are in love. Which means that 12% of Russian women and 22% of Japanese women are totally wasting their time.

And then there is sex as man-theft. "Sometimes men who are high in mate value are in relationships or many of them simply pursue a short-term sexual strategy and don't want commitment," Buss explains. "There isn't this huge pool of highly desirable men just sitting out there waiting for women." It's true. So how do we liberate desirable men from other women? We "mate poach". And how do we do that? We "compete to embody what men want" – high heels to show off our pelvises, lip-gloss to make men think about vaginas, and we see off our rivals with slander. We spread gossip – "She's easy!" – because that makes the slandered woman less inviting to men as a long-term partner. She may get short-term genetic benefits but she can sing all night for the resource benefits, like a cat sitting out in the rain. Then – then! – the gossiper mates with the man herself.

We also use sex to "mate guard". I love this phrase. It is so evocative an image – I can see a man in a cage, and a woman with a spear and a bottle of baby oil. Women regularly have sex with their mates to stop them seeking it elsewhere. Mate guarding is closely related to "a sense of duty", a popular reason for sex, best expressed by the Meston and Buss interviewee who says: "Most of the time I just lie there and make lists in my head. I grunt once in a while so he knows I'm awake, and then I tell him how great it was when it's over. We are happily married."

Women often mate guard by flaunting healthy sexual relationships. "In a very public display of presumed rivalry," Meston writes, "in 2008 singer and actress Jessica Simpson appeared with her boyfriend, Dallas Cowboys quarterback Tony Romo, wearing a shirt with the tagline Real Girls Eat Meat. Fans interpreted it as a competitive dig at Romo's previous mate, who is a vegetarian."

Meston and Buss also explain why the girls in my class at school went down like dominoes in 1990. One week we were maidens, the following week, we were not. We were, apparently, having sex to see if we liked it, so we could tell other schoolgirls that we had done it and to practise sexual techniques: "As a woman I don't want to be a dead fish," says one female. Another interviewee wanted to practise for her wedding night.

The authors lubricate this with a description of the male genitalia, again food themed. I include it because I am immature. "In Masters & Johnson's [1966] study of over 300 flaccid penises the largest was 5.5 inches long (about the size of a bratwurst sausage); the smallest non-erect penis was 2.25 inches (about the size of a breakfast sausage)."

Ever had sex out of pity and wondered why? "Women," say Meston and Buss, "for the most part, are the ones who give soup to the sick, cookies to the elderly and . . . sex to the forlorn." "Tired, but he wanted it," says one female. Pause for more amazing detail: fat people are more likely to stay in a relationship because no one else wants them.

Women also mate to get the things they think they want – drugs, handbags, jobs, drugs. "The degree to which economics plays out in sexual motivations," Buss says, "surprised me. Not just prostitution. Sex economics plays out even in regular relationships. Women have sex so that the guy would mow the lawn or take out the garbage. You exchange sex for dinner." He quotes some students from the University of Michigan. It is an affluent university, but 9% of students said they had "initiated an attempt to trade sex for some tangible benefit".

Medicinal sex

Then there is sex to feel better. Women use sex to cure their migraines. This is explained by the release of endormorphins during sex – they are a pain reliever. Sex can even help relieve period pains. (Why are periods called periods? Please, someone tell me. Write in.)

Women also have sex because they are raped, coerced or lied to, although we have defences against deception – men will often copulate on the first date, women on the third, so they will know it is love (madness). Some use sex to tell their partner they don't want them any more – by sleeping with somebody else. Some use it to feel desirable; some to get a new car. There are very few things we will not use sex for. As Meston says, "Women can use sex at every stage of the relationship."

And there you have it – most of the reasons why women have sex, although, as Meston says, "There are probably a few more." Probably. Before I read this book I watched women eating men in ignorance. Now, when I look at them, I can hear David Attenborough talking in my head: "The larger female is closing in on her prey. The smaller female has been ostracised by her rival's machinations, and slinks away." The complex human race has been reduced in my mind to a group of little apes, running around, rutting and squeaking.

I am not sure if I feel empowered or dismayed. I thought that my lover adored me. No – it is because I have a symmetrical face. "I love you so much," he would say, if he could read his evolutionary impulses, "because you have a symmetrical face!" "Oh, how I love the smell of your compatible genes!" I would say back. "Symmetrical face!" "Compatible genes!" "Symmetrical face!" "Compatible genes!" And so we would osculate (kiss). I am really just a monkey trying to survive. I close the book.

I think I knew that.

http://www.guardian.co.uk/lifeandstyle/2009/sep/28/sex-women-relationships-tanya-gold
Title: Testosterone & Risk
Post by: Body-by-Guinness on October 31, 2009, 04:20:44 PM
Why testosterone-charged women behave like men (they're hungry for sex and ready to take risks with money)

By Daily Mail Reporter
Last updated at 12:04 PM on 25th August 2009


Women with an appetite for risk may also be hungry for sex, a study suggests.

Scientists found that risk-taking women have unusually high testosterone levels.

The hormone fuels sex-drive in both men and women and is associated with competitiveness and dominance.

Prior research has shown that high levels of testosterone are also linked to risky behaviour such as gambling or excessive drinking.


Women with an appetite for risk may also be hungry for sex
Scientists in the US measured the amount of testosterone in saliva samples taken from 500 male and female MBA business students at the University of Chicago.

Participants in the study were asked to play a computer game that evaluated their attitude towards risk.

A series of questions allowed them to choose between a guaranteed monetary reward or a risky lottery with a higher potential pay-out.

The students had to decide repeatedly whether to play safe for less or gamble on a bigger win.

Women who were most willing to take risks were also found to have the highest levels of testosterone, but this was not true of men.

However, men and women with the same levels of the hormone shared a similar attitude to risk.

The link between risk-taking and testosterone also had a bearing on the students' career choices after graduation.

Testosterone-driven individuals who liked to gamble went on to choose riskier careers in finance.

"This is the first study showing that gender differences in financial risk aversion have a biological basis, and that differences in testosterone levels between individuals can affect important aspects of economic behaviour and career decisions," said Professor Dario Maestripieri, one of the study leaders.

In general, women are known to be more risk-averse than men when it comes to financial decision making. Among the students taking part in the study, 36% of the women chose high-risk financial careers such as investment banking or trading compared with 57% of the men.

Overall, male participants displayed lower risk-aversion than their female counterparts and also had significantly higher levels of salivary testosterone.

The findings are published in the journal Proceedings of the National Academy of Sciences.

Co-author Professor Luigi Zingales said: "This study has significant implications for how the effects of testosterone could impact actual risk-taking in financial markets, because many of these students will go on to become major players in the financial world.

"Furthermore, it could shed some light on gender differences in career choices. Future studies should further explore the mechanisms through which testosterone affects the brain."

http://www.dailymail.co.uk/news/article-1208859/Women-appetite-risk-hungry-sex-study-suggests.html#
Title: Missing Heritability and Other Problems
Post by: Body-by-Guinness on December 04, 2009, 08:55:59 PM
The looming crisis in human genetics
Nov 13th 2009


Some awkward news ahead

Human geneticists have reached a private crisis of conscience, and it will become public knowledge in 2010. The crisis has depressing health implications and alarming political ones. In a nutshell: the new genetics will reveal much less than hoped about how to cure disease, and much more than feared about human evolution and inequality, including genetic differences between classes, ethnicities and races.

About five years ago, genetics researchers became excited about new methods for “genome-wide association studies” (GWAS). We already knew from twin, family and adoption studies that all human traits are heritable: genetic differences explain much of the variation between individuals. We knew the genes were there; we just had to find them. Companies such as Illumina and Affymetrix produced DNA chips that allowed researchers to test up to 1m genetic variants for their statistical association with specific traits. America’s National Institutes of Health and Britain’s Wellcome Trust gave huge research grants for gene-hunting. Thousands of researchers jumped on the GWAS bandwagon. Lab groups formed and international research consortia congealed. The quantity of published GWAS research has soared.

In 2010, GWAS fever will reach its peak. Dozens of papers will report specific genes associated with almost every imaginable trait—intelligence, personality, religiosity, sexuality, longevity, economic risk-taking, consumer preferences, leisure interests and political attitudes. The data are already collected, with DNA samples from large populations already measured for these traits. It’s just a matter of doing the statistics and writing up the papers for Nature Genetics. The gold rush is on throughout the leading behaviour-genetics centres in London, Amsterdam, Boston, Boulder and Brisbane.

GWAS researchers will, in public, continue trumpeting their successes to science journalists and Science magazine. They will reassure Big Pharma and the grant agencies that GWAS will identify the genes that explain most of the variation in heart disease, cancer, obesity, depression, schizophrenia, Alzheimer’s and ageing itself. Those genes will illuminate the biochemical pathways underlying disease, which will yield new genetic tests and blockbuster drugs. Keep holding your breath for a golden age of health, happiness and longevity.

In private, though, the more thoughtful GWAS researchers are troubled. They hold small, discreet conferences on the “missing heritability” problem: if all these human traits are heritable, why are GWAS studies failing so often? The DNA chips should already have identified some important genes behind physical and mental health. They simply have not been delivering the goods.

Certainly, GWAS papers have reported a couple of hundred genetic variants that show statistically significant associations with a few traits. But the genes typically do not replicate across studies. Even when they do replicate, they never explain more than a tiny fraction of any interesting trait. In fact, classical Mendelian genetics based on family studies has identified far more disease-risk genes with larger effects than GWAS research has so far.

Why the failure? The missing heritability may reflect limitations of DNA-chip design: GWAS methods so far focus on relatively common genetic variants in regions of DNA that code for proteins. They under-sample rare variants and DNA regions translated into non-coding RNA, which seems to orchestrate most organic development in vertebrates. Or it may be that thousands of small mutations disrupt body and brain in different ways in different populations. At worst, each human trait may depend on hundreds of thousands of genetic variants that add up through gene-expression patterns of mind-numbing complexity.

Political science

We will know much more when it becomes possible to do cheap “resequencing”—which is really just “sequencing” a wider variety of individuals beyond the handful analysed for the Human Genome Project. Full sequencing means analysing all 3 billion base pairs of an individual’s DNA rather than just a sample of 1m genetic variants as the DNA chips do. When sequencing costs drop within a few years below $1,000 per genome, researchers in Europe, China and India will start huge projects with vast sample sizes, sophisticated bioinformatics, diverse trait measures and detailed family structures. (American bioscience will prove too politically squeamish to fund such studies.) The missing heritability problem will surely be solved sooner or later.

The trouble is, the resequencing data will reveal much more about human evolutionary history and ethnic differences than they will about disease genes. Once enough DNA is analysed around the world, science will have a panoramic view of human genetic variation across races, ethnicities and regions. We will start reconstructing a detailed family tree that links all living humans, discovering many surprises about mis-attributed paternity and covert mating between classes, castes, regions and ethnicities.

We will also identify the many genes that create physical and mental differences across populations, and we will be able to estimate when those genes arose. Some of those differences probably occurred very recently, within recorded history. Gregory Cochran and Henry Harpending argued in “The 10,000 Year Explosion” that some human groups experienced a vastly accelerated rate of evolutionary change within the past few thousand years, benefiting from the new genetic diversity created within far larger populations, and in response to the new survival, social and reproductive challenges of agriculture, cities, divisions of labour and social classes. Others did not experience these changes until the past few hundred years when they were subject to contact, colonisation and, all too often, extermination.

If the shift from GWAS to sequencing studies finds evidence of such politically awkward and morally perplexing facts, we can expect the usual range of ideological reactions, including nationalistic retro-racism from conservatives and outraged denial from blank-slate liberals. The few who really understand the genetics will gain a more enlightened, live-and-let-live recognition of the biodiversity within our extraordinary species—including a clearer view of likely comparative advantages between the world’s different economies.



Geoffrey Miller: evolutionary psychologist, University of New Mexico; author of “Spent: Sex, Evolution, and Consumer Behavior” (Viking)

http://www.economist.com/PrinterFriendly.cfm?story_id=14742737
Title: "Change Blindness"
Post by: Body-by-Guinness on December 14, 2009, 02:33:17 PM
Interesting experiment. Though I think I'd catch this, I've met my share of folks who aren't particularly situationally aware.

[youtube]http://www.youtube.com/watch?v=38XO7ac9eSs&feature=player_embedded#[/youtube]
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on December 14, 2009, 06:35:56 PM
 8-) 8-) 8-)
Title: Invertebrate Tool Use
Post by: Body-by-Guinness on December 15, 2009, 09:35:48 AM
Coconut-carrying octopus stuns scientists

Tue Dec 15, 4:46 am ET

SYDNEY (AFP) – Australian scientists on Tuesday revealed the eight-tentacled species can carry coconut shells to use as armour -- the first case of an invertebrate using tools.
Research biologist Julian Finn said he was "blown away" the first time he saw the fist-sized veined octopus, Amphioctopus marginatus, pick up and scoot away with its portable protection along the sea bed.

"We don't normally associate complex behaviours with invertebrates -- with lower life forms I guess you could say," Finn, from Museum Victoria, told AFP.

"And things like tool-use and complex behaviour we generally associate with the higher vertebrates: humans, monkeys, a few birds, that kind of thing.

"This study, if anything, shows that these complex behaviours aren't limited to us. They are actually employed by a wide range of animals."

The use of tools is considered one of the defining elements of intelligence and, although originally considered only present in humans, has since been found in other primates, mammals and birds.
But this is the first time that the behaviour has been observed in an invertebrate, according to an article co-authored by Finn and published in the US-based journal Current Biology.
Finn said when he first saw the octopus walk along awkwardly with its shell, he didn't know whether it was simply a freak example of wacky underwater behaviour by the animal whose closest relative is a snail.

"So over the 10-year period basically we observed about 20 octopuses and we would have seen about four different individuals carrying coconut shells over large distances," he said of his research in Indonesia.

"There were lots that were buried with coconuts in the mud. But we saw four individuals actually pick them up and carry them, jog them across the sea floor carrying them under their bodies. It's a good sight."

Finn said the animals were slower and more vulnerable to predators while carrying the broken shells, which they later used as shelters.

"They are doing it for the later benefit and that's what makes it different from an animal that picks up something and puts it over its head for the immediate benefit," he said.
Other animals were likely to be discovered to exhibit similar behaviours, he said.

(http://d.yimg.com/a/p/afp/20091215/capt.photo_1260866892734-1-0.jpg?x=373&y=345&q=85&sig=AVibTPSGbnsI2tC0ysXxcw--)

An octopus, wrapped around the shell of a coconut, uses it to protect itself on the seabed floor. Australian scientists have revealed that the eight-tentacled species can carry coconut shells to use as armour -- the first case of an invertebrate using tools. (AFP/HO/File/Roger Steene)

http://news.yahoo.com/nphotos/File/photo//091215/photos_sc_afp/0b2621955b75a067f7490914d54ddc41//s:/afp/20091215/sc_afp/scienceaustraliaanimaloctopus_20091215094832;_ylt=ArjI1ZsRmfkf.0rxJk5lsg7QOrgF;_ylu=X3oDMTE5czdndmNvBHBvcwMxBHNlYwN5bl9yX3RvcF9waG90bwRzbGsDY29jb251dC1jYXJy
Title: Neanderthal Genome Sequenced
Post by: Body-by-Guinness on May 06, 2010, 12:33:47 PM
http://reason.com/blog/2010/05/06/the-neaderthal-in-us-neanderth
Reason Magazine


The Neanderthal in Us -- Neanderthal Genome Sequenced

Ronald Bailey | May 6, 2010

A team of genetic researchers led by Svante Pääbo from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany is publishing today in the journal Science the results of their effort to sequence the genome of the extinct Neanderthal lineage. It turns out that many of us whose ancestors hail from Europe or Asia carry genes from Neanderthals. As the press release describing the study explains:

A unique scientific task lasting four years has been completed: a team of researchers led by Svante Pääbo, Director of the Department of Evolutionary Genetics at the Max Planck Institute for Evolutionary Anthropology in Leipzig, is publishing an initial version of the Neandertal genome sequence in the current
issue of the journal Science.

This is an unprecedented scientific achievement: only ten years after the decoding of the present-day Homo sapiens genome, researchers have managed to do something similar for an extinct hominid that was the closest relative of modern humans. "The comparison of these two genetic sequences enables us to find out where our genome differs from that of our closest relative," says Svante Pääbo.

The Neandertal sequence presented is based on the analysis of over one billion DNA fragments taken from several Neandertal bones found in Croatia, Russia and Spain, as well as from the original Neandertal found in Germany. From the DNA fragments present in the bones the Leipzig researchers developed ways to distinguish true Neandertal DNA from the DNA of microbes that have lived in the bones over the last 40,000 years. Enough DNA fragments were retrieved to account for over 60 percent of the entire Neandertal genome.

An initial comparison of the two sequences has brought some exciting discoveries to light. Contrary to the assumption of many researchers, it would appear that some Neandertals and early modern humans interbred.

According to the researchers’ calculations, between one and four percent of the DNA of many humans living today originate from the Neandertal. "Those of us who live outside Africa carry a little Neandertal DNA in us," says Svante Pääbo. Previous tests carried out on the DNA of Neandertal mitochondria, which represents just a tiny part of the whole genome, had not found any evidence of such interbreeding or "admixture".

For the purpose of the analysis the researchers also sequenced five present day human genomes of European, Asian and African origin and compared them with the Neandertal. To their surprise they found that the Neandertal is slightly more closely related to modern humans from outside Africa than to Africans, suggesting some contribution of Neandertal DNA to the genomes of present-day non-Africans. Interestingly, Neandertals show the same relationship with all humans outside Africa, whether they are from Europe, East Asia or Melanesia. This is puzzling, as no Neandertal remains have been so far found in East Asia. They lived in Europe and Western Asia.

The researchers offer a plausible explanation for this finding. Svante Pääbo: "Neandertals probably mixed with early modern humans before Homo sapiens split into different groups in Europe and Asia." This could have occurred in the Middle East between 100,000 and 50,000 years ago before the human population spread across East Asia. It is known from archaeological findings in the Middle East that Neandertals and modern humans overlapped in time in this region.

Apart from the question as to whether Neandertals and Homo sapiens mixed, the researchers are highly interested in discovering genes that distinguish modern humans from their closest relative and may have given the modern humans certain advantages over the course of evolution.

By comparing Neandertal and modern human genomes, the scientists identified several genes that may have played an important role in modern human evolution. For example, they found genes related to cognitive functions, metabolism and the development of cranial features, the collar bone and the rib cage. However, more detailed analysis needs to be carried out to enable conclusions to be drawn on the actual influence of these genes.

I will mention that my 23andMe genotype scan indicates my maternal haplogroup is U5a2a which arose some 40,000 years ago and were among the first homo sapiens colonizers of ice age Europe.

If you're interested, go here for my column on what rights Neanderthals might claim should we ever succeed in using cloning technologies to bring them back.
Title: Cognitive Biases Guide
Post by: Body-by-Guinness on May 18, 2010, 05:21:33 AM
I've only started sorting through this, but it appears to be an interesting list of the ways humans go about fooling themselves.

http://www.scribd.com/documents/30548590/Cognitive-Biases-A-Visual-Study-Guide-by-the-Royal-Society-of-Account-Planning
Title: Universal, Common Ancestro
Post by: Body-by-Guinness on May 18, 2010, 07:00:06 AM
Second post.

First Large-Scale Formal Quantitative Test Confirms Darwin's Theory of Universal Common Ancestry

ScienceDaily (May 17, 2010) — More than 150 years ago, Darwin proposed the theory of universal common ancestry (UCA), linking all forms of life by a shared genetic heritage from single-celled microorganisms to humans. Until now, the theory that makes ladybugs, oak trees, champagne yeast and humans distant relatives has remained beyond the scope of a formal test. Now, a Brandeis biochemist reports in Nature the results of the first large scale, quantitative test of the famous theory that underpins modern evolutionary biology.

The results of the study confirm that Darwin had it right all along. In his 1859 book, On the Origin of Species, the British naturalist proposed that, "all the organic beings which have ever lived on this earth have descended from some one primordial form." Over the last century and a half, qualitative evidence for this theory has steadily grown, in the numerous, surprising transitional forms found in the fossil record, for example, and in the identification of sweeping fundamental biological similarities at the molecular level.

Still, rumblings among some evolutionary biologists have recently emerged questioning whether the evolutionary relationships among living organisms are best described by a single "family tree" or rather by multiple, interconnected trees -- a "web of life." Recent molecular evidence indicates that primordial life may have undergone rampant horizontal gene transfer, which occurs frequently today when single-celled organisms swap genes using mechanisms other than usual organismal reproduction. In that case, some scientists argue, early evolutionary relationships were web-like, making it possible that life sprang up independently from many ancestors.

According to biochemist Douglas Theobald, it doesn't really matter. "Let's say life originated independently multiple times, which UCA allows is possible," said Theobald. "If so, the theory holds that a bottleneck occurred in evolution, with descendants of only one of the independent origins surviving until the present. Alternatively, separate populations could have merged, by exchanging enough genes over time to become a single species that eventually was ancestral to us all. Either way, all of life would still be genetically related."

Harnessing powerful computational tools and applying Bayesian statistics, Theobald found that the evidence overwhelmingly supports UCA, regardless of horizontal gene transfer or multiple origins of life. Theobald said UCA is millions of times more probable than any theory of multiple independent ancestries.

"There have been major advances in biology over the last decade, with our ability to test Darwin's theory in a way never before possible," said Theobald. "The number of genetic sequences of individual organisms doubles every three years, and our computational power is much stronger now than it was even a few years ago."

While other scientists have previously examined common ancestry more narrowly, for example, among only vertebrates, Theobald is the first to formally test Darwin's theory across all three domains of life. The three domains include diverse life forms such as the Eukarya (organisms, including humans, yeast, and plants, whose cells have a DNA-containing nucleus) as well as Bacteria and Archaea (two distinct groups of unicellular microorganisms whose DNA floats around in the cell instead of in a nucleus).

Theobald studied a set of 23 universally conserved, essential proteins found in all known organisms. He chose to study four representative organisms from each of the three domains of life. For example, he researched the genetic links found among these proteins in archaeal microorganisms that produce marsh gas and methane in cows and the human gut; in fruit flies, humans, round worms, and baker's yeast; and in bacteria like E. coli and the pathogen that causes tuberculosis.

Theobald's study rests on several simple assumptions about how the diversity of modern proteins arose. First, he assumed that genetic copies of a protein can be multiplied during reproduction, such as when one parent gives a copy of one of their genes to several of their children. Second, he assumed that a process of replication and mutation over the eons may modify these proteins from their ancestral versions. These two factors, then, should have created the differences in the modern versions of these proteins we see throughout life today. Lastly, he assumed that genetic changes in one species don't affect mutations in another species -- for example, genetic mutations in kangaroos don't affect those in humans.

What Theobald did not assume, however, was how far back these processes go in linking organisms genealogically. It is clear, say, that these processes are able to link the shared proteins found in all humans to each other genetically. But do the processes in these assumptions link humans to other animals? Do these processes link animals to other eukaryotes? Do these processes link eukaryotes to the other domains of life, bacteria and archaea? The answer to each of these questions turns out to be a resounding yes.

Just what did this universal common ancestor look like and where did it live? Theobald's study doesn't answer this question. Nevertheless, he speculated, "to us, it would most likely look like some sort of froth, perhaps living at the edge of the ocean, or deep in the ocean on a geothermal vent. At the molecular level, I'm sure it would have looked as complex and beautiful as modern life."

http://www.sciencedaily.com/releases/2010/05/100512131513.htm
Title: How the parasitic worm has turned
Post by: Freki on June 15, 2010, 06:37:52 AM
Just knowing there is an interaction could lead to some good medical breakthroughs.  I would like to do without the worms....the yuck factor alone....... :-D


How the parasitic worm has turned
June 14, 2010
(PhysOrg.com) -- Parasites in the gut such as whipworm have an essential role in developing a healthy immune system, University of Manchester scientists have found.

 
It has long been known that microbes in the gut help to develop a healthy immune system, hence the rise in popularity of probiotic yoghurts that encourage 'friendly' bacteria. But new research by Professors Richard Grencis and Ian Roberts shows that larger organisms such as parasitic worms are also essential in maintaining our bodily 'ecosystem'.
Professor Roberts, whose work is published in Science, explains: "It is like a three-legged stool - the microbes, worms and immune system regulate each other.
"The worms have been with us throughout our evolution and their presence, along with bacteria, in the ecosystem of the gut is important in the development of a functional immune system."
Professor Grencis adds: "If you look at the incidence of parasitic worm infection and compare it to the incidence of auto-immune disease and allergy, where the body's immune system over-reacts and causes damage, they have little overlap. Clean places in the West, where parasites are eradicated, see problems caused by overactive immune systems. In the developing world, there is more parasitic worm infection but less auto-immune and allergic problems.
"We are not suggesting that people deliberately infect themselves with parasitic worms but we are saying that these larger pathogens make things that help our immune system. We have evolved with both the bugs and the worms and there are consequences of that interaction, so they are important to the development of our immune system."
Whipworm, also known as Trichuris, is a very common type of parasitic worm and infects many species of animals including millions of humans. It has also been with us and animals throughout evolution. The parasites live in the large intestine, the very site containing the bulk of the intestinal bacteria.
Ads by Google
Kill Skin Parasites Now - Stop the Itching, Biting & Stinging Kill Parasites & Heal the Sores. - Shop.QBased.com/Skin-Parasites
 
Heavy infections of whipworm can cause bloody diarrhoea, with long-standing blood loss leading to iron-deficiency anaemia, and even rectal prolapse. But light infections have relatively few symptoms.
Professors Grencis and Roberts and their team at Manchester's Faculty of Life Sciences investigated the establishment of Trichuris and found it is initiated by an interaction between gut bacteria and the parasite.
They further found that a broad range of gut bacteria were able to induce parasite hatching. In the case of Escherichia coli (E-coli), bacteria bound to specific sites on the egg and rapidly induce parasite hatching. With E-coli, hatching involved specific bacterial cell-surface structures known as fimbriae, which the bacteria normally use to attach to cells of the gut wall.
Importantly, the work also showed that the presence of worms and bacteria altered the immune responses in a way that is likely to protect ourselves, the bacteria and the worms.
Intestinal roundworm parasites are one of the most common types of infection worldwide, although in humans increased hygiene has reduced infection in many countries. High level infections by these parasites can cause disease, but the natural situation is the presence of relatively low levels of infection. The team's work suggests that in addition to bacterial microflora, the natural state of affairs of our intestines may well be the presence of larger organisms, the parasitic roundworms, and that complex and subtle interactions between these different types of organism have evolved to provide an efficient and beneficial ecosystem for all concerned.
Professor Roberts says: "The host uses its immune system to regulate the damage caused by the bacteria and the worms. If the pathogens are missing, the immune system may not give the right response."
Professor Grencis adds: "The gut and its inhabitants should be considered a complex ecosystem, not only involving bacteria but also parasites, not just sitting together but interacting."
More information: 'Exploitation of the Intestinal Microflora by the Parasitic Nematode Trichuris muris', Science.
Provided by University of Manchester (news : web)
Title: Stomach Virus: New Meaning
Post by: Body-by-Guinness on July 17, 2010, 06:59:12 PM
A Viral Wonderland in the Human Gut
by Gisela Telis on July 14, 2010 1:39 PM | Permanent Link | 0 Comments
Email Print| More
PREVIOUS ARTICLE NEXT ARTICLE
ENLARGE IMAGE
 
Distinctive signature. Each person's gut carries a different collection of bacteria-infecting viruses that may benefit their hosts, researchers report. The viruses contain DNA fragments whose functions (above) include cell repair and food processing.
Credit: A. Reyes et al., Nature, 466 (15 July 2010)

Snowflakes haven't cornered the market on uniqueness. Researchers report that human guts harbor viruses as unique as the people they inhabit; the viral lineup differs even between identical twins. The discovery offers a first glimpse at the previously unknown viruses and their surprisingly friendly relationships with their hosts.

Microbiologists have known since the late 19th century that human intestines are a crowded and complicated place. Our bacterial denizens outnumber our cells, and many help break down foods and fight off pathogens. For the past decade, microbiologist Jeffrey Gordon of Washington University in St Louis has been mapping the gut's microbial landscape. His studies have linked intestinal bacteria to obesity and have shown that families tend to share their microbial makeup. But scientists hadn't yet explored whether phages—viruses that infect bacteria—were part of this shared community.

Led by graduate student Alejandro Reyes, Gordon's team analyzed fecal samples from four sets of Missouri-born female identical twins and their mothers. The researchers collected and purified the poop three times over the course of a year—to better track the microbial community's changes over time—and then sequenced the viral DNA, or viromes, the poop contained. Only 20% of the viromes matched existing databases; the rest came from previously unknown viruses. And each woman's set of viromes was distinctive, differing even from the viromes of her identical twin, the researchers report in the 15 July issue of Nature. Unlike their bacterial profiles, which overlapped by about 50% (significantly more than between strangers), identical twins had no more viruses in common than did unrelated people.

Equally surprising, Gordon says, was the communities' consistency: the viral makeup changed less than 5% over the course of the year, and the viromes of the most abundant phages changed less than 1%. Rapidly changing viromes would have signaled an "arms race" in which threatened bacteria were adapting to survive phage attacks, and the phages were adapting to avoid bacterial defenses. "The fact that the viromes didn't change," says Gordon, "suggests this is a temperate environment" in which the bacteria and their phages coexist in peace.

That may be because the viruses are actually helping the bacteria. When the viruses latch onto gut bacteria, they take some of their host's genetic material and can change it or move it to other hosts, bringing new and potentially advantageous functions to the bugs. The researchers found that many of the genes the phages carry and transfer are beneficial to the bacteria; some may help them repair cell walls, for example. In return, the bacteria, which don't die from the infections, provide an improved cellular factory to make new viruses.

The researchers don't know where the viruses come from or what causes viromes to differ so dramatically from person to person. But their data indicate that there is a huge diversity of these viruses, and that could explain why even closely related people can harbor very different populations.

Gordon says that understanding the details of the phage-bacteria relationship could help gauge the health of a patient's gut community, because the phages are sensitive to changes in their hosts. But "we still have a lot to learn about viruses" before we can expect any practical applications, says microbiologist Edward DeLong of the Massachusetts Institute of Technology in Cambridge. "This is just a first peek," he says, "but it's a remarkable one. It's the first high-resolution picture of the bacterial-viral dynamic in the human ecosystem, in a huge part of our own ecology that remains terra incognita."

http://news.sciencemag.org/sciencenow/2010/07/a-viral-wonderland-in-the-human-.html
Title: Well This Goes Without Saying. . . .
Post by: Body-by-Guinness on November 05, 2010, 11:31:24 AM
By COURTNEY HUTCHISON
ABC News Medical Unit
Ozzy Osbourne Is a Genetic Mutant
Gene Variants Let the Part-Neanderthal Rocker Party Hard Into His 60s
  Nov. 3, 2010

Despite a lifetime of hard partying, heavy metal rocker
Ozzy Osbourne is alive and kicking at 61, and he may
have his genes to thank for it. Now that the "Full
Osbourne Genome" has been sequenced, the truth is
out: the former lead singer of Black Sabbath is a
genetic mutant.

The musician has several gene variants that "we've
never seen before," said geneticist Nathaniel Pearson,
who sequenced the rocker's genome, including
variants that could impact how Osbourne's body
absorbs methamphetamines and other recreational
drugs.

"I've always said that at the end of the world there will
be roaches, Ozzy and Keith Richards," Osbourne's
wife Sharon Osbourne, said at Friday's conference.
"He's going to outlive us all. That fascinated me --
how his body can endure so much."

Osbourne's resilience also piqued the interest of
Knome, Inc., a genomics company that began
sequencing the "full Ozzy genome" last July.

"Why not Ozzy?" Jorge Conde, co-founder and chief
executive of Cambridge, Mass.-based Knome, told
ABCnews.com.

Conde said the company was interested in exploring
the genome of someone as musically talented as
Osbourne. Of course, trying to figure out if good
genes had anything to do with Osbourne's ability to
handle his "aggressive" lifestyle was also a major
draw for researchers, he said.

The results of Knome's sequencing were discussed
on stage last Friday at this year'sTEDMED conference
in San Diego, with Sharon Osbourne, Pearson, and
Ozzy Osbourne all weighing in on what Osbourne's
genes can mean for medicine.

 Uncovering the Ozzy Genome

  Osbourne initially was skeptical about the project, he
wrote Oct. 24 in his Sunday Times of London column,
"The Wisdom of Oz," but soon came around to the
idea of offering his genetic code to science.

"I was curious ... given the swimming pools of booze
I've guzzled over the years -- not to mention all of the
cocaine, morphine, sleeping pills, cough syrup, LSD,
Rohypnol ... you name it -- there's really no plausible
medical reason why I should still be alive. Maybe my
DNA could say why," he wrote in his column.

Not surprisingly, the most notable differences in
Osbourne's genes had to do with how he processes
drugs and alcohol. Genes connected to addiction,
alcoholism and the absorption of marijuana, opiates
and methamphetamines all had unique variations in
Osbourne, a few of which Knome geneticists had
never seen before.

"He had a change on the regulatory region of the
ADH4 gene, a gene associated with alcoholism, that
we've never seen before," Conde told ABCnews.com.
"He has an increased predisposition for alcohol
dependence of something like six times higher. He
also had a slight increased risk for cocaine addiction,
but he dismissed that. He said that if anyone has
done as much cocaine he had, they would have been
hooked."

The Prince of Darkness also a 2.6-times increased
chance for hallucinations associated with marijuana,
though Osbourne said he wouldn't know if that were
true because he so rarely smoked marijuana without
other drugs also in his system.

Ironically, Osbourne's genes suggest that he is a
slow metabolizer of coffee, meaning that he would be
more affected by caffeine.

"Turns out that Ozzy's kryptonite is caffeine," Conde
said.

Conde and Pearson particularly were interested in
looking at Osbourne's nervous system and nervous
function, given the musician's lifestyle and his recent
experience of Parkinson's-like tremors.

They found a functioning change in his TTN gene,
which is associated with a number of things in the n
ervous system, including deafness and Parkinson's.

"Here's a guy who's rocking heavy metal for decades
and he can still hear," Conde said. "It would be
interesting to know if this gene may impact that. His
Parkinsonian tremor -- it's hard to know if that is
from his genes or from years of hard living."

And of course, there's the fact that Osbourne had
Neanderthal genes in him.

"People thought that [Neanderthals] had no
descendents today, but they do," Pearson said at the
conference. "In east Asia and Europe, a lot of us have a
little Neanderthal ancestry. We found a sliver of the
genes in Ozzy. We also looked at [Knome's] founder,
George Chruch, and he has about three times as
much as Ozzy does."

To which Sharon Osbourne replied: "I'd like to meet
him."

 Learning From Our Favorite
Neanderthal Rocker

  While genomics have come a long way since the
first full human genome was sequenced in 2003,
interpreting what gene variants mean still involves a
lot of guesswork.

"We can read the code, but it takes additional
research to decipher what is means," Conde said.

In other words, geneticists know which traits are
associated with certain genes, but not how a mutation
on that gene will affect someone. By sequencing those
who seem to show unique traits, such as Osbourne's
ability to remain relatively healthy despite heavy drug
 and alcohol abuse, geneticists hope to learn more
about how deviations in certain genes create specific
traits, susceptibility to disease and reactions to
substances.

"What interests me are people who have done
something extraordinary with no clear reason as to
why," Conde said.

For his next celebrity genome, he would like to pick
somebody on the far extreme of intelligence such as
Stephen Hawking. Or he might stick with rock-lifestyle
resilience and get Keith Richards, as Sharon
Osbourne suggested.

TEDMED is a yearly conference dedicated to
increasing innovation in the medical realm "from
personal health to public health, devices to design
and Hollywood to the hospital," the website said.

http://abcnews.go.com/Health/Wellness/genetic-mutations-ozzy-osbourne-party-hard/story?id=12032552&page=1
Title: Retrovirues & Schizophrenia, I
Post by: Body-by-Guinness on November 12, 2010, 10:22:55 AM
The Insanity Virus

11.08.2010
Schizophrenia has long been blamed on bad genes or even bad parents. Wrong, says a growing group of psychiatrists. The real culprit, they claim, is a virus that lives entwined in every person's DNA.

by Douglas Fox
Steven and David Elmore were born identical twins, but their first days in this world could not have been more different. David came home from the hospital after a week. Steven, born four minutes later, stayed behind in the ICU. For a month he hovered near death in an incubator, wracked with fever from what doctors called a dangerous viral infection. Even after Steven recovered, he lagged behind his twin. He lay awake but rarely cried. When his mother smiled at him, he stared back with blank eyes rather than mirroring her smiles as David did. And for several years after the boys began walking, it was Steven who often lost his balance, falling against tables or smashing his lip.

Those early differences might have faded into distant memory, but they gained new significance in light of the twins’ subsequent lives. By the time Steven entered grade school, it appeared that he had hit his stride. The twins seemed to have equalized into the genetic carbon copies that they were: They wore the same shoulder-length, sandy-blond hair. They were both B+ students. They played basketball with the same friends. Steven Elmore had seemingly overcome his rough start. But then, at the age of 17, he began hearing voices.

The voices called from passing cars as Steven drove to work. They ridiculed his failure to find a girlfriend. Rolling up the car windows and blasting the radio did nothing to silence them. Other voices pursued Steven at home. Three voices called through the windows of his house: two angry men and one woman who begged the men to stop arguing. Another voice thrummed out of the stereo speakers, giving a running commentary on the songs of Steely Dan or Led Zeppelin, which Steven played at night after work. His nerves frayed and he broke down. Within weeks his outbursts landed him in a psychiatric hospital, where doctors determined he had schizophrenia.

The story of Steven and his twin reflects a long-standing mystery in schizophrenia, one of the most common mental diseases on earth, affecting about 1 percent of humanity. For a long time schizophrenia was commonly blamed on cold mothers. More recently it has been attributed to bad genes. Yet many key facts seem to contradict both interpretations.

Schizophrenia is usually diagnosed between the ages of 15 and 25, but the person who becomes schizophrenic is sometimes recalled to have been different as a child or a toddler—more forgetful or shy or clumsy. Studies of family videos confirm this. Even more puzzling is the so-called birth-month effect: People born in winter or early spring are more likely than others to become schizophrenic later in life. It is a small increase, just 5 to 8 percent, but it is remarkably consistent, showing up in 250 studies. That same pattern is seen in people with bipolar disorder or multiple sclerosis.

“The birth-month effect is one of the most clearly established facts about schizophrenia,” says Fuller Torrey, director of the Stanley Medical Research Institute in Chevy Chase, Maryland. “It’s difficult to explain by genes, and it’s certainly difficult to explain by bad mothers.”

The facts of schizophrenia are so peculiar, in fact, that they have led Torrey and a growing number of other scientists to abandon the traditional explanations of the disease and embrace a startling alternative. Schizophrenia, they say, does not begin as a psychological disease. Schizophrenia begins with an infection.

The idea has sparked skepticism, but after decades of hunting, Torrey and his colleagues think they have finally found the infectious agent. You might call it an insanity virus. If Torrey is right, the culprit that triggers a lifetime of hallucinations—that tore apart the lives of writer Jack Kerouac, mathematician John Nash, and millions of others—is a virus that all of us carry in our bodies. “Some people laugh about the infection hypothesis,” says Urs Meyer, a neuroimmunologist at the Swiss Federal Institute of Technology in Zurich. “But the impact that it has on researchers is much, much, much more than it was five years ago. And my prediction would be that it will gain even more impact in the future.”

The implications are enormous. Torrey, Meyer, and others hold out hope that they can address the root cause of schizophrenia, perhaps even decades before the delusions begin. The first clinical trials of drug treatments are already under way. The results could lead to meaningful new treatments not only for schizophrenia but also for bipolar disorder and multiple sclerosis. Beyond that, the insanity virus (if such it proves) may challenge our basic views of human evolution, blurring the line between “us” and “them,” between pathogen and host.

+++
 

Rhoda Torrey and her brother Fuller,
who would go on to research shizophrenia.

Courtesy E. Fuller Torrey

Torrey’s connection to schizophrenia began in 1957. As summer drew to a close that year, his younger sister, Rhoda, grew agitated. She stood on the lawn of the family home in upstate New York, looking into the distance. She rambled as she spoke. “The British,” she said. “The British are coming.” Just days before Rhoda should have started college, she was given a diagnosis of schizophrenia. Doctors told the grieving family that dysfunctional household relationships had caused her meltdown. Because his father was no longer alive, it was Torrey, then in college, who shouldered much of the emotional burden.

Torrey, now 72, develops a troubled expression behind his steel-rimmed glasses as he remembers those years. “Schizophrenia was badly neglected,” he says.

In 1970 Torrey arrived at the National Institute of Mental Health in Washington, D.C., having finished his training in psychiatric medicine. At the time, psychiatry remained under the thrall of Freudian psychoanalysis, an approach that offered little to people like Rhoda. Torrey began looking for research opportunities in schizophrenia. The more he learned, the more his views diverged from those of mainstream psychiatry.

A simple neurological exam showed Torrey that schizophrenics suffered from more than just mental disturbances. They often had trouble doing standard inebriation tests, like walking a straight line heel to toe. If Torrey simultaneously touched their face and hand while their eyes were closed, they often did not register being touched in two places. Schizophrenics also showed signs of inflammation in their infection-fighting white blood cells. “If you look at the blood of people with schizophrenia,” Torrey says, “there are too many odd-looking lymphocytes, the kind that you find in mononucleosis.” And when he performed CAT scans on pairs of identical twins with and without the disease—including Steven and David Elmore—he saw that schizophrenics’ brains had less tissue and larger fluid-filled ventricles.

Subsequent studies confirmed those oddities. Many schizophrenics show chronic inflammation and lose brain tissue over time, and these changes correlate with the severity of their symptoms. These things “convinced me that this is a brain disease,” Torrey says, “not a psychological problem.”

By the 1980s he began working with Robert Yolken, an infectious-diseases specialist at Johns Hopkins University in Baltimore, to search for a pathogen that could account for these symptoms. The two researchers found that schizophrenics often carried antibodies for toxoplasma, a parasite spread by house cats; Epstein-Barr virus, which causes mononucleosis; and cytomegalovirus. These people had clearly been exposed to those infectious agents at some point, but Torrey and Yolken never found the pathogens themselves in the patients’ bodies. The infection always seemed to have happened years before.

Torrey wondered if the moment of infection might in fact have occurred during early childhood. If schizophrenia was sparked by a disease that was more common during winter and early spring, that could explain the birth-month effect. “The psychiatrists thought I was psychotic myself,” Torrey says. “Some of them still do.”

Better prenatal care or vaccinations could prevent the infections that put people on a path to schizophrenia, and early treatment might prevent psychosis from developing two decades later.
While Torrey and Yolken were chasing their theory, another scientist unwittingly entered the fray. Hervé Perron, then a graduate student at Grenoble University in France, dropped his Ph.D. project in 1987 to pursue something more challenging and controversial: He wanted to learn if new ideas about retroviruses—a type of virus that converts RNA into DNA—could be relevant to multiple sclerosis.

Robert Gallo, the director of the Institute of Human Virology at the University of Maryland School of Medicine and co discoverer of HIV, had speculated that a virus might trigger the paralytic brain lesions in MS. People had already looked at the herpes virus (HHV-6), cytomegalovirus, Epstein-Barr virus, and the retroviruses HTLV-1 and HTLV-2 as possible causes of the disease. But they always came up empty-handed.

Perron learned from their failures. “I decided that I should not have an a priori idea of what I would find,” he says. Rather than looking for one virus, as others had done, he tried to detect any retrovirus, whether or not it was known to science. He extracted fluids from the spinal columns of MS patients and tested for an enzyme, called reverse transcriptase, that is carried by all retroviruses. Sure enough, Perron saw faint traces of retroviral activity. Soon he obtained fuzzy electron microscope images of the retrovirus itself.

His discovery was intriguing but far from conclusive. After confirming his find was not a fluke, Perron needed to sequence its genes. He moved to the National Center for Scientific Research in Lyon, France, where he labored days, nights, and weekends. He cultured countless cells from people with MS to grow enough of his mystery virus for sequencing. MS is an incurable disease, so Perron had to do his research in a Level 3 biohazard lab. Working in this airtight catacomb, he lived his life in masks, gloves, and disposable scrubs.

After eight years of research, Perron finally completed his retrovirus’s gene sequence. What he found on that day in 1997 no one could have predicted; it instantly explained why so many others had failed before him. We imagine viruses as mariners, sailing from person to person across oceans of saliva, snot, or semen—but Perron’s bug was a homebody. It lives permanently in the human body at the very deepest level: inside our DNA. After years slaving away in a biohazard lab, Perron realized that everyone already carried the virus that causes multiple sclerosis.

Other scientists had previously glimpsed Perron’s retrovirus without fully grasping its significance. In the 1970s biologists studying pregnant baboons were shocked as they looked at electron microscope images of the placenta. They saw spherical retroviruses oozing from the cells of seemingly healthy animals. They soon found the virus in healthy humans, too. So began a strange chapter in evolutionary biology.

+++
Viruses like influenza or measles kill cells when they infect them. But when retroviruses like HIV infect a cell, they often let the cell live and splice their genes into its DNA. When the cell divides, both of its progeny carry the retrovirus’s genetic code in their DNA.

In the past few years, geneticists have pieced together an account of how Perron’s retrovirus entered our DNA. Sixty million years ago, a lemurlike animal—an early ancestor of humans and monkeys—contracted an infection. It may not have made the lemur ill, but the retrovirus spread into the animal’s testes (or perhaps its ovaries), and once there, it struck the jackpot: It slipped inside one of the rare germ line cells that produce sperm and eggs. When the lemur reproduced, that retrovirus rode into the next generation aboard the lucky sperm and then moved on from generation to generation, nestled in the DNA. “It’s a rare, random event,” says Robert Belshaw, an evolutionary biologist at the University of Oxford in England. “Over the last 100 million years, there have been only maybe 50 times when a retrovirus has gotten into our genome and proliferated.”

But such genetic intrusions stick around a very long time, so humans are chockablock full of these embedded, or endogenous, retroviruses. Our DNA carries dozens of copies of Perron’s virus, now called human endogenous retrovirus W, or HERV-W, at specific addresses on chromosomes 6 and 7.

If our DNA were an airplane carry-on bag (and essentially it is), it would be bursting at the seams. We lug around 100,000 retro virus sequences inside us; all told, genetic parasites related to viruses account for more than 40 percent of all human DNA. Our body works hard to silence its viral stowaways by tying up those stretches of DNA in tight stacks of proteins, but sometimes they slip out. Now and then endogenous retroviruses switch on and start manufacturing proteins. They assemble themselves like Lego blocks into bulbous retroviral particles, which ooze from the cells producing them.

Title: Retrovirues & Schizophrenia, II
Post by: Body-by-Guinness on November 12, 2010, 10:23:20 AM
Endogenous retroviruses were long considered genetic fossils, incapable of doing anything interesting. But since Perron’s revelation, at least a dozen studies have found that HERV-W is active in people with MS.

By the time Perron made his discovery, Torrey and Yolken had spent about 15 years looking for a pathogen that causes schizophrenia. They found lots of antibodies but never the bug itself. Then Håkan Karlsson, who was a postdoctoral fellow in Yolken’s lab, became interested in studies showing that retroviruses sometimes triggered psychosis in AIDS patients. The team wondered if other retroviruses might cause these symptoms in separate diseases such as schizophrenia. So they used an experiment, similar to Perron’s, that would detect any retrovirus (by finding sequences encoding reverse transcriptase enzyme)—even if it was one that had never been catalogued before. In 2001 they nabbed a possible culprit. It turned out to be HERV-W.

Several other studies have since found similar active elements of HERV-W in the blood or brain fluids of people with schizophrenia. One, published by Perron in 2008, found HERV-W in the blood of 49 percent of people with schizophrenia, compared with just 4 percent of healthy people. “The more HERV-W they had,” Perron says, “the more inflammation they had.” He now sees HERV-W as key to understanding many cases of both MS and schizophrenia. “I’ve been doubting for so many years,” he says. “I’m convinced now.”

Torrey, Yolken, and Sarven Sabunciyan, an epigeneticist at Johns Hopkins, are working to understand how endogenous retroviruses can wreak their havoc. Much of their research revolves around the contents of a nondescript brick building near Washington, D.C. This building, owned by the Stanley Medical Research Institute, maintains the world’s largest library of schizophrenic and bipolar brains. Inside are hundreds of cadaver brains (donated to science by the deceased), numbered 1 through 653. Each brain is split into right and left hemispheres, one half frozen at about –103 degrees Fahrenheit, the other chilled in formaldehyde. Jacuzzi-size freezers fill the rooms. The roar of their fans cuts through the air as Torrey’s team examines the brains to pinpoint where and when HERV-W awakens into schizophrenia.

New high-speed DNA sequencing is making the job possible. In a cramped room at Johns Hopkins Medical Center, a machine the size of a refrigerator hums 24/7 to read gene sequences from samples. Every few minutes the machine’s electric eye scans a digital image of a stamp-size glass plate. Fixed to that plate are 300 million magnetic beads, and attached to each bead is a single molecule of DNA, which the machine is sequencing. In a week the machine churns out the equivalent of six human genomes—enough raw data to fill 40 computer hard drives.

Torrey’s younger sister, Rhoda, stood on the lawn of the family home in upstate New York, looking into the distance. “The British,” she said. “The British are coming.” Just days before Rhoda should have started college, she was given a diagnosis of schizophrenia.
The hard part starts when those sequences arrive at Sabunciyan’s desk. “We got these data right around New Year’s 2009,” Sabunciyan said one day last August as he scrolled through a file containing 2 billion letters of genetic code, equivalent to 2,000 John Grisham novels composed just of the letters G, A, T, and C (making the plot a great deal more confusing). “We’re still looking at it.”

Sabunciyan has found that an unexpectedly large amount of the RNA produced in the brain—about 5 percent—comes from seemingly “junk” DNA, which includes endogenous retroviruses. RNA is a messenger of DNA, a step in the path to making proteins, so its presence could mean that viral proteins are being manufactured in the body more frequently than had been thought.

Through this research, a rough account is emerging of how HERV-W could trigger diseases like schizophrenia, bipolar disorder, and MS. Although the body works hard to keep its ERVs under tight control, infections around the time of birth destabilize this tense standoff. Scribbled onto the marker board in Yolken’s office is a list of infections that are now known to awaken HERV-W—including herpes, toxoplasma, cytomegalovirus, and a dozen others. The HERV-W viruses that pour into the newborn’s blood and brain fluid during these infections contain proteins that may enrage the infant immune system. White blood cells vomit forth inflammatory molecules called cytokines, attracting more immune cells like riot police to a prison break. The scene turns toxic.

In one experiment, Perron isolated HERV-W virus from people with MS and injected it into mice. The mice became clumsy, then paralyzed, then died of brain hemorrhages. But if Perron depleted the mice of immune cells known as T cells, the animals survived their encounter with HERV-W. It was an extreme experiment, but to Perron it made an important point. Whether people develop MS or schizophrenia may depend on how their immune system responds to HERV-W, he says. In MS the immune system directly attacks and kills brain cells, causing paralysis. In schizophrenia it may be that inflammation damages neurons indirectly by overstimulating them. “The neuron is discharging neurotransmitters, being excited by these inflammatory signals,” Perron says. “This is when you develop hallucinations, delusions, paranoia, and hyper-suicidal tendencies.”

The first, pivotal infection by toxoplasmosis or influenza (and subsequent flaring up of HERV-W) might happen shortly before or after birth. That would explain the birth-month effect: Flu infections happen more often in winter. The initial infection could then set off a lifelong pattern in which later infections reawaken HERV-W, causing more inflammation and eventually symptoms. This process explains why schizophrenics gradually lose brain tissue. It explains why the disease waxes and wanes like a chronic infection. And it could explain why some schizophrenics suffer their first psychosis after a mysterious, monolike illness.

+++
The infection theory could also explain what little we know of the genetics of schizophrenia. One might expect that the disease would be associated with genes controlling our synapses or neurotransmitters. Three major studies published last year in the journal Nature tell a different story. They instead implicate immune genes called human leukocyte antigens (HLAs), which are central to our body’s ability to detect invading pathogens. “That makes a lot of sense,” Yolken says. “The response to an infectious agent may be why person A gets schizophrenia and person B doesn’t.”

Gene studies have failed to provide simple explanations for ailments like schizophrenia and MS. Torrey’s theory may explain why. Genes may come into play only in conjunction with certain environmental kicks. Our genome’s thousands of parasites might provide part of that kick.

“The ‘genes’ that can respond to environmental triggers or toxic pathogens are the dark side of the genome,” Perron says. Retroviruses, including HIV, are known to be awakened by inflammation—possibly the result of infection, cigarette smoke, or pollutants in drinking water. (This stress response may be written into these parasites’ basic evolutionary strategy, since stressed hosts may be more likely to spread or contract infections.) The era of writing off endogenous retroviruses and other seemingly inert parts of the genome as genetic fossils is drawing to an end, Perron says. “It’s not completely junk DNA, it’s not dead DNA,” he asserts. “It’s an incredible source of interaction with the environment.” Those interactions may trigger disease in ways that we are only just beginning to imagine.

Torrey’s sister has had a tough go of it. Schizophrenia treatments were limited when she fell ill. Early on she received electroshock therapy and insulin shock therapy, in which doctors induced a coma by lowering her blood sugar level. Rhoda Torrey has spent 40 years in state hospitals. The disease has left only one part of her untouched: Her memory of her brief life before becoming ill—of school dances and sleepovers half a century ago—remains as clear as ever.

Steven Elmore was more fortunate. Drug therapy was widely available when he fell ill, and although he still hears voices from time to time, he has done well. Now 50 years old, he is married, cares for an adopted son and stepson, and works full time. He has avoided common drug side effects like diabetes, although his medications initially caused him to gain 40 pounds.

Torrey and Yolken hope to add a new, more hopeful chapter to this story. Yolken’s wife, Faith Dickerson, is a clinical psychologist at Sheppard Pratt Health System in Baltimore. She is running a clinical trial to examine whether adding an anti-infective agent called artemisinin to the drugs that patients are already taking can lessen the symptoms of schizophrenia. The drug would hit HERV-W indirectly by tamping down the infections that awaken it. “If we can treat the toxoplasmosis,” Torrey says, “presumably we can get a better outcome than by treating [neurotransmitter] abnormalities that have occurred 14 steps down the line, which is what we’re doing now.”

Looking ahead, better prenatal care or vaccinations could prevent the first, early infections that put some people on a path to schizophrenia. For high-risk babies who do get sick, early treatment might prevent psychosis from developing two decades later. Recent work by Urs Meyer, the neuroimmunologist, and his colleague Joram Feldon at the Swiss Federal Institute of Technology drives this point home. When they injected pregnant mice with RNA molecules mimicking viral infections, the pups grew up to resemble schizophrenic adults. The animals’ memory and learning were impaired, they overreacted to startling noises, and their brain atrophied. But this March, Meyer and Feldon reported that treating the baby mice with antipsychotic drugs prevented them from developing some of these abnormalities as adults.

Perron has founded a biotech start-up —GeNeuro, in Geneva, Switzerland—to develop treatments targeting HERV-W. The company has created an antibody that neutralizes a primary viral protein, and it works in lab mice with MS. “We have terrific effects,” Perron says. “In animals that have demyelinating brain lesions induced by these HERV envelope proteins, we see a dramatic stop to this process when we inject this antibody.” He is scheduled to begin a Phase 1 clinical trial in people with MS near the end of this year. A clinical trial with schizophrenics might follow in 2011.

Even after all that, many medical experts still question how much human disease can be traced to viral invasions that took place millions of years ago. If the upcoming human trials work as well as the animal experiments, the questions may be silenced—and so may the voices of schizophrenia.

http://discovermagazine.com/2010/jun/03-the-insanity-virus/
Title: NY Times: This is your brain on metaphors
Post by: Crafty_Dog on November 19, 2010, 08:02:17 AM
Despite rumors to the contrary, there are many ways in which the human brain isn’t all that fancy. Let’s compare it to the nervous system of a fruit fly. Both are made up of cells, of course, with neurons playing particularly important roles. Now one might expect that a neuron from a human will differ dramatically from one from a fly. Maybe the human’s will have especially ornate ways of communicating with other neurons, making use of unique “neurotransmitter” messengers. Maybe compared to the lowly fly neuron, human neurons are bigger, more complex, in some way can run faster and jump higher.

We study hard to get admitted to a top college to get a good job to get into the nursing home of our choice. Gophers don’t do that.
.But no. Look at neurons from the two species under a microscope and they look the same. They have the same electrical properties, many of the same neurotransmitters, the same protein channels that allow ions to flow in and out, as well as a remarkably high number of genes in common. Neurons are the same basic building blocks in both species.

So where’s the difference? It’s numbers — humans have roughly one million neurons for each one in a fly. And out of a human’s 100 billion neurons emerge some pretty remarkable things. With enough quantity, you generate quality.


Erin Schell
 Neuroscientists understand the structural bases of some of these qualities. Take language, that uniquely human behavior. Underlining it are structures unique to the human brain — regions like “Broca’s area,” which specializes in language production. Then there’s the brain’s “extrapyramidal system,” which is involved in fine motor control. The complexity of the human version allows us to do something that, say, a polar bear, could never accomplish — sufficiently independent movement of digits to play a trill on the piano, for instance. Particularly striking is the human frontal cortex. While occurring in all mammals, the human version is proportionately bigger and denser in its wiring. And what is the frontal cortex good for? Emotional regulation, gratification postponement, executive decision-making, long-term planning. We study hard in high school to get admitted to a top college to get into grad school to get a good job to get into the nursing home of our choice. Gophers don’t do that.

There’s another domain of unique human skills, and neuroscientists are learning a bit about how the brain pulls it off.

Consider the following from J. Ruth Gendler’s wonderful “The Book of Qualities,” a collection of “character sketches” of different qualities, emotions and attributes:

Anxiety is secretive. He does not trust anyone, not even his friends, Worry, Terror, Doubt and Panic … He likes to visit me late at night when I am alone and exhausted. I have never slept with him, but he kissed me on the forehead once, and I had a headache for two years …

Or:

Compassion speaks with a slight accent. She was a vulnerable child, miserable in school, cold, shy … In ninth grade she was befriended by Courage. Courage lent Compassion bright sweaters, explained the slang, showed her how to play volleyball.

What is Gendler going on about? We know, and feel pleasure triggered by her unlikely juxtapositions. Despair has stopped listening to music. Anger sharpens kitchen knives at the local supermarket. Beauty wears a gold shawl and sells seven kinds of honey at the flea market. Longing studies archeology.

Symbols, metaphors, analogies, parables, synecdoche, figures of speech: we understand them. We understand that a captain wants more than just hands when he orders all of them on deck. We understand that Kafka’s “Metamorphosis” isn’t really about a cockroach. If we are of a certain theological ilk, we see bread and wine intertwined with body and blood. We grasp that the right piece of cloth can represent a nation and its values, and that setting fire to such a flag is a highly charged act. We can learn that a certain combination of sounds put together by Tchaikovsky represents Napoleon getting his butt kicked just outside Moscow. And that the name “Napoleon,” in this case, represents thousands and thousands of soldiers dying cold and hungry, far from home.

And we even understand that June isn’t literally busting out all over. It would seem that doing this would be hard enough to cause a brainstorm. So where did this facility with symbolism come from? It strikes me that the human brain has evolved a necessary shortcut for doing so, and with some major implications.

A single part of the brain processes both physical and psychic pain.
.Consider an animal (including a human) that has started eating some rotten, fetid, disgusting food. As a result, neurons in an area of the brain called the insula will activate. Gustatory disgust. Smell the same awful food, and the insula activates as well. Think about what might count as a disgusting food (say, taking a bite out of a struggling cockroach). Same thing.

Now read in the newspaper about a saintly old widow who had her home foreclosed by a sleazy mortgage company, her medical insurance canceled on flimsy grounds, and got a lousy, exploitative offer at the pawn shop where she tried to hock her kidney dialysis machine. You sit there thinking, those bastards, those people are scum, they’re worse than maggots, they make me want to puke … and your insula activates. Think about something shameful and rotten that you once did … same thing. Not only does the insula “do” sensory disgust; it does moral disgust as well. Because the two are so viscerally similar. When we evolved the capacity to be disgusted by moral failures, we didn’t evolve a new brain region to handle it. Instead, the insula expanded its portfolio.

Or consider pain. Somebody pokes your big left toe with a pin. Spinal reflexes cause you to instantly jerk your foot back just as they would in, say, a frog. Evolutionarily ancient regions activate in the brain as well, telling you about things like the intensity of the pain, or whether it’s a sharp localized pain or a diffuse burning one. But then there’s a fancier, more recently evolved brain region in the frontal cortex called the anterior cingulate that’s involved in the subjective, evaluative response to the pain. A piranha has just bitten you? That’s a disaster. The shoes you bought are a size too small? Well, not as much of a disaster.

Now instead, watch your beloved being poked with the pin. And your anterior cingulate will activate, as if it were you in pain. There’s a neurotransmitter called Substance P that is involved in the nuts and bolts circuitry of pain perception. Administer a drug that blocks the actions of Substance P to people who are clinically depressed, and they often feel better, feel less of the world’s agonies. When humans evolved the ability to be wrenched with feeling the pain of others, where was it going to process it? It got crammed into the anterior cingulate. And thus it “does” both physical and psychic pain.

Another truly interesting domain in which the brain confuses the literal and metaphorical is cleanliness. In a remarkable study, Chen-Bo Zhong of the University of Toronto and Katie Liljenquist of Northwestern University demonstrated how the brain has trouble distinguishing between being a dirty scoundrel and being in need of a bath. Volunteers were asked to recall either a moral or immoral act in their past. Afterward, as a token of appreciation, Zhong and Liljenquist offered the volunteers a choice between the gift of a pencil or of a package of antiseptic wipes. And the folks who had just wallowed in their ethical failures were more likely to go for the wipes. In the next study, volunteers were told to recall an immoral act of theirs. Afterward, subjects either did or did not have the opportunity to clean their hands. Those who were able to wash were less likely to respond to a request for help (that the experimenters had set up) that came shortly afterward. Apparently, Lady Macbeth and Pontius Pilate weren’t the only ones to metaphorically absolve their sins by washing their hands.

This potential to manipulate behavior by exploiting the brain’s literal-metaphorical confusions about hygiene and health is also shown in a study by Mark Landau and Daniel Sullivan of the University of Kansas and Jeff Greenberg of the University of Arizona. Subjects either did or didn’t read an article about the health risks of airborne bacteria. All then read a history article that used imagery of a nation as a living organism with statements like, “Following the Civil War, the United States underwent a growth spurt.” Those who read about scary bacteria before thinking about the U.S. as an organism were then more likely to express negative views about immigration.

Another example of how the brain links the literal and the metaphorical comes from a study by Lawrence Williams of the University of Colorado and John Bargh of Yale. Volunteers would meet one of the experimenters, believing that they would be starting the experiment shortly. In reality, the experiment began when the experimenter, seemingly struggling with an armful of folders, asks the volunteer to briefly hold their coffee. As the key experimental manipulation, the coffee was either hot or iced. Subjects then read a description of some individual, and those who had held the warmer cup tended to rate the individual as having a warmer personality, with no change in ratings of other attributes.

Another brilliant study by Bargh and colleagues concerned haptic sensations (I had to look the word up — haptic: related to the sense of touch). Volunteers were asked to evaluate the resumes of supposed job applicants where, as the critical variable, the resume was attached to a clipboard of one of two different weights. Subjects who evaluated the candidate while holding the heavier clipboard tended to judge candidates to be more serious, with the weight of the clipboard having no effect on how congenial the applicant was judged. After all, we say things like “weighty matter” or “gravity of a situation.”

What are we to make of the brain processing literal and metaphorical versions of a concept in the same brain region? Or that our neural circuitry doesn’t cleanly differentiate between the real and the symbolic? What are the consequences of the fact that evolution is a tinkerer and not an inventor, and has duct-taped metaphors and symbols to whichever pre-existing brain areas provided the closest fit?

Jonathan Haidt, of the University of Virginia, has shown how viscera and emotion often drive our decisionmaking, with conscious cognition mopping up afterward, trying to come up with rationalizations for that gut decision. The viscera that can influence moral decisionmaking and the brain’s confusion about the literalness of symbols can have enormous consequences. Part of the emotional contagion of the genocide of Tutsis in Rwanda arose from the fact that when militant Hutu propagandists called for the eradication of the Tutsi, they iconically referred to them as “cockroaches.” Get someone to the point where his insula activates at the mention of an entire people, and he’s primed to join the bloodletting.

Related
More From The Stone
Read previous contributions to this series.
.But if the brain confusing reality and literalness with metaphor and symbol can have adverse consequences, the opposite can occur as well. At one juncture just before the birth of a free South Africa, Nelson Mandela entered secret negotiations with an Afrikaans general with death squad blood all over his hands, a man critical to the peace process because he led a large, well-armed Afrikaans resistance group. They met in Mandela’s house, the general anticipating tense negotiations across a conference table. Instead, Mandela led him to the warm, homey living room, sat beside him on a comfy couch, and spoke to him in Afrikaans. And the resistance melted away.

This neural confusion about the literal versus the metaphorical gives symbols enormous power, including the power to make peace. The political scientist and game theorist Robert Axelrod of the University of Michigan has emphasized this point in thinking about conflict resolution. For example, in a world of sheer rationality where the brain didn’t confuse reality with symbols, bringing peace to Israel and Palestine would revolve around things like water rights, placement of borders, and the extent of militarization allowed to Palestinian police. Instead, argues Axelrod, “mutual symbolic concessions” of no material benefit will ultimately make all the difference. He quotes a Hamas leader who says that for the process of peace to go forward, Israel must apologize for the forced Palestinians exile in 1948. And he quotes a senior Israeli official saying that for progress to be made, Palestinians need to first acknowledge Israel’s right to exist and to get their anti-Semitic garbage out of their textbooks.

Hope for true peace in the Middle East didn’t come with the news of a trade agreement being signed. It was when President Hosni Mubarak of Egypt and King Hussein of Jordan attended the funeral of the murdered Israeli prime minister Yitzhak Rabin. That same hope came to the Northern Irish, not when ex-Unionist demagogues and ex-I.R.A. gunmen served in a government together, but when those officials publicly commiserated about each other’s family misfortunes, or exchanged anniversary gifts. And famously, for South Africans, it came not with successful negotiations about land reapportionment, but when black South Africa embraced rugby and Afrikaans rugby jocks sang the A.N.C. national anthem.

Nelson Mandela was wrong when he advised, “Don’t talk to their minds; talk to their hearts.” He meant talk to their insulas and cingulate cortices and all those other confused brain regions, because that confusion could help make for a better world.

(Robert Sapolsky’s essay is the subject of this week’s forum discussion among the humanists and scientists at On the Human, a project of the National Humanities Center.)



--------------------------------------------------------------------------------
Title: Religion & Cooperative Endeavors
Post by: Body-by-Guinness on December 01, 2010, 09:11:18 AM
Hmm, as an almost atheist who often falls into altruistic behaviors, I'm not sure where I fit in this picture:

http://reason.com/archives/2010/11/30/the-eleventh-commandment-punis
Reason Magazine


The Eleventh Commandment: Punish Free Riders

Religion and the evolutionary origin of cooperation

Ronald Bailey | November 30, 2010

Two of the deep puzzles in human evolution are religion and cooperation between genetically unrelated strangers. In recent years, many researchers have come to believe the two phenomena are intimately linked. If people believe they are being watched and judged by an omnipresent supernatural entity, they may be more willing to perform emotionally binding and costly rituals to signal commitment to a group. The same sense of being watched may also encourage people to be helpful to others—even when there is no obvious reproductive payoff. In other words: Science suggests that God—and His followers—hate free riders.

A 2007 study by University of British Columbia psychologists Azim Shariff and Ara Norenzayan found that players in an anonymous economic game were more generous if they were primed with religious concepts before beginning play. In the case, the subjects participated in the dictator game in which they get to anonymously divvy up $10 between themselves and an unknown individual. The researchers assigned players into three groups. One group was primed with religious concepts by having them unscramble 10 five-word sentences, dropping an extraneous word from each to create a grammatical four-word sentence. For example, “dessert divine was fork the’’ would become ‘‘the dessert was divine.” The religious words were spirit, divine, God, sacred, and prophet. A second group was primed with words connoting secular moral institutions, e.g., civic, jury, court, police, and contract. The third group unscrambled sentences containing neutral words. So what did they find?

Earlier studies using the dictator game consistently found that subjects in general behaved selfishly by taking most of the money for themselves. In this case, players in the neutral game offered an average of $2.56 to other players. However, players who had been primed with religious concepts offered an average of $4.56. Interestingly, players primed with secular moral concepts offered $4.44, nearly as much as players exposed to religious primes. Self-reported belief in God was not a good predictor of generosity in the neutral prime version of the game; it seems believers needed reminders to be more generous.

But how do the invisible omnipresent gods encourage generosity to strangers? Of course, the gods can reward believers for good behavior, but they also punish them for bad behavior. It is how this aspect of religious belief affects cooperation that a team of researchers led by University of London psychologist Ryan McKay attempt to probe in a study released last week, “Wrath of God: Religious primes and punishment.”

One of the chief fears of people who want to cooperate is that they will be chumps who are taken advantage of by free riders. Earlier research using public goods economic games found that cooperation was considerably enhanced if players had an opportunity to punish free riders. In these games, players can invest in a common pool which then grows and is divvied up among all the players. Free riders, however, can make more money by refusing to invest and yet get a share of the growing pool. Research shows that cooperation breaks down completely when such free riders cannot be punished by other players. But when other players can pay to reduce the holdings of free riders, they begin to play fairly and cooperation dramatically increases.

In the new study, McKay and his colleagues sought to find out if religious priming promotes costly punishment of unfair behavior. In this experiment, one player could choose between splitting a pot of money evenly between herself and a second player or she could choose another option in which the split was about nine to one. If the second player believed the choice was unfair, she could punish the first player by spending a portion of her allocation to reduce the take of the first player at a rate of three to one, e.g., if she spent 50, the first player would lose 150. The players were subliminally primed by words flashing on a computer screen. Divided into four groups, one group was exposed to religious words, another to punishment words, the third to punishment and religious words, and the fourth to neutral words. Afterwards, players were asked about their religious beliefs and if they had donated to a religious organization in the past year.

The results? “Our study reveals that for those who financially support religious institutions, subliminal religious messages strongly increase the costly punishment of unfair behavior, even when such punishment is to their individual material disadvantage,” says McKay in a press release describing the research. Subliminal religious priming did not have a significant effect on other players.

So why does religious priming induce committed believers to punish unfair behavior? The researchers suggest two possibilities. The first is that religious primes trigger the idea that one is being watched by the gods. “In this case primed participants punish unfair behaviors because they sense that not doing so will damage their standing in the eyes of a supernatural agent,” they speculate. The second hypothesis is that religious primes “activate cultural norms pertaining to fairness and its enforcement and occasion behavior consistent with those norms.” McKay and his colleagues acknowledge that religious primes might actually invoke both mechanisms. In either case, while the gods may punish uncooperative sinners, their work is considerably enhanced if believers go out of their way to punish sinners too.

These studies do bolster the idea that ancestral belief in supernatural entities enhanced group cooperation, enabling believers to out-compete other groups. As Shariff and Norenzayan observe, “If the cultural spread of supernatural moralizing agents expanded the circle of cooperation to unrelated strangers, it may well have allowed small groups to grow into large-scale societies, from the early towns of Jericho and Ur to the metropolises of today.”

Ronald Bailey is Reason's science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is now available from Prometheus Books.
Title: NYT: Smiles
Post by: Crafty_Dog on January 25, 2011, 05:57:29 AM
In the middle of a phone call four years ago, Paula Niedenthal began to wonder what it really means to smile. The call came from a Russian reporter, who was interviewing Dr. Niedenthal about her research on facial expressions.

“At the end he said, ‘So you are American?’ ” Dr. Niedenthal recalled.
Indeed, she is, although she was then living in France, where she had taken a post at Blaise Pascal University.

“So you know,” the Russian reporter informed her, “that American smiles are all false, and French smiles are all true.”

“Wow, it’s so interesting that you say that,” Dr. Niedenthal said diplomatically. Meanwhile, she was imagining what it would have been like to spend most of her life surrounded by fake smiles.

“I suddenly became interested in how people make these kinds of errors,” Dr. Niedenthal said. But finding the source of the error would require knowing what smiles really are — where they come from and how people process them. And despite the fact that smiling is one of the most common things that we humans do, Dr. Niedenthal found science’s explanation for it to be weak.

“I think it’s pretty messed up,” she said. “I think we don’t know very much, actually, and it’s something I want to take on.”

To that end, Dr. Niedenthal and her colleagues have surveyed a wide range of studies, from brain scans to cultural observations, to build a new scientific model of the smile. They believe they can account not only for the source of smiles, but how people perceive them. In a recent issue of the journal Behavioral and Brain Sciences, they argue that smiles are not simply the expression of an internal feeling. Smiles in fact are only the most visible part of an intimate melding between two minds.

“It’s an impressive, sophisticated analysis,” said Adam Galinsky, a social psychologist at Northwestern University.

Psychologists have studied smiles carefully for decades, but mostly from the outside. When the zygomaticus major muscles in our cheeks contract, they draw up the corners of our mouths. But there’s much more to a smile than that.

“A smile is not this floating thing, like a Cheshire Cat,” said Dr. Niedenthal. “It’s attached to a body.” Sometimes the lips open to reveal teeth; sometimes they stay sealed. Sometimes the eyes crinkle. The chin rises with some smiles, and drops in others.

Cataloging these variations is an important first step, said Dr. Niedenthal, but it can’t deliver an answer to the enigma of smiles. “People like to make dictionaries of the facial muscles to make a particular gesture, but there’s no depth to that approach,” she said.

Some researchers have tried to move deeper, to understand the states of mind that produce smiles. We think of them as signifying happiness, and indeed, researchers do find that the more intensely people contract their zygomaticus major muscles, the happier they say they feel. But this is far from an iron law. The same muscles sometimes contract when people are feeling sadness or disgust, for example.

The link between feelings and faces is even more mysterious. Why should any feeling cause us to curl up our mouths, after all? This is a question that Darwin pondered for years. An important clue, he said, is found in the faces of apes, which draw up their mouths as well. These expressions, Darwin argued, were also smiles. In other words, Mona Lisa inherited her endlessly intriguing smile from the grinning common ancestor she shared with chimpanzees.

Primatologists have been able to sort smiles into a few categories, and Dr. Niedenthal thinks that human smiles should be classified in the same way. Chimpanzees sometimes smile from pleasure, as when baby chimps play with each other. but chimpanzees also smile when they’re trying to strengthen a social bond with another chimpanzee.

Dr. Niedenthal thinks that some human smiles fall into these categories as well. What’s more, they may be distinguished by certain expressions. An embarrassed smile is often accompanied by a lowered chin, for example, while a smile of greeting often comes with raised eyebrows.

Chimpanzees sometimes smile not for pleasure or for a social bond, but for power. A dominant chimpanzee will grin and show its teeth. Dr. Niedenthal argues that humans flash a power grin as well — often raising their chin so as to look down at others.

“ ‘You’re an idiot, I’m better than you’—that’s what we mean by a dominant smile,” said Dr. Niedenthal.

====================

Page 2 of 2)



But making a particular facial expression is just the first step of a smile. Dr. Niedenthal argues that how another person interprets the smile is equally important. In her model, the brain can use three different means to distinguish a smile from some other expression.

One way people recognize smiles is comparing the geometry of a person’s face to a standard smile. A second way is thinking about the situation in which someone is making an expression, judging if it’s the sort where a smile would be expected.
But most importantly, Dr. Niedenthal argues, people recognize smiles by mimicking them. When a smiling person locks eyes with another person, the viewer unknowingly mimics a smile as well. In their new paper, Dr. Niedenthal and her colleagues point to a number of studies indicating that this imitation activates many of the same regions of the brain that are active in the smiler.

A happy smile, for example, is accompanied by activity in the brain’s reward circuits, and looking at a happy smile can excite those circuits as well. Mimicking a friendly smile produces a different pattern of brain activity. It activates a region of the brain called the orbitofrontal cortex, which distinguishes feelings for people with whom we have a close relationship from others. The orbitofrontal cortex becomes active when parents see their own babies smile, for example, but not other babies.

If Dr. Niedenthal’s model is correct, then studies of dominant smiles should reveal different patterns of brain activity. Certain regions associated with negative emotions should become active.

Embodying smiles not only lets people recognize smiles, Dr. Niedenthal argues. It also lets them recognize false smiles. When they unconsciously mimic a false smile, they don’t experience the same brain activity as an authentic one. The mismatch lets them know something’s wrong.

Other experts on facial expressions applaud Dr. Niedenthal’s new model, but a number of them also think that parts of it require fine-tuning. “Her model fits really well along the horizontal dimension, but I have my doubts about the vertical,” said Dr. Galinsky. He questions whether people observing a dominant smile would experience the feeling of power themselves. In fact, he points out, in such encounters, people tend to avoid eye contact, which Dr. Niedenthal says is central to her model.

Dr. Niedenthal herself is now testing the predictions of the model with her colleagues. In one study, she and her colleagues are testing the idea that mimicry lets people recognize authentic smiles. They showed pictures of smiling people to a group of students. Some of the smiles were genuine and others were fake. The students could readily tell the difference between them.

Then Dr. Niedenthal and her colleagues asked the students to place a pencil between their lips. This simple action engaged muscles that could otherwise produce a smile. Unable to mimic the faces they saw, the students had a much harder time telling which smiles were real and which were fake.

The scientists then ran a variation on the experiment on another group of students. They showed the same faces to the second group, but had them imagine the smiling faces belonged to salesclerks in a shoe store. In some cases the salesclerks had just sold the students a pair of shoes — in which they might well have a genuine smile of satisfaction. In other trials, they imagined that the salesclerks were trying to sell them a pair of shoes — in which case they might be trying to woo the customer with a fake smile.

In reality, the scientists use a combination of real and fake smiles for both groups of salesclerks. When the students were free to mimic the smiles, their judgments were not affected by what the salesclerk was doing.

But if the students put a pencil in their mouth, they could no longer rely on their mimicry. Instead, they tended to believe that the salesclerks who were trying to sell them shoes were faking their smiles — even when their smiles were genuine. Likewise, they tended to say that the salesclerks who had finished the sale were smiling for real, even when they weren’t. In other words, they were forced to rely on the circumstances of the smile, rather than the smile itself.

Dr. Niedenthal and her colleagues have also been testing the importance of eye contact for smiles. They had students look at a series of portraits, like the “Laughing Cavalier” by the 17th-century artist Frans Hals. In some portraits the subject looked away from the viewer, while in others, the gaze was eye to eye. In some trials, the students looked at the paintings with bars masking the eyes.

The participants rated how emotional the impact of the painting was. Dr. Niedenthal and her colleagues found, as they had predicted, that people felt a bigger emotional impact when the eyes were unmasked than when they were masked. The smile was identical in each painting, but it was not enough on its own. What’s more, the differences were greater when the portrait face was making direct eye contact with the viewer.

Dr. Niedenthal suspects that she and other psychologists are just starting to learn secrets about smiles that artists figured out centuries ago. It may even be possible someday to understand why Mona Lisa’s smile is so powerful. “I would say the reason it was so successful is because you achieve eye contact with her,” said Dr. Niedenthal, “and so the fact that the meaning of her smile is complicated is doubly communicated, because your own simulation of it is mysterious and difficult.”
Title: Connectomics
Post by: Body-by-Guinness on April 12, 2011, 06:22:20 PM
Connectomics
Published by Steven Novella under Neuroscience
Comments: 9
There are approximately 100 billion neurons in the adult human brain. Each neuron makes thousands of connections to other neurons, resulting in an approximate 150 trillion connections in the human brain. The pattern of those connections is largely responsible for the functionality of the brain – everything we sense, feel, think, and do. Neuroscientists are attempting to map those connections – in an effort known as connectomics. (Just as genomics is the effort to map the genome, and proteomics is mapping all the proteins that make up an organism.)
This is no small task. No matter how you look at it, 150 trillion is a lot of connections. One research group working on this project is a team led by Thomas Mrsic-Flogel at the University College London. They recently published a paper in Nature in which they map some of the connections in the mouse visual cortex.
What they did was to first determine the function of specific areas and neurons in the mouse visual cortex in living mice. For example, they determined which orientation they are sensitive to. In the visual cortex different neurons respond to different orientations (vertical vs horizontal, for example). Once they mapped the directional function of the neurons they then mapped the connections between those neurons in vitro (after removing the brain). They found that neurons made more connections to other neurons with the same directional response, rather than neurons with sensitivity to different (orthogonal) directions.
The techniques used allowed them to make a map of connections in part of the mouse visual cortex and correlate the pattern of those connections to the functionality of that cortex. The resulting connectomics map is still partial and crude, but it is a step in the direction of reproducing the connections in the brain.
One way to think about these kinds of techniques is that they promise to take us a level deeper in our understanding of brain anatomy. At present we have mapped the mammalian, and specifically human, brain to the point that we can identify specific regions of the brain and link them to some specific function. For the more complex areas of the brain we are still refining our map of these brain modules and the networks they form.
To give an example of where we are with this, clinical neurologists are often able to predict where a stroke is located simply by the neurological exam. We can correlate specific deficits with known brain structures, and the availability of MRI scanning means that we get rapid and precise feedback on our accuracy. We are very good at localizing deficits of strength, sensation, vision, and also many higher cortical functions like language, calculations, visuo-spatial reasoning, performing learned motor tasks, and others.
But we are still a long way from being able to reproduce the connections in the brain in fine detail – say, with sufficient accuracy to produce a virtual brain in a computer simulation (even putting aside the question of computing power). And that is exactly the goal of connectomics.
Along the way these research efforts will increase our knowledge of brain anatomy and function, as we learn exactly how different brain regions connect to each other and correlate them with specific functions. Neuroscientists are still picking the low-hanging fruit, such as mapping the visual cortex, which has some straightforward organization that correlates with concepts that are easy to identify and understand – like mapping to an actual layout of the visual field, and to specific features of vision such as contrast and orientation.
For more abstract areas of the brain, like those that are involved with planning, making decision, directing our attention, feeling as if we are inside our own bodies, etc. connectomics is likely to be more challenging. Right now we are mainly using fMRI scans for these kinds of studies, which has been very successful, but does not produce a fine map of connections (more of a brain region map). Also, the more abstract the function the more difficult it will be to use mice or other animals as subjects, and when using humans you cannot use certain techniques, like removing the brain and slicing it up (at least not on living subjects).
The utility of this kind of research is a better understanding of brain function, and all that flows from that. We cannot anticipate all the potential benefits, and the most fruitful outcome may derive from knowledge we are not even aware we are missing.
This also plays into the research efforts to create a virtual representation of the human brain, complete with all the connections. This is one pathway to artificial intelligence. Estimates vary, but it seems like we will have the computer power sometime this century to create a virtual human brain that can function in real time, and then, of course, become progressively faster.
I should note that the connections among neurons in the brain are not the only feature that contributes to brain function. The astrocytes and other “support” cells also contribute to brain function. There is also a biochemical level to brain function – the availability of specific neurotransmitters, for example. So even if we could completely reproduce the neuronal connections in the brain, there are other layers of complexity superimposed upon this.
 
In any case, this is fascinating research and it will be nice to see how it progresses over the next few decades.

http://theness.com/neurologicablog/?p=3096
Title: WSJ: One language mother of all others?
Post by: Crafty_Dog on April 14, 2011, 03:35:38 PM
By GAUTAM NAIK
The world's 6,000 or so modern languages may have all descended from a single ancestral tongue spoken by early African humans between 50,000 and 70,000 years ago, a new study suggests.

The finding, published Thursday in the journal Science, could help explain how the first spoken language emerged, spread and contributed to the evolutionary success of the human species.

Quentin Atkinson, an evolutionary psychologist at the University of Auckland in New Zealand and author of the study, found that the first migrating populations leaving Africa laid the groundwork for all the world's cultures by taking their single language with them—the mother of all mother tongues.

"It was the catalyst that spurred the human expansion that we all are a product of," Dr. Atkinson said.

About 50,000 years ago—the exact timeline is debated—there was a sudden and marked shift in how modern humans behaved. They began to create cave art and bone artifacts and developed far more sophisticated hunting tools. Many experts argue that this unusual spurt in creative activity was likely caused by a key innovation: complex language, which enabled abstract thought. The work done by Dr. Atkinson supports this notion.

His research is based on phonemes, distinct units of sound such as vowels, consonants and tones, and an idea borrowed from population genetics known as "the founder effect." That principle holds that when a very small number of individuals break off from a larger population, there is a gradual loss of genetic variation and complexity in the breakaway group.

Dr. Atkinson figured that if a similar founder effect could be discerned in phonemes, it would support the idea that modern verbal communication originated on that continent and only then expanded elsewhere.

In an analysis of 504 world languages, Dr. Atkinson found that, on average, dialects with the most phonemes are spoken in Africa, while those with the fewest phonemes are spoken in South America and on tropical islands in the Pacific.

The study also found that the pattern of phoneme usage globally mirrors the pattern of human genetic diversity, which also declined as modern humans set up colonies elsewhere. Today, areas such as sub-Saharan Africa that have hosted human life for millennia still use far more phonemes in their languages than more recently colonized regions do.

"It's a wonderful contribution and another piece of the mosaic" supporting the out-of-Africa hypothesis, said Ekkehard Wolff, professor emeritus of African Languages and Linguistics at the University of Leipzig in Germany, who read the paper.

Dr. Atkinson's findings are consistent with the prevailing view of the origin of modern humans, known as the "out of Africa" hypothesis. Bolstered by recent genetic evidence, it says that modern humans emerged in Africa alone, about 200,000 years ago. Then, about 50,000 to 70,000 years ago, a small number of them moved out and colonized the rest of the world, becoming the ancestors of all non-African populations on the planet.

The origin of early languages is fuzzier. Truly ancient languages haven't left empirical evidence that scientists can study. And many linguists believe it is hard to say anything definitive about languages prior to 8,000 years ago, as their relationships would have become jumbled over the millennia.

But the latest Science paper "and our own observations suggest that it is possible to detect an arrow of time" underlying proto-human languages spoken more than 8,000 years ago, said Murray Gell-Mann of the Santa Fe Institute in New Mexico, who read the Science paper and supports it. The "arrow of time" is based on the notion that it is possible to use data from modern languages to trace their origins back 10,000 years or even further.

Dr. Gell-Mann, a Nobel Prize-winning physicist with a keen interest in historical linguistics, is co-founder of a project known as Evolution of Human Languages. He concedes that his "arrow of time" view is a minority one.

Only humans have the biological capacity to communicate with a rich language based on symbols and rules, enabling us to pass on cultural ideas to future generations. Without language, culture as we know it wouldn't exist, so scientists are keen to pin down where it sprang from.

Dr. Atkinson's approach has its limits. Genes change slowly, over many generations, while the diversity of phonemes amid a population group can change rapidly as language evolves. While distance from Africa can explain as much as 85% of the genetic diversity of populations, a similar distance measurement can explain only 19% of the variation in phonemic diversity. Dr. Atkinson said the measure is still statistically significant.

Another theory of the origin of modern humans, known as the multiregional hypothesis, holds that earlier forms of humans originated in Africa and then slowly developed their anatomically modern form in every area of the Old World. This scenario implies that several variants of modern human language could have emerged somewhat independently in different locations, rather than solely in Africa.

Early migrants from Africa probably had to battle significant odds. A founder effect on a breakaway human population tends to reduce its size, genetic complexity and fitness. A similar effect could have limited "the size and cultural complexity of societies at the vanguard of the human expansion" out of Africa, the paper notes.

Write to Gautam Naik at gautam.naik@wsj.com

Title: Re: Evolutionary biology/psychology
Post by: tim nelson on April 15, 2011, 03:18:40 PM
I highly recommend books by Paul Shephard. Nature and Madness, The Tender Carnivore, and Coming Home to the Pleistocene. He was way ahead of his time in my opinion.

He is a proponent for hunter-gatherer lifestyle being the most healthy. He spends a lot of time on comparing our development with other primates socially, physically, nutritionally, etc and with other social and less social large predators. I liked his ideas of how we developed to be quite a mix, digestive systems like that of true omnivores : raccoon, bear, boars. But our social structure and culture was more wolf like. How we especially males evolved as modern humans spending considerable time hunting large game whenver and wherever  we lived. so that those complex processes that occupied the male psyche of problem solving a moving problem, coordinating together, and the types of exercise. And without large game to hunt we need comparable experiences, we crave an experience like that. Comparable such as hunting humans in war, fighting, etc.

Anyway, I liked most of his stuff. Lots of convincing evidence, and I liked his biased stance on hunter-gatherer cultures which helps.
Title: Welcome to the Family Tree
Post by: Body-by-Guinness on April 22, 2011, 11:19:32 AM
A New Hominin – A. sediba
Published by Steven Novella under Evolution
Comments: 8
Following the branching bush of human evolution is getting increasingly difficult. When I studied human evolution in college, things were much simpler. There were a few Australopithecus species followed by a few Homo species, leading to modern humans. It was recognized at the time that these fossil species probably did not represent a nice clean straight line to Homo sapiens, but it seems the family tree has become much bushier than was imagined at the time.
Here is a recent representation of the hominin family tree. We have added more species of Australopithecus and Homo, plus new genuses of Kenyanthropus and Paranthropus (not even including older genuses that predate Australopithecus).
Now researchers have announced the discovery of yet another species of early hominin, about 2 million years old – likely a late species of Australopithecus named A. sediba. They discovered four individuals – two adults, a child and an infant, who likely fell into a “death trap”  in a cave in what is now Malapa, South Africa.
Each bit of fossil evidence is like a piece to a complex puzzle. As more pieces fit into place, however, the picture becomes more complex and more questions are generated. We are still at the stage where new evidence generates more questions than answers, and we have no idea how complex the final picture that emerges will be.
The new discovery is no exception. A. sediba has a mixture of modern (Homo) and primitive (Australopithecine) traits. It has a small brain like a primitive Australopithecus, but has pelvic structure and hand features that are more modern than other members of the genus.
It should also be noted that the first members of the Homo genus arose about a million years before the age of these specific specimens – so these individuals do not represent a population that in ancestral to our genus.
As always, there are multiple ways to interpret this data. It is possible that A. sediba is the ancestral Australopithecine species that led to Homo – either directly, or closely related to that species (yet to be discovered). In this case, these individuals would be later representatives of that species. Species often persist, even for millions of years, after other species branch off from them. So it is always possible to find representative of an ancestral species that are more recent than species that evolved from them.
It is also possible that A. sediba is a separate line of Australopithecines that did not lead to Homo, but developed some similar features. In this case the “modern” features in A. sediba would be analogous to (similar to, but not ancestral to) the modern feature, rather than homologous to (related through evolutionary derivation) the modern Homo features.
Another possibility that was not mentioned in the Science article that I linked to is that these individuals, and possibly A. sediba as a species, or perhaps just one breeding population, represent the results of interbreeding between Homo and Australopithecus species. In this case modern features would have literally mixed with the more primitive features together in A. sediba.
This adds a new layer complexity to our picture of the human family tree (or any family tree). When species divide the separation is not clean, and later remixing of genes is not only possible but probable. There is genetic evidence, for example, of later mixing of genes between human ancestors and chimpanzee ancestors after the split. So it’s not a stretch to think that hominin populations were at least occasionally interbreeding .
I suspect there are many more hominin species and subspecies to be discovered. The picture that is emerging is fascinating, if it is becoming increasingly difficult to keep track of it all. I’ll just have to muddle through.

http://theness.com/neurologicablog/?p=3139
Title: Belly Bacteria & the Brain
Post by: Body-by-Guinness on April 22, 2011, 12:20:57 PM
Second post.

The Neuroscience of the Gut
Strange but true: the brain is shaped by bacteria in the digestive tract
By Robert Martone  | Tuesday, April 19, 2011 | 18

Researchers track the gut-brain connection
Image: dyoma
People may advise you to listen to your gut instincts: now research suggests that your gut may have more impact on your thoughts than you ever realized. Scientists from the Karolinska Institute in Sweden and the Genome Institute of Singapore led by Sven Pettersson recently reported in the Proceedings of the National Academy of Sciences that normal gut flora, the bacteria that inhabit our intestines, have a significant impact on brain development and subsequent adult behavior.

We human beings may think of ourselves as a highly evolved species of conscious individuals, but we are all far less human than most of us appreciate. Scientists have long recognized that the bacterial cells inhabiting our skin and gut outnumber human cells by ten-to-one. Indeed, Princeton University scientist Bonnie Bassler compared the approximately 30,000 human genes found in the average human to the more than 3 million bacterial genes inhabiting us, concluding that we are at most one percent human. We are only beginning to understand the sort of impact our bacterial passengers have on our daily lives.

Moreover, these bacteria have been implicated in the development of neurological and behavioral disorders. For example, gut bacteria may have an influence on the body’s use of vitamin B6, which in turn has profound effects on the health of nerve and muscle cells. They modulate immune tolerance and, because of this, they may have an influence on autoimmune diseases, such as multiple sclerosis. They have been shown to influence anxiety-related behavior, although there is controversy regarding whether gut bacteria exacerbate or ameliorate stress related anxiety responses. In autism and other pervasive developmental disorders, there are reports that the specific bacterial species present in the gut are altered and that gastrointestinal problems exacerbate behavioral symptoms. A newly developed biochemical test for autism is based, in part, upon the end products of bacterial metabolism.

But this new study is the first to extensively evaluate the influence of gut bacteria on the biochemistry and development of the brain. The scientists raised mice lacking normal gut microflora, then compared their behavior, brain chemistry and brain development to mice having normal gut bacteria. The microbe-free animals were more active and, in specific behavioral tests, were less anxious than microbe-colonized mice. In one test of anxiety, animals were given the choice of staying in the relative safety of a dark box, or of venturing into a lighted box. Bacteria-free animals spent significantly more time in the light box than their bacterially colonized littermates. Similarly, in another test of anxiety, animals were given the choice of venturing out on an elevated and unprotected bar to explore their environment, or remain in the relative safety of a similar bar protected by enclosing walls. Once again, the microbe-free animals proved themselves bolder than their colonized kin.

Pettersson’s team next asked whether the influence of gut microbes on the brain was reversible and, since the gut is colonized by microbes soon after birth, whether there was evidence that gut microbes influenced the development of the brain. They found that colonizing an adult germ-free animal with normal gut bacteria had no effect on their behavior. However, if germ free animals were colonized early in life, these effects could be reversed. This suggests that there is a critical period in the development of the brain when the bacteria are influential.

Consistent with these behavioral findings, two genes implicated in anxiety -- nerve growth factor-inducible clone A (NGF1-A) and brain-derived neurotrophic factor (BDNF) -- were found to be down-regulated in multiple brain regions in the germ-free animals. These changes in behavior were also accompanied by changes in the levels of several neurotransmitters, chemicals which are responsible for signal transmission between nerve cells. The neurotransmitters dopamine, serotonin and noradrenaline were elevated in a specific region of the brain, the striatum, which is associated with the planning and coordination of movement and which is activated by novel stimuli, while there were there were no such effects on neurotransmitters in other brain regions, such as those involved in memory (the hippocampus) or executive function (the frontal cortex).

When Pettersson’s team performed a comprehensive gene expression analysis of five different brain regions, they found nearly 40 genes that were affected by the presence of gut bacteria. Not only were these primitive microbes able to influence signaling between nerve cells while sequestered far away in the gut, they had the astonishing ability to influence whether brain cells turn on or off specific genes.   

How, then, do these single-celled intestinal denizens exert their influence on a complex multicellular organ such as the brain? Although the answer is unclear, there are several possibilities: the Vagus nerve, for example, connects the gut to the brain, and it’s known that infection with the Salmonella bacteria stimulates the expression of certain genes in the brain, which is blocked when the Vagus nerve is severed. This nerve may be stimulated as well by normal gut microbes, and serve as the link between them and the brain. Alternatively, those microbes may modulate the release of chemical signals by the gut into the bloodstream which ultimately reach the brain. These gut microbes, for example, are known to modulate stress hormones which may in turn influence the expression of genes in the brain.

Regardless of how these intestinal “guests” exert their influence, these studies suggest that brain-directed behaviors, which influence the manner in which animals interact with the external world, may be deeply influenced by that animal’s relationship with the microbial organisms living in its gut. And the discovery that gut bacteria exert their influence on the brain within a discrete developmental stage may have important implications for developmental brain disorders.

http://www.scientificamerican.com/article.cfm?id=the-neuroscience-of-gut
Title: Language is Innate?
Post by: Body-by-Guinness on April 26, 2011, 08:11:03 PM
Baby Language
Published by Steven Novella under Neuroscience
Comments: 4
Recent studies demonstrate that babies 12-18 months old have similar activity in their brains in response to spoken words as do adults, a fact that tells us a lot about the development of language function.
In the typical adult brain language function is primarily carried out in highly specialized parts of the brain – Wernicke’s area (in the dominant, usually left, superior temporal lobe) processes words into concepts and concepts into words, while Broca’s area (in the dominant posterior-inferior frontal lobe) controls speech output. The two areas are connected by the arcuate fasciculus and are fed by both auditory and visual input. Taken as a whole this part of the brain functions as the language cortex. A stroke or other damage to this area in an adult results in loss of one ore more aspects of speech, depending on the extent of damage.
Damage to this part of the brain in babies, however, does not have the same effect. When such children grow up they are able to develop essentially normal language function. There are two prevailing theories to explain this. The first is that language function is more widely distributed in infants than in adults, perhaps also involving the same structures on the non-dominant side of the brain. As the brain matures language function becomes confined to the primary language cortex.
The second theory is that brain plasticity allows non-damaged parts of the brain to take over function for the language cortex. Such plasticity exists even in adult brains, but is vastly more significant in babies, whose brains are still developing and wiring themselves. There is still a lot of raw brain material that has not fully specialized yet that can be coopted for whatever functions are needed.
The new research has implications for this debate. If the former theory is correct, then babies who are just learning language would activate their brains more broadly than adults in response to language. If babies show a similar pattern of activation, that would support the plasticity theory.
This latest research firmly supports plasticity as the answer. Researchers at the University of California used functional MRI scans and magnetoencephalography (MEG) to look at the brain activity of 12-18 month old children in response to spoken words. They found that their primary language cortex lit up in a similar pattern to adults. They further tested to see if the children had any sense of the meaning of the words. They showed pictures of common objects with either a correct or incorrect spoken word. The children showed increased language area activity when the words were incongruous to the picture – and the researchers showed this is the same increase in activity as seen in adults.
What this research implies is that the genetic program for brain design comes into effect very early in brain development. The language cortex is destined to be language cortex right from the beginning, as long as nothing goes wrong with this process.
It should also be noted that this study looked only at the response to individual words. What it says about the 12-18 month old stage of development is that children of this age are already programming their language areas and storing up words and their meanings. This research did not look at other aspects of language, such as grammar – the ability to string multiple words together in a specific way in order to create meaning. It also did not look at the visual processing of written words.
Any parent of young children will likely remember with great detail the functional language development of their own children. At this age, and even younger than 12 months, children do seem to be sponges for language. Once they start learning words, they do so very quickly. Young children also seem to understand far more words than they can say. I don’t think this is mere confirmation bias (although that would tend to exaggerate the appearance of word acquisition), and research bears out that children can understand many more words than they can say. The ability to speak comes a bit later than the ability to assign meaning to specific words.
I remember that I played games with my children when they were about one year old, and still in the babbling stage. They could reliably, for example, retrieve specific toys by name (being very careful to avoid the clever Hans effect). I remember, in fact, be very surprised at how well they performed – they seemed to understand many more words than I would have thought given the rudimentary nature of their babbling. In this case, careful research confirms subjective experience – children learn to understand words spoken to them before they gain the ability to say them.
This makes sense from the point of view that it is very neurologically difficult to articulate. We take it for granted, but it does require dedicated cortex to pull of this feat. Also, think about how easy it is to become dysarthric – we start slurring our words even when we are just a little sleep deprived, or with a moderate amount of alcohol. It does seem to be a function that goes early when brain function is even slightly compromised, which says something about how demanding it is.
One more tangential point – it also strikes me that we tend to naively judge what is going on in people’s heads by what is coming out of their mouths. This is not unreasonable in most circumstances, but there are many reasons why people may be more mentally sharp than is evidenced by their articulation. Young children are just one example – they may be babbling with their mouths, but there is more linguistically going on in their brains.

http://theness.com/neurologicablog/?p=2711
Title: The Dishonest Minority
Post by: Crafty_Dog on May 17, 2011, 11:08:55 AM

      Status Report: "The Dishonest Minority"



Three months ago, I announced that I was writing a book on why security
exists in human societies.  This is basically the book's thesis statement:

     All complex systems contain parasites.  In any system of
     cooperative behavior, an uncooperative strategy will be effective
     -- and the system will tolerate the uncooperatives -- as long as
     they're not too numerous or too effective. Thus, as a species
     evolves cooperative behavior, it also evolves a dishonest minority
     that takes advantage of the honest majority.  If individuals
     within a species have the ability to switch strategies, the
     dishonest minority will never be reduced to zero.  As a result,
     the species simultaneously evolves two things: 1) security systems
     to protect itself from this dishonest minority, and 2) deception
     systems to successfully be parasitic.

     Humans evolved along this path.  The basic mechanism can be
     modeled simply.  It is in our collective group interest for
     everyone to cooperate. It is in any given individual's short-term
     self-interest not to cooperate: to defect, in game theory terms.
     But if everyone defects, society falls apart.  To ensure
     widespread cooperation and minimal defection, we collectively
     implement a variety of societal security systems.

     Two of these systems evolved in prehistory: morals and reputation.
     Two others evolved as our social groups became larger and more
     formal: laws and technical security systems.  What these security
     systems do, effectively, is give individuals incentives to act in
     the group interest.  But none of these systems, with the possible
     exception of some fanciful science-fiction technologies, can ever
     bring that dishonest minority down to zero.

     In complex modern societies, many complications intrude on this
     simple model of societal security. Decisions to cooperate or
     defect are often made by groups of people -- governments,
     corporations, and so on -- and there are important differences
     because of dynamics inside and outside the groups. Much of our
     societal security is delegated -- to the police, for example --
     and becomes institutionalized; the dynamics of this are also
     important.

     Power struggles over who controls the mechanisms of societal
     security are inherent: "group interest" rapidly devolves to "the
     king's interest."  Societal security can become a tool for those
     in power to remain in power, with the definition of "honest
     majority" being simply the people who follow the rules.

     The term "dishonest minority" is not a moral judgment; it simply
     describes the minority who does not follow societal norm.  Since
     many societal norms are in fact immoral, sometimes the dishonest
     minority serves as a catalyst for social change.  Societies
     without a reservoir of people who don't follow the rules lack an
     important mechanism for societal evolution.  Vibrant societies
     need a dishonest minority; if society makes its dishonest minority
     too small, it stifles dissent as well as common crime.

At this point, I have most of a first draft: 75,000 words.  The
tentative title is still "The Dishonest Minority: Security and its Role
in Modern Society."  I have signed a contract with Wiley to deliver a
final manuscript in November for February 2012 publication.  Writing a
book is a process of exploration for me, and the final book will
certainly be a little different -- and maybe even very different -- from
what I wrote above.  But that's where I am today.

And it's why my other writings -- and the issues of Crypto-Gram --
continue to be sparse.

Lots of comments -- over 200 -- to the blog post.  Please comment there;
I want the feedback.
http://www.schneier.com/blog/archives/2011/02/societal_securi.html
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on August 19, 2011, 11:19:39 AM
TTT for the attention of Richard Lighty
Title: Re: Evolutionary biology/psychology
Post by: prentice crawford on September 08, 2011, 04:10:09 PM
 

  Closest Human Ancestor May Rewrite Steps in Our Evolution
By Charles Q. Choi, LiveScience Contributor |

  A startling mix of human and primitive traits found in the brains, hips, feet and hands of an extinct species identified last year make a strong case for it being the immediate ancestor to the human lineage, scientists have announced.

These new findings could rewrite long-standing theories about the precise steps human evolution took, they added, including the notion that early human female hips changed shape to accommodate larger-brained offspring. There is also new evidence suggesting that this species had the hands of a toolmaker.

Fossils of the extinct hominid known as Australopithecus sediba were accidentally discovered by the 9-year-old son of a scientist in the remains of a cave in South Africa in 2008, findings detailed by researchers last year. Australopithecus means "southern ape," and is a group that includes the iconic fossil Lucy, while sediba means "wellspring" in the South African language Sotho. [See images of human ancestor]

Two key specimens were discovered — a juvenile male as developed as a 10- to 13-year-old human and an adult female maybe in her late 20s or early 30s. The species is both a hominid and a hominin — hominids include humans, chimpanzees, gorillas and their extinct ancestors, while hominins include those species after Homo, the human lineage, split from that of chimpanzees.

To begin to see where Au. sediba might fit on the family tree, researchers pinned down the age of the fossils by dating the calcified sediments surrounding them with advanced uranium-lead dating techniques and a method called paleomagnetic dating, which measures how many times the Earth's magnetic field has reversed. They discovered the fossils were approximately 1.977 million years old, which predates the earliest appearances of traits specific to the human lineage Homo in the fossil record. This places Au. sediba in roughly the same age category as hominids such as Homo habilis and Homo rudolfensis, which were thought to be potential ancestors to Homo erectus, the earliest undisputed predecessor of modern humans. [10 Things That Make Humans Special]

"As the fossil record for early human ancestors increases, the need for more accurate dates is becoming paramount," said researcher Robyn Pickering at the University of Melbourne in Australia.

Small but humanlike brain

Most aspects of Au. sediba display an intriguing mix of both human and more primitive features that hint it might be an intermediary form between Australopithecus and Homo.

"The fossils demonstrate a surprisingly advanced but small brain, a very evolved hand with a long thumb like a human's, a very modern pelvis, but a foot and ankle shape never seen in any hominin species that combines features of both apes and humans in one anatomical package," said researcher Lee Berger, a paleoanthropologist at the University of Witwatersrand in Johannesburg, South Africa. "The many very advanced features found in the brain and body and the earlier date make it possibly the best candidate ancestor for our genus, the genus Homo, more so than previous discoveries such as Homo habilis."

The brain is often thought of as what distinguishes humanity from the rest of the animal kingdom, and the juvenile specimen of Au. sediba had an exceptionally well-preserved skull that could shed light on the pace of brain evolution in early hominins. To find out more, the researchers scanned the space in the skull where its brain would have been using the European Synchrotron Radiation Facility in Grenoble, France; the result is the most accurate scan ever produced for an early human ancestor, with a level of detail of up to 90 microns, or just below the size of a human hair.

The scan revealed Au. sediba had a much smaller brain than seen in human species, with an adult version maybe only as large as a medium-size grapefruit. However, it was humanlike in several ways — for instance, its orbitofrontal region directly behind the eyes apparently expanded in ways that make it more like a human's frontal lobe in shape. This area is linked in humans with higher mental functions such as multitasking, an ability that may contribute to human capacities for long-term planning and innovative behavior.

"We could be seeing the beginnings of those capabilities," researcher Kristian Carlson at the University of Witwatersrand told LiveScience.

These new findings cast doubt on the long-standing theory that brains gradually increased in size and complexity from Australopithecus to Homo. Instead, their findings corroborate an alternative idea — that Australopithecus brains did increase in complexity gradually, becoming more like Homo, and later increased in size relatively quickly.

Modern hips

This mosaic of modern and primitive traits held true with its hips as well. An analysis of the partial pelvis of the female Au. sediba revealed that it had modern, humanlike features.

"It is surprising to discover such an advanced pelvis in such a small-brained creature," said researcher Job Kibii at the University of the Witwatersrand.  "It is short and broad like a human pelvis ... parts of the pelvis are indistinguishable from that of humans."

Scientists had thought the human-like pelvis evolved to accommodate larger-brained offspring. The new findings of humanlike hips in Au. sediba despite small-brained offspring suggests these pelvises may have instead initially evolved to help this hominin better wander across the landscape, perhaps as grasslands began to expand across its habitat.

When it came to walking, investigating the feet and ankles of the fossils revealed surprises about how Au. sediba might have strode across the world. No hominin ankle has ever been described with so many primitive and advanced features.

"If the bones had not been found stuck together, the team may have described them as belonging to different species," said researcher Bernhard Zipfel at the University of the Witwatersrand.

The  researchers discovered that its ankle joint is mostly like a human's, with some evidence for a humanlike arch and a well--efined Achilles tendon, but its heel and shin bones appear to be mostly ape-like. This suggested the hominid probably climbed trees yet also halkid in a unique way not exactly like that of humans.

Altogether, such anatomical traits would have allowed Au. sediba to walk in perhaps a more energy-efficient way, with tendons storing energy and returning that energy to the next step, said researcher Steve Churchill from Duke University in Durham, N.C. "These are the kinds of things that we see with the genus Homo," he explained.

What nice hands …

Finally, an analysis of Au. sediba's hands suggests it might have been a toolmaker. The fossils — including the most complete hand known in an early hominin, which is missing only a few bones and belonged to the mature female specimen — showed its hand was capable of the strong grasping needed for tree-climbing, but that it also had a long thumb and short fingers. These would have allowed it a precision grip useful for tools, one involving just the thumb and fingers, where the palm does not play an active part.

Altogether, the hand of Au. sediba has more features related to tool-making than that of the first human species thought of as a tool user, the "handy man" Homo habilis, said researcher Tracy Kivell at the Max Planck Institute for Evolutionary Anthropology in Germany. "This suggests to us that sediba may also have been a toolmaker."

Though the scientists haven't excavated the site in search of stone tools, "the hand and brain morphology suggest that Au. sediba may have had the capacity to manufacture and use complex tools," Kivell added.

The researchers do caution that although they suggest that Au. sediba was ancestral to the human lineage, all these apparent resemblances between it and us could just be coincidences, with this extinct species evolving similar traits to our lineages due, perhaps, to similar circumstances. [Top 10 Missing Links]

In fact, it might be just as interesting to imagine that Au. sediba was not directly ancestral to Homo, because it opens up the possibility "of independent evolution of the same sorts of features," Carlson said. "Whether or not it's on the same lineage as leading to Homo, I think there are interesting questions and implications."

The scientists detailed their findings in the Sept. 9 issue of the journal Science.

                                                P.C.
Title: WSJ: Ridley: From Phoenicia to the Cloud
Post by: Crafty_Dog on September 24, 2011, 09:09:57 AM
Matt Ridley is an author whom I follow.  I have read his "The Red Queen" (the evolutionary reasons sex exists and the implications thereof) and "Nature via Nuture (also quite brilliant and which has triggered a shift in how I think about these things.)



By MATT RIDLEY
The crowd-sourced, wikinomic cloud is the new, new thing that all management consultants are now telling their clients to embrace. Yet the cloud is not a new thing at all. It has been the source of human invention all along. Human technological advancement depends not on individual intelligence but on collective idea sharing, and it has done so for tens of thousands of years. Human progress waxes and wanes according to how much people connect and exchange.

When the Mediterranean was socially networked by the trading ships of Phoenicians, Greeks, Arabs or Venetians, culture and prosperity advanced. When the network collapsed because of pirates at the end of the second millennium B.C., or in the Dark Ages, or in the 16th century under the Barbary and Ottoman corsairs, culture and prosperity stagnated. When Ming China, or Shogun Japan, or Nehru's India, or Albania or North Korea turned inward and cut themselves off from the world, the consequence was relative, even absolute decline.

Knowledge is dispersed and shared. Friedrich Hayek was the first to point out, in his famous 1945 essay "The Uses of Knowledge in Society," that central planning cannot work because it is trying to substitute an individual all-knowing intelligence for a distributed and fragmented system of localized but connected knowledge.

So dispersed is knowledge, that, as Leonard Reed famously observed in his 1958 essay "I, Pencil," nobody on the planet knows how to make a pencil. The knowledge is dispersed among many thousands of graphite miners, lumberjacks, assembly line workers, ferrule designers, salesmen and so on. This is true of everything that I use in my everyday life, from my laptop to my shirt to my city. Nobody knows how to make it or to run it. Only the cloud knows.

One of the things I have tried to do in my book "The Rational Optimist" is to take this insight as far back into the past as I can—to try to understand when it first began to be true. When did human beings start to use collective rather than individual intelligence?

In doing so, I find that the entire field of anthropology and archaeology needs Hayek badly. Their debates about what made human beings successful, and what caused the explosive take-off of human culture in the past 100,000 years, simply never include the insight of dispersed knowledge. They are still looking for a miracle gene, or change in brain organization, that explains, like a deus ex machina, the human revolution. They are still looking inside human heads rather than between them.

Enlarge Image

CloseGetty Images
 ."I think there was a biological change—a genetic mutation of some kind that promoted the fully modern ability to create and innovate," wrote the anthropologist Richard Klein in a 2003 speech to the American Association for the Advancement of Science. "The sudden expansion of the brain 200,000 years ago was a dramatic spontaneous mutation in the brain . . . a change in a single gene would have been enough," the neuroscientist Colin Blakemore told the Guardian in 2010.

There was no sudden change in brain size 200,000 years ago. We Africans—all human beings are descended chiefly from people who lived exclusively in Africa until about 65,000 years ago—had slightly smaller brains than Neanderthals, yet once outside Africa we rapidly displaced them (bar acquiring 2.5% of our genes from them along the way).

And the reason we won the war against the Neanderthals, if war it was, is staring us in the face, though it remains almost completely unrecognized among anthropologists: We exchanged. At one site in the Caucasus there are Neanderthal and modern remains within a few miles of each other, both from around 30,000 years ago. The Neanderthal tools are all made from local materials. The moderns' tools are made from chert and jasper, some of which originated many miles away. That means trade.

Evidence from recent Australian artifacts shows that long-distance movement of objects is a telltale sign of trade, not migration. We Africans have been doing this since at least 120,000 years ago. That's the date of beads made from marine shells found a hundred miles inland in Algeria. Trade is 10 times as old as agriculture.

At first it was a peculiarity of us Africans. It gave us the edge over Neanderthals in their own continent and their own climate, because good ideas can spread through trade. New weapons, new foods, new crafts, new ornaments, new tools. Suddenly you are no longer relying on the inventiveness of your own tribe or the capacity of your own territory. You are drawing upon ideas that occurred to anybody anywhere anytime within your trading network.

In the same way, today, American consumers do not have to rely only on their own citizens to discover new consumer goods or new medicines or new music: The Chinese, the Indians, the Brazilians are also able to supply them.

That is what trade does. It creates a collective innovating brain as big as the trade network itself. When you cut people off from exchange networks, their innovation rate collapses. Tasmanians, isolated by rising sea levels about 10,000 years ago, not only failed to share in the advances that came after that time—the boomerang, for example—but actually went backwards in terms of technical virtuosity. The anthropologist Joe Henrich of the University of British Columbia argues that in a small island population, good ideas died faster than they could be replaced. Tierra del Fuego's natives, on a similarly inhospitable and small land, but connected by trading canoes across the much narrower Magellan strait, suffered no such technological regress. They had access to a collective brain the size of South America.

Which is of course why the Internet is such an exciting development. For the first time humanity has not just some big collective brains, but one truly vast one in which almost everybody can share and in which distance is no obstacle.

The political implications are obvious: that human collaboration is necessary for society to work; that the individual is not—and has not been for 120,000 years—able to support his lifestyle; that trade enables us to work for each other not just for ourselves; that there is nothing so antisocial (or impoverishing) as the pursuit of self-sufficiency; and that authoritarian, top-down rule is not the source of order or progress.

Hayek understood all this. And it's time most archaeologists and anthropologists, as well as some politicians and political scientists, did as well.

Mr. Ridley writes the Journal's weekly Mind & Matter column. He is the author of "The Rational Optimist: How Prosperity Evolves" (Harper, 2010). This op-ed is adapted from his Hayek Prize lecture, given under the auspices of the Manhattan Institute, to be delivered on Sept. 26.

Title: Snowboarding crow
Post by: Crafty_Dog on January 13, 2012, 06:31:36 AM
This footage seems to me to be quite extraordinary. Apparently the crow has observed humans snowboarding and has taken up the sport himself.

Thus we see:

a) cross-species learning
b) the use of a tool
c) play

http://www.youtube.com/watch?v=3dWw9GLcOeA&feature=share
Title: WSJ: Lionel Tiger on Facebook
Post by: Crafty_Dog on February 06, 2012, 05:38:47 AM

By LIONEL TIGER

When the first phone line linked two New England towns, the inevitable arrogant scold asked if the people of town X had anything to say to the folks of town Y. His implication was "no." Why have more to do with (implicitly fallen) fellow humans than absolutely necessary? Why should technology abet friendliness?

Mr. Scold was wrong. One of the most successful magazine launches of the last decades was People, carefully and endlessly just about that, week in and week out, year after year. Europe boasts a strange menagerie of similar publications that ceaselessly chronicle the libidinous events in the lives of minor Scandinavian royalty and the housing buys and sells of soccer stars before and after their divorces. Magazines pay the price of a used fighter plane for the first photo of the baby of certified stars.

People want to know about this town and that other town too. It's their nature.

Primates always want to know what is going on. If it's over the hill where you can't see for sure what's up, that's even more stimulating and important to secure long-range survival. Primates are intensely interested in each other and other groups. It was pointed out in the 1960s that in some ground-living species, members of the group glanced at the lead primate every 20 or 30 seconds. Think Louis Quatorze or Mick Jagger. Look, look, look—people are always on the lookout.

The human who has most adroitly—if at first innocently, and in the next weeks most profitably—capitalized on this is Facebook founder Mark Zuckerberg.

"Facebook." Get it? Not FootBook or ElbowBook. The face. It gets you a driver's license and stars send it out to fans. We know that many users' first and classical impulse was acquiring convivial acquaintance with young women. Facebook married that ancient Darwinian urgency to a cheap, brilliantly lucid, and endlessly replicable technology.

The result has been virtually incalculable and not only for Mr. Zuckerberg's lunch money. Nearly one-sixth of homo sapiens are on Facebook. Half of Americans over age 12 are on it. It is world-wide and has been joined by other tools of conviviality such as Twitter. Nearly 15% of Americans already belong to that new tribe. There are others.

Mr. Zuckerberg has re-primatized a group of humans of unprecedented number, diffusion and intensity. His product costs him virtually nothing to produce—it is simply us. We enter his shop, display ourselves as attractively or interestingly as we can, replenish ourselves hourly or daily or by the minute, and do it for nothing. Doesn't cost him a nickel.

And why? Just because we're primates with endlessly deep interest in each other, with a knack and need to groom each other—either physically, as monkeys do, or with "What a nice hairdo/dress/divorce/promotion!" as Facebookworms do. There is much to transmit between towns and between people.

Mr. Zuckerberg bestrides vast business numbers once dreamt of only by toothpaste and soft-drink makers. This reflects a new commercial demography in which the consumer is not someone who wants something necessary, but rather one who seeks to assert simply what he is. And the tool he uses in order to become nothing more or less than an efficient, interesting and socially prosperous primate is the Facebook page.

The technology is new but the passion for connection isn't. In Paris a hundred years ago pneumatic tubes ran all the through the parts of town that could afford them so messages could be written and sent as if by courier. When I was a student in London, there were mail deliveries twice a day and in some environs three. The homo sapien wants to know, to exchange, to show its face.

And when the counting houses work triple-time recording the riches from all this, it will be sweet comedy to remember that Mr. Zuckerberg became the richest primatologist in the world because he gave his customers nothing new, except the chance to be their old ape selves.

Mr. Tiger, an emeritus professor of anthropology at Rutgers, is the author of "The Decline of Males" (St. Martins, 2000) and, with Michael McGuire, of "God's Brain" (Prometheus Books, 2010).
Title: When the good do bad
Post by: bigdog on March 21, 2012, 05:58:15 PM
http://www.nytimes.com/2012/03/20/opinion/brooks-when-the-good-do-bad.html?_r=1&ref=davidbrooks


"According to this view, most people are naturally good, because nature is good. The monstrosities of the world are caused by the few people (like Hitler or Idi Amin) who are fundamentally warped and evil.

This worldview gives us an easy conscience, because we don’t have to contemplate the evil in ourselves. But when somebody who seems mostly good does something completely awful, we’re rendered mute or confused.

But of course it happens all the time. That’s because even people who contain reservoirs of compassion and neighborliness also possess a latent potential to commit murder."
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on March 22, 2012, 04:12:49 AM
Indeed BD.  Coincidentally enough I am re-reading right now a rather thick book that is a collection of essays on the concept, Jungian and otherwise, of the shadow.
Title: Evolutionary psychology: America’s false autism epidemic
Post by: DougMacG on April 26, 2012, 12:32:48 PM
America’s false autism epidemic,  by Dr. Allen Frances, professor emeritus at Duke University’s department of psychology

The apparent epidemic of autism is in fact the latest instance of the fads that litter the history of psychiatry.

We have a strong urge to find labels for disturbing behaviors; naming things gives us an (often false) feeling that we control them. So, time and again, an obscure diagnosis suddenly comes out of nowhere to achieve great popularity. It seems temporarily to explain a lot of previously confusing behavior — but then suddenly and mysteriously returns to obscurity.

Not so long ago, autism was the rarest of diagnoses, occurring in fewer than one in 2,000 people. Now the rate has skyrocketed to 1 in 88 in America (and to a remarkable 1 in 38 in Korea). And there is no end in sight.

Increasingly panicked, parents have become understandably vulnerable to quackery and conspiracy theories. The worst result has been a reluctance to vaccinate kids because of the thoroughly disproved and discredited suggestion that the shots can somehow cause autism.

There are also frantic (and probably futile) efforts to find environmental toxins that might be harming developing brains, explaining the sudden explosion of autism.

Anything is possible, but when rates rise this high and this fast, the best bet is always that there has been a change in diagnostic habits, not a real change in people or in the rate of illness.

So what is really going on to cause this “epidemic”?

Perhaps a third of the huge jump in rates can be explained by three factors: the much-increased public and provider awareness of autism, the much-reduced stigma associated with it and the fact that the definition of autism has been loosened to include milder cases.

Sixteen years ago, when we updated the DSM (the official manual of psych diagnoses) for the fourth edition, we expanded the definition of autism to include Aspergers. At the time, we expected this to triple the rate of diagnosed cases; instead, it has climbed 20 times higher.

That unexpected jump has three obvious causes. Most important, the diagnosis has become closely linked with eligibility for special school services.

Having the label can make the difference between being closely attended to in a class of four versus being lost in a class of 40. Kids who need special attention can often get it only if they are labeled autistic.

So the autism tent has been stretched to accommodate a wide variety of difficult learning, behavioral and social problems that certainly deserve help — but aren’t really autism. Probably as many as half of the kids labeled autistic wouldn’t really meet the DSM IV criteria if these were applied carefully.

Freeing autism from its too tight coupling with service provision would bring down its rates and end the “epidemic.” But that doesn’t mean that school services should also be reduced. The mislabeled problems are serious in their own right, and call out for help.

The second driver of the jump in diagnosis has been a remarkably active and successful consumer advocacy on autism, facilitated by the power of the Internet. This has had four big upsides: the identification of previously missed cases, better care and education for the identified cases, greatly expanded research and a huge reduction in stigma.

But there are two unfortunate downsides: Many people with the diagnosis don’t really meet the criteria for it, and the diagnosis has become so heterogeneous that it loses meaning and predictive value. This is why so many kids now outgrow their autism. They were never really autistic in the first place.

A third cause has been overstated claims coming from epidemiological research — studies of autism rates in the general population. For reasons of convenience and cost, the ratings in the studies always have to be done by lay interviewers, who aren’t trained as clinicians and so are unable to judge whether the elicited symptoms are severe and enduring enough to qualify as a mental disorder.

It’s important to understand that the rates reported in these studies are always upper limits, not true rates; they exaggerate the prevalence of autism by including people who’d be excluded by careful clinical interview. (This also explains why rates can change so quickly from year to year.)

So where do we stand, and what should we do? I am for a more careful and restricted diagnosis of autism that isn’t driven by service requirements. I am also for kids getting the school services they need.

The only way to achieve both goals is to reduce the inordinate power of the diagnosis of autism in determining who gets what educational service. Psychiatric diagnosis is devised for use in clinical settings, not educational ones. It may help contribute to educational decisions but should not determine them.

Human nature changes slowly, if at all, but the ways we label it can change fast and tend to follow fleeting fashions.

Dr. Allen Frances, now a professor emeritus at Duke University’s department of psychology, chaired the DSM IV task force.

Read more: http://www.nypost.com/p/news/opinion/opedcolumnists/america_false_autism_epidemic_jfI7XORH94IcUB795b6f7L#ixzz1tB0kPCdK
Title: Re: Evolutionary biology/psychology
Post by: JDN on April 27, 2012, 09:55:38 AM
On a personal note, an acquaintance of mine at a local 4 year college sought and was diagnosed with Attention Deficit Disorder.  He was given private instruction, private tutors, extra time on tests, exemptions from certain requirements, etc.  At what cost?  And frankly, IMHO his degree is suspect.  My heart bleeds for the truly handicapped; but this has become ridiculous. 
Title: Silicon Valley says step away from the device
Post by: Crafty_Dog on July 25, 2012, 07:01:53 PM


Silicon Valley Says Step Away From the Device
Tech firms are uneasy over the effect time online has on relationships.
By MATT RICHTEL
Published: July 23, 2012
 
Stuart Crabb, a director in the executive offices of Facebook, naturally likes to extol the extraordinary benefits of computers and smartphones. But like a growing number of technology leaders, he offers a warning: log off once in a while, and put them down.

The New York Times

In a place where technology is seen as an all-powerful answer, it is increasingly being seen as too powerful, even addictive.

The concern, voiced in conferences and in recent interviews with many top executives of technology companies, is that the lure of constant stimulation — the pervasive demand of pings, rings and updates — is creating a profound physical craving that can hurt productivity and personal interactions.

“If you put a frog in cold water and slowly turn up the heat, it’ll boil to death — it’s a nice analogy,” said Mr. Crabb, who oversees learning and development at Facebook. People “need to notice the effect that time online has on your performance and relationships.”

The insight may not sound revelatory to anyone who has joked about the “crackberry” lifestyle or followed the work of researchers who are exploring whether interactive technology has addictive properties.

But hearing it from leaders at many of Silicon Valley’s most influential companies, who profit from people spending more time online, can sound like auto executives selling muscle cars while warning about the dangers of fast acceleration.

“We’re done with this honeymoon phase and now we’re in this phase that says, ‘Wow, what have we done?’ ” said Soren Gordhamer, who organizes Wisdom 2.0, an annual conference he started in 2010 about the pursuit of balance in the digital age. “It doesn’t mean what we’ve done is bad. There’s no blame. But there is a turning of the page.”

At the Wisdom 2.0 conference in February, founders from Facebook, Twitter, eBay, Zynga and PayPal, and executives and managers from companies like Google, Microsoft, Cisco and others listened to or participated in conversations with experts in yoga and mindfulness. In at least one session, they debated whether technology firms had a responsibility to consider their collective power to lure consumers to games or activities that waste time or distract them.

The actual science of whether such games and apps are addictive is embryonic. But the Diagnostic and Statistical Manual of Mental Disorders, widely viewed as the authority on mental illnesses, plans next year to include “Internet use disorder” in its appendix, an indication researchers believe something is going on but that requires further study to be deemed an official condition.

Some people disagree there is a problem, even if they agree that the online activities tap into deep neurological mechanisms. Eric Schiermeyer, a co-founder of Zynga, an online game company and maker of huge hits like FarmVille, has said he has helped addict millions of people to dopamine, a neurochemical that has been shown to be released by pleasurable activities, including video game playing, but also is understood to play a major role in the cycle of addiction.

But what he said he believed was that people already craved dopamine and that Silicon Valley was no more responsible for creating irresistible technologies than, say, fast-food restaurants were responsible for making food with such wide appeal.

“They’d say: ‘Do we have any responsibility for the fact people are getting fat?’ Most people would say ‘no,’ ” said Mr. Schiermeyer. He added: “Given that we’re human, we already want dopamine.”

Along those lines, Scott Kriens, chairman of Juniper Networks, one of the biggest Internet infrastructure companies, said the powerful lure of devices mostly reflected primitive human longings to connect and interact, but that those desires needed to be managed so they did not overwhelm people’s lives.

“The responsibility we have is to put the most powerful capability into the world,” he said. “We do it with eyes wide open that some harm will be done. Someone might say, ‘Why not do so in a way that causes no harm?’ That’s naïve.”

“The alternative is to put less powerful capability in people’s hands and that’s a bad trade-off,” he added.

Mr. Crabb, the Facebook executive, said his primary concern was that people live balanced lives. At the same time, he acknowledges that the message can run counter to Facebook’s business model, which encourages people to spend more time online. “I see the paradox,” he said.

The emerging conversation reflects a broader effort in the valley to offer counterweights to the fast-paced lifestyle. Many tech firms are teaching meditation and breathing exercises to their staff members to help them slow down and disconnect.

At Cisco, Padmasree Warrior, the chief technology and strategy officer and its former head of engineering, a position where she oversaw 22,000 employees, said she regularly told people to take a break and a deep breath, and did so herself. She meditates every night and takes Saturday to paint and write poetry, turning off her phone or leaving it in the other room.

“It’s almost like a reboot for your brain and your soul,” she said. She added of her Saturday morning digital detox: “It makes me so much calmer when I’m responding to e-mails later.”

Kelly McGonigal, a psychologist who lectures about the science of self-control at the Stanford School of Medicine (and has been invited to lecture at the business school at Stanford), said she regularly talked with leaders at technology companies about these issues. She added that she was impressed that they had been open to discussing a potential downside of their innovations. “The people who are running these companies deeply want their technology and devices to enhance lives,” said Dr. McGonigal. “But they’re becoming aware of people’s inability to disengage.”

She also said she believed that interactive gadgets could create a persistent sense of emergency by setting off stress systems in the brain — a view that she said was becoming more widely accepted.

“It’s this basic cultural recognition that people have a pathological relationship with their devices,” she said. “People feel not just addicted, but trapped.”

Michelle Gale, who recently left her post as the head of learning and development at Twitter, said she regularly coached engineers and executives at the company that their gadgets had addictive properties.

“They said, ‘Wow, I didn’t know that.’ Or, ‘I guess I knew that but I don’t know what to do about it,’ ” recalled Ms. Gale, who regularly organized meditation and improvisation classes at Twitter to encourage people to let their minds wander.

Google has started a “mindfulness” movement at the company to teach employees self-awareness and to improve their ability to focus. Richard Fernandez, an executive coach at Google and one of the leaders of the mindfulness movement, said the risks of being overly engaged with devices were immense.

“It’s nothing less than everything,” he said, adding that if people can find time to occasionally disconnect, “we can have more intimate and authentic relationships with ourselves and those we love in our communities.”

Google, which owns YouTube, earns more ad revenue as people stay online longer. But Mr. Fernandez, echoing others in Silicon Valley, said they were not in business to push people into destructive behavior.

“Consumers need to have an internal compass where they’re able to balance the capabilities that technology offers them for work, for search, with the qualities of the lives they live offline,” he said.

“It’s about creating space, because otherwise we can be swept away by our technologies.”

Title: Ontogeny, philogeny, Lamark, Epigenetics
Post by: Crafty_Dog on September 16, 2012, 01:35:11 PM
Proposition:

"Ontogeny recapitulates philogeny."

True or false?


In a somewhat related vein:

http://en.wikipedia.org/wiki/Lamarckism   I remember reading a comment many years ago that criticized something Konrad Lorenz had said as being Lamarkian, but this past year I read Matt Ridley's "Nature via Nuture"-- a book which I found quite exciting though certain passages went right over my head with nary a look back, which seemed to me to resurrect the question.   In a related vein, there is this http://en.wikipedia.org/wiki/Epigenetics
Title: Recapitulation theory...
Post by: objectivist1 on September 16, 2012, 01:45:00 PM
This is from Wikipedia, which I know is not necessarily the authoritative source, but I've also read articles by modern biologists which state that this theory is not valid:

Haeckel

Ernst Haeckel attempted to synthesize the ideas of Lamarckism and Goethe's Naturphilosophie with Charles Darwin's concepts. While often seen as rejecting Darwin's theory of branching evolution for a more linear Lamarckian "biogenic law" of progressive evolution, this is not accurate: Haeckel used the Lamarckian picture to describe the ontogenic and phylogenic history of the individual species, but agreed with Darwin about the branching nature of all species from one, or a few, original ancestors.[18] Since around the start of the twentieth century, Haeckel's "biogenetic law" has been refuted on many fronts.[7]
Haeckel formulated his theory as "Ontogeny recapitulates phylogeny". The notion later became simply known as the recapitulation theory. Ontogeny is the growth (size change) and development (shape change) of an individual organism; phylogeny is the evolutionaryhistory of a species. Haeckel's recapitulation theory claims that the development of advanced species passes through stages represented by adult organisms of more primitive species.[7] Otherwise put, each successive stage in the development of an individual represents one of the adult forms that appeared in its evolutionary history.
For example, Haeckel proposed that the pharyngeal grooves between the pharyngeal arches in the neck of the human embryo resembled gill slits of fish, thus representing an adult "fishlike" developmental stage as well as signifying a fishlike ancestor. Embryonic pharyngeal slits, which form in many animals when the thin branchial plates separating pharyngeal pouches and pharyngeal grooves perforate, open the pharynx to the outside. Pharyngeal arches appear in all tetrapod embryos: in mammals, the first pharyngeal arch develops into the lower jaw (Meckel's cartilage), the malleus and the stapes. But these embryonic pharyngeal arches, grooves, pouches, and slits in human embryos could not at any stage carry out the same function as the gills of an adult fish.
Haeckel produced several embryo drawings that often overemphasized similarities between embryos of related species. The misinformation was propagated through many biology textbooks, and popular knowledge, even today. Modern biology rejects the literal and universal form of Haeckel's theory.[8]
Haeckel's drawings were disputed by Wilhelm His, who had developed a rival theory of embryology.[19] His developed a "causal-mechanical theory" of human embryonic development.[20]
Darwin's view, that early embryonic stages are similar to the same embryonic stage of related species but not to the adult stages of these species, has been confirmed by modern evolutionary developmental biology.
[edit]Modern status

The Haeckelian form of recapitulation theory is now considered defunct.[21] However, embryos do undergo a period where their morphology is strongly shaped by their phylogenetic position, rather than selective pressures.[22]
Title: Pinker: The Decline of Violence
Post by: Crafty_Dog on February 21, 2013, 02:30:11 PM


http://www.pointofinquiry.org/
Title: Morris: Chinese embarking on engineering Chinese master race
Post by: Crafty_Dog on April 05, 2013, 09:01:04 AM

http://www.dickmorris.com/chinas-secret-genetic-engineering-dick-morris-tv-lunch-alert/?utm_source=dmreports&utm_medium=dmreports&utm_campaign=dmreports
Title: Noah's Ark for DNA
Post by: Crafty_Dog on April 22, 2013, 09:05:34 PM


WSJ
By WILLIAM Y. BROWN

DNA was the topic of U.S. Supreme Court argument on April 15. Can a gene be patented if it occurs in nature—which is generally grounds for exclusion—but has been identified by an individual scientist or company and removed from the cells in which it occurs? Lower courts are split on the matter, and the justices didn't tip their hands.

But whether a gene can be patented will be irrelevant if it disappears before anyone has identified it. That is what's happening now and will continue to happen—at a rate perhaps 100 to 200 times faster than in prehistoric days—due to modern man's outsize influence on nature and encroachment on habitat. Unless we have sequenced a species' DNA, extinction means gone forever and never really known. Preservation of the DNA is the simpler, cheaper route, with sequencing to follow. If the Library of Congress is where every book is stored, the world needs the equivalent for species DNA.

Preserving the DNA of known species would provide genetic libraries for research and commerce and for recovery of species that are endangered—the Armur Leopard and the Northern Right Whale, for example. Preservation would also offer the potential to restore species that have gone extinct. We currently lack preserved DNA for most of the 1.9 million species that have been named, but that is fewer than the number of people in Houston. No doubt additional species exist, but their DNA can be preserved as they are named. The job is doable.

Just a small fraction of species are maintained as living organisms in cultivation or captivity or are kept frozen as viable seeds or cells. These are the best, because whole, reproducing organisms can be grown from them by planting or cloning. Botanical gardens and zoos keep the living stuff. The Millennium Seed Bank at Kew Gardens in England is on a course to preserve frozen seeds of all vascular plant species, and the Svalbard Seed Vault in Norway is taking seed duplicates from other facilities. The San Diego "Frozen Zoo" has some 20,000 viable cell cultures representing 1,000 vertebrate species, including "Lonesome George," the last Pinta Island Galapagos tortoise, which expired last year. Its DNA would have disintegrated if the Frozen Zoo hadn't made a heroic mission after the tortoise's death to get a sample.

Enlarge Image


Close
Getty Images
 .For a fraction more species, DNA is kept at low temperature in dead cells or extracted form. The American Museum of Natural History in New York keeps 70,000 samples in liquid nitrogen, the Academy of Natural Sciences in Philadelphia has frozen samples for 4,000 bird species, and the National Museum of Natural History at the Smithsonian has embarked on an ambitious course to freeze species tissues.

Yet the DNA of most species is still not preserved. We need a plan. One might think that preserving the DNA of life on earth would cost a moonshot of money. But a viable cell culture in liquid nitrogen for a species at the Frozen Zoo costs only $200 to $300 to establish and just $1 a year to maintain. Multiplying $250 per species by 1.9 million species comes to $475 million, ignoring what has already been done. The U.S. pays more than twice that daily on the national debt. But let's be real, nobody is throwing new money around, even when the priority is obvious.

There is another way that could work, and would be much cheaper. First, we could develop a website to track progress on preservation whose key information is managed directly by contributing facilities. It would be a "wiki" site for DNA repositories, and many keepers would be delighted to share information if they could manage it themselves. They could both update holdings and let people know what species they will take and under what conditions.

Second, we can establish new incentives and mandates for contributing specimens, including grant, publication and permit requirements. Some grant makers and publications already require that DNA information be shared with a genetic information bank kept by the National Institutes of Health. Why not tissue too?

Third, donors who care could help develop and fund "citizen science" projects of museums and nonprofit groups to collect, identify and contribute specimens to repositories. The collections would grow, and so might public connection to nature. At the end of it all, we will preserve what we appreciate. And patent lawyers will be happy too, because they'll have something to fight about.

Mr. Brown, a former president of the Academy of Natural Sciences, is a senior fellow at the Brookings Institution.
Title: two monkeys paid differently
Post by: Crafty_Dog on May 06, 2013, 01:03:21 PM


http://www.upworthy.com/2-monkeys-were-paid-unequally-see-what-happens-next?g=2&c=upw1
Title: Physically strong = likely right wing views
Post by: Crafty_Dog on May 16, 2013, 06:45:35 AM
http://www.dailymail.co.uk/health/article-2325414/Men-physically-strong-likely-right-wing-political-views.html
Title: Wired for Culture: Origins of the Human Social Mind by Mark Pagel
Post by: Crafty_Dog on June 20, 2013, 05:07:42 AM
Hat tip to David Gordon


Wired for Culture: Origins of the Human Social Mind by Mark Pagel
W. W. Norton & Company | 2012 | ISBN: 0393065871, 0393344207 | English | 432 pages

A fascinating, far-reaching study of how our species' innate capacity for culture altered the course of our social and evolutionary history.

A unique trait of the human species is that our personalities, lifestyles, and worldviews are shaped by an accident of birth—namely, the culture into which we are born. It is our cultures and not our genes that determine which foods we eat, which languages we speak, which people we love and marry, and which people we kill in war. But how did our species develop a mind that is hardwired for culture—and why?

Evolutionary biologist Mark Pagel tracks this intriguing question through the last 80,000 years of human evolution, revealing how an innate propensity to contribute and conform to the culture of our birth not only enabled human survival and progress in the past but also continues to influence our behavior today. Shedding light on our species’ defining attributes—from art, morality, and altruism to self-interest, deception, and prejudice—Wired for Culture offers surprising new insights into what it means to be human.
Title: S. Pinker; R. Wright
Post by: Crafty_Dog on September 05, 2013, 06:10:17 AM
The Decline of Violence
http://www.ted.com/talks/steven_pinker_on_the_myth_of_violence.html

Non-Zero Sum
http://www.ted.com/talks/robert_wright_on_optimism.html


Title: Evoultion - out of God's hands into ours
Post by: ccp on October 16, 2013, 07:32:51 AM
Not long ago many people wondered if we are still "evolving".  How can we be if there is no survival of the fittest.  Even those who are not "fit" still get to survive and reproduce in our society.

Now it is clear.  Not only are we evolving but evolution will accelerate.   We will soon begin to control our evolution and accelerate it.  From simple choosing the sex of babies to divesting of flawed DNA to insertion of chosen DNA.  Parents will be able to view menus of traits.  You want your son to be tall, athletic.  How about an IQ of 180?  How about extrovert?  High energy?

No problem.   

Not only will evolution increase so that we develop master races of humans we will be controlling it.

Title: Archibold MacLeish
Post by: Crafty_Dog on October 16, 2013, 09:12:01 AM
There is, in truth, a terror in the world.  Under the hum of the miraculous machines and the ceaseless publications of the brilliant physicists a silence waits and listens and is heard.

It is the silence of apprehension  We do not trust our time, and the reason we do not trust out rimes is because it is we who have made the time, and we do not trust ourselves.  We have played the hero's part, mastered the monsters, accomplished the labors, become gods-- and we do not trust ourselves as gods.  We know what we are.

In the old days the gods were someone else; the knowledge of what we are did not frighten us.   There were Furies to pursue the Hitlers, and Athenas to restore the Truth.  But now that we are gods ourselves we bear the knowledge for ourselves-- like that old Greek hero who learned when all his labors had been accomplished that it was he himself who had killed his son.
Title: human skull challenges out of Africa theory
Post by: Crafty_Dog on May 15, 2014, 10:44:15 PM
http://www.ancient-origins.net/human-origins-science/human-skull-challenges-out-africa-theory-001283
Title: Interesting new human genetic info
Post by: Crafty_Dog on November 09, 2014, 05:05:50 PM


http://www.archaeology.org/news/2692-141107-kostenki-european-dna
Title: Human Ancestors Were Consuming Alcohol 10 Million Years Ago
Post by: DougMacG on December 28, 2014, 01:47:08 PM
We haven't changed as much as we think?

Human Ancestors Were Consuming Alcohol 10 Million Years Ago
http://blogs.discovermagazine.com/d-brief/2014/12/01/human-ancestors-were-consuming-alcohol-10-million-years-ago/#.VKB4d_9LIAA
Title: Stratfor: What drives people to the extreme
Post by: Crafty_Dog on January 14, 2015, 10:26:31 AM

Share
What Drives People to the Extreme
Global Affairs
January 14, 2015 | 09:00 GMT Print Text Size

By Dr. Luc De Keyser

When invited to write a column for Stratfor, I volunteered too quickly to write about the phenomenon that is the Islamic State and the apparent copycat groups. The shock and awe of the Islamic State's eruption onto the world stage threw a wrench into my ever-developing model for gauging human behavior. It is easy to admit that this essay comes too soon. It is harder to admit that, most likely, this essay will not be in time for the next atrocity. (Indeed, I sat down to write this as the world was still mourning the deadly attack on Charlie Hebdo.)

Most geopolitical analysis draws from backgrounds in humanities and the social sciences. Stratfor might look at geography, history and economics when trying to understand a group such as the Islamic State. I, on the other hand, draw from a background in the exact and life sciences. This work stems from my preoccupation with the dimension of evolutionary biology that studies the origin of human disease. And within this setting, it is easy to accept that the actions of a group like the Islamic State must be classified as manifest symptoms of a severe form of "dis-ease," as in "not being at ease." This is not to say that the regions in the world where Islamic State fighters are coming from or what they are fighting against are not part of the overarching physiopathological picture of what many consider an upcoming black plague. But in any case, carefully tracing the cause and effect chains from the individual to the next of kin, then to the extended group and up to the nation will prove too brittle still to generate a reliable prognosis and guidelines for preventive and curative measures. So, why bother?

To become a reliable forecaster, it is important to understand that "there are also unknown unknowns — the ones we don't know we don't know," to quote former U.S. Defense Secretary Donald Rumsfeld. Although his statement may have been inspired by Nassim Nicholas Taleb's black swan theory, it does read like the modern sound bite of the warnings of Jesus ben Sirach, inscribed in the Old Testament's Apocrypha: "What is too sublime for you, do not seek; do not reach into things that are hidden from you. What is committed to you, pay heed to; what is hidden is not your concern" (Sirach 3:21-22). Let me try an alternative, post-Darwinian exegesis.
Principles of Evolutionary Psychology

We all know the adage, "If all you have is a hammer, everything looks like a nail." I propose the following variant adage: "If all you have is a human mind, everything looks like a situation in the life of the hunter-gatherer we have forgotten we still are." We are generally unaware of this perspective, and more important, it may not even matter if we were aware. 

This new adage is the consequence of a couple of basic principles of evolutionary psychology, outlined most succinctly by University of California, Santa Barbara, professors John Tooby and Leda Cosmides in their 1997 publication, "Evolutionary Psychology: A Primer." One is that "our neural circuits were designed by natural selection to solve problems that our ancestors faced during our species' evolutionary history." The other is that "Our modern skulls house a stone age mind." From those I deduce that every modern-day problem can, at best, be reduced to one or more problems of a complexity we humans are "naturally" wired to solve. Thus, there is hardly a guarantee that the solutions found will fit the problem at hand, no matter how hard we try.
But Aren't Humans 'Sapiens'?

There are several objections to this conclusion: 1) The human brain has an almost infinite capacity to learn and over time will attain the capacity to deal with problems of ever-increasing complexity; 2) evolution has not stopped and will progressively cause new wirings in the human brain to whatever is needed to overcome any problems that may arise; and 3) every problem can certainly be decomposed into more simple problems until they reach a level that even a Stone Age mind can handle.

Let me address these objections in reverse order, starting with the one that is least controversial:

3) It does not take much thought to accept that most relevant problems cannot be fully and faithfully decomposed into a consistent logical tree of underlying problems. The accuracy of such decomposition is limited by the specialized logic of our thinking machinery as it is dedicated to a problem set typical of an ancestral lifestyle only. The scope of this natural logic covers only a small part of the domain that mathematical logic encompasses, which in itself is not even sufficient to embrace the complexity of most issues pertinent to our modern times. 

Moreover, even without these logical and neurophysiologic limitations, it is not very likely that the broken-down problems can be solved within a relevant time frame. Doing so would mean that humans could, within a couple of decades, revert to the conditions of existence under which the ancestral solutions worked, even though it took millions of years for humans to evolve. Imagine we could decompose the sudden surge of the Islamic State to problems at the individual level, such as deprivation of attachment as an infant, lack of examples of trustworthy parent figures as toddlers, underdeveloped confidence to master life enough to build a future as an adolescent, etc. Even if these are not the problems at cause, they illustrate how desperately hard each of them would be to solve.

2) Of course evolution has not stopped. We know that since the relatively recent Neolithic age, man has evolved to a lighter skin color with dwellings at higher latitudes, to preserve an active lactose digestion enzyme when growing up as a pastoralist, to deform red blood cells to resist malaria in swampy areas, and so on. Not only has the human genome evolved, but so, too, has the complex microbiome that has co-evolved with our species — that is, the fauna and flora of microbes, viruses and fungi that have called the body of Homo sapiens, inside and out, their home. And yes, the brain was also the site of numerous mutations since. But these persisting mutations may not necessarily have upgraded the brain in a direction we would interpret today as beneficial. For example, the average weight of the human brain has dropped up to double-digit percentages since the dawn of systematic agriculture. This does not help to defend the argument that the brain has become smarter since then. In addition, there are statistical trails that suggest that during some periods since the Neolithic Revolution, regional selective pressures have even increased the rate of evolution. Despite all these considerations, what is fundamental is that the time frame in which evolution has an impact is much, much longer than the rate at which Neolithic-style problems that are thrown at humankind demand solutions that have never been tried.

1) And then there is this exaltation with the prowess of the human mind. Biologically, this feature is just one of many in the broad lineup of leaves that make up the edge of progress of the tree of evolution, which covers the more than 10 million extant species. In this context, this cerebrocentric posturing makes as much sense as a giraffe bragging about its long neck. Of course, this comparison is hard to accept when we praise our kind for producing the works of Shakespeare, the construction of the Great Wall, putting man on the moon, the building of great civilizations and so on. Still, evolutionary psychologists would argue that these are mere expressions of extensions of innate abilities within limits set by the earlier natural selection process. Thus, there is nothing wrong with being proud, per se, if it were not that our appreciation seems quite biased and strongly tends to ignore or downplay the various adverse effects associated with or leading up to most of these feats.

As a matter of fact, this lack of objectivity toward man's own accomplishments is probably another good example of those very limits.
The Human Brain's Actual Capacities

Man has also evolved a relatively sophisticated mental model of naive mechanical physics. It is easy to argue that this would come in handy in hunting prey with, for example, bow and arrow. This "talent" easily shines through in modern times as well. For example, if asked to run the 100-meter sprint in 10 seconds flat, it is quite obvious to most that only the top athletes can reach such speed. If asked to run the same distance in 5 seconds, most would readily recognize that this does not seem to be humanly possible. In a similar vein, if asked to learn Sanskrit or to unify the theory of general relativity and the quantum field theory by tomorrow, most will quickly agree that this is not feasible even for the most intelligent among us.

But if asked whether it is within humankind's capacity to assess the risk ramifications of very complex systems — such as the exploitation of nuclear energy, the setup of the worldwide economic and financial system, or the human effect on global warming — most would agree that these topics are, eventually, within the grasp of the human brain, if only given more time and more staff. Considering what the human brain was really programmed to handle and the bewildering intricacies of the systems involved in those examples, this faith can only be a manifestation of über hubris. The proof is relatively straightforward. I am sure most of us remember the statements of confidence before and the statements of sheer surprise during the most recent economic crisis. There is already evidence of a collective memory selective against the causes put forward for these catastrophic events and of a return of optimism toward pre-crisis levels. Many of us are not embarrassed playing the lottery or casino games despite the mathematical certainty of losing, on average. That is our ancestral brain at work. Man's innate mental model for statistics did not require that level of sophistication. But in comparison, this is just a fait divers concerning the seemingly boundless faith we have in the human brain to deal with matters that clearly supersede its intellectual capacity by several orders of magnitude. 

Let me propose a number of reasons for this phenomenon. First, the brain has evolved to understand the particular world of the hunter-gatherer. It has not developed a capacity to understand a very different world. Such a world will be understood only in terms of patterns the brain recognizes as typical for the world it does understand. The fit, if any, can only be coincidental.

Second, the brain senses its environment according to the model it has evolved to understand. The interpretation of the signals coming from the senses are, so to say, preloaded. The brain, to work at all, therefore cannot withhold judgment of interpretation while sensing. The brain must provide itself an explanation at all times, even in the most artificial and unrealistic situations. For example, the night sky is littered with innumerable stars. Some are brighter than others. The brain cannot refrain from ordering the brightest under them in patterns, drawing imaginary lines to make up Zodiac signs that refer to familiar images. The brain abhors a vacuum of explanation.

Finally, the human organism, like any organism, is driven by an "elan vital," or a "vital force." This is more an interpretation of the biological expression of the laws of thermodynamics that inexorably unfold in the universe than a magical form of energy. This is also not to be confused with the inborn mechanisms for fight or flight to preserve one's life when in danger. These situations are part of the conditions of existence man is readied to deal with. The vital energy, however, is expressed in the innate expectation that man fits his conditions of existence and that man will thrive, at least, in the form of the social aggregate man typically lives in. This means that man's biological and social needs are translated in feelings of "soon to be satisfied." These drive human behavior to fulfill these needs until feelings of sufficient satisfaction are reached. And overall, this fulfillment is within reach, day in and day out, from season to season, from ancestors to descendants. It is cause for a general sense of optimism. The brain, however, has no means to deal adequately with living conditions that hold insufficient promise of a future for generations. A fight-or-flight reaction to danger that would ultimately become impending is likely completely inappropriate for the complexity of the real situation at hand. The brain abhors a vacuum of destiny. Depending on the particular stage of dis-ease, the brain may ignore the vacuum and whistle in the dark; it may fill in the vacuum with its own "wishful thinking"; or it may turn this vacuum to an existential fright.

The same principles are at work in dreams. One of the evolutionary psychological theories on dreams, the activation-synthesis theory, poses that "there is a randomness of dream imagery, and the randomness synthesizes dream-generated images to fit the patterns of internally generated stimulations." In other words, emotions flood the higher neural circuits, and the neocortex scrambles to interpret the myriad pulsating trigger trails. To do that, it captures the most readily available images, related or not, from short- and longer-term memory stores and combines them as quickly as the feelings unfold into a story, any story if it must. No wonder many dreams appear weird when recounted upon waking. But this has important implications for interpreting dreams. Instead of engaging in an interminable wild-goose chase, delving for magical meanings through the most unusual combination of images and correlating them with everyday events in someone's distant past, recent past and — the extrapolation is quickly made — future, it is much more revealing to ask about the predominant emotion during the dream and the progression of that feeling during the unfolding of the made-up story. That is the core of the dream. The story is only chatter, albeit in the foreground.
Disentangling Our Analysis of the Islamic State

When studying the Islamic State phenomenon and its ilk, the same principles — being aware of humans' hunter-gatherer mentality, knowing that humans' environmental sensing is preprogrammed and understanding our species' "elan vital" — can be applied. This works on at least two levels: on the action of the individual fighters themselves but also on the reaction of the world feeling under threat. The current flood of analysis available in the global infosphere contains very erudite explanations and powerful conceptual placeholders to come to rest from the mental exhaustion of navigating the intricacies of the many possible cause-and-effect chains. But in the same vein as with the interpretation of dreams, the "primal" questions are not even close to being treated as extensively as warranted. A couple of obvious ones: To what emotion must a Homo sapiens be brought that it results in triggering one's explosive belt or in shooting in cold blood each one of a row of other Homo sapiens taken prisoner?

Take a paradigm such as the theory of the tectonic plates. It gives a coherent explanation for the particular position of volcanoes over the globe and of regions with a high risk of earthquakes. As useful as it is — for planning communities and evacuation routes, for example — this theory is still insufficient to precisely predict the majority of actual eruptions and tremors. Tracking the emotional magma flows underlying the Islamic State's emergence also remains insufficient to predict the occurrence of the next outbreak of barbaric violence reliably enough to prevent it.

But the analysis of this daytime nightmare proves useful because it separates the chatter from the core and applies it at the level of the individual and of the group. At the individual level, the chatter is made up of the complex of narratives the different stakeholders, perpetrators and victims put forth to make sense of it all, each from the perspective of his or her own culture and subculture. The core consists of the conditions of existence that were so overwhelmingly discordant with those the human genome was prepared for that it triggered this series of dramatic events. The efforts to improve those conditions are much more to the point than efforts to debunk the different narratives. And the conditions that need to be improved in this case are more in the sociological tier than in the economic tier.

At the group level, the analysis remains very grainy. The collection of gut feelings of the group's members percolates up through multiple layers of aggregation and along various sinuous paths. Even with unique mega-events like the "Je suis Charlie" march in Paris of last weekend, it is not clear if present pyroclastic clouds are cloaking the birth of a new supervolcano. Whatever that outcome, the pent up geopolitical pressures are real and will need more than an impromptu Twitter message to rally people.

In future columns I will discuss a number of those conditions of existence that are specific to the human species and are required for healthy development. These play an important role in the elaboration of moral and social rules and conventions that make up the organizational matrix of civilizations, small and grand. In the stride of the Human Genome Project, it is high time to give these principles the prominence they deserve in the redesign of our social matrix. Recycling the staggering emotional energy released in the aftermath of recently publicized savageries would be a means to mourn the dead in Paris and an excellent endeavor, lest the tragedy and its global response be in vain.

Read more: What Drives People to the Extreme | Stratfor
Follow us: @stratfor on Twitter | Stratfor on Facebook
Title: Human directed evolution
Post by: Crafty_Dog on May 29, 2015, 07:43:48 PM
Building with Biology
Several of my newsletters in the last few weeks have reported on a recent trip to California during which I visited Google, Facebook, and Udacity--remarkable companies undertaking projects with the potential to change the world. But of all the fascinating experiences on the trip, the best might have been the visit to my friend Lynn Rothschild’s lab at NASA's Ames Research Center in Mountain View, California.
 
Lynn and her students are developing projects that blend biology and technology in mind-bending ways. As a synthetic biologist and astrobiologist, Lynn studies the building blocks of life. She thinks both about where life might exist on other planets--the clouds of Venus, for instance--and about new ways to assemble those building blocks here on Earth. The latter effort holds amazing potential for practical applications, discoveries that could change our lives and the materials we encounter every day.
Lynn coaches the Stanford-Brown team in the international iGEM challenge, a competition for students to create "bio-bricks"--useful DNA sequences that can be inserted into cells to give them certain desirable properties, like water resistance or tolerance to high temperatures. The idea is that bio-bricks, like a kind of DNA LEGOs, could be assembled into basic living organisms or materials that could be useful to humans.
 
One example might be engineering a cell that generates cotton fibers. Assemble the right combination of DNA, and there could be a way to produce whole pieces of cloth in a factory setting (rather than growing cotton it in a field and weaving it on a loom. Another idea--the team's 2013 entry--is BioWires, which embeds individual atoms of silver into strands of DNA, resulting in nanowires that conduct electricity.
In 2012, Lynn's team took genetic features from a variety of organisms in harsh places on Earth--life surviving in extreme cold, or low oxygen, or with high radiation, or almost no water--and assembled them into one tough bacteria that potentially could survive on Mars. They dubbed it the “Hell Cell”. Those features, in theory, could be paired with still more genetic features--the thread production, for instance--and sent to Mars to replicate and grow ahead of a human mission to the planet.
 
Last year, Lynn challenged the team to solve a problem her NASA colleagues had experienced here on Earth--losing scientific sensing equipment in delicate environments, potentially polluting them. Lynn's suggestion to her students was to build a biodegradable drone. The team, which in 2014 included Spelman College, proved up to the challenge: they used a dried fungus for the body instead of plastic, and added proteins from wasp saliva to make it waterproof. The team believes they'll eventually be able to print the circuitry right onto the body in silver, and then find ways to power biological motors.
The team’s project this year is still a secret, but it’s even more intricate.
 
It was a privilege to see the pioneering work Lynn and her students are doing in the lab, with applications from medicine to materials. It was a great reminder after visiting three of Silicon Valley’s most innovative technology companies that a better future will come not just through breakthroughs in computing and communication, but through advances in biology as well.
Your Friend,
Newt
Title: The New Chimpanzee Review
Post by: Crafty_Dog on March 11, 2018, 04:32:24 PM
‘The New Chimpanzee’ Review: Mysteries of the Chimpanzees
Unusual among nonhuman primates, male chimpanzees are considerably more social than females.
‘The New Chimpanzee’ Review: Mysteries of the Chimpanzees
Photo: Getty Images
By David Barash
March 9, 2018 4:14 p.m. ET
1 COMMENTS

Ours is not really a planet of the apes. Rather, it is a planet overwhelmingly populated by one ape species: us. The other “great apes” include chimpanzees, bonobos, gorillas and orangutans, none of which are abundant.

There are many reasons to be interested in these creatures, not least that they are fascinating members of life’s panoply, worth knowing, observing and preserving for their own sakes. Long before biology’s evolution revolution, people recognized kinship with them—and with chimpanzees in particular. Regrettably, all of the great apes are now at risk of extinction, us included. It would not be in our interest to let the chimps fall where they may.
The New Chimpanzee

By Craig Stanford

Harvard, 274 pages, $35

There is something undeniably human-like about chimps, and chimp-like about humans, all of which is to be expected given that we share nearly 99% of our nuclear DNA with them (and with bonobos). Moreover, all three species—humans, chimps and bonobos—are more closely related to one another than to gorillas or orangutans. This fact has led Jared Diamond, of the University of California, Los Angeles, to label Homo sapiens the third chimpanzee. It has also led biologists, even before DNA sequencing was routine, to spend a great deal of time studying chimps.

The pioneer researchers in the field include Jane Goodall and three Japanese scientists little-known in the West but renowned among primatologists for their work primarily in the 1960s and ’70s: Junichiro Itani, Kinji Imanishi and Toshisada Nishida. Since this early work, our knowledge of chimpanzees has continued to expand thanks to an array of doughty field workers. Among the most productive has been Craig Stanford, whose book, “The New Chimpanzee,” is suitably subtitled “A Twenty-First-Century Portrait of Our Closest Kin.” Mr. Stanford began studying chimpanzees at Ms. Goodall’s now-famous Gombe Stream National Park in Tanzania more than three decades ago.

Mr. Stanford, a professor of biological sciences and anthropology at the University of Southern California, is a talented and fluent writer as well as an accomplished researcher. “My hope,” he writes, “is that readers will appreciate chimpanzees for what they are—not underevolved humans or caricatures of ourselves, but perhaps the most interesting of all the species of nonhuman animals with which we share our planet. The gift of the chimpanzee is the vista we are offered of ourselves. It is a gift at risk of disappearing as we destroy the chimpanzees’ natural world and drive them toward extinction.” I would add that the most valuable component of that vista is the glimpse we get not of ourselves but of those chimps for their own sake.

Researchers have unearthed remarkable cognitive abilities among chimpanzees, but such discoveries have been made using captive animals, either in labs or zoos. The findings of Mr. Stanford and his colleagues involve studying these animals in their natural environments, which is the only situation in which they can reveal the diversity and depth of their behavioral repertoire, notably as it reflects the impact of ecological cues (especially the location of fruiting trees) as well as the presence of competing social groups.

Ms. Goodall discovered that chimps use simple tools (including sticks for “fishing” termites out of their mounds) and occasionally hunt, ritually sharing the meat thereby obtained; they also engage in a form of intergroup aggression sometimes called (misleadingly, since it is altogether different from the human phenomenon) warfare. Mr. Stanford’s book expands upon what we have learned in the four decades since Ms. Goodall first began her field research. His chapter titles provide an outline.

In “Fission, Fusion, and Food,” we learn that the earlier conception that chimps live in chaotic, ever-changing social groups is not valid. Rather, they occupy “communities” whose constituents sometimes combine, sometimes split up, and are always influenced by the availability of food and estrus females. Unusual among nonhuman primates, males are considerably more social than females. “We now think,” Mr. Stanford writes, “that male cooperation is based mainly on the shared benefits of working together, with kin selection playing some role as well.” Elsewhere, he writes that alliances “tip the balance away from more powerful, lone actors in favor of lower-ranking males who team up briefly. In my own field studies, there was always a single alpha male, but his power at a given moment was highly dependent on those around him.”

In “Politics Is War Without Bloodshed,” a version of Clausewitz’s maxim that war is politics by other means, the reader is granted insight into the ways in which chimps—especially those highly social but no less scheming males—achieve dominance and, with it, reproductive success: “Some males are inveterate social climbers, cleverly serving their own ends by ingratiating themselves with high-ranking males and females. Others rely more on brute intimidation, which does not necessarily carry the day. And then there are males who seem to care little about their social status and are content to live out their lives on the edges of the struggle.” Of the nearly trite concept of alpha males, Mr. Stanford writes that “the most famous of all alphas in recorded chimpanzee history,” a chimp named Mahale alpha Ntologi, who was observed in Tanzania, “shared meat liberally as he rose in rank. But . . . once he had achieved alpha status, his generosity dropped, and he began sharing meat mainly with those whose political support he still needed most.” To his credit, the author refrains from pointing out human parallels.

The chapter “War for Peace” is a riveting discussion of intergroup aggression, in which males band together to ambush and occasionally raid neighboring groups, often with gruesome and lethal results. Chimps are the only primates, other than humans, that routinely kill members of the same species over access to resources. Adult females as well as males sometimes commit infanticide. As Frans de Waal demonstrated in impressive detail, chimps also engage in ritualized postconflict reconciliation—at least in captivity.

When it comes to “Sex and Reproduction,” things are comparably contradictory, with a degree of sexual free-for-all combined with exclusive consortships, in which females may cycle rapidly between apparent promiscuity and genuine sexual choosiness. We also learn about hunting, a cooperative endeavor whose goal (at least for males) appears to be enhanced mating opportunities as well as coalition-building. “We know that males use meat for a variety of political purposes. . . . One aspect of male manipulation of others was the use of meat to entice females to mate with them.” Also notable is the cultural transmission of certain behaviors, especially the use of tools to obtain food.

Despite its relative brevity, “The New Chimpanzee” is a remarkably thorough account of our current knowledge about free-living chimpanzees. Although it is tempting to try to use this knowledge to better understand human evolution and human nature, in many respects—notably, their inclinations toward violence—chimps are quite different from us. They may fight over females and territory using their hands and teeth, but we will fight for many additional reasons with the use of weapons, from clubs and knives to nuclear weapons.

My own inclination, when considering chimpanzees or any other animal, is to follow the advice of the early-20th-century naturalist Henry Beston : “The animal shall not be measured by man. In a world older and more complete than ours they move finished and complete, gifted with extensions of the senses we have lost or never attained, living by voices we shall never hear. They are not brethren, they are not underlings; they are other nations, caught with ourselves in the net of life and time, fellow prisoners of the splendor and travail of the earth.”

—Mr. Barash is an emeritus professor at the University of Washington. His next book is “Through a Glass Brightly: Using Science to See Our Species as We Really Are.”
Title: The evolutionary biology of sex
Post by: Crafty_Dog on September 11, 2018, 09:15:59 AM


https://www.youtube.com/watch?v=iOd0hnmFR3c
Title: Popular Science: Things evolving around humans
Post by: Crafty_Dog on October 13, 2018, 11:37:09 AM
https://www.popsci.com/things-evolving-around-humans?CMPID=ene101318#page-13
Title: Same sex mice reproduction
Post by: Crafty_Dog on October 13, 2018, 11:40:06 AM
second post

https://www.popsci.com/same-sex-mice-reproduction?CMPID=ene101318
Title: Re: Evolutionary biology/psychology
Post by: ccp on October 13, 2018, 03:37:18 PM


https://www.popsci.com/same-sex-mice-reproduction?CMPID=ene101318

how beautiful
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on October 14, 2018, 11:23:31 AM
Did you notice the previous post?

 :-D
Title: Endocrine Disruption
Post by: Crafty_Dog on November 11, 2018, 12:39:29 PM
Let's make this the thread for this very important subject:

Pulling together previous references here on this forum we have:

=================

from 2006:

I FOUND THIS ON www.mikemahler.com. check it out.

A Conversation with Dr. William Wong on Training, Testosterone, Growth Hormone, Acting like A Man, and Rites Of Passage

, , ,

MM: Testosterone is a big topic these days and low testosterone seems to be more and more prevalent in men. What are some of the common factors contributing to low Testosterone levels?

DW: The low testosterone levels we see in guys these days is due to a few overlapping factors all relating back to one thing - estrogen.  Many baby boys born since 1973 have had soy formula with it's phyto estrogen mucking up their works.  A boy age 6 months to 3 years has the testosterone level of an 18-year-old man!  A bottle of soy formula has the equivalent by weight of 5 to 8 estrogen birth control pills in it!  Multiply that by the number of bottles fed a day and the estrogen load is enormous!  What happens when all the E suppresses a boys T?  It is this T that tells the anterior pituitary to develops tiny part of itself and it is that part of the body that tells a boy that he's a guy!  In autopsies done on over 3000 gay men who died of HIV, I believe it was Dr. Lendon Smith the famous pediatrician who reported that homosexual men did not have this portion of the anterior pituitary!  Since that part develops in early childhood from the combination of testosterone, salt and calcium and since homosexual men are high in estrogen and DHT but low in testosterone and generally as a group have low serum calcium and low serum sodium we can see where the problem arises!  Trans-gender support groups have discovered where their dysfunction has arisen from and now they are in the lead in warning about the dangers of soy and environmental estrogens on the development of children.

MM: Wow that is some pretty scary stuff. Where else are we being bombarded with estrogen?

DW: Next we have all the pesticides, fertilizers, soy in all food, flax etc.  One form of estrogen atop another acting as endocrine disruptors we now have two generations of men since the 70's with:  smaller penis size both flaccid and erect than previous generations, lower testosterone levels, higher estrogen levels, dreadfully lower sperm counts, higher incidences of sexual mental dimorphism (not being sure what sex they are), and I fear reaching andropause around 35 or 40 instead of 45 to 50 all because of the estrogen in their food and environment.

On sperm count, in the 1960's a man was considered fertile only if he had over 100,000 sperm per ml. of semen. Things have gotten so bad that now a guy is considered fertile if he can make a measly 20,000 sperm per ml!  Dr. Doris Rapp MD the worlds leading environmental doc and pediatric allergist has looked at the data and predicts that by 2045 only 21% of the men on the entire planet will be fertile.  In all of Africa, Europe, Japan and many other countries deaths exceed births.  This will have devastating effect on world economies as pensioners will drastically out number those paying into pension plans!   I call this the Zardoz effect after the old Sean Connery movie where he was the last fertile man on earth.

MM: I have observed a scary trend of men being more effeminate. I wonder how much attitude and confidence has to do with T levels? I often hear men talk about how they need permission from their wives or girlfriends just to spend money they have earned or go out. Would you say that men who are dominated by their wives or girlfriends have lower T levels?

DW: I'm going to out on a limb here and I'll likely get hate mail for what I'm about to say but here goes.  Guy - gal relationships and marriage are a 50 - 50 proposition.  When men are dominated by their wives they have allowed this to happen and it may show low T levels or they are just "pussy whipped".  If the gal is very beautiful and desirable, the type of goddess men would kill for, then being "pussy whipped" is understandable.  (I'm married to one of those type gals)!   Most women who denigrate their men are nowhere near that good looking!  Not even close!   There are very few of those women around to account for all the "nebbish" men on the planet!   Since T levels are going lower and lower at earlier ages I expect we'll see even more nebbish men on the planet soon.  Andropause (male menopause) is now hitting at 35!

MM: What are the reasons for women bossing their men around? Is it just due to the fact that these men have low T levels or are sim ply whipped?

DW: I have noticed that women boss their men around when they lose respect for them.  As providers, as pillars of strength, as builders, as the "Hunks" these gals first married; the gleam is gone, the warts are showing, the dreams have deflated and the reality of what ever failings and the guy has are apparent.  This is when the bitterness of a woman's disappointment shows in her attitude by taking on his role as head of household.   Having seen this many times over my 5 plus decades I can positively make that statement.

One thing that really gets to a woman is when she is not the center of her man's world.  Any guy who still prefers to hang out with his friends, go drinking, watch sports, or has not grown up is going to lose a woman's respect right quick.   In teaching martial arts, med school and exercise I've told many a "boy man" to grow the f--k up act responsibly.  To gals I offer this advice: marry an x service man or a fellow who's had a really hard life and has had to work to make it, as these guys are more in touch with being responsible, disciplined and productive than the "bad boys" who still act like they're in school.  Women unfortunately love and are highly attracted to the "bad boys" and their astonishment that the bad boys continue to be bad boys after the vows are said and the rings go on astonishes me.   What did they expect!

MM: Interesting points. A friend once noted that there is no rite of passage for boys into men in the modern world. How d oes that play into things?

DW: The poet Robert Bly pointed out in his book "Iron John", men change from boys to men by a process of initiation.  Hard experiences, long-sufferings, deep teaching from wise elders, military boot camp; if they have taught persistence, discipline and responsibility all qualify as an initiation.   American Indian boys went into the field to hunt, fend for themselves, gain deep understanding of what they could or must accomplish, survive and hopefully gain a spiritual insight.  This changed them deeply and made them men, worthy to stand with the warriors and builders.  We live in a matriarchal society where fathers and grand fathers no longer guide their sons down the path to manhood.  Men are no longer ritually initiated into manhood.  The harshness of survival, grand effort and spiritual awakening is looked down upon, as being primitive and so we have the world filled with irresponsible "boy men".

MM: What about women?

DW: We now have a generation of daughters of the "Liberated Women" of the 70's.  These gals don't have a clue of what it means to make a household, tend to a family, make a meal or raise children.  They know corporate politics, reservations for eating, and day care from birth.  These women have grown in dysfunctional families where mom was a dominating closet lesbian and their images of male / female relationships are extremely skewed.  These daughters have not been initiated into womanhood.

I don't have a clue as to how to fix that, except to tell men to find wives from women that have had good moms because even though most women hate hearing this fact it is very, very true:  by their late 30's and into their 40's all gals turn into their mothers!

But back to an earlier point:  Increasing a man's T level, increasing his affirmativeness, being the pillar of strength, being a good provider, noticing the gal, giving a gal better orgasms, can help improve a gals impression of her man in some relationships.  Other relationships, which are toxic, just need to be walked away from before it drives the guy crazy.   By the way Iron John is a MUST READ for every MAN.  Boy men need not read it.  Nuff said.

MM: Lets get into specifics. What can be done to increase T levels?

DW: Testosterone levels in men and women decline from 27 onward and seriously decline from 35 onward until by 40-45 most men are estrogen dominant and have more estrogen floating round their bodies than their wives do!  Since all of our drive both mental, physical and sexual is derived from testosterone, since the spark that keeps us interested in life and enjoying it is derived from testosterone it behooves us not to succumb to natures planned obsolesce and let ourselves get E dominant and T deficient!.

A few things can be done to naturally raise one's own testosterone levels are:

Libido Lift herbal capsules 4 caps 3 to 4 times daily.

Doctors Testosterone Gel (has no real testosterone but has herbs and homeopathics that stimulate out own production). Two to three applications of the gel daily.  Especially before bed and early afternoon (since we make T twice daily between 2 and 4 AM and 2 and 4 PM).  It can also be applied some 20 to 30 min. before training or sex.

Maca powder: This south American root is kin to a turnip but tastes like butterscotch, has plant sterols that are precursors to both testosterone and progesterone the good hormones and has Di Indole Methane (DIM) to block estrogen from tissues three to six teaspoons of the stuff a day should be minimum.  The capsules of this stuff won't work as they don't contain enough Maca to make a difference regardless of how "extracted and concentrated" they claim to be.

These three supplements in combination work very well to elevate T levels in those whose pituitaries and testicles still function to make hormones.  All of these supplements are available at www.docsprefer.com

MM: What about dietary advice for increasing testosterone levels?

DW: The only dietary advice I can think of off hand to increase T levels is: don't let your cholesterol get below 180.  The body stops making hormones then.  In India where most are vegetarian Hindus, milk and eggs are a dietary staple to increase the intake of animal fats, which are some of the best sources of cholesterol from which to make hormones.

Eat a lot of MACA.  This Andean butterscotch tasting turnip has the plant sterols that are immediate precursors to testosterone and progesterone and it also has Di Indole Methane to block estrogen use by the tissues.  In Peru it is used to increase fertility and libido, which are both functions of testosterone.  By the way men do need progesterone, it blocks the conversion of testosterone to estrogen and blocks both the T and E from becoming Di Hydro Testosterone the hair loss and swollen prostate hormone.   Consume at least three to six teaspoons of maca every day.  In South America maca is put into baked goods cookies breads and cakes, into stews and taken plain.  I drop a teaspoon of the powder in my mouth and drink water to chase it down.

MM: How important is growth hormone for health and well-being?

DW: Funny you bring that up. Just heard today about a study that showed that an increase of 12% in IGF 1 levels is equal to adding 10 years to your life!

MM: Wow! Besides decreasing life span, what else happens when IGF-1 gets low?

DW:  IGF 1 is a wonderful anti aging, muscle-sustaining hormone that gets low with high stress levels.  There is some controversy to using IGF-1 or HGH (which releases IGF-1).  The geriatric docs say the IGF-1 can cause cancer.  The anti-aging MD's say hogwash.  The final word comes from the oncologists who use IGF-1 to fight cancer.  Used to be we would expect to see lowered IGF-1 in a person 35 to 40 +.   These days there are 20 some things with very low IGF-1!    IGF-1 gives not only muscle and bone mass but also increased immunity, greater mental power and maintains brain and internal organ size (which shrinks and becomes fibrotic with age.   Read my article Fibrosis The Enemy Of Life at www.drwong.us to find out why and how).   Having IGF-1 levels go south in one's 20's is noting but bad and likely will take decades off the lives of the X'rs and Y generations unless changed.  Already I've seen my boomer generation come down with things like strokes and heart attacks in our 40's that should not have happened till  our 60's.  So what will happen to the X'ers and Y's in their 40?s if the trends for the good hormones (i.e. testosterone, progesterone, IGF 1, oxytocin) continue downward?

MM: Do you recommend GH injections?

DW: The much touted HGH injections so prized by anti aging docs are a way of causing the body to release IGF 1, but that's a long and expensive way round the barn.  $11,000 to 12,000 a year expensive to be precise.  IGF- 1 is abundant in the velvet that covers deer antlers.  The male deer shed their antlers every year, the velvet from these can be collected and the IGF-1 extracted.   There is a deer farm in New Zealand that has the largest herd of Chinese red deer in the world and this is where all of the IGF-1 sublingual spray products are made regardless of who puts their label on it.  It's all from the same source; www.nowfoods.com sublingual sells their IGF-1 sublingual spray for about $25, a full fifty to sixty dollars less than most of the other folks who carry the product do.

MM: Blood pressure seems to be on the rise at a rapid pace. What advice do you have for lowering blood pressure?

On lowering blood pressure I have a two prong approach:

a) taking systemic enzymes to lyse away the fibrin clogs that plus up the micro circulation and reduce full circulation to the extremities, (peripheral vascular resistance).  PVR is equal to having high pressure at the kitchen tap when all the other water taps in the house are closed.

b) Do strength training and build miles and miles of new blood vessels. This better feed tissue as well as reduces peripheral vascular resistance further.

Between the two it's like opening all the water taps in the house, pressure at the kitchen tap goes down. It must be said that there are 2 reasons for high blood pressure: Peripheral Vascular Resistance and Kidney Damage.  When this technique does not work then we know the patient has a good bit of kidney damage and that is the cause of their higher BP.

MM: What about taking CoQ10?

DW: Co Q 10 is an essential for heart health as is Vit. E.  On the Co Q 10 the dose should be equal to the persons age in decades or if there is heart pathology then 150 to 300 mg daily.  On the Vit. E there has been much junk medical science made by drug companies to disprove the effectiveness of vitamins so they can sell you their expensive drugs instead.  1200 to 1600 IU of E are needed daily as well and the hearts favorite mineral Magnesium.  With out you'll not only have constipation, night cramps, muscle spasms and a build up of calcium in arterial plaque but in the extreme of mag deficiency you'll get irregular heart beat (arrhythmia).  Of mag we need 1000 to 2000 mg daily. In some folks this can cause the runs so they can use magnesium glycinate the only form of the mineral that does not cause loose stools.

MM: Recently you came out with a book on sexual health. How is your book different from other books on the market?

DW: Most books on men?s sexual performance are written by non experts, guys like "Big Joe From Brooklyn" and the text covers nothing scientific or medical but reads like porn.  Other books on sexual performance are so full of fluff and needless, useless prattle, that out of 100+ pages the real advice or techniques come in the last 5 pages of the work.  My men?s pro-sexual book "The Care And Feeding Of A Penis"  has no porn or nude male pictures, is filled with immediately useful information in every chapter and from penis size, to sperm count from peyronies to erectile dysfunction there is something to benefit every man from 27 to 97!   It is the users manual we should have come with!  To make it accessible world wide with out having to pay the VAT (taxes) usually imposed on book imports we have offered the book as a downloadable E book.  It's available from www.drwongsbooks.com

MM: Well this is certainly going to be a controversial interview to say the least. Thank you for taking the time to do the interview and keep up the great work.

DW: You are very welcome and I look forward to talking to you again. I would like to invite your readers to check out my website: www.drwong.us.

===========================================


https://thenutritionwatchdog.com/what-is-your-plastic-footprint/

===========================================


This may offer leads

https://endocrinedisruption.org/interactive-tools/endocrine-basics



Title: Economist: Polygamy breeds Civil War
Post by: Crafty_Dog on November 16, 2018, 06:42:28 PM
https://www.economist.com/the-economist-explains/2018/03/19/why-polygamy-breeds-civil-war?fsrc=scn%2Ftw%2Fte%2Fbl%2Fed%2F%3Ffsrc%3Dscn%2Ffb%2Fte%2Fbl%2Fed%2Fwhypolygamybreedscivilwartheeconomistexplains&fbclid=IwAR1luYBPbV1oDH2p7dnBBe3HTEAq7s8LuNG9o91L5gG9_m0U8LRWSlMP6cI
Title: Re: Economist: Polygamy breeds Civil War
Post by: G M on November 16, 2018, 06:51:38 PM
https://www.economist.com/the-economist-explains/2018/03/19/why-polygamy-breeds-civil-war?fsrc=scn%2Ftw%2Fte%2Fbl%2Fed%2F%3Ffsrc%3Dscn%2Ffb%2Fte%2Fbl%2Fed%2Fwhypolygamybreedscivilwartheeconomistexplains&fbclid=IwAR1luYBPbV1oDH2p7dnBBe3HTEAq7s8LuNG9o91L5gG9_m0U8LRWSlMP6cI

Funny how islamic marriage and shariah law are avoided in the above article.
Title: Big Bird eats Neanderthal Child
Post by: Crafty_Dog on November 23, 2018, 12:30:53 PM
https://www.thevintagenews.com/2018/11/23/neanderthal-child/?fbclid=IwAR0-0yeLCEuANJa-4aoO6DJZFq2pMBnUjXGe4huPkFSUa4bK9K4Zh7zQnwo
Title: Re: Big Bird eats Neanderthal Child
Post by: G M on November 23, 2018, 03:45:24 PM
https://www.thevintagenews.com/2018/11/23/neanderthal-child/?fbclid=IwAR0-0yeLCEuANJa-4aoO6DJZFq2pMBnUjXGe4huPkFSUa4bK9K4Zh7zQnwo

This explains why"Sesame Cave" never really became popular.
Title: all things muscle
Post by: bigdog on January 09, 2019, 02:17:33 PM
https://www.popsci.com/build-muscle-faq-exercise-experts#page-4
Title: The Goodness Paradox
Post by: Crafty_Dog on January 27, 2019, 12:06:52 PM


The Goodness Paradox’ Review: The Benefits of Good Breeding
Humans are peaceful compared to some of our closest primate relatives. Did we domesticate ourselves?
Two Eastern chimpanzees in Mahale Mountains National Park, Tanzania. Auscape/Getty Images
By John Hawks
Jan. 25, 2019 9:45 a.m. ET

An anthropologist at Harvard University, Richard Wrangham is no stranger to wild animals. His long fieldwork with wild chimpanzees in the Kibale Forest of Uganda, and other African field sites, has done much to help scientists see the role of aggression and violence in our close relatives.

Two decades ago, Mr. Wrangham examined chimpanzee violence in his provocative book “Demonic Males: Apes and the Origins of Human Violence” (co-written with Dale Peterson). That book recounts how field primatologists, including Mr. Wrangham at Kibale, began to understand that coalitions of male chimpanzees work together to kill chimpanzees in neighboring groups. Humans, the authors suggested, have a violent heritage, one that is still manifested today.
‘The Goodness Paradox’ Review: The Benefits of Good Breeding
Photo: Tom Brakefield/Getty Images
The Goodness Paradox

By Richard Wrangham
Pantheon, 377 pages, $28.95

The scene among anthropologists debating the thesis of “Demonic Males” was not so different from a face-off between rival chimpanzees. Any suggestion that humans have an intrinsically violent and aggressive nature tends to get people riled. Besides, violence seems to go against so many aspects of human nature. Humans are highly prosocial, cooperative and altruistic. We are kind. If we are so good, how can we be so bad?

In “The Goodness Paradox: The Strange Relationship Between Virtue and Violence in Human Evolution,” Mr. Wrangham probes the deep evolutionary history of human aggression. “We must agree with Frederick the Great: ‘Every man has a wild beast within him,’ ” he writes. “The question is what releases the beast.” “The Goodness Paradox” has a different emphasis from “Demonic Males,” however. Our violent lineage is still the main problem, but here Mr. Wrangham explores the reasons that our species may have partly overcome it.

In pursuing this question, Mr. Wrangham is far from alone. The Harvard psychologist Steven Pinker is the best-known exponent of the idea that human existence has become progressively less violent over time. That concept may seem counterintuitive. Wars, genocides and social conflicts of the 20th century killed millions of combatants and civilians, while people in small-scale human societies seem to live comparatively peaceful lives. But anthropologists studying even the smallest villages and hunter-gatherer groups have recorded murder, revenge killing and warfare. If extrapolated to large industrialized societies comprising millions, even a few cases in a small group would translate to a frightful rate of violent deaths. Mr. Pinker puts such observations together with archaeological evidence of violence in past societies. These data, he argues, say life is getting better—at least for humans.

Mr. Pinker’s ideas have set off almost as much debate among anthropologists as Mr. Wrangham’s did 20 years before. But even assuming that Mr. Pinker is correct about the past 10,000 years of human history, his idea leaves open some very interesting questions. If all humans today live in social systems that greatly restrict our expression of violence and aggression, how did this come about?

Mr. Wrangham follows the Duke University social psychologist Kenneth Dodge and many others in separating human aggression into two types. “Reactive aggression” is the stuff of bar fights, when individuals provoked by taunts start throwing punches or pulling knives. “Proactive aggression” is premeditated, planned, the stuff of careful tactical strikes. The “paradox” of Mr. Wrangham’s title is the distinction between these different aspects of human violence.

To understand why this distinction matters, Mr. Wrangham asks readers to think of an airplane full of people. Hundreds of humans can sit quietly for hours crammed into tiny uncomfortable seats. Hundreds of chimpanzees in such a space would quickly be ripping one another limb from limb. The difference is reactive aggression—extremely high in chimpanzees, low in humans. But humans must implement elaborate security arrangements to prevent a single person from bringing down the plane in a well-planned plot. That’s the threat of proactive aggression, and it’s uniquely developed in humans.

Humans are not alone in being much more peaceful on the whole than chimpanzees. Chimpanzees themselves have a close sister species, the bonobos, which lives in female-dominated social groups. Whereas chimpanzee groups are regularly racked by aggressive interactions, bonobos resolve tension with affiliative behaviors, often sexual interactions. Chimpanzee males intimidate and beat females; bonobo males do not. As the primatologist Frans de Waal has put it, bonobos make love, not war.

Mr. Wrangham describes a “ball game” sometimes played by both male chimpanzees and bonobos, in which two males chase each other around a tree trunk trying to grab each other’s testes. Bonobos have such trust that they sometimes play the game with males from other communities. Chimpanzees have such instant hostility that play between communities would be unthinkable.

The differences between chimpanzees and bonobos have long been a quandary for anthropologists. Neither of them is closer to humans than the other; they are both our closest relatives. As Mr. Wrangham asks, “Why should two species that look so much alike be so different in their intensity of aggression?”

His answer is that bonobos have domesticated themselves. Domesticated animals, like dogs and horses, exhibit huge decreases in aggression compared with their wild ancestors. Humans have induced those changes in our domesticated animals by selecting strongly against reactive aggression. Wolves that could not tolerate the presence of humans didn’t become part of the ancestral dog gene pool. Cattle that attacked their minders were not bred.

The most well-known experiment on domestication was begun in the late 1950s by the Russian researcher Dmitri Belyaev, who selected against aggressive responses to humans in silver foxes, minks and rats. Within a few dozen generations these species exhibited many of the behavioral traits of domesticated species like dogs. They also began to show physical changes. Fox ears became floppy, and spots of color began to appear. These and other changes are part of a “domestication syndrome” shared across many species.

In Mr. Wrangham’s description, bonobos display many aspects of this syndrome. Their brains are smaller than chimpanzees’, a shift also seen in many domesticates. Their skulls are “juvenilized,” with smaller faces and brow ridges. Maybe, he suggests, these changes are tied to their massive reduction in aggression. Such changes, Belyaev and others have claimed, make individuals look less threatening. Then again, adult bonobos are remarkable in their willingness to play. Maybe the juvenilized skull form is a genetic side effect of retaining such juvenile-like behavior.

Bonobos hint that self-domestication might be possible. Could it be that we ourselves are the most extreme example of our propensity to domesticate the creatures around us?

The idea of human self-domestication is older than the theory of evolution. Mr. Wrangham traces the scientific record of human self-domestication to Johann Blumenbach, a German physician and naturalist at the turn of the 19th century. Blumenbach is known for devising a theory of the origin of races by a process he called “degeneration.” It’s a terrible name for a concept that contained the seeds of evolution. Blumenbach claimed that human races had a common origin in the Caucasus region of Asia and then changed when they encountered different environments such as sun, heat or cold.

Whereas Blumenbach tied races to his theory of change, he viewed domestication as part of the nature of all humans, describing us as the “one domestic animal . . . that surpasses all others.” But if humans are domesticated, someone must have domesticated them. For Blumenbach, the only answer was divine intervention.

Since the notion of self-domestication preceded Charles Darwin’s work on evolution, Darwin had a good chance to think it over. He didn’t like the idea. Not many evolutionary biologists have. In his 1962 work on human origins, the geneticist Theodosius Dobzhansky wrote: “The concept of human domestication is too vague an idea at this time to be scientifically productive.”

But times have changed, and a broad array of anthropologists and geneticists have become interested in the idea of human self-domestication again. Some have focused on possible hormonal changes, trying to understand why humans today have skulls that lack the large brow ridges and thicker bone of our fossil ancestors. Others have worked on developmental timing, trying to show that adult humans retain some of the traits of ancient children.
A silver fox descended from those domesticated by Dmitri Belyaev.
A silver fox descended from those domesticated by Dmitri Belyaev. Photo: Artyom Geodakyan/TASS via Getty Images
Newsletter Sign-up

These scientists face the same problem as Blumenbach. Domestication may seem to be in our nature, but there’s little solid idea of how it could have happened.

Mr. Wrangham’s hypothesis is that humans have been shaped by a history of coalitionary proactive aggression. People work together to enforce social rules and punish wrongdoing. This may sound like common sense, but it raises deep questions. Some chimpanzees and bonobos exhibit a sense of fairness in experiments on sharing food, which suggests that they share a basic moral sense with us. But humans have a developed morality that goes beyond a mere sense of injustice, with elaborate codes detailing right and wrong behavior. How did this evolve?

Chimpanzees cooperate during tasks like hunting. But when humans began to communicate using language, a much broader array of cooperation became possible. Humans can plan their cooperation in advance, with full knowledge of the reasons why they are pursuing a course of action. In Mr. Wrangham’s account, language places proactive aggression on steroids. One outraged person can quickly become a mob.

Morality sometimes leads to outcomes that would be perverse if individuals acted to maximize their own fitness. A chapter titled “The Evolution of Right and Wrong” begins with the tale of an Inuit mother who strangles her arrogant son rather than have him bring shame to the family. Killing an offspring is the last thing that pure fitness-maximizing evolution would promote. For this reason, scientists have debated the evolutionary foundations of morality. Many have proposed that the value of moral behavior in encouraging group cohesion must have outweighed the reproductive interests of individuals. Such “cultural group selection” might have emerged if war and competition between groups was very important to humans in the past.

But as chimpanzees clearly demonstrate, war and competition between groups are not sufficient to give rise to humanlike morality. Mr. Wrangham instead follows the work of the anthropologist Christopher Boehm, who has suggested that morals emerged as an individual defense against coalitional violence. The way to stay safe in a small human group is to be relentlessly egalitarian. Each person has an incentive to follow a shared moral code because violations give would-be punishers a cause to recruit allies against him.

It is indeed a paradox—as good as humans are, they are sometimes driven to seemingly outrageous acts. By giving a detailed comparison of human violence and aggression with that of our close primate relatives, Mr. Wrangham has given a possible explanation for how our species might have domesticated itself. That makes this book essential reading as geneticists start to unwrap the package of genes that responded to domestication, which may give hints about our own evolutionary history.

—Mr. Hawks is a professor of anthropology at the University of Wisconsin-Madison.
Title: Re: Evolutionary biology/psychology
Post by: G M on January 27, 2019, 12:33:39 PM
We live in a time of prosperity. Take that away and see how quickly things turn violent.
Title: An expert on human blind spots gives advice on how to think
Post by: bigdog on February 04, 2019, 07:17:21 PM
https://www.vox.com/science-and-health/2019/1/31/18200497/dunning-kruger-effect-explained-trump

(I post this as the phenomenon, not about the president.)
Title: Re: An expert on human blind spots gives advice on how to think
Post by: G M on February 05, 2019, 07:03:51 AM
https://www.vox.com/science-and-health/2019/1/31/18200497/dunning-kruger-effect-explained-trump

(I post this as the phenomenon, not about the president.)

Maybe Young-adult infotainment site Vox isn't the best source for this, or any other topic.
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on February 06, 2019, 03:19:12 PM
Be that as it may, is it relevant to the essence of the summary of the point being made?
Title: NR: Evolutionary biology, society, and politics
Post by: Crafty_Dog on March 25, 2019, 01:17:05 PM

https://www.nationalreview.com/2019/03/david-wilson-this-view-of-life-evolutionary-biology-society-politics/?utm_source=Sailthru&utm_medium=email&utm_campaign=NR%20Daily%20Saturday%202019-03-23&utm_term=NRDaily-Smart
Title: Re: An expert on human blind spots gives advice on how to think
Post by: G M on March 25, 2019, 08:46:26 PM
https://www.vox.com/science-and-health/2019/1/31/18200497/dunning-kruger-effect-explained-trump

(I post this as the phenomenon, not about the president.)

So, let's explore the Dunning-Kruger effect and all the "smart people" who fell hook, line and sinker for "Russian Collusion".

Title: Eggs may decide which sperm to fertilize them
Post by: Crafty_Dog on April 19, 2019, 02:42:38 PM
https://bigthink.com/design-for-good/eggs-may-get-to-decide-which-sperm-to-fertilize-them?utm_medium=Social&utm_source=Facebook&fbclid=IwAR2RQBg1CxeBRozxtxvZ5iMXV3rk-FML8U2Fvjk8JcJiA2ZrJNApPkTYKPI#Echobox=1555624783
Title: An ethological approach to reason
Post by: Crafty_Dog on May 13, 2019, 11:42:21 AM


http://nautil.us/blog/the-problem-with-the-way-scientists-study-reason?fbclid=IwAR0I4_cnBrzARrCapxdsOvQlnwX4wPmFKMbcJ7LguWB4QTILidC6t3ezeOg
Title: Orangutan spear fishing
Post by: Crafty_Dog on July 07, 2019, 09:44:55 PM


https://primatology.net/2008/04/29/orangutan-photographed-using-tool-as-spear-to-fish/?fbclid=IwAR10ehQkesWaV9v5cioMkYbYMMLvNy4JNWDNoyo9RXuTD3fQzAHbvQY-54k
Title: Mathematical Challenges to Darwin
Post by: Crafty_Dog on August 08, 2019, 05:45:28 PM


https://www.youtube.com/watch?v=noj4phMT9OE
Title: Men today are weaker?
Post by: Crafty_Dog on August 19, 2019, 09:51:04 AM
https://www.intellectualtakeout.org/blog/study-men-are-getting-weaker?fbclid=IwAR1MKa4bxolkyouXTGcdkthGkDadTdR4fqlAlAt08_cmn3O1TEjrzwV9NEg
Title: Bacteria to Beethoven
Post by: Crafty_Dog on October 25, 2019, 11:46:39 AM
https://www.prageru.com/video/evolution-bacteria-to-beethoven/
Title: The Magic Mushroom theory of human evolution
Post by: Crafty_Dog on December 19, 2019, 08:05:54 PM


https://truththeory.com/2019/12/19/stoned-ape-theory-magic-mushrooms-caused-complex-thinking-in-human-evolution/
Title: 90,000 year old human hybrid remains discovered
Post by: Crafty_Dog on March 09, 2020, 11:21:36 AM
https://bigthink.com/scotty-hendricks/90000-year-old-neanderthal-denisovan-human-hybrid-found-in-ancient-cave?utm_medium=Social&utm_source=Facebook#Echobox=1583640569
Title: Extinct Bird evolves itself back into existence
Post by: Crafty_Dog on May 20, 2020, 09:55:50 PM


https://www.esquireme.com/content/46133-an-extinct-bird-just-evolved-itself-back-into-existence
Title: The Primal Struggle for Dominance
Post by: Crafty_Dog on December 29, 2020, 03:45:06 AM
https://www.city-journal.org/primal-struggle-for-dominance?fbclid=IwAR2VGM7y-EwBZr6-dICej8vC6MJGVlhVy7bVLRcmZNHMw_d0zh4_6wHTlos
Title: ET: Role of Internet and Violent Video Games
Post by: Crafty_Dog on July 17, 2022, 08:57:50 AM


What Is the Role of Internet and Violent Video Games in a Generation of Aggressive and Violent Youths
BY HEALTH 1+1 AND MARINA ZHANG TIMEJULY 16, 2022 PRINT

On July 4th, 2022, a man in Illinois took to the rooftop and fired rounds of bullets on the Independence Day parade participants, killing 5 people with 2 more dying later from their wounds and injuring even more.

The man, soon identified by the police, is Robert Crimo III, aged 21.

The mass shooting devastated the close-knit community of Highland Park. Most people knew Crimo III as the son of a well-known business owner and mayoral candidate Robert Crimo Jr.

The agonizing aftermath has raised endless questions regarding the reasons and motives that led to his devastating action. Motives, however, are always complex issues interlaced with upbringing, environment, and a person’s own unique circumstances.

What is obvious is the increased trend in violent crimes committed by younger perpetrators.


The median age for those committing gun-related crimes has been decreasing, down to 35 in 2018 from 39.5 for handgun shootings. The median age is even lower for assault rifle shootings at 31 years, and just 21 years for school shootings.

From Salvador Ramos’s school shooting in Uvalde, to Robert Crimo III in Highland park, to Payton Gendron and the many other young adult and teenage perpetrators of such crimes, we find commonalities among the backgrounds of each individual.

In addition to possible behavioral problems and fatherless homes or sparse parental involvement, another similarity is the obsession with and excessive hours spent on the internet or video games, especially those containing violence.

This curious commonality begs the question of whether violent video games and internet activity could play a role in precipitating acts of such horrific violence.

Violent Media Increases Interest in Firearms
Some researchers suggest that violent media breeds an interest in violence and weaponry.

Dr. Brad Bushman once did an experiment that suggested “violent media was a risk factor for dangerous behavior around real guns.”

In the study, he split 242 children into three groups and had each group watch a video for 20 minutes.

The first group watched a Minecraft video that was violent with guns, the second watched a Minecraft video that was violent with swords, and the third group watched a non-violent video.

The children were then asked to play with toys and games in another room. The room contained two disabled handguns.

Most of the children who watched the video with guns would touch a gun, pull the trigger, and point it at one another. The group of children who watched the video with swords touched the guns less frequently while those who watched a non-violent video touched the guns the least.

From this study, Bushman suggested that “exposure to violent video games can increase a child’s interest in firearms, including shooting a handgun at themselves or others.”

However, such small game play is not going to cause a person to become a mass shooter or a perpetrator of violent crime. It is a series of complex issues that pushes a person down the path to commit violence.

Before violence comes aggression.

Learning Aggression From Violent Media

Renowned psychologist Dr. Douglas Gentile from the University of Iowa has studied violent media and aggression for over 30 years. He likens the relationship between violent media and aggressive behavior to smoking and cancer.

“It’s not a simple mechanistic thing, not like you watch a violent movie and then you go do something violent. That’s not the way this works; it’s much more subtle than that.”

“Although there are short term effects [of aggression]…the short term effects usually dissipate after about 20…30 minutes. Just like smoking…that one cigarette does not give you cancer, and you know the effects of it do wear off after an hour or whatever, but if you continually consume it, then each one is increasing the odds of a more extreme result.”

Gentile was not too keen on linking violent media and acts of violence but acknowledged that violence is a rare endpoint of aggression.

Though some people can become aggressive, very few of them will later develop violent outbursts. It is a complex issue.

Nonetheless, aggression can be a learned behavior, trained through exposure to violent media, especially through violent video games.

Gentile defined aggression as the intent to harm, explaining that it can be physical, verbal, and cyber, among others while violence is only physical and is typically more extreme.

His years of research have since demonstrated that long-term exposure to violent media creates very subtle changes in a person’s cognitive response to aggression and violence, training a person to become more aggressive in their behavior and their thinking.

Gentile listed four main effects that are well established within the psychology research community on violent media.

“There’s first of all, what’s called an ‘aggressor effect,’ that the more entertained violence you watch, the more willing you become to behave aggressively when provoked.” Gentile said.

“The second is the ‘victim effect,’ that the more entertainment and violence you see in the media, the more you start seeing…the world as a much more dangerous and scary place.”

The third effect is the ‘bystander effect,’ meaning that we can become more desensitized and possibly even callous to the violence that has been experienced by someone else.

The final is the ‘appetite effect,’ which means that the more violent media we see the more we will want to see it. These changes in our perception also affect our cognition, on how we may respond to aggression and provocation, Gentile explained.

These effects, however, are not a one-size-fits-all as different people will experience different effects depending on the content, amount consumed, and various individual complexities.

Between the two genders, males tend to be more affected by the aggressor, bystander and appetite effects whereas females are more influenced by the victim effect.

To illustrate how violent media, especially video games, can train a person to become more aggressive, and in extreme and complex situations to become violent, Gentile described a previous research study he led with students from Singapore.

More than 3000 students were surveyed over three years on their video game use, the level of violence, length of gameplay, and also how they would respond in aggressive or provoking situations.

It was assumed that responses would reflect the underlying aggressive behavior and cognition of the students.

Aggressive cognition in the study was separate into three aspects, namely aggressive fantasy (how much one thinks about harming others), hostile attribution bias (the bias to interpret situations as hostile rather than benign), and normative beliefs about aggression (the degree of aggression in a response that a person thinks is acceptable).

Gentile found those students who played violent video games, mostly children in primary school, exhibited greater aggressive behaviors as well as all three aspects of aggressive cognition.

Gentile explained that violent video games trains the bias for hostility, as participants are waiting for violence and “are practicing being hyper-vigilant for aggression.”

Video games also reward gamers when they respond to violence with violence, consolidating this aggressive learning. Gentile argued that being exposed to other violent media also amplifies this reward process.

“Of course, the whole time you are consuming violent media, you are rehearsing along with it, aggressive fantasies, so all three of these aggressive cognitions increase among the kids who played more violent video games, and by the end of the study, those kids were being more physically aggressive.”

He gave a hypothetical scenario where a student who plays violent games gets bumped in the school hallway and how this may escalate to a fight because of the learning he or she did through gaming.

Gentile said that the hours of video game training to be vigilant for aggression can make a person interpret such an event as one of aggression or provocation rather than a simple accident.

“That tiny change in perception changes everything downstream from it.”

In games, once a player encounters an aggressive stimulus, the immediate reaction is to turn to the stimulus and respond aggressively.

“Well, the thing that humans do, especially when they’re under stress, is the thing that comes to mind first, well, the thing that comes to mind first is the thing you’ve practiced the most,” Gentile said.

The student’s immediate response may be aggression such as returning the shove or saying something unkind, however, “that’s not enough to get the kids to do it.”

“There’s kind of a high bar to doing it, because once you do, the odds that this could turn into a real fight skyrocket. But because you’ve been rewarded [in the games], and you’ve enjoyed consuming all this media violence, that bar has been lowered quite a bit,” Gentile explained, highlighting the steps that take virtual aggression to physical aggression.

However, Gentile argued that when aggressive cognition spills into daily life as aggressive behaviors, no one in the moment would connect that to violent games or violent media as the cause of such behaviors.

“Kids aren’t copying [the actions in the games]. That’s not how it works.”

“[Violent media] changes the way we perceive the world and the way we think, and we take the way we perceive the world and the way we think with us everywhere.”

Brain Differences in Obsessive Video Game and Internet Users
Studies examining the benefits of gaming have found that video gamers who play in moderation had better visuospatial skills which are trained in popular games such as Tetris. Additionally, some people can improve their decision making and social skills from action games that require teamwork and an overall fast response rate in order to win.

However, research into individuals with obsessive video game play or internet use shows that these individuals have reduced brain volume as compared to individuals who do not game or use the internet excessively.

Video game players and those that spend a long time online, have reduced grey matter (neurons) in the prefrontal areas as well as in many other areas of the brain. The prefrontal area is in charge of complex thinking, decision making, self-control, and impulse. Loss of grey matter may indicate poorer impulse control, poorer decision making, and impaired thinking.

In the short-term, violent video games have also been shown to reduce brain activation in regions responsible for emotional processes, indicating reduced empathy.

A 2006 study (pdf) of 14 adolescents demonstrated this. The researchers split the children in two groups. For 30 minutes, one group played a violent game and the other group played a car racing game.

Then, the teenagers were asked to match geometric shapes and assign an emotion to photographs of people with different facial expressions.

The researchers observed that their overall reaction times and accuracy were similar but brain scans for children who played the violent game showed reduced emotional processing when interpreting fearful and angry facial expressions.

The group playing the racing game showed robust processing patterns, activating the areas responsible for fear and risk including areas that exert control on appropriate behavior such as the anterior cingulate cortex which is responsible for empathy and impulse control as well as the areas responsible for facial expression recognition and the visual cortex.

Epoch Times Photo
MRI scans comparing emotional processing between children that play a violent game just beforehand, as compared to playing a non-violent game (Radiological Society of North America)
However, in the group playing a violent game this processing was reduced and the regions responsible for empathy, impulse control, and some control of fear processing was deactivated.

Long-term gaming and prolonged internet use is unhealthy and causes long-term changes in brain matter density.

A study comparing gaming men and those who did not game found that men who gamed had a reduction of grey matter in the right posterior cingulate gyrus (motivation, top-down control of visual attention), left pre- and post-central gyrus, and right thalamus, among others.

Gamers also had reduced white matter in the left and right cingulum, a structure that helps regulate emotions and pain that is also involved in predicting and avoiding negative consequences.

Children of today are at higher risk of becoming internet and video game obsessed as they are being raised in the digital age where screen entertainment is pervasive. Over 90 percent of American children play video games.

The COVID-19 pandemic also saw increases in both video game use and internet addiction as people were forced to stay at home, to work and do school work online, and used gaming and the internet as a means to keep themselves occupied.

From adolescence to adulthood, the brain’s socio-emotional regions that manage feelings and emotions mature at a faster rate than cognitive control, meaning that mental processes such as attention, decision making, and learning are affected by how exciting or social the situation is.

Studies now indicate that a person’s brain may not reach full maturation well into their 20s, with some researchers suggesting maturation does not occur until 30 years of age.

This means, prior to full brain maturation, these individuals are at a higher susceptibility to obsessive gaming.

Generally the younger the child is exposed to screen media, the easier it is for them to become susceptible to the negative aspects of both the internet and video games. Additionally, as a child gets older screen time usually increases.

At the individual level, depending on how sensitive one is to the reward mechanism and the addictive nature of games and the internet, one’s dependence on these activities will vary.

Both internet use and video game obsession draw in individuals who have harm-avoidant, anxious, and detached personalities (pdf) while simultaneously influencing users to become both antisocial and withdrawn.

How to Reduce the Risks?
Screen media permeates every aspect of our lives.

Violence and aggression, in the from of sarcasm, expletives, action, and crime, among many others, are also highly prevalent in all forms of screen media.

Compared to the time when television was the only form of screen media, it has become increasingly difficult for teenagers and young adults to exert self control over their screen media consumption and for parents to control their children’s screen time and media content.

A cohesive family with increased parental involvement and social support is related with reduced obsession, but mental health disorders and poor academic performances increase the risk.

Apart from restricting screen time and removing screen media from children’s bedrooms, Gentile encourages parents to engage in screen entertainment together with their children.

“There are four types of parental monitoring identified by the research, first is co-viewing (parents sit with their children and can comment on the media if they like)…the second is setting limits on amount…the third is setting limits on content…and then the fourth is…active mediation.”

Though co-viewing is the most common and the easiest thing to do, and “is the thing that most parents sometimes do, at least,” said Gentile.

Co-viewing however, is “actually the bad one” as it increases the negative exposure from media that contains violence.

Instead he encourages active mediation by getting parents to ask questions such as “in real life would it really work like that?…What would be the best way to handle a situation like this?”

Engaging with children to think critically about what they are seeing has been shown to seemingly “mitigate almost all of the negatives of the media,” including the violence.

Since the majority of parents only engage in passive co-viewing, this “enhances the negatives, because then you’re giving tacit approval to whatever carnage is being seen on the screen.”

To break screen entertainment habits, parents can break the pattern by eliminating the cues that prompt screen habits.

UAE-SKOREA-US-TELEVISION-SOCIAL-SQUIDGAME
Participants take part in an event where they play the games of Netflix smash hit “Squid Game” at the Korean Cultural Centre in Abu Dhabi, on Oct. 12, 2021. (Giuseppe Cacace/AFP via Getty Images)
If a child’s cue is to view screen media the moment they come home, then implementing other activities at this time or in the same environment can help to break this habit.

However, it should be noted that how well an individual responds to these parental engagements and how much they need parental engagements to maintain self control over screen media usage will also vary depending on the individual. Flexible adjustments and guidance are the keys to compliance.

The Benefits of Controlling Media Content
Gentile led another study that measured the effect of parental control on the media usage of their children.

“We asked both the kids and the parents how much their parents set what they could watch, the content limitation or when they could watch or how long they could watch what.”

At the end of the school year, Gentile and his team found that “the parents that set more limits on the amount of content…[their] kids were getting better sleep by the end of the school year, which in turn related to lower weight gain, so less risk of obesity. Those kids were getting better grades in school, were more pro-social in their behaviors as rated by teachers.”

He was very fascinated by the results, especially since the three outcomes were unrelated variables.

“Physical health, school performance, and social wellness; those three very different types of outcomes don’t usually co-occur. But there’s one simple thing of setting limits on the amount of content, influenced all of them.”

“It is a protective factor; a ripple that extends out wide across time.”

While parents may have faced daily fights over the rules at home without seeing the fruits of their labor, the real outcomes of their endeavors were improvements in the health, academic performance, and behavior of their children.

“You can’t know that your child is less aggressive than how he would have been; you only know what your kid is. You don’t know if your child is getting better grades than [he or] she would have…more pro-social….parents…they can’t actually see the benefits.”

“So this study shows, and others [show]…that parents are in a much more powerful position than they realize.”

Views expressed in this article are the opinions of the author and do not necessarily reflect the views of The Epoch Times. Epoch Health welcomes professional discussion and friendly debate. To submit an opinion piece, please follow these guidelines and submit through our form here.
Title: evolution may not be so random afterall
Post by: ccp on January 01, 2023, 10:24:17 AM
https://www.inverse.com/science/crabs-keep-evolving-but-why
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on January 02, 2023, 06:27:35 AM
Thanks for this  8-)
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on February 01, 2023, 08:01:28 AM
https://www.scientificamerican.com/article/tech-company-invests-150m-to-bring-back-the-dodo/?fbclid=IwAR0I1RHcXwWnexWCLmEkGRJ-z9vtyEo8P1ojow-2B76mtKtBZG3LE3mBTrw

A 'De-Extinction' Company Wants to Bring Back the Dodo
The de-extinction company known for its plans to resurrect the mammoth and Tasmanian tiger announces it will also bring back the dodo

By Christine Kenneally on January 31, 2023
A 'De-Extinction' Company Wants to Bring Back the Dodo

Colossal Biosciences, the headline-grabbing, venture-capital-funded juggernaut of de-extinction science, announced plans on January 31 to bring back the dodo. Whether “bringing back” a semblance of the extinct flightless bird is feasible is a matter of debate.

Founded in 2021 by tech entrepreneur Ben Lamm and Harvard University geneticist George Church, the company first said it would re-create the mammoth. And a year later it announced such an effort for the thylacine, aka the Tasmanian tiger. Now, with the launch of a new Avian Genomics Group and a reported $150 million of additional investment, the long-gone dodo joins the lineup.

In the world of extinct animals, the dodo carries some heavy symbolic weight. Native to Mauritius in the Indian Ocean, it went extinct in the mid- to late 17th century, after humans arrived on the island. The ungainly bird, which stood around one meter tall and weighed about 15 to 20 kilograms, represents a particular kind of evolutionary misfortune: It should have been afraid of humans, but it wasn’t. The birds blithely walked up to sailors, so received history goes, and didn’t flinch as their peers were killed around them. The dodoes, which reproduced by laying a single egg on the ground, were also predated by other species, such as monkeys and rats, which humans brought with them. Now the creature represents extinction itself—you can’t get deader than a dodo.

“This announcement is really just the start of this project,” says Beth Shapiro, lead paleogeneticist and a scientific advisory board member at Colossal Biosciences. Shapiro, also a professor of ecology and evolutionary biology at the University of California, Santa Cruz, has studied the dodo since the science of paleogenetics was in its infancy. In 2002 she published research in Science describing how her team had extracted a tiny piece of the bird’s mitochondrial DNA (mtDNA)—the DNA inside little organelles called mitochondria that gets passed down from mother to offspring. That snippet of mtDNA showed the dodo’s closest living relative was the Nicobar pigeon. Then, in 2022, Shapiro announced that her team at U.C. Santa Cruz had reconstructed the dodo’s entire genome.

Though the journey from mtDNA to genome took decades, the path from genome to a living, breathing animal is even more formidable, involving an enormous, interacting set of extraordinarily complex problems. Technically, a species could be resurrected by cloning DNA from a remnant cell. In reality, this has been impossible to achieve, mostly because viable DNA cannot be found. Most de-extinction programs aim to re-create a proxy of an extinct animal by genetic engineering, editing the genome of a closely related living species to replicate the target species’ genome. The edited genome would then be implanted into an egg cell of that related species to develop. The process must ensure that development proceeds correctly, that the animal is born successfully, that suitable surrogate parents nurture the creature, that it is administered a nutritious diet and that it is raised in an appropriate environment.

Colossal Biosciences is trying to solve all these problems at once. “Even though we’re nowhere near ready to start implanting embryos into surrogates,” Lamm says, the company currently has a team working on the cloning methodology necessary for that process. It also has multiple teams working in parallel on problems of computational biology, cellular engineering, stem cell reprogramming, embryology, protein engineering and animal husbandry, among other focuses.

One of the biggest challenges in the reconstruction of the dodo is a problem for all avian genomics. With mammals, the process is like that used in the creation of Dolly the sheep, the world’s first animal to be cloned successfully from and adult cells of an adult mammal. But, Shapiro says, “we can’t clone birds.” Cloning requires access to an egg cell that is ready for fertilization but not yet fertilized. “There is no access to a bird egg cell at the same developmental time as there is for a mammal,” he explains. Colossal Biosciences is exploring a process to extract avian primordial germ cells (PGCs) from bird eggs. If the process works, PGCs from pigeons would be manipulated to eventually develop into a dodolike bird. Ultimately, Shapiro says, “the final version of dodo will emerge from a pigeon that has been engineered to be the size of a dodo. So the size of eggs will be consistent.”

Although the first stage of genome editing is harder with birds, the next stage should be easier. With mammals, scientists don’t yet know how the modified embryo of an extinct species will interact with the intrauterine environment of the host species. That stage will be simpler in birds, Shapiro explains, “because everything happens in an egg.”

Once a re-created animal is born, more questions arise. Most animals have a mix of instinctive behavior, which arises from their genetic programming, and social behavior, which are learned from their parents and, in the case of social animals, their pack or group. But there is no way to re-create the unique natural history that shaped the social behavior of the dodo or other extinct animals—or even, in many cases, to know what it was. Mikkel Sinding, a postdoctoral researcher in paleogenomics at the University of Copenhagen, says, “There is nobody around to teach the dodo how to be a dodo.” In this sense, the word de-extinction is a misnomer. It’s not possible to bring back the dodo, even if it becomes possible to build a bird with a dodo genome.

Beyond behavior, the dodo proxy must survive in a world that is significantly different from that of more than 300 years ago, when the dodo went extinct. Yet not much is known about how dodoes functioned in their ecosystem. The birds lived only in forests on Mauritius. They had no large predators. They were slow to reproduce, laying one egg per year. And it’s believed from ancient sailors’ reports that there were once thousands of them. Another challenge for de-extinction is ensuring the well-being of the genetically engineered dodoes.

“A goal here is to create an animal that can be physically and psychologically well in the environment in which it lives,” Shapiro says. “If we are going to bring back something that's functionally equivalent to a dodo, then we will have to find, identify or create habitats in which they’re able to survive.” Shapiro points to environmental restoration on Mauritius and surrounding islands. There is hope that work focused on dodo habitat restoration could have knock-on benefits for other endemic plants and animals and even that the reintroduced bird may directly contribute to restoring its own ecosystem. Giant tortoises introduced to an island near Mauritius to replace an extinct species have helped revive native ebony trees by eating their fruit and distributing their seeds around the landscape.

newsletter promo
Sign up for Scientific American’s free newsletters.

Sign Up
Sinding, who has extracted ancient DNA from Pleistocene wolves, woolly rhinoceroses and aurochs, was surprised and excited to hear that Colossal Biosciences planned to re-create the dodo. He thinks the company is more likely to find success sooner with the bird than the mammoth or thylacine. He adds that this will depend on one’s definition of success, however. “You can genome edit the hell out of something and say you have remade a species,” Sinding says. “But is it really the species?”

“The dodo is a good choice because the fetus development happens in a short time span inside an egg and not in a surrogate mother, unlike a mammoth, which would have to be gestated by an elephant for nearly two years,” Sinding says. “It would be slightly easier to work with a chick than with a thylacine cub.” The ethical question with the dodo, he adds, is “whether the money is well spent or if we should spend that money trying to preserve some other living pigeons that are almost extinct.”

Tom Gilbert, director of the Danish National Research Foundation Center for Evolutionary Hologenomics, recently joined the Colossal Biosciences’ scientific advisory board. In 2022, before he was on the board, Gilbert told Technology Networks that he loved “the idea and technology behind rewilding with extinct species.” But he wondered about the influence of human morality on the choice of species. As the article put it, “Why stop at the good things?” Gilbert added, “What about the bad things? The pathogens now eradicated?”

The de-extinction of the dodo is “not a solution to the extinction crisis,” Shapiro says. “Extinction is forever.” But by pursuing the problem of dodo de-extinction, she explains, Colossal Biosciences is also developing critically needed tools for avian genomics, including for the genetic rescue of currently threatened species, such as editing genetic diversity back into a shrunken, threatened bird population. In this way, a 21st-century dodo may assist all avian conservation.

The dodo is only one of many lost birds: 161 avian species have been classified as extinct since 1500, according to a 2022 report from Bird Life International. But Colossal Biosciences is relying on the creature’s significance to inspire scientists and the general public to engage with all the problems of extinction. “We could have picked lots of different birds,” says Shapiro, raising her right arm to reveal a dodo tattoo. “I happen to really love the dodo.”
Title: Antagonistic Narcissism and Psychopathic tendencies predict Leftie aggression
Post by: Crafty_Dog on May 27, 2023, 08:23:03 AM
https://www.psypost.org/2023/05/antagonistic-narcissism-and-psychopathic-tendencies-predict-left-wing-antihierarchical-aggression-study-finds-163497?fbclid=IwAR2w8ze0vrh5CV-wWm-5YXb_HMIvvhiQ4oVZQ2onNQ_8U4suHb6C1tuPoT4
Title: Technology vs Family Formation
Post by: Crafty_Dog on May 27, 2023, 10:29:15 AM
second

https://americanmind.org/salvo/the-role-of-technology-in-family-formation/?utm_campaign=American%20Mind%20Email%20Warm%20Up&utm_medium=email&_hsmi=259836942&_hsenc=p2ANqtz-9fPx_Z6qRtjzxDoIyXDB_PJ7dBpMpfmps77dWABbx-aglhPDB1-7xy1D0ck01YW14R7rVndE8RpNT7Skq61K1RwYEgNA&utm_content=259836942&utm_source=hs_email
Title: Re: Technology vs Family Formation
Post by: G M on May 27, 2023, 10:45:11 AM
second

https://americanmind.org/salvo/the-role-of-technology-in-family-formation/?utm_campaign=American%20Mind%20Email%20Warm%20Up&utm_medium=email&_hsmi=259836942&_hsenc=p2ANqtz-9fPx_Z6qRtjzxDoIyXDB_PJ7dBpMpfmps77dWABbx-aglhPDB1-7xy1D0ck01YW14R7rVndE8RpNT7Skq61K1RwYEgNA&utm_content=259836942&utm_source=hs_email

With religion on the decline, so are teachings that set boundaries for human behavior, and technologies can capture those who lack moral firmness. 

Title: Re: Technology vs Family Formation
Post by: G M on May 27, 2023, 10:50:22 AM
second

https://americanmind.org/salvo/the-role-of-technology-in-family-formation/?utm_campaign=American%20Mind%20Email%20Warm%20Up&utm_medium=email&_hsmi=259836942&_hsenc=p2ANqtz-9fPx_Z6qRtjzxDoIyXDB_PJ7dBpMpfmps77dWABbx-aglhPDB1-7xy1D0ck01YW14R7rVndE8RpNT7Skq61K1RwYEgNA&utm_content=259836942&utm_source=hs_email

With religion on the decline, so are teachings that set boundaries for human behavior, and technologies can capture those who lack moral firmness. 

https://www.youtube.com/watch?v=wJ6knaienVE
Title: Re: Evolutionary biology/psychology
Post by: ccp on May 27, 2023, 11:01:41 AM
narcissm and psychopathology

interesting
 a bit of psycho babble
 and the usual end study conclusion - more research is needed

but this part caught my highlighting left wing "researcher" bias :

“By[but] many researchers, the notion of left-wing authoritarianism (LWA) is even been met with skepticism. "

my response : have they ever heard of Mao Marx Stalin ?

think only Hitler KKK etc   :wink:

I read psychologist  Hare's book on psychopaths (in late 90s or very early 2000's when we were being robbed over and over again by everyone in sight practically everyone ) and still like the theories
though I read there is some debate or disagreement about them:

https://en.wikipedia.org/wiki/Psychopathy_Checklist

the thrill of money I found can certainly turn people to act like cold heartless psychopaths.
Title: Genetic underpinnings of bisexuality
Post by: Crafty_Dog on January 04, 2024, 02:22:51 PM
https://www.msn.com/en-us/health/medical/scientists-discover-genetic-underpinnings-of-bisexuality/ar-AA1mqbt6?rc=1&ocid=winp1taskbar&cvid=c177263553b94dacd6c07e5ea41348e7&ei=27
Title: genetic underpinnings of bisexual behavior
Post by: ccp on January 05, 2024, 09:01:24 AM
Thus a Crisper "cure" for this in the future?

Title: Lorenz's "Collective Militant Enthusiasm"
Post by: Crafty_Dog on January 22, 2024, 11:36:17 AM
https://avoiceformen.com/featured/the-militant-enthusiasm-syndrome/
Title: Evolutionary Mechanisms Causing Species to Grow or Shrink in Size
Post by: Body-by-Guinness on January 24, 2024, 10:45:11 PM
Piece examines how/why evolutionary pressures cause species to grow in size, or do the reverse:

https://theness.com/neurologicablog/why-do-species-evolve-to-get-bigger-or-smaller/
Title: Re: Evolutionary biology/psychology
Post by: ccp on January 25, 2024, 06:12:31 AM
an example of island gigantism:

https://en.wikipedia.org/wiki/Kodiak_bear
Title: No Need for Porno Panic?
Post by: Body-by-Guinness on January 25, 2024, 09:19:32 AM
Piece states claims of sundry psych impacts of porno are unfounded alarmism.

Can't escape the notion this is the wrong place to post this, but searched for "psych" and this was the least unlikely place for this post to live. But hey, we could always start a pornography thread....  :evil:

https://thehill.com/opinion/technology/4424581-is-pornography-really-warping-our-brains-or-is-it-a-moral-panic/
Title: Evolutionary Benefit of Human Dreaming?
Post by: Body-by-Guinness on January 25, 2024, 02:05:02 PM
2nd post:

The difference between hunter/gatherer and Western dreams is pretty interesting, though not a lot of conclusions to be found here:

https://singularityhub.com/2024/01/24/dreams-may-have-played-a-crucial-role-in-our-evolutionary-success-as-a-species/

FWIW, my wife says I can “guide” my dreams, and states she is envious of the ability. Not sure if I indeed can, but confess that whenever I have a family in danger dream I manage to grab my dream state Scattergun Technologies enhanced Remington model 870 in 12 gauge loaded with Federal Law Enforcement 8 ball 00 and dispatch the imaginary threat with extreme prejudice….
Title: Re: Evolutionary biology/psychology
Post by: Crafty_Dog on January 25, 2024, 08:08:41 PM
Yes this was the correct thread for the porn piece :-D
Title: Evolutionary Biology on Surplus Chinese Males
Post by: Crafty_Dog on February 04, 2024, 01:45:14 PM
https://www.instagram.com/reel/C26CkIVJcuE/?igsh=Zm01bjNsc2NvdHd1
Title: Two life forms merge for first time in one billion years
Post by: Crafty_Dog on April 22, 2024, 03:47:21 PM
https://www.msn.com/en-us/news/technology/two-lifeforms-merge-into-one-organism-for-first-time-in-a-billion-years/ar-AA1nstjc?ocid=msedgntp&pc=DCTS&cvid=0721d1b22fd944b8ae1393aa64d530ce&ei=7
Title: Re: Evolutionary biology/psychology
Post by: ccp on April 23, 2024, 06:46:41 AM
The algae then incorporates the bacterium as an internal organ called an organelle, which becomes vital to the host’s ability to function.

 :-o  Wow  8-)
Title: Anti-Lorenz
Post by: Crafty_Dog on April 26, 2024, 01:48:00 PM
Even though this is but a teaser in search of money to see the whole thing, there are several interesting footnotes:

https://journals.sagepub.com/doi/10.1177/002234337601300401
Title: Heroic Doubling
Post by: Crafty_Dog on April 29, 2024, 08:31:22 AM



https://amgreatness.com/2024/04/27/heroic-doubling-and-supporting-hamas/


This follows quite closely with a recent discussion a my FB page of Konrad Lorenz's concept of Collective Militant Enthusiasm.   

CME can be, and often is in the current moment, aroused to terrible and unreasoning purpose.   

It also serves for good and necessary reasons too-- a point I think this very good piece misses with regard to its passing reference to Kyle Rittenhouse.

Regardless, on the whole there is some nuanced intelligent discussion here.
========================