Author Topic: Evolutionary biology/psychology  (Read 139520 times)

tim nelson

  • Newbie
  • *
  • Posts: 23
    • View Profile
Re: Evolutionary biology/psychology
« Reply #100 on: April 15, 2011, 03:18:40 PM »
I highly recommend books by Paul Shephard. Nature and Madness, The Tender Carnivore, and Coming Home to the Pleistocene. He was way ahead of his time in my opinion.

He is a proponent for hunter-gatherer lifestyle being the most healthy. He spends a lot of time on comparing our development with other primates socially, physically, nutritionally, etc and with other social and less social large predators. I liked his ideas of how we developed to be quite a mix, digestive systems like that of true omnivores : raccoon, bear, boars. But our social structure and culture was more wolf like. How we especially males evolved as modern humans spending considerable time hunting large game whenver and wherever  we lived. so that those complex processes that occupied the male psyche of problem solving a moving problem, coordinating together, and the types of exercise. And without large game to hunt we need comparable experiences, we crave an experience like that. Comparable such as hunting humans in war, fighting, etc.

Anyway, I liked most of his stuff. Lots of convincing evidence, and I liked his biased stance on hunter-gatherer cultures which helps.

Body-by-Guinness

  • Guest
Welcome to the Family Tree
« Reply #101 on: April 22, 2011, 11:19:32 AM »
A New Hominin – A. sediba
Published by Steven Novella under Evolution
Comments: 8
Following the branching bush of human evolution is getting increasingly difficult. When I studied human evolution in college, things were much simpler. There were a few Australopithecus species followed by a few Homo species, leading to modern humans. It was recognized at the time that these fossil species probably did not represent a nice clean straight line to Homo sapiens, but it seems the family tree has become much bushier than was imagined at the time.
Here is a recent representation of the hominin family tree. We have added more species of Australopithecus and Homo, plus new genuses of Kenyanthropus and Paranthropus (not even including older genuses that predate Australopithecus).
Now researchers have announced the discovery of yet another species of early hominin, about 2 million years old – likely a late species of Australopithecus named A. sediba. They discovered four individuals – two adults, a child and an infant, who likely fell into a “death trap”  in a cave in what is now Malapa, South Africa.
Each bit of fossil evidence is like a piece to a complex puzzle. As more pieces fit into place, however, the picture becomes more complex and more questions are generated. We are still at the stage where new evidence generates more questions than answers, and we have no idea how complex the final picture that emerges will be.
The new discovery is no exception. A. sediba has a mixture of modern (Homo) and primitive (Australopithecine) traits. It has a small brain like a primitive Australopithecus, but has pelvic structure and hand features that are more modern than other members of the genus.
It should also be noted that the first members of the Homo genus arose about a million years before the age of these specific specimens – so these individuals do not represent a population that in ancestral to our genus.
As always, there are multiple ways to interpret this data. It is possible that A. sediba is the ancestral Australopithecine species that led to Homo – either directly, or closely related to that species (yet to be discovered). In this case, these individuals would be later representatives of that species. Species often persist, even for millions of years, after other species branch off from them. So it is always possible to find representative of an ancestral species that are more recent than species that evolved from them.
It is also possible that A. sediba is a separate line of Australopithecines that did not lead to Homo, but developed some similar features. In this case the “modern” features in A. sediba would be analogous to (similar to, but not ancestral to) the modern feature, rather than homologous to (related through evolutionary derivation) the modern Homo features.
Another possibility that was not mentioned in the Science article that I linked to is that these individuals, and possibly A. sediba as a species, or perhaps just one breeding population, represent the results of interbreeding between Homo and Australopithecus species. In this case modern features would have literally mixed with the more primitive features together in A. sediba.
This adds a new layer complexity to our picture of the human family tree (or any family tree). When species divide the separation is not clean, and later remixing of genes is not only possible but probable. There is genetic evidence, for example, of later mixing of genes between human ancestors and chimpanzee ancestors after the split. So it’s not a stretch to think that hominin populations were at least occasionally interbreeding .
I suspect there are many more hominin species and subspecies to be discovered. The picture that is emerging is fascinating, if it is becoming increasingly difficult to keep track of it all. I’ll just have to muddle through.

http://theness.com/neurologicablog/?p=3139

Body-by-Guinness

  • Guest
Belly Bacteria & the Brain
« Reply #102 on: April 22, 2011, 12:20:57 PM »
Second post.

The Neuroscience of the Gut
Strange but true: the brain is shaped by bacteria in the digestive tract
By Robert Martone  | Tuesday, April 19, 2011 | 18

Researchers track the gut-brain connection
Image: dyoma
People may advise you to listen to your gut instincts: now research suggests that your gut may have more impact on your thoughts than you ever realized. Scientists from the Karolinska Institute in Sweden and the Genome Institute of Singapore led by Sven Pettersson recently reported in the Proceedings of the National Academy of Sciences that normal gut flora, the bacteria that inhabit our intestines, have a significant impact on brain development and subsequent adult behavior.

We human beings may think of ourselves as a highly evolved species of conscious individuals, but we are all far less human than most of us appreciate. Scientists have long recognized that the bacterial cells inhabiting our skin and gut outnumber human cells by ten-to-one. Indeed, Princeton University scientist Bonnie Bassler compared the approximately 30,000 human genes found in the average human to the more than 3 million bacterial genes inhabiting us, concluding that we are at most one percent human. We are only beginning to understand the sort of impact our bacterial passengers have on our daily lives.

Moreover, these bacteria have been implicated in the development of neurological and behavioral disorders. For example, gut bacteria may have an influence on the body’s use of vitamin B6, which in turn has profound effects on the health of nerve and muscle cells. They modulate immune tolerance and, because of this, they may have an influence on autoimmune diseases, such as multiple sclerosis. They have been shown to influence anxiety-related behavior, although there is controversy regarding whether gut bacteria exacerbate or ameliorate stress related anxiety responses. In autism and other pervasive developmental disorders, there are reports that the specific bacterial species present in the gut are altered and that gastrointestinal problems exacerbate behavioral symptoms. A newly developed biochemical test for autism is based, in part, upon the end products of bacterial metabolism.

But this new study is the first to extensively evaluate the influence of gut bacteria on the biochemistry and development of the brain. The scientists raised mice lacking normal gut microflora, then compared their behavior, brain chemistry and brain development to mice having normal gut bacteria. The microbe-free animals were more active and, in specific behavioral tests, were less anxious than microbe-colonized mice. In one test of anxiety, animals were given the choice of staying in the relative safety of a dark box, or of venturing into a lighted box. Bacteria-free animals spent significantly more time in the light box than their bacterially colonized littermates. Similarly, in another test of anxiety, animals were given the choice of venturing out on an elevated and unprotected bar to explore their environment, or remain in the relative safety of a similar bar protected by enclosing walls. Once again, the microbe-free animals proved themselves bolder than their colonized kin.

Pettersson’s team next asked whether the influence of gut microbes on the brain was reversible and, since the gut is colonized by microbes soon after birth, whether there was evidence that gut microbes influenced the development of the brain. They found that colonizing an adult germ-free animal with normal gut bacteria had no effect on their behavior. However, if germ free animals were colonized early in life, these effects could be reversed. This suggests that there is a critical period in the development of the brain when the bacteria are influential.

Consistent with these behavioral findings, two genes implicated in anxiety -- nerve growth factor-inducible clone A (NGF1-A) and brain-derived neurotrophic factor (BDNF) -- were found to be down-regulated in multiple brain regions in the germ-free animals. These changes in behavior were also accompanied by changes in the levels of several neurotransmitters, chemicals which are responsible for signal transmission between nerve cells. The neurotransmitters dopamine, serotonin and noradrenaline were elevated in a specific region of the brain, the striatum, which is associated with the planning and coordination of movement and which is activated by novel stimuli, while there were there were no such effects on neurotransmitters in other brain regions, such as those involved in memory (the hippocampus) or executive function (the frontal cortex).

When Pettersson’s team performed a comprehensive gene expression analysis of five different brain regions, they found nearly 40 genes that were affected by the presence of gut bacteria. Not only were these primitive microbes able to influence signaling between nerve cells while sequestered far away in the gut, they had the astonishing ability to influence whether brain cells turn on or off specific genes.   

How, then, do these single-celled intestinal denizens exert their influence on a complex multicellular organ such as the brain? Although the answer is unclear, there are several possibilities: the Vagus nerve, for example, connects the gut to the brain, and it’s known that infection with the Salmonella bacteria stimulates the expression of certain genes in the brain, which is blocked when the Vagus nerve is severed. This nerve may be stimulated as well by normal gut microbes, and serve as the link between them and the brain. Alternatively, those microbes may modulate the release of chemical signals by the gut into the bloodstream which ultimately reach the brain. These gut microbes, for example, are known to modulate stress hormones which may in turn influence the expression of genes in the brain.

Regardless of how these intestinal “guests” exert their influence, these studies suggest that brain-directed behaviors, which influence the manner in which animals interact with the external world, may be deeply influenced by that animal’s relationship with the microbial organisms living in its gut. And the discovery that gut bacteria exert their influence on the brain within a discrete developmental stage may have important implications for developmental brain disorders.

http://www.scientificamerican.com/article.cfm?id=the-neuroscience-of-gut

Body-by-Guinness

  • Guest
Language is Innate?
« Reply #103 on: April 26, 2011, 08:11:03 PM »
Baby Language
Published by Steven Novella under Neuroscience
Comments: 4
Recent studies demonstrate that babies 12-18 months old have similar activity in their brains in response to spoken words as do adults, a fact that tells us a lot about the development of language function.
In the typical adult brain language function is primarily carried out in highly specialized parts of the brain – Wernicke’s area (in the dominant, usually left, superior temporal lobe) processes words into concepts and concepts into words, while Broca’s area (in the dominant posterior-inferior frontal lobe) controls speech output. The two areas are connected by the arcuate fasciculus and are fed by both auditory and visual input. Taken as a whole this part of the brain functions as the language cortex. A stroke or other damage to this area in an adult results in loss of one ore more aspects of speech, depending on the extent of damage.
Damage to this part of the brain in babies, however, does not have the same effect. When such children grow up they are able to develop essentially normal language function. There are two prevailing theories to explain this. The first is that language function is more widely distributed in infants than in adults, perhaps also involving the same structures on the non-dominant side of the brain. As the brain matures language function becomes confined to the primary language cortex.
The second theory is that brain plasticity allows non-damaged parts of the brain to take over function for the language cortex. Such plasticity exists even in adult brains, but is vastly more significant in babies, whose brains are still developing and wiring themselves. There is still a lot of raw brain material that has not fully specialized yet that can be coopted for whatever functions are needed.
The new research has implications for this debate. If the former theory is correct, then babies who are just learning language would activate their brains more broadly than adults in response to language. If babies show a similar pattern of activation, that would support the plasticity theory.
This latest research firmly supports plasticity as the answer. Researchers at the University of California used functional MRI scans and magnetoencephalography (MEG) to look at the brain activity of 12-18 month old children in response to spoken words. They found that their primary language cortex lit up in a similar pattern to adults. They further tested to see if the children had any sense of the meaning of the words. They showed pictures of common objects with either a correct or incorrect spoken word. The children showed increased language area activity when the words were incongruous to the picture – and the researchers showed this is the same increase in activity as seen in adults.
What this research implies is that the genetic program for brain design comes into effect very early in brain development. The language cortex is destined to be language cortex right from the beginning, as long as nothing goes wrong with this process.
It should also be noted that this study looked only at the response to individual words. What it says about the 12-18 month old stage of development is that children of this age are already programming their language areas and storing up words and their meanings. This research did not look at other aspects of language, such as grammar – the ability to string multiple words together in a specific way in order to create meaning. It also did not look at the visual processing of written words.
Any parent of young children will likely remember with great detail the functional language development of their own children. At this age, and even younger than 12 months, children do seem to be sponges for language. Once they start learning words, they do so very quickly. Young children also seem to understand far more words than they can say. I don’t think this is mere confirmation bias (although that would tend to exaggerate the appearance of word acquisition), and research bears out that children can understand many more words than they can say. The ability to speak comes a bit later than the ability to assign meaning to specific words.
I remember that I played games with my children when they were about one year old, and still in the babbling stage. They could reliably, for example, retrieve specific toys by name (being very careful to avoid the clever Hans effect). I remember, in fact, be very surprised at how well they performed – they seemed to understand many more words than I would have thought given the rudimentary nature of their babbling. In this case, careful research confirms subjective experience – children learn to understand words spoken to them before they gain the ability to say them.
This makes sense from the point of view that it is very neurologically difficult to articulate. We take it for granted, but it does require dedicated cortex to pull of this feat. Also, think about how easy it is to become dysarthric – we start slurring our words even when we are just a little sleep deprived, or with a moderate amount of alcohol. It does seem to be a function that goes early when brain function is even slightly compromised, which says something about how demanding it is.
One more tangential point – it also strikes me that we tend to naively judge what is going on in people’s heads by what is coming out of their mouths. This is not unreasonable in most circumstances, but there are many reasons why people may be more mentally sharp than is evidenced by their articulation. Young children are just one example – they may be babbling with their mouths, but there is more linguistically going on in their brains.

http://theness.com/neurologicablog/?p=2711

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
The Dishonest Minority
« Reply #104 on: May 17, 2011, 11:08:55 AM »

      Status Report: "The Dishonest Minority"



Three months ago, I announced that I was writing a book on why security
exists in human societies.  This is basically the book's thesis statement:

     All complex systems contain parasites.  In any system of
     cooperative behavior, an uncooperative strategy will be effective
     -- and the system will tolerate the uncooperatives -- as long as
     they're not too numerous or too effective. Thus, as a species
     evolves cooperative behavior, it also evolves a dishonest minority
     that takes advantage of the honest majority.  If individuals
     within a species have the ability to switch strategies, the
     dishonest minority will never be reduced to zero.  As a result,
     the species simultaneously evolves two things: 1) security systems
     to protect itself from this dishonest minority, and 2) deception
     systems to successfully be parasitic.

     Humans evolved along this path.  The basic mechanism can be
     modeled simply.  It is in our collective group interest for
     everyone to cooperate. It is in any given individual's short-term
     self-interest not to cooperate: to defect, in game theory terms.
     But if everyone defects, society falls apart.  To ensure
     widespread cooperation and minimal defection, we collectively
     implement a variety of societal security systems.

     Two of these systems evolved in prehistory: morals and reputation.
     Two others evolved as our social groups became larger and more
     formal: laws and technical security systems.  What these security
     systems do, effectively, is give individuals incentives to act in
     the group interest.  But none of these systems, with the possible
     exception of some fanciful science-fiction technologies, can ever
     bring that dishonest minority down to zero.

     In complex modern societies, many complications intrude on this
     simple model of societal security. Decisions to cooperate or
     defect are often made by groups of people -- governments,
     corporations, and so on -- and there are important differences
     because of dynamics inside and outside the groups. Much of our
     societal security is delegated -- to the police, for example --
     and becomes institutionalized; the dynamics of this are also
     important.

     Power struggles over who controls the mechanisms of societal
     security are inherent: "group interest" rapidly devolves to "the
     king's interest."  Societal security can become a tool for those
     in power to remain in power, with the definition of "honest
     majority" being simply the people who follow the rules.

     The term "dishonest minority" is not a moral judgment; it simply
     describes the minority who does not follow societal norm.  Since
     many societal norms are in fact immoral, sometimes the dishonest
     minority serves as a catalyst for social change.  Societies
     without a reservoir of people who don't follow the rules lack an
     important mechanism for societal evolution.  Vibrant societies
     need a dishonest minority; if society makes its dishonest minority
     too small, it stifles dissent as well as common crime.

At this point, I have most of a first draft: 75,000 words.  The
tentative title is still "The Dishonest Minority: Security and its Role
in Modern Society."  I have signed a contract with Wiley to deliver a
final manuscript in November for February 2012 publication.  Writing a
book is a process of exploration for me, and the final book will
certainly be a little different -- and maybe even very different -- from
what I wrote above.  But that's where I am today.

And it's why my other writings -- and the issues of Crypto-Gram --
continue to be sparse.

Lots of comments -- over 200 -- to the blog post.  Please comment there;
I want the feedback.
http://www.schneier.com/blog/archives/2011/02/societal_securi.html

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
Re: Evolutionary biology/psychology
« Reply #105 on: August 19, 2011, 11:19:39 AM »
TTT for the attention of Richard Lighty

prentice crawford

  • Guest
Re: Evolutionary biology/psychology
« Reply #106 on: September 08, 2011, 04:10:09 PM »
 

  Closest Human Ancestor May Rewrite Steps in Our Evolution
By Charles Q. Choi, LiveScience Contributor |

  A startling mix of human and primitive traits found in the brains, hips, feet and hands of an extinct species identified last year make a strong case for it being the immediate ancestor to the human lineage, scientists have announced.

These new findings could rewrite long-standing theories about the precise steps human evolution took, they added, including the notion that early human female hips changed shape to accommodate larger-brained offspring. There is also new evidence suggesting that this species had the hands of a toolmaker.

Fossils of the extinct hominid known as Australopithecus sediba were accidentally discovered by the 9-year-old son of a scientist in the remains of a cave in South Africa in 2008, findings detailed by researchers last year. Australopithecus means "southern ape," and is a group that includes the iconic fossil Lucy, while sediba means "wellspring" in the South African language Sotho. [See images of human ancestor]

Two key specimens were discovered — a juvenile male as developed as a 10- to 13-year-old human and an adult female maybe in her late 20s or early 30s. The species is both a hominid and a hominin — hominids include humans, chimpanzees, gorillas and their extinct ancestors, while hominins include those species after Homo, the human lineage, split from that of chimpanzees.

To begin to see where Au. sediba might fit on the family tree, researchers pinned down the age of the fossils by dating the calcified sediments surrounding them with advanced uranium-lead dating techniques and a method called paleomagnetic dating, which measures how many times the Earth's magnetic field has reversed. They discovered the fossils were approximately 1.977 million years old, which predates the earliest appearances of traits specific to the human lineage Homo in the fossil record. This places Au. sediba in roughly the same age category as hominids such as Homo habilis and Homo rudolfensis, which were thought to be potential ancestors to Homo erectus, the earliest undisputed predecessor of modern humans. [10 Things That Make Humans Special]

"As the fossil record for early human ancestors increases, the need for more accurate dates is becoming paramount," said researcher Robyn Pickering at the University of Melbourne in Australia.

Small but humanlike brain

Most aspects of Au. sediba display an intriguing mix of both human and more primitive features that hint it might be an intermediary form between Australopithecus and Homo.

"The fossils demonstrate a surprisingly advanced but small brain, a very evolved hand with a long thumb like a human's, a very modern pelvis, but a foot and ankle shape never seen in any hominin species that combines features of both apes and humans in one anatomical package," said researcher Lee Berger, a paleoanthropologist at the University of Witwatersrand in Johannesburg, South Africa. "The many very advanced features found in the brain and body and the earlier date make it possibly the best candidate ancestor for our genus, the genus Homo, more so than previous discoveries such as Homo habilis."

The brain is often thought of as what distinguishes humanity from the rest of the animal kingdom, and the juvenile specimen of Au. sediba had an exceptionally well-preserved skull that could shed light on the pace of brain evolution in early hominins. To find out more, the researchers scanned the space in the skull where its brain would have been using the European Synchrotron Radiation Facility in Grenoble, France; the result is the most accurate scan ever produced for an early human ancestor, with a level of detail of up to 90 microns, or just below the size of a human hair.

The scan revealed Au. sediba had a much smaller brain than seen in human species, with an adult version maybe only as large as a medium-size grapefruit. However, it was humanlike in several ways — for instance, its orbitofrontal region directly behind the eyes apparently expanded in ways that make it more like a human's frontal lobe in shape. This area is linked in humans with higher mental functions such as multitasking, an ability that may contribute to human capacities for long-term planning and innovative behavior.

"We could be seeing the beginnings of those capabilities," researcher Kristian Carlson at the University of Witwatersrand told LiveScience.

These new findings cast doubt on the long-standing theory that brains gradually increased in size and complexity from Australopithecus to Homo. Instead, their findings corroborate an alternative idea — that Australopithecus brains did increase in complexity gradually, becoming more like Homo, and later increased in size relatively quickly.

Modern hips

This mosaic of modern and primitive traits held true with its hips as well. An analysis of the partial pelvis of the female Au. sediba revealed that it had modern, humanlike features.

"It is surprising to discover such an advanced pelvis in such a small-brained creature," said researcher Job Kibii at the University of the Witwatersrand.  "It is short and broad like a human pelvis ... parts of the pelvis are indistinguishable from that of humans."

Scientists had thought the human-like pelvis evolved to accommodate larger-brained offspring. The new findings of humanlike hips in Au. sediba despite small-brained offspring suggests these pelvises may have instead initially evolved to help this hominin better wander across the landscape, perhaps as grasslands began to expand across its habitat.

When it came to walking, investigating the feet and ankles of the fossils revealed surprises about how Au. sediba might have strode across the world. No hominin ankle has ever been described with so many primitive and advanced features.

"If the bones had not been found stuck together, the team may have described them as belonging to different species," said researcher Bernhard Zipfel at the University of the Witwatersrand.

The  researchers discovered that its ankle joint is mostly like a human's, with some evidence for a humanlike arch and a well--efined Achilles tendon, but its heel and shin bones appear to be mostly ape-like. This suggested the hominid probably climbed trees yet also halkid in a unique way not exactly like that of humans.

Altogether, such anatomical traits would have allowed Au. sediba to walk in perhaps a more energy-efficient way, with tendons storing energy and returning that energy to the next step, said researcher Steve Churchill from Duke University in Durham, N.C. "These are the kinds of things that we see with the genus Homo," he explained.

What nice hands …

Finally, an analysis of Au. sediba's hands suggests it might have been a toolmaker. The fossils — including the most complete hand known in an early hominin, which is missing only a few bones and belonged to the mature female specimen — showed its hand was capable of the strong grasping needed for tree-climbing, but that it also had a long thumb and short fingers. These would have allowed it a precision grip useful for tools, one involving just the thumb and fingers, where the palm does not play an active part.

Altogether, the hand of Au. sediba has more features related to tool-making than that of the first human species thought of as a tool user, the "handy man" Homo habilis, said researcher Tracy Kivell at the Max Planck Institute for Evolutionary Anthropology in Germany. "This suggests to us that sediba may also have been a toolmaker."

Though the scientists haven't excavated the site in search of stone tools, "the hand and brain morphology suggest that Au. sediba may have had the capacity to manufacture and use complex tools," Kivell added.

The researchers do caution that although they suggest that Au. sediba was ancestral to the human lineage, all these apparent resemblances between it and us could just be coincidences, with this extinct species evolving similar traits to our lineages due, perhaps, to similar circumstances. [Top 10 Missing Links]

In fact, it might be just as interesting to imagine that Au. sediba was not directly ancestral to Homo, because it opens up the possibility "of independent evolution of the same sorts of features," Carlson said. "Whether or not it's on the same lineage as leading to Homo, I think there are interesting questions and implications."

The scientists detailed their findings in the Sept. 9 issue of the journal Science.

                                                P.C.

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
WSJ: Ridley: From Phoenicia to the Cloud
« Reply #107 on: September 24, 2011, 09:09:57 AM »
Matt Ridley is an author whom I follow.  I have read his "The Red Queen" (the evolutionary reasons sex exists and the implications thereof) and "Nature via Nuture (also quite brilliant and which has triggered a shift in how I think about these things.)



By MATT RIDLEY
The crowd-sourced, wikinomic cloud is the new, new thing that all management consultants are now telling their clients to embrace. Yet the cloud is not a new thing at all. It has been the source of human invention all along. Human technological advancement depends not on individual intelligence but on collective idea sharing, and it has done so for tens of thousands of years. Human progress waxes and wanes according to how much people connect and exchange.

When the Mediterranean was socially networked by the trading ships of Phoenicians, Greeks, Arabs or Venetians, culture and prosperity advanced. When the network collapsed because of pirates at the end of the second millennium B.C., or in the Dark Ages, or in the 16th century under the Barbary and Ottoman corsairs, culture and prosperity stagnated. When Ming China, or Shogun Japan, or Nehru's India, or Albania or North Korea turned inward and cut themselves off from the world, the consequence was relative, even absolute decline.

Knowledge is dispersed and shared. Friedrich Hayek was the first to point out, in his famous 1945 essay "The Uses of Knowledge in Society," that central planning cannot work because it is trying to substitute an individual all-knowing intelligence for a distributed and fragmented system of localized but connected knowledge.

So dispersed is knowledge, that, as Leonard Reed famously observed in his 1958 essay "I, Pencil," nobody on the planet knows how to make a pencil. The knowledge is dispersed among many thousands of graphite miners, lumberjacks, assembly line workers, ferrule designers, salesmen and so on. This is true of everything that I use in my everyday life, from my laptop to my shirt to my city. Nobody knows how to make it or to run it. Only the cloud knows.

One of the things I have tried to do in my book "The Rational Optimist" is to take this insight as far back into the past as I can—to try to understand when it first began to be true. When did human beings start to use collective rather than individual intelligence?

In doing so, I find that the entire field of anthropology and archaeology needs Hayek badly. Their debates about what made human beings successful, and what caused the explosive take-off of human culture in the past 100,000 years, simply never include the insight of dispersed knowledge. They are still looking for a miracle gene, or change in brain organization, that explains, like a deus ex machina, the human revolution. They are still looking inside human heads rather than between them.

Enlarge Image

CloseGetty Images
 ."I think there was a biological change—a genetic mutation of some kind that promoted the fully modern ability to create and innovate," wrote the anthropologist Richard Klein in a 2003 speech to the American Association for the Advancement of Science. "The sudden expansion of the brain 200,000 years ago was a dramatic spontaneous mutation in the brain . . . a change in a single gene would have been enough," the neuroscientist Colin Blakemore told the Guardian in 2010.

There was no sudden change in brain size 200,000 years ago. We Africans—all human beings are descended chiefly from people who lived exclusively in Africa until about 65,000 years ago—had slightly smaller brains than Neanderthals, yet once outside Africa we rapidly displaced them (bar acquiring 2.5% of our genes from them along the way).

And the reason we won the war against the Neanderthals, if war it was, is staring us in the face, though it remains almost completely unrecognized among anthropologists: We exchanged. At one site in the Caucasus there are Neanderthal and modern remains within a few miles of each other, both from around 30,000 years ago. The Neanderthal tools are all made from local materials. The moderns' tools are made from chert and jasper, some of which originated many miles away. That means trade.

Evidence from recent Australian artifacts shows that long-distance movement of objects is a telltale sign of trade, not migration. We Africans have been doing this since at least 120,000 years ago. That's the date of beads made from marine shells found a hundred miles inland in Algeria. Trade is 10 times as old as agriculture.

At first it was a peculiarity of us Africans. It gave us the edge over Neanderthals in their own continent and their own climate, because good ideas can spread through trade. New weapons, new foods, new crafts, new ornaments, new tools. Suddenly you are no longer relying on the inventiveness of your own tribe or the capacity of your own territory. You are drawing upon ideas that occurred to anybody anywhere anytime within your trading network.

In the same way, today, American consumers do not have to rely only on their own citizens to discover new consumer goods or new medicines or new music: The Chinese, the Indians, the Brazilians are also able to supply them.

That is what trade does. It creates a collective innovating brain as big as the trade network itself. When you cut people off from exchange networks, their innovation rate collapses. Tasmanians, isolated by rising sea levels about 10,000 years ago, not only failed to share in the advances that came after that time—the boomerang, for example—but actually went backwards in terms of technical virtuosity. The anthropologist Joe Henrich of the University of British Columbia argues that in a small island population, good ideas died faster than they could be replaced. Tierra del Fuego's natives, on a similarly inhospitable and small land, but connected by trading canoes across the much narrower Magellan strait, suffered no such technological regress. They had access to a collective brain the size of South America.

Which is of course why the Internet is such an exciting development. For the first time humanity has not just some big collective brains, but one truly vast one in which almost everybody can share and in which distance is no obstacle.

The political implications are obvious: that human collaboration is necessary for society to work; that the individual is not—and has not been for 120,000 years—able to support his lifestyle; that trade enables us to work for each other not just for ourselves; that there is nothing so antisocial (or impoverishing) as the pursuit of self-sufficiency; and that authoritarian, top-down rule is not the source of order or progress.

Hayek understood all this. And it's time most archaeologists and anthropologists, as well as some politicians and political scientists, did as well.

Mr. Ridley writes the Journal's weekly Mind & Matter column. He is the author of "The Rational Optimist: How Prosperity Evolves" (Harper, 2010). This op-ed is adapted from his Hayek Prize lecture, given under the auspices of the Manhattan Institute, to be delivered on Sept. 26.


Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
Snowboarding crow
« Reply #108 on: January 13, 2012, 06:31:36 AM »
This footage seems to me to be quite extraordinary. Apparently the crow has observed humans snowboarding and has taken up the sport himself.

Thus we see:

a) cross-species learning
b) the use of a tool
c) play

http://www.youtube.com/watch?v=3dWw9GLcOeA&feature=share

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
WSJ: Lionel Tiger on Facebook
« Reply #109 on: February 06, 2012, 05:38:47 AM »

By LIONEL TIGER

When the first phone line linked two New England towns, the inevitable arrogant scold asked if the people of town X had anything to say to the folks of town Y. His implication was "no." Why have more to do with (implicitly fallen) fellow humans than absolutely necessary? Why should technology abet friendliness?

Mr. Scold was wrong. One of the most successful magazine launches of the last decades was People, carefully and endlessly just about that, week in and week out, year after year. Europe boasts a strange menagerie of similar publications that ceaselessly chronicle the libidinous events in the lives of minor Scandinavian royalty and the housing buys and sells of soccer stars before and after their divorces. Magazines pay the price of a used fighter plane for the first photo of the baby of certified stars.

People want to know about this town and that other town too. It's their nature.

Primates always want to know what is going on. If it's over the hill where you can't see for sure what's up, that's even more stimulating and important to secure long-range survival. Primates are intensely interested in each other and other groups. It was pointed out in the 1960s that in some ground-living species, members of the group glanced at the lead primate every 20 or 30 seconds. Think Louis Quatorze or Mick Jagger. Look, look, look—people are always on the lookout.

The human who has most adroitly—if at first innocently, and in the next weeks most profitably—capitalized on this is Facebook founder Mark Zuckerberg.

"Facebook." Get it? Not FootBook or ElbowBook. The face. It gets you a driver's license and stars send it out to fans. We know that many users' first and classical impulse was acquiring convivial acquaintance with young women. Facebook married that ancient Darwinian urgency to a cheap, brilliantly lucid, and endlessly replicable technology.

The result has been virtually incalculable and not only for Mr. Zuckerberg's lunch money. Nearly one-sixth of homo sapiens are on Facebook. Half of Americans over age 12 are on it. It is world-wide and has been joined by other tools of conviviality such as Twitter. Nearly 15% of Americans already belong to that new tribe. There are others.

Mr. Zuckerberg has re-primatized a group of humans of unprecedented number, diffusion and intensity. His product costs him virtually nothing to produce—it is simply us. We enter his shop, display ourselves as attractively or interestingly as we can, replenish ourselves hourly or daily or by the minute, and do it for nothing. Doesn't cost him a nickel.

And why? Just because we're primates with endlessly deep interest in each other, with a knack and need to groom each other—either physically, as monkeys do, or with "What a nice hairdo/dress/divorce/promotion!" as Facebookworms do. There is much to transmit between towns and between people.

Mr. Zuckerberg bestrides vast business numbers once dreamt of only by toothpaste and soft-drink makers. This reflects a new commercial demography in which the consumer is not someone who wants something necessary, but rather one who seeks to assert simply what he is. And the tool he uses in order to become nothing more or less than an efficient, interesting and socially prosperous primate is the Facebook page.

The technology is new but the passion for connection isn't. In Paris a hundred years ago pneumatic tubes ran all the through the parts of town that could afford them so messages could be written and sent as if by courier. When I was a student in London, there were mail deliveries twice a day and in some environs three. The homo sapien wants to know, to exchange, to show its face.

And when the counting houses work triple-time recording the riches from all this, it will be sweet comedy to remember that Mr. Zuckerberg became the richest primatologist in the world because he gave his customers nothing new, except the chance to be their old ape selves.

Mr. Tiger, an emeritus professor of anthropology at Rutgers, is the author of "The Decline of Males" (St. Martins, 2000) and, with Michael McGuire, of "God's Brain" (Prometheus Books, 2010).

bigdog

  • Power User
  • ***
  • Posts: 2321
    • View Profile
When the good do bad
« Reply #110 on: March 21, 2012, 05:58:15 PM »
http://www.nytimes.com/2012/03/20/opinion/brooks-when-the-good-do-bad.html?_r=1&ref=davidbrooks


"According to this view, most people are naturally good, because nature is good. The monstrosities of the world are caused by the few people (like Hitler or Idi Amin) who are fundamentally warped and evil.

This worldview gives us an easy conscience, because we don’t have to contemplate the evil in ourselves. But when somebody who seems mostly good does something completely awful, we’re rendered mute or confused.

But of course it happens all the time. That’s because even people who contain reservoirs of compassion and neighborliness also possess a latent potential to commit murder."

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
Re: Evolutionary biology/psychology
« Reply #111 on: March 22, 2012, 04:12:49 AM »
Indeed BD.  Coincidentally enough I am re-reading right now a rather thick book that is a collection of essays on the concept, Jungian and otherwise, of the shadow.

DougMacG

  • Power User
  • ***
  • Posts: 19447
    • View Profile
Evolutionary psychology: America’s false autism epidemic
« Reply #112 on: April 26, 2012, 12:32:48 PM »
America’s false autism epidemic,  by Dr. Allen Frances, professor emeritus at Duke University’s department of psychology

The apparent epidemic of autism is in fact the latest instance of the fads that litter the history of psychiatry.

We have a strong urge to find labels for disturbing behaviors; naming things gives us an (often false) feeling that we control them. So, time and again, an obscure diagnosis suddenly comes out of nowhere to achieve great popularity. It seems temporarily to explain a lot of previously confusing behavior — but then suddenly and mysteriously returns to obscurity.

Not so long ago, autism was the rarest of diagnoses, occurring in fewer than one in 2,000 people. Now the rate has skyrocketed to 1 in 88 in America (and to a remarkable 1 in 38 in Korea). And there is no end in sight.

Increasingly panicked, parents have become understandably vulnerable to quackery and conspiracy theories. The worst result has been a reluctance to vaccinate kids because of the thoroughly disproved and discredited suggestion that the shots can somehow cause autism.

There are also frantic (and probably futile) efforts to find environmental toxins that might be harming developing brains, explaining the sudden explosion of autism.

Anything is possible, but when rates rise this high and this fast, the best bet is always that there has been a change in diagnostic habits, not a real change in people or in the rate of illness.

So what is really going on to cause this “epidemic”?

Perhaps a third of the huge jump in rates can be explained by three factors: the much-increased public and provider awareness of autism, the much-reduced stigma associated with it and the fact that the definition of autism has been loosened to include milder cases.

Sixteen years ago, when we updated the DSM (the official manual of psych diagnoses) for the fourth edition, we expanded the definition of autism to include Aspergers. At the time, we expected this to triple the rate of diagnosed cases; instead, it has climbed 20 times higher.

That unexpected jump has three obvious causes. Most important, the diagnosis has become closely linked with eligibility for special school services.

Having the label can make the difference between being closely attended to in a class of four versus being lost in a class of 40. Kids who need special attention can often get it only if they are labeled autistic.

So the autism tent has been stretched to accommodate a wide variety of difficult learning, behavioral and social problems that certainly deserve help — but aren’t really autism. Probably as many as half of the kids labeled autistic wouldn’t really meet the DSM IV criteria if these were applied carefully.

Freeing autism from its too tight coupling with service provision would bring down its rates and end the “epidemic.” But that doesn’t mean that school services should also be reduced. The mislabeled problems are serious in their own right, and call out for help.

The second driver of the jump in diagnosis has been a remarkably active and successful consumer advocacy on autism, facilitated by the power of the Internet. This has had four big upsides: the identification of previously missed cases, better care and education for the identified cases, greatly expanded research and a huge reduction in stigma.

But there are two unfortunate downsides: Many people with the diagnosis don’t really meet the criteria for it, and the diagnosis has become so heterogeneous that it loses meaning and predictive value. This is why so many kids now outgrow their autism. They were never really autistic in the first place.

A third cause has been overstated claims coming from epidemiological research — studies of autism rates in the general population. For reasons of convenience and cost, the ratings in the studies always have to be done by lay interviewers, who aren’t trained as clinicians and so are unable to judge whether the elicited symptoms are severe and enduring enough to qualify as a mental disorder.

It’s important to understand that the rates reported in these studies are always upper limits, not true rates; they exaggerate the prevalence of autism by including people who’d be excluded by careful clinical interview. (This also explains why rates can change so quickly from year to year.)

So where do we stand, and what should we do? I am for a more careful and restricted diagnosis of autism that isn’t driven by service requirements. I am also for kids getting the school services they need.

The only way to achieve both goals is to reduce the inordinate power of the diagnosis of autism in determining who gets what educational service. Psychiatric diagnosis is devised for use in clinical settings, not educational ones. It may help contribute to educational decisions but should not determine them.

Human nature changes slowly, if at all, but the ways we label it can change fast and tend to follow fleeting fashions.

Dr. Allen Frances, now a professor emeritus at Duke University’s department of psychology, chaired the DSM IV task force.

Read more: http://www.nypost.com/p/news/opinion/opedcolumnists/america_false_autism_epidemic_jfI7XORH94IcUB795b6f7L#ixzz1tB0kPCdK

JDN

  • Power User
  • ***
  • Posts: 2004
    • View Profile
Re: Evolutionary biology/psychology
« Reply #113 on: April 27, 2012, 09:55:38 AM »
On a personal note, an acquaintance of mine at a local 4 year college sought and was diagnosed with Attention Deficit Disorder.  He was given private instruction, private tutors, extra time on tests, exemptions from certain requirements, etc.  At what cost?  And frankly, IMHO his degree is suspect.  My heart bleeds for the truly handicapped; but this has become ridiculous. 

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
Silicon Valley says step away from the device
« Reply #114 on: July 25, 2012, 07:01:53 PM »


Silicon Valley Says Step Away From the Device
Tech firms are uneasy over the effect time online has on relationships.
By MATT RICHTEL
Published: July 23, 2012
 
Stuart Crabb, a director in the executive offices of Facebook, naturally likes to extol the extraordinary benefits of computers and smartphones. But like a growing number of technology leaders, he offers a warning: log off once in a while, and put them down.

The New York Times

In a place where technology is seen as an all-powerful answer, it is increasingly being seen as too powerful, even addictive.

The concern, voiced in conferences and in recent interviews with many top executives of technology companies, is that the lure of constant stimulation — the pervasive demand of pings, rings and updates — is creating a profound physical craving that can hurt productivity and personal interactions.

“If you put a frog in cold water and slowly turn up the heat, it’ll boil to death — it’s a nice analogy,” said Mr. Crabb, who oversees learning and development at Facebook. People “need to notice the effect that time online has on your performance and relationships.”

The insight may not sound revelatory to anyone who has joked about the “crackberry” lifestyle or followed the work of researchers who are exploring whether interactive technology has addictive properties.

But hearing it from leaders at many of Silicon Valley’s most influential companies, who profit from people spending more time online, can sound like auto executives selling muscle cars while warning about the dangers of fast acceleration.

“We’re done with this honeymoon phase and now we’re in this phase that says, ‘Wow, what have we done?’ ” said Soren Gordhamer, who organizes Wisdom 2.0, an annual conference he started in 2010 about the pursuit of balance in the digital age. “It doesn’t mean what we’ve done is bad. There’s no blame. But there is a turning of the page.”

At the Wisdom 2.0 conference in February, founders from Facebook, Twitter, eBay, Zynga and PayPal, and executives and managers from companies like Google, Microsoft, Cisco and others listened to or participated in conversations with experts in yoga and mindfulness. In at least one session, they debated whether technology firms had a responsibility to consider their collective power to lure consumers to games or activities that waste time or distract them.

The actual science of whether such games and apps are addictive is embryonic. But the Diagnostic and Statistical Manual of Mental Disorders, widely viewed as the authority on mental illnesses, plans next year to include “Internet use disorder” in its appendix, an indication researchers believe something is going on but that requires further study to be deemed an official condition.

Some people disagree there is a problem, even if they agree that the online activities tap into deep neurological mechanisms. Eric Schiermeyer, a co-founder of Zynga, an online game company and maker of huge hits like FarmVille, has said he has helped addict millions of people to dopamine, a neurochemical that has been shown to be released by pleasurable activities, including video game playing, but also is understood to play a major role in the cycle of addiction.

But what he said he believed was that people already craved dopamine and that Silicon Valley was no more responsible for creating irresistible technologies than, say, fast-food restaurants were responsible for making food with such wide appeal.

“They’d say: ‘Do we have any responsibility for the fact people are getting fat?’ Most people would say ‘no,’ ” said Mr. Schiermeyer. He added: “Given that we’re human, we already want dopamine.”

Along those lines, Scott Kriens, chairman of Juniper Networks, one of the biggest Internet infrastructure companies, said the powerful lure of devices mostly reflected primitive human longings to connect and interact, but that those desires needed to be managed so they did not overwhelm people’s lives.

“The responsibility we have is to put the most powerful capability into the world,” he said. “We do it with eyes wide open that some harm will be done. Someone might say, ‘Why not do so in a way that causes no harm?’ That’s naïve.”

“The alternative is to put less powerful capability in people’s hands and that’s a bad trade-off,” he added.

Mr. Crabb, the Facebook executive, said his primary concern was that people live balanced lives. At the same time, he acknowledges that the message can run counter to Facebook’s business model, which encourages people to spend more time online. “I see the paradox,” he said.

The emerging conversation reflects a broader effort in the valley to offer counterweights to the fast-paced lifestyle. Many tech firms are teaching meditation and breathing exercises to their staff members to help them slow down and disconnect.

At Cisco, Padmasree Warrior, the chief technology and strategy officer and its former head of engineering, a position where she oversaw 22,000 employees, said she regularly told people to take a break and a deep breath, and did so herself. She meditates every night and takes Saturday to paint and write poetry, turning off her phone or leaving it in the other room.

“It’s almost like a reboot for your brain and your soul,” she said. She added of her Saturday morning digital detox: “It makes me so much calmer when I’m responding to e-mails later.”

Kelly McGonigal, a psychologist who lectures about the science of self-control at the Stanford School of Medicine (and has been invited to lecture at the business school at Stanford), said she regularly talked with leaders at technology companies about these issues. She added that she was impressed that they had been open to discussing a potential downside of their innovations. “The people who are running these companies deeply want their technology and devices to enhance lives,” said Dr. McGonigal. “But they’re becoming aware of people’s inability to disengage.”

She also said she believed that interactive gadgets could create a persistent sense of emergency by setting off stress systems in the brain — a view that she said was becoming more widely accepted.

“It’s this basic cultural recognition that people have a pathological relationship with their devices,” she said. “People feel not just addicted, but trapped.”

Michelle Gale, who recently left her post as the head of learning and development at Twitter, said she regularly coached engineers and executives at the company that their gadgets had addictive properties.

“They said, ‘Wow, I didn’t know that.’ Or, ‘I guess I knew that but I don’t know what to do about it,’ ” recalled Ms. Gale, who regularly organized meditation and improvisation classes at Twitter to encourage people to let their minds wander.

Google has started a “mindfulness” movement at the company to teach employees self-awareness and to improve their ability to focus. Richard Fernandez, an executive coach at Google and one of the leaders of the mindfulness movement, said the risks of being overly engaged with devices were immense.

“It’s nothing less than everything,” he said, adding that if people can find time to occasionally disconnect, “we can have more intimate and authentic relationships with ourselves and those we love in our communities.”

Google, which owns YouTube, earns more ad revenue as people stay online longer. But Mr. Fernandez, echoing others in Silicon Valley, said they were not in business to push people into destructive behavior.

“Consumers need to have an internal compass where they’re able to balance the capabilities that technology offers them for work, for search, with the qualities of the lives they live offline,” he said.

“It’s about creating space, because otherwise we can be swept away by our technologies.”


Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
Ontogeny, philogeny, Lamark, Epigenetics
« Reply #115 on: September 16, 2012, 01:35:11 PM »
Proposition:

"Ontogeny recapitulates philogeny."

True or false?


In a somewhat related vein:

http://en.wikipedia.org/wiki/Lamarckism   I remember reading a comment many years ago that criticized something Konrad Lorenz had said as being Lamarkian, but this past year I read Matt Ridley's "Nature via Nuture"-- a book which I found quite exciting though certain passages went right over my head with nary a look back, which seemed to me to resurrect the question.   In a related vein, there is this http://en.wikipedia.org/wiki/Epigenetics
« Last Edit: September 16, 2012, 01:53:55 PM by Crafty_Dog »

objectivist1

  • Power User
  • ***
  • Posts: 1059
    • View Profile
Recapitulation theory...
« Reply #116 on: September 16, 2012, 01:45:00 PM »
This is from Wikipedia, which I know is not necessarily the authoritative source, but I've also read articles by modern biologists which state that this theory is not valid:

Haeckel

Ernst Haeckel attempted to synthesize the ideas of Lamarckism and Goethe's Naturphilosophie with Charles Darwin's concepts. While often seen as rejecting Darwin's theory of branching evolution for a more linear Lamarckian "biogenic law" of progressive evolution, this is not accurate: Haeckel used the Lamarckian picture to describe the ontogenic and phylogenic history of the individual species, but agreed with Darwin about the branching nature of all species from one, or a few, original ancestors.[18] Since around the start of the twentieth century, Haeckel's "biogenetic law" has been refuted on many fronts.[7]
Haeckel formulated his theory as "Ontogeny recapitulates phylogeny". The notion later became simply known as the recapitulation theory. Ontogeny is the growth (size change) and development (shape change) of an individual organism; phylogeny is the evolutionaryhistory of a species. Haeckel's recapitulation theory claims that the development of advanced species passes through stages represented by adult organisms of more primitive species.[7] Otherwise put, each successive stage in the development of an individual represents one of the adult forms that appeared in its evolutionary history.
For example, Haeckel proposed that the pharyngeal grooves between the pharyngeal arches in the neck of the human embryo resembled gill slits of fish, thus representing an adult "fishlike" developmental stage as well as signifying a fishlike ancestor. Embryonic pharyngeal slits, which form in many animals when the thin branchial plates separating pharyngeal pouches and pharyngeal grooves perforate, open the pharynx to the outside. Pharyngeal arches appear in all tetrapod embryos: in mammals, the first pharyngeal arch develops into the lower jaw (Meckel's cartilage), the malleus and the stapes. But these embryonic pharyngeal arches, grooves, pouches, and slits in human embryos could not at any stage carry out the same function as the gills of an adult fish.
Haeckel produced several embryo drawings that often overemphasized similarities between embryos of related species. The misinformation was propagated through many biology textbooks, and popular knowledge, even today. Modern biology rejects the literal and universal form of Haeckel's theory.[8]
Haeckel's drawings were disputed by Wilhelm His, who had developed a rival theory of embryology.[19] His developed a "causal-mechanical theory" of human embryonic development.[20]
Darwin's view, that early embryonic stages are similar to the same embryonic stage of related species but not to the adult stages of these species, has been confirmed by modern evolutionary developmental biology.
[edit]Modern status

The Haeckelian form of recapitulation theory is now considered defunct.[21] However, embryos do undergo a period where their morphology is strongly shaped by their phylogenetic position, rather than selective pressures.[22]
"You have enemies?  Good.  That means that you have stood up for something, sometime in your life." - Winston Churchill.

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
Pinker: The Decline of Violence
« Reply #117 on: February 21, 2013, 02:30:11 PM »


Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
Noah's Ark for DNA
« Reply #119 on: April 22, 2013, 09:05:34 PM »


WSJ
By WILLIAM Y. BROWN

DNA was the topic of U.S. Supreme Court argument on April 15. Can a gene be patented if it occurs in nature—which is generally grounds for exclusion—but has been identified by an individual scientist or company and removed from the cells in which it occurs? Lower courts are split on the matter, and the justices didn't tip their hands.

But whether a gene can be patented will be irrelevant if it disappears before anyone has identified it. That is what's happening now and will continue to happen—at a rate perhaps 100 to 200 times faster than in prehistoric days—due to modern man's outsize influence on nature and encroachment on habitat. Unless we have sequenced a species' DNA, extinction means gone forever and never really known. Preservation of the DNA is the simpler, cheaper route, with sequencing to follow. If the Library of Congress is where every book is stored, the world needs the equivalent for species DNA.

Preserving the DNA of known species would provide genetic libraries for research and commerce and for recovery of species that are endangered—the Armur Leopard and the Northern Right Whale, for example. Preservation would also offer the potential to restore species that have gone extinct. We currently lack preserved DNA for most of the 1.9 million species that have been named, but that is fewer than the number of people in Houston. No doubt additional species exist, but their DNA can be preserved as they are named. The job is doable.

Just a small fraction of species are maintained as living organisms in cultivation or captivity or are kept frozen as viable seeds or cells. These are the best, because whole, reproducing organisms can be grown from them by planting or cloning. Botanical gardens and zoos keep the living stuff. The Millennium Seed Bank at Kew Gardens in England is on a course to preserve frozen seeds of all vascular plant species, and the Svalbard Seed Vault in Norway is taking seed duplicates from other facilities. The San Diego "Frozen Zoo" has some 20,000 viable cell cultures representing 1,000 vertebrate species, including "Lonesome George," the last Pinta Island Galapagos tortoise, which expired last year. Its DNA would have disintegrated if the Frozen Zoo hadn't made a heroic mission after the tortoise's death to get a sample.

Enlarge Image


Close
Getty Images
 .For a fraction more species, DNA is kept at low temperature in dead cells or extracted form. The American Museum of Natural History in New York keeps 70,000 samples in liquid nitrogen, the Academy of Natural Sciences in Philadelphia has frozen samples for 4,000 bird species, and the National Museum of Natural History at the Smithsonian has embarked on an ambitious course to freeze species tissues.

Yet the DNA of most species is still not preserved. We need a plan. One might think that preserving the DNA of life on earth would cost a moonshot of money. But a viable cell culture in liquid nitrogen for a species at the Frozen Zoo costs only $200 to $300 to establish and just $1 a year to maintain. Multiplying $250 per species by 1.9 million species comes to $475 million, ignoring what has already been done. The U.S. pays more than twice that daily on the national debt. But let's be real, nobody is throwing new money around, even when the priority is obvious.

There is another way that could work, and would be much cheaper. First, we could develop a website to track progress on preservation whose key information is managed directly by contributing facilities. It would be a "wiki" site for DNA repositories, and many keepers would be delighted to share information if they could manage it themselves. They could both update holdings and let people know what species they will take and under what conditions.

Second, we can establish new incentives and mandates for contributing specimens, including grant, publication and permit requirements. Some grant makers and publications already require that DNA information be shared with a genetic information bank kept by the National Institutes of Health. Why not tissue too?

Third, donors who care could help develop and fund "citizen science" projects of museums and nonprofit groups to collect, identify and contribute specimens to repositories. The collections would grow, and so might public connection to nature. At the end of it all, we will preserve what we appreciate. And patent lawyers will be happy too, because they'll have something to fight about.

Mr. Brown, a former president of the Academy of Natural Sciences, is a senior fellow at the Brookings Institution.

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile


Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
Wired for Culture: Origins of the Human Social Mind by Mark Pagel
« Reply #122 on: June 20, 2013, 05:07:42 AM »
Hat tip to David Gordon


Wired for Culture: Origins of the Human Social Mind by Mark Pagel
W. W. Norton & Company | 2012 | ISBN: 0393065871, 0393344207 | English | 432 pages

A fascinating, far-reaching study of how our species' innate capacity for culture altered the course of our social and evolutionary history.

A unique trait of the human species is that our personalities, lifestyles, and worldviews are shaped by an accident of birth—namely, the culture into which we are born. It is our cultures and not our genes that determine which foods we eat, which languages we speak, which people we love and marry, and which people we kill in war. But how did our species develop a mind that is hardwired for culture—and why?

Evolutionary biologist Mark Pagel tracks this intriguing question through the last 80,000 years of human evolution, revealing how an innate propensity to contribute and conform to the culture of our birth not only enabled human survival and progress in the past but also continues to influence our behavior today. Shedding light on our species’ defining attributes—from art, morality, and altruism to self-interest, deception, and prejudice—Wired for Culture offers surprising new insights into what it means to be human.

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
« Last Edit: September 05, 2013, 06:16:11 AM by Crafty_Dog »

ccp

  • Power User
  • ***
  • Posts: 19763
    • View Profile
Evoultion - out of God's hands into ours
« Reply #124 on: October 16, 2013, 07:32:51 AM »
Not long ago many people wondered if we are still "evolving".  How can we be if there is no survival of the fittest.  Even those who are not "fit" still get to survive and reproduce in our society.

Now it is clear.  Not only are we evolving but evolution will accelerate.   We will soon begin to control our evolution and accelerate it.  From simple choosing the sex of babies to divesting of flawed DNA to insertion of chosen DNA.  Parents will be able to view menus of traits.  You want your son to be tall, athletic.  How about an IQ of 180?  How about extrovert?  High energy?

No problem.   

Not only will evolution increase so that we develop master races of humans we will be controlling it.


Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
Archibold MacLeish
« Reply #125 on: October 16, 2013, 09:12:01 AM »
There is, in truth, a terror in the world.  Under the hum of the miraculous machines and the ceaseless publications of the brilliant physicists a silence waits and listens and is heard.

It is the silence of apprehension  We do not trust our time, and the reason we do not trust out rimes is because it is we who have made the time, and we do not trust ourselves.  We have played the hero's part, mastered the monsters, accomplished the labors, become gods-- and we do not trust ourselves as gods.  We know what we are.

In the old days the gods were someone else; the knowledge of what we are did not frighten us.   There were Furies to pursue the Hitlers, and Athenas to restore the Truth.  But now that we are gods ourselves we bear the knowledge for ourselves-- like that old Greek hero who learned when all his labors had been accomplished that it was he himself who had killed his son.


Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile

DougMacG

  • Power User
  • ***
  • Posts: 19447
    • View Profile
Human Ancestors Were Consuming Alcohol 10 Million Years Ago
« Reply #128 on: December 28, 2014, 01:47:08 PM »
We haven't changed as much as we think?

Human Ancestors Were Consuming Alcohol 10 Million Years Ago
http://blogs.discovermagazine.com/d-brief/2014/12/01/human-ancestors-were-consuming-alcohol-10-million-years-ago/#.VKB4d_9LIAA

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
Stratfor: What drives people to the extreme
« Reply #129 on: January 14, 2015, 10:26:31 AM »

Share
What Drives People to the Extreme
Global Affairs
January 14, 2015 | 09:00 GMT Print Text Size

By Dr. Luc De Keyser

When invited to write a column for Stratfor, I volunteered too quickly to write about the phenomenon that is the Islamic State and the apparent copycat groups. The shock and awe of the Islamic State's eruption onto the world stage threw a wrench into my ever-developing model for gauging human behavior. It is easy to admit that this essay comes too soon. It is harder to admit that, most likely, this essay will not be in time for the next atrocity. (Indeed, I sat down to write this as the world was still mourning the deadly attack on Charlie Hebdo.)

Most geopolitical analysis draws from backgrounds in humanities and the social sciences. Stratfor might look at geography, history and economics when trying to understand a group such as the Islamic State. I, on the other hand, draw from a background in the exact and life sciences. This work stems from my preoccupation with the dimension of evolutionary biology that studies the origin of human disease. And within this setting, it is easy to accept that the actions of a group like the Islamic State must be classified as manifest symptoms of a severe form of "dis-ease," as in "not being at ease." This is not to say that the regions in the world where Islamic State fighters are coming from or what they are fighting against are not part of the overarching physiopathological picture of what many consider an upcoming black plague. But in any case, carefully tracing the cause and effect chains from the individual to the next of kin, then to the extended group and up to the nation will prove too brittle still to generate a reliable prognosis and guidelines for preventive and curative measures. So, why bother?

To become a reliable forecaster, it is important to understand that "there are also unknown unknowns — the ones we don't know we don't know," to quote former U.S. Defense Secretary Donald Rumsfeld. Although his statement may have been inspired by Nassim Nicholas Taleb's black swan theory, it does read like the modern sound bite of the warnings of Jesus ben Sirach, inscribed in the Old Testament's Apocrypha: "What is too sublime for you, do not seek; do not reach into things that are hidden from you. What is committed to you, pay heed to; what is hidden is not your concern" (Sirach 3:21-22). Let me try an alternative, post-Darwinian exegesis.
Principles of Evolutionary Psychology

We all know the adage, "If all you have is a hammer, everything looks like a nail." I propose the following variant adage: "If all you have is a human mind, everything looks like a situation in the life of the hunter-gatherer we have forgotten we still are." We are generally unaware of this perspective, and more important, it may not even matter if we were aware. 

This new adage is the consequence of a couple of basic principles of evolutionary psychology, outlined most succinctly by University of California, Santa Barbara, professors John Tooby and Leda Cosmides in their 1997 publication, "Evolutionary Psychology: A Primer." One is that "our neural circuits were designed by natural selection to solve problems that our ancestors faced during our species' evolutionary history." The other is that "Our modern skulls house a stone age mind." From those I deduce that every modern-day problem can, at best, be reduced to one or more problems of a complexity we humans are "naturally" wired to solve. Thus, there is hardly a guarantee that the solutions found will fit the problem at hand, no matter how hard we try.
But Aren't Humans 'Sapiens'?

There are several objections to this conclusion: 1) The human brain has an almost infinite capacity to learn and over time will attain the capacity to deal with problems of ever-increasing complexity; 2) evolution has not stopped and will progressively cause new wirings in the human brain to whatever is needed to overcome any problems that may arise; and 3) every problem can certainly be decomposed into more simple problems until they reach a level that even a Stone Age mind can handle.

Let me address these objections in reverse order, starting with the one that is least controversial:

3) It does not take much thought to accept that most relevant problems cannot be fully and faithfully decomposed into a consistent logical tree of underlying problems. The accuracy of such decomposition is limited by the specialized logic of our thinking machinery as it is dedicated to a problem set typical of an ancestral lifestyle only. The scope of this natural logic covers only a small part of the domain that mathematical logic encompasses, which in itself is not even sufficient to embrace the complexity of most issues pertinent to our modern times. 

Moreover, even without these logical and neurophysiologic limitations, it is not very likely that the broken-down problems can be solved within a relevant time frame. Doing so would mean that humans could, within a couple of decades, revert to the conditions of existence under which the ancestral solutions worked, even though it took millions of years for humans to evolve. Imagine we could decompose the sudden surge of the Islamic State to problems at the individual level, such as deprivation of attachment as an infant, lack of examples of trustworthy parent figures as toddlers, underdeveloped confidence to master life enough to build a future as an adolescent, etc. Even if these are not the problems at cause, they illustrate how desperately hard each of them would be to solve.

2) Of course evolution has not stopped. We know that since the relatively recent Neolithic age, man has evolved to a lighter skin color with dwellings at higher latitudes, to preserve an active lactose digestion enzyme when growing up as a pastoralist, to deform red blood cells to resist malaria in swampy areas, and so on. Not only has the human genome evolved, but so, too, has the complex microbiome that has co-evolved with our species — that is, the fauna and flora of microbes, viruses and fungi that have called the body of Homo sapiens, inside and out, their home. And yes, the brain was also the site of numerous mutations since. But these persisting mutations may not necessarily have upgraded the brain in a direction we would interpret today as beneficial. For example, the average weight of the human brain has dropped up to double-digit percentages since the dawn of systematic agriculture. This does not help to defend the argument that the brain has become smarter since then. In addition, there are statistical trails that suggest that during some periods since the Neolithic Revolution, regional selective pressures have even increased the rate of evolution. Despite all these considerations, what is fundamental is that the time frame in which evolution has an impact is much, much longer than the rate at which Neolithic-style problems that are thrown at humankind demand solutions that have never been tried.

1) And then there is this exaltation with the prowess of the human mind. Biologically, this feature is just one of many in the broad lineup of leaves that make up the edge of progress of the tree of evolution, which covers the more than 10 million extant species. In this context, this cerebrocentric posturing makes as much sense as a giraffe bragging about its long neck. Of course, this comparison is hard to accept when we praise our kind for producing the works of Shakespeare, the construction of the Great Wall, putting man on the moon, the building of great civilizations and so on. Still, evolutionary psychologists would argue that these are mere expressions of extensions of innate abilities within limits set by the earlier natural selection process. Thus, there is nothing wrong with being proud, per se, if it were not that our appreciation seems quite biased and strongly tends to ignore or downplay the various adverse effects associated with or leading up to most of these feats.

As a matter of fact, this lack of objectivity toward man's own accomplishments is probably another good example of those very limits.
The Human Brain's Actual Capacities

Man has also evolved a relatively sophisticated mental model of naive mechanical physics. It is easy to argue that this would come in handy in hunting prey with, for example, bow and arrow. This "talent" easily shines through in modern times as well. For example, if asked to run the 100-meter sprint in 10 seconds flat, it is quite obvious to most that only the top athletes can reach such speed. If asked to run the same distance in 5 seconds, most would readily recognize that this does not seem to be humanly possible. In a similar vein, if asked to learn Sanskrit or to unify the theory of general relativity and the quantum field theory by tomorrow, most will quickly agree that this is not feasible even for the most intelligent among us.

But if asked whether it is within humankind's capacity to assess the risk ramifications of very complex systems — such as the exploitation of nuclear energy, the setup of the worldwide economic and financial system, or the human effect on global warming — most would agree that these topics are, eventually, within the grasp of the human brain, if only given more time and more staff. Considering what the human brain was really programmed to handle and the bewildering intricacies of the systems involved in those examples, this faith can only be a manifestation of über hubris. The proof is relatively straightforward. I am sure most of us remember the statements of confidence before and the statements of sheer surprise during the most recent economic crisis. There is already evidence of a collective memory selective against the causes put forward for these catastrophic events and of a return of optimism toward pre-crisis levels. Many of us are not embarrassed playing the lottery or casino games despite the mathematical certainty of losing, on average. That is our ancestral brain at work. Man's innate mental model for statistics did not require that level of sophistication. But in comparison, this is just a fait divers concerning the seemingly boundless faith we have in the human brain to deal with matters that clearly supersede its intellectual capacity by several orders of magnitude. 

Let me propose a number of reasons for this phenomenon. First, the brain has evolved to understand the particular world of the hunter-gatherer. It has not developed a capacity to understand a very different world. Such a world will be understood only in terms of patterns the brain recognizes as typical for the world it does understand. The fit, if any, can only be coincidental.

Second, the brain senses its environment according to the model it has evolved to understand. The interpretation of the signals coming from the senses are, so to say, preloaded. The brain, to work at all, therefore cannot withhold judgment of interpretation while sensing. The brain must provide itself an explanation at all times, even in the most artificial and unrealistic situations. For example, the night sky is littered with innumerable stars. Some are brighter than others. The brain cannot refrain from ordering the brightest under them in patterns, drawing imaginary lines to make up Zodiac signs that refer to familiar images. The brain abhors a vacuum of explanation.

Finally, the human organism, like any organism, is driven by an "elan vital," or a "vital force." This is more an interpretation of the biological expression of the laws of thermodynamics that inexorably unfold in the universe than a magical form of energy. This is also not to be confused with the inborn mechanisms for fight or flight to preserve one's life when in danger. These situations are part of the conditions of existence man is readied to deal with. The vital energy, however, is expressed in the innate expectation that man fits his conditions of existence and that man will thrive, at least, in the form of the social aggregate man typically lives in. This means that man's biological and social needs are translated in feelings of "soon to be satisfied." These drive human behavior to fulfill these needs until feelings of sufficient satisfaction are reached. And overall, this fulfillment is within reach, day in and day out, from season to season, from ancestors to descendants. It is cause for a general sense of optimism. The brain, however, has no means to deal adequately with living conditions that hold insufficient promise of a future for generations. A fight-or-flight reaction to danger that would ultimately become impending is likely completely inappropriate for the complexity of the real situation at hand. The brain abhors a vacuum of destiny. Depending on the particular stage of dis-ease, the brain may ignore the vacuum and whistle in the dark; it may fill in the vacuum with its own "wishful thinking"; or it may turn this vacuum to an existential fright.

The same principles are at work in dreams. One of the evolutionary psychological theories on dreams, the activation-synthesis theory, poses that "there is a randomness of dream imagery, and the randomness synthesizes dream-generated images to fit the patterns of internally generated stimulations." In other words, emotions flood the higher neural circuits, and the neocortex scrambles to interpret the myriad pulsating trigger trails. To do that, it captures the most readily available images, related or not, from short- and longer-term memory stores and combines them as quickly as the feelings unfold into a story, any story if it must. No wonder many dreams appear weird when recounted upon waking. But this has important implications for interpreting dreams. Instead of engaging in an interminable wild-goose chase, delving for magical meanings through the most unusual combination of images and correlating them with everyday events in someone's distant past, recent past and — the extrapolation is quickly made — future, it is much more revealing to ask about the predominant emotion during the dream and the progression of that feeling during the unfolding of the made-up story. That is the core of the dream. The story is only chatter, albeit in the foreground.
Disentangling Our Analysis of the Islamic State

When studying the Islamic State phenomenon and its ilk, the same principles — being aware of humans' hunter-gatherer mentality, knowing that humans' environmental sensing is preprogrammed and understanding our species' "elan vital" — can be applied. This works on at least two levels: on the action of the individual fighters themselves but also on the reaction of the world feeling under threat. The current flood of analysis available in the global infosphere contains very erudite explanations and powerful conceptual placeholders to come to rest from the mental exhaustion of navigating the intricacies of the many possible cause-and-effect chains. But in the same vein as with the interpretation of dreams, the "primal" questions are not even close to being treated as extensively as warranted. A couple of obvious ones: To what emotion must a Homo sapiens be brought that it results in triggering one's explosive belt or in shooting in cold blood each one of a row of other Homo sapiens taken prisoner?

Take a paradigm such as the theory of the tectonic plates. It gives a coherent explanation for the particular position of volcanoes over the globe and of regions with a high risk of earthquakes. As useful as it is — for planning communities and evacuation routes, for example — this theory is still insufficient to precisely predict the majority of actual eruptions and tremors. Tracking the emotional magma flows underlying the Islamic State's emergence also remains insufficient to predict the occurrence of the next outbreak of barbaric violence reliably enough to prevent it.

But the analysis of this daytime nightmare proves useful because it separates the chatter from the core and applies it at the level of the individual and of the group. At the individual level, the chatter is made up of the complex of narratives the different stakeholders, perpetrators and victims put forth to make sense of it all, each from the perspective of his or her own culture and subculture. The core consists of the conditions of existence that were so overwhelmingly discordant with those the human genome was prepared for that it triggered this series of dramatic events. The efforts to improve those conditions are much more to the point than efforts to debunk the different narratives. And the conditions that need to be improved in this case are more in the sociological tier than in the economic tier.

At the group level, the analysis remains very grainy. The collection of gut feelings of the group's members percolates up through multiple layers of aggregation and along various sinuous paths. Even with unique mega-events like the "Je suis Charlie" march in Paris of last weekend, it is not clear if present pyroclastic clouds are cloaking the birth of a new supervolcano. Whatever that outcome, the pent up geopolitical pressures are real and will need more than an impromptu Twitter message to rally people.

In future columns I will discuss a number of those conditions of existence that are specific to the human species and are required for healthy development. These play an important role in the elaboration of moral and social rules and conventions that make up the organizational matrix of civilizations, small and grand. In the stride of the Human Genome Project, it is high time to give these principles the prominence they deserve in the redesign of our social matrix. Recycling the staggering emotional energy released in the aftermath of recently publicized savageries would be a means to mourn the dead in Paris and an excellent endeavor, lest the tragedy and its global response be in vain.

Read more: What Drives People to the Extreme | Stratfor
Follow us: @stratfor on Twitter | Stratfor on Facebook

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
Human directed evolution
« Reply #130 on: May 29, 2015, 07:43:48 PM »
Building with Biology
Several of my newsletters in the last few weeks have reported on a recent trip to California during which I visited Google, Facebook, and Udacity--remarkable companies undertaking projects with the potential to change the world. But of all the fascinating experiences on the trip, the best might have been the visit to my friend Lynn Rothschild’s lab at NASA's Ames Research Center in Mountain View, California.
 
Lynn and her students are developing projects that blend biology and technology in mind-bending ways. As a synthetic biologist and astrobiologist, Lynn studies the building blocks of life. She thinks both about where life might exist on other planets--the clouds of Venus, for instance--and about new ways to assemble those building blocks here on Earth. The latter effort holds amazing potential for practical applications, discoveries that could change our lives and the materials we encounter every day.
Lynn coaches the Stanford-Brown team in the international iGEM challenge, a competition for students to create "bio-bricks"--useful DNA sequences that can be inserted into cells to give them certain desirable properties, like water resistance or tolerance to high temperatures. The idea is that bio-bricks, like a kind of DNA LEGOs, could be assembled into basic living organisms or materials that could be useful to humans.
 
One example might be engineering a cell that generates cotton fibers. Assemble the right combination of DNA, and there could be a way to produce whole pieces of cloth in a factory setting (rather than growing cotton it in a field and weaving it on a loom. Another idea--the team's 2013 entry--is BioWires, which embeds individual atoms of silver into strands of DNA, resulting in nanowires that conduct electricity.
In 2012, Lynn's team took genetic features from a variety of organisms in harsh places on Earth--life surviving in extreme cold, or low oxygen, or with high radiation, or almost no water--and assembled them into one tough bacteria that potentially could survive on Mars. They dubbed it the “Hell Cell”. Those features, in theory, could be paired with still more genetic features--the thread production, for instance--and sent to Mars to replicate and grow ahead of a human mission to the planet.
 
Last year, Lynn challenged the team to solve a problem her NASA colleagues had experienced here on Earth--losing scientific sensing equipment in delicate environments, potentially polluting them. Lynn's suggestion to her students was to build a biodegradable drone. The team, which in 2014 included Spelman College, proved up to the challenge: they used a dried fungus for the body instead of plastic, and added proteins from wasp saliva to make it waterproof. The team believes they'll eventually be able to print the circuitry right onto the body in silver, and then find ways to power biological motors.
The team’s project this year is still a secret, but it’s even more intricate.
 
It was a privilege to see the pioneering work Lynn and her students are doing in the lab, with applications from medicine to materials. It was a great reminder after visiting three of Silicon Valley’s most innovative technology companies that a better future will come not just through breakthroughs in computing and communication, but through advances in biology as well.
Your Friend,
Newt

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
The New Chimpanzee Review
« Reply #131 on: March 11, 2018, 04:32:24 PM »
‘The New Chimpanzee’ Review: Mysteries of the Chimpanzees
Unusual among nonhuman primates, male chimpanzees are considerably more social than females.
‘The New Chimpanzee’ Review: Mysteries of the Chimpanzees
Photo: Getty Images
By David Barash
March 9, 2018 4:14 p.m. ET
1 COMMENTS

Ours is not really a planet of the apes. Rather, it is a planet overwhelmingly populated by one ape species: us. The other “great apes” include chimpanzees, bonobos, gorillas and orangutans, none of which are abundant.

There are many reasons to be interested in these creatures, not least that they are fascinating members of life’s panoply, worth knowing, observing and preserving for their own sakes. Long before biology’s evolution revolution, people recognized kinship with them—and with chimpanzees in particular. Regrettably, all of the great apes are now at risk of extinction, us included. It would not be in our interest to let the chimps fall where they may.
The New Chimpanzee

By Craig Stanford

Harvard, 274 pages, $35

There is something undeniably human-like about chimps, and chimp-like about humans, all of which is to be expected given that we share nearly 99% of our nuclear DNA with them (and with bonobos). Moreover, all three species—humans, chimps and bonobos—are more closely related to one another than to gorillas or orangutans. This fact has led Jared Diamond, of the University of California, Los Angeles, to label Homo sapiens the third chimpanzee. It has also led biologists, even before DNA sequencing was routine, to spend a great deal of time studying chimps.

The pioneer researchers in the field include Jane Goodall and three Japanese scientists little-known in the West but renowned among primatologists for their work primarily in the 1960s and ’70s: Junichiro Itani, Kinji Imanishi and Toshisada Nishida. Since this early work, our knowledge of chimpanzees has continued to expand thanks to an array of doughty field workers. Among the most productive has been Craig Stanford, whose book, “The New Chimpanzee,” is suitably subtitled “A Twenty-First-Century Portrait of Our Closest Kin.” Mr. Stanford began studying chimpanzees at Ms. Goodall’s now-famous Gombe Stream National Park in Tanzania more than three decades ago.

Mr. Stanford, a professor of biological sciences and anthropology at the University of Southern California, is a talented and fluent writer as well as an accomplished researcher. “My hope,” he writes, “is that readers will appreciate chimpanzees for what they are—not underevolved humans or caricatures of ourselves, but perhaps the most interesting of all the species of nonhuman animals with which we share our planet. The gift of the chimpanzee is the vista we are offered of ourselves. It is a gift at risk of disappearing as we destroy the chimpanzees’ natural world and drive them toward extinction.” I would add that the most valuable component of that vista is the glimpse we get not of ourselves but of those chimps for their own sake.

Researchers have unearthed remarkable cognitive abilities among chimpanzees, but such discoveries have been made using captive animals, either in labs or zoos. The findings of Mr. Stanford and his colleagues involve studying these animals in their natural environments, which is the only situation in which they can reveal the diversity and depth of their behavioral repertoire, notably as it reflects the impact of ecological cues (especially the location of fruiting trees) as well as the presence of competing social groups.

Ms. Goodall discovered that chimps use simple tools (including sticks for “fishing” termites out of their mounds) and occasionally hunt, ritually sharing the meat thereby obtained; they also engage in a form of intergroup aggression sometimes called (misleadingly, since it is altogether different from the human phenomenon) warfare. Mr. Stanford’s book expands upon what we have learned in the four decades since Ms. Goodall first began her field research. His chapter titles provide an outline.

In “Fission, Fusion, and Food,” we learn that the earlier conception that chimps live in chaotic, ever-changing social groups is not valid. Rather, they occupy “communities” whose constituents sometimes combine, sometimes split up, and are always influenced by the availability of food and estrus females. Unusual among nonhuman primates, males are considerably more social than females. “We now think,” Mr. Stanford writes, “that male cooperation is based mainly on the shared benefits of working together, with kin selection playing some role as well.” Elsewhere, he writes that alliances “tip the balance away from more powerful, lone actors in favor of lower-ranking males who team up briefly. In my own field studies, there was always a single alpha male, but his power at a given moment was highly dependent on those around him.”

In “Politics Is War Without Bloodshed,” a version of Clausewitz’s maxim that war is politics by other means, the reader is granted insight into the ways in which chimps—especially those highly social but no less scheming males—achieve dominance and, with it, reproductive success: “Some males are inveterate social climbers, cleverly serving their own ends by ingratiating themselves with high-ranking males and females. Others rely more on brute intimidation, which does not necessarily carry the day. And then there are males who seem to care little about their social status and are content to live out their lives on the edges of the struggle.” Of the nearly trite concept of alpha males, Mr. Stanford writes that “the most famous of all alphas in recorded chimpanzee history,” a chimp named Mahale alpha Ntologi, who was observed in Tanzania, “shared meat liberally as he rose in rank. But . . . once he had achieved alpha status, his generosity dropped, and he began sharing meat mainly with those whose political support he still needed most.” To his credit, the author refrains from pointing out human parallels.

The chapter “War for Peace” is a riveting discussion of intergroup aggression, in which males band together to ambush and occasionally raid neighboring groups, often with gruesome and lethal results. Chimps are the only primates, other than humans, that routinely kill members of the same species over access to resources. Adult females as well as males sometimes commit infanticide. As Frans de Waal demonstrated in impressive detail, chimps also engage in ritualized postconflict reconciliation—at least in captivity.

When it comes to “Sex and Reproduction,” things are comparably contradictory, with a degree of sexual free-for-all combined with exclusive consortships, in which females may cycle rapidly between apparent promiscuity and genuine sexual choosiness. We also learn about hunting, a cooperative endeavor whose goal (at least for males) appears to be enhanced mating opportunities as well as coalition-building. “We know that males use meat for a variety of political purposes. . . . One aspect of male manipulation of others was the use of meat to entice females to mate with them.” Also notable is the cultural transmission of certain behaviors, especially the use of tools to obtain food.

Despite its relative brevity, “The New Chimpanzee” is a remarkably thorough account of our current knowledge about free-living chimpanzees. Although it is tempting to try to use this knowledge to better understand human evolution and human nature, in many respects—notably, their inclinations toward violence—chimps are quite different from us. They may fight over females and territory using their hands and teeth, but we will fight for many additional reasons with the use of weapons, from clubs and knives to nuclear weapons.

My own inclination, when considering chimpanzees or any other animal, is to follow the advice of the early-20th-century naturalist Henry Beston : “The animal shall not be measured by man. In a world older and more complete than ours they move finished and complete, gifted with extensions of the senses we have lost or never attained, living by voices we shall never hear. They are not brethren, they are not underlings; they are other nations, caught with ourselves in the net of life and time, fellow prisoners of the splendor and travail of the earth.”

—Mr. Barash is an emeritus professor at the University of Washington. His next book is “Through a Glass Brightly: Using Science to See Our Species as We Really Are.”

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
The evolutionary biology of sex
« Reply #132 on: September 11, 2018, 09:15:59 AM »

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
Same sex mice reproduction
« Reply #134 on: October 13, 2018, 11:40:06 AM »

ccp

  • Power User
  • ***
  • Posts: 19763
    • View Profile
Re: Evolutionary biology/psychology
« Reply #135 on: October 13, 2018, 03:37:18 PM »

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
Re: Evolutionary biology/psychology
« Reply #136 on: October 14, 2018, 11:23:31 AM »
Did you notice the previous post?

 :-D

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
Endocrine Disruption
« Reply #137 on: November 11, 2018, 12:39:29 PM »
Let's make this the thread for this very important subject:

Pulling together previous references here on this forum we have:

=================

from 2006:

I FOUND THIS ON www.mikemahler.com. check it out.

A Conversation with Dr. William Wong on Training, Testosterone, Growth Hormone, Acting like A Man, and Rites Of Passage

, , ,

MM: Testosterone is a big topic these days and low testosterone seems to be more and more prevalent in men. What are some of the common factors contributing to low Testosterone levels?

DW: The low testosterone levels we see in guys these days is due to a few overlapping factors all relating back to one thing - estrogen.  Many baby boys born since 1973 have had soy formula with it's phyto estrogen mucking up their works.  A boy age 6 months to 3 years has the testosterone level of an 18-year-old man!  A bottle of soy formula has the equivalent by weight of 5 to 8 estrogen birth control pills in it!  Multiply that by the number of bottles fed a day and the estrogen load is enormous!  What happens when all the E suppresses a boys T?  It is this T that tells the anterior pituitary to develops tiny part of itself and it is that part of the body that tells a boy that he's a guy!  In autopsies done on over 3000 gay men who died of HIV, I believe it was Dr. Lendon Smith the famous pediatrician who reported that homosexual men did not have this portion of the anterior pituitary!  Since that part develops in early childhood from the combination of testosterone, salt and calcium and since homosexual men are high in estrogen and DHT but low in testosterone and generally as a group have low serum calcium and low serum sodium we can see where the problem arises!  Trans-gender support groups have discovered where their dysfunction has arisen from and now they are in the lead in warning about the dangers of soy and environmental estrogens on the development of children.

MM: Wow that is some pretty scary stuff. Where else are we being bombarded with estrogen?

DW: Next we have all the pesticides, fertilizers, soy in all food, flax etc.  One form of estrogen atop another acting as endocrine disruptors we now have two generations of men since the 70's with:  smaller penis size both flaccid and erect than previous generations, lower testosterone levels, higher estrogen levels, dreadfully lower sperm counts, higher incidences of sexual mental dimorphism (not being sure what sex they are), and I fear reaching andropause around 35 or 40 instead of 45 to 50 all because of the estrogen in their food and environment.

On sperm count, in the 1960's a man was considered fertile only if he had over 100,000 sperm per ml. of semen. Things have gotten so bad that now a guy is considered fertile if he can make a measly 20,000 sperm per ml!  Dr. Doris Rapp MD the worlds leading environmental doc and pediatric allergist has looked at the data and predicts that by 2045 only 21% of the men on the entire planet will be fertile.  In all of Africa, Europe, Japan and many other countries deaths exceed births.  This will have devastating effect on world economies as pensioners will drastically out number those paying into pension plans!   I call this the Zardoz effect after the old Sean Connery movie where he was the last fertile man on earth.

MM: I have observed a scary trend of men being more effeminate. I wonder how much attitude and confidence has to do with T levels? I often hear men talk about how they need permission from their wives or girlfriends just to spend money they have earned or go out. Would you say that men who are dominated by their wives or girlfriends have lower T levels?

DW: I'm going to out on a limb here and I'll likely get hate mail for what I'm about to say but here goes.  Guy - gal relationships and marriage are a 50 - 50 proposition.  When men are dominated by their wives they have allowed this to happen and it may show low T levels or they are just "pussy whipped".  If the gal is very beautiful and desirable, the type of goddess men would kill for, then being "pussy whipped" is understandable.  (I'm married to one of those type gals)!   Most women who denigrate their men are nowhere near that good looking!  Not even close!   There are very few of those women around to account for all the "nebbish" men on the planet!   Since T levels are going lower and lower at earlier ages I expect we'll see even more nebbish men on the planet soon.  Andropause (male menopause) is now hitting at 35!

MM: What are the reasons for women bossing their men around? Is it just due to the fact that these men have low T levels or are sim ply whipped?

DW: I have noticed that women boss their men around when they lose respect for them.  As providers, as pillars of strength, as builders, as the "Hunks" these gals first married; the gleam is gone, the warts are showing, the dreams have deflated and the reality of what ever failings and the guy has are apparent.  This is when the bitterness of a woman's disappointment shows in her attitude by taking on his role as head of household.   Having seen this many times over my 5 plus decades I can positively make that statement.

One thing that really gets to a woman is when she is not the center of her man's world.  Any guy who still prefers to hang out with his friends, go drinking, watch sports, or has not grown up is going to lose a woman's respect right quick.   In teaching martial arts, med school and exercise I've told many a "boy man" to grow the f--k up act responsibly.  To gals I offer this advice: marry an x service man or a fellow who's had a really hard life and has had to work to make it, as these guys are more in touch with being responsible, disciplined and productive than the "bad boys" who still act like they're in school.  Women unfortunately love and are highly attracted to the "bad boys" and their astonishment that the bad boys continue to be bad boys after the vows are said and the rings go on astonishes me.   What did they expect!

MM: Interesting points. A friend once noted that there is no rite of passage for boys into men in the modern world. How d oes that play into things?

DW: The poet Robert Bly pointed out in his book "Iron John", men change from boys to men by a process of initiation.  Hard experiences, long-sufferings, deep teaching from wise elders, military boot camp; if they have taught persistence, discipline and responsibility all qualify as an initiation.   American Indian boys went into the field to hunt, fend for themselves, gain deep understanding of what they could or must accomplish, survive and hopefully gain a spiritual insight.  This changed them deeply and made them men, worthy to stand with the warriors and builders.  We live in a matriarchal society where fathers and grand fathers no longer guide their sons down the path to manhood.  Men are no longer ritually initiated into manhood.  The harshness of survival, grand effort and spiritual awakening is looked down upon, as being primitive and so we have the world filled with irresponsible "boy men".

MM: What about women?

DW: We now have a generation of daughters of the "Liberated Women" of the 70's.  These gals don't have a clue of what it means to make a household, tend to a family, make a meal or raise children.  They know corporate politics, reservations for eating, and day care from birth.  These women have grown in dysfunctional families where mom was a dominating closet lesbian and their images of male / female relationships are extremely skewed.  These daughters have not been initiated into womanhood.

I don't have a clue as to how to fix that, except to tell men to find wives from women that have had good moms because even though most women hate hearing this fact it is very, very true:  by their late 30's and into their 40's all gals turn into their mothers!

But back to an earlier point:  Increasing a man's T level, increasing his affirmativeness, being the pillar of strength, being a good provider, noticing the gal, giving a gal better orgasms, can help improve a gals impression of her man in some relationships.  Other relationships, which are toxic, just need to be walked away from before it drives the guy crazy.   By the way Iron John is a MUST READ for every MAN.  Boy men need not read it.  Nuff said.

MM: Lets get into specifics. What can be done to increase T levels?

DW: Testosterone levels in men and women decline from 27 onward and seriously decline from 35 onward until by 40-45 most men are estrogen dominant and have more estrogen floating round their bodies than their wives do!  Since all of our drive both mental, physical and sexual is derived from testosterone, since the spark that keeps us interested in life and enjoying it is derived from testosterone it behooves us not to succumb to natures planned obsolesce and let ourselves get E dominant and T deficient!.

A few things can be done to naturally raise one's own testosterone levels are:

Libido Lift herbal capsules 4 caps 3 to 4 times daily.

Doctors Testosterone Gel (has no real testosterone but has herbs and homeopathics that stimulate out own production). Two to three applications of the gel daily.  Especially before bed and early afternoon (since we make T twice daily between 2 and 4 AM and 2 and 4 PM).  It can also be applied some 20 to 30 min. before training or sex.

Maca powder: This south American root is kin to a turnip but tastes like butterscotch, has plant sterols that are precursors to both testosterone and progesterone the good hormones and has Di Indole Methane (DIM) to block estrogen from tissues three to six teaspoons of the stuff a day should be minimum.  The capsules of this stuff won't work as they don't contain enough Maca to make a difference regardless of how "extracted and concentrated" they claim to be.

These three supplements in combination work very well to elevate T levels in those whose pituitaries and testicles still function to make hormones.  All of these supplements are available at www.docsprefer.com

MM: What about dietary advice for increasing testosterone levels?

DW: The only dietary advice I can think of off hand to increase T levels is: don't let your cholesterol get below 180.  The body stops making hormones then.  In India where most are vegetarian Hindus, milk and eggs are a dietary staple to increase the intake of animal fats, which are some of the best sources of cholesterol from which to make hormones.

Eat a lot of MACA.  This Andean butterscotch tasting turnip has the plant sterols that are immediate precursors to testosterone and progesterone and it also has Di Indole Methane to block estrogen use by the tissues.  In Peru it is used to increase fertility and libido, which are both functions of testosterone.  By the way men do need progesterone, it blocks the conversion of testosterone to estrogen and blocks both the T and E from becoming Di Hydro Testosterone the hair loss and swollen prostate hormone.   Consume at least three to six teaspoons of maca every day.  In South America maca is put into baked goods cookies breads and cakes, into stews and taken plain.  I drop a teaspoon of the powder in my mouth and drink water to chase it down.

MM: How important is growth hormone for health and well-being?

DW: Funny you bring that up. Just heard today about a study that showed that an increase of 12% in IGF 1 levels is equal to adding 10 years to your life!

MM: Wow! Besides decreasing life span, what else happens when IGF-1 gets low?

DW:  IGF 1 is a wonderful anti aging, muscle-sustaining hormone that gets low with high stress levels.  There is some controversy to using IGF-1 or HGH (which releases IGF-1).  The geriatric docs say the IGF-1 can cause cancer.  The anti-aging MD's say hogwash.  The final word comes from the oncologists who use IGF-1 to fight cancer.  Used to be we would expect to see lowered IGF-1 in a person 35 to 40 +.   These days there are 20 some things with very low IGF-1!    IGF-1 gives not only muscle and bone mass but also increased immunity, greater mental power and maintains brain and internal organ size (which shrinks and becomes fibrotic with age.   Read my article Fibrosis The Enemy Of Life at www.drwong.us to find out why and how).   Having IGF-1 levels go south in one's 20's is noting but bad and likely will take decades off the lives of the X'rs and Y generations unless changed.  Already I've seen my boomer generation come down with things like strokes and heart attacks in our 40's that should not have happened till  our 60's.  So what will happen to the X'ers and Y's in their 40?s if the trends for the good hormones (i.e. testosterone, progesterone, IGF 1, oxytocin) continue downward?

MM: Do you recommend GH injections?

DW: The much touted HGH injections so prized by anti aging docs are a way of causing the body to release IGF 1, but that's a long and expensive way round the barn.  $11,000 to 12,000 a year expensive to be precise.  IGF- 1 is abundant in the velvet that covers deer antlers.  The male deer shed their antlers every year, the velvet from these can be collected and the IGF-1 extracted.   There is a deer farm in New Zealand that has the largest herd of Chinese red deer in the world and this is where all of the IGF-1 sublingual spray products are made regardless of who puts their label on it.  It's all from the same source; www.nowfoods.com sublingual sells their IGF-1 sublingual spray for about $25, a full fifty to sixty dollars less than most of the other folks who carry the product do.

MM: Blood pressure seems to be on the rise at a rapid pace. What advice do you have for lowering blood pressure?

On lowering blood pressure I have a two prong approach:

a) taking systemic enzymes to lyse away the fibrin clogs that plus up the micro circulation and reduce full circulation to the extremities, (peripheral vascular resistance).  PVR is equal to having high pressure at the kitchen tap when all the other water taps in the house are closed.

b) Do strength training and build miles and miles of new blood vessels. This better feed tissue as well as reduces peripheral vascular resistance further.

Between the two it's like opening all the water taps in the house, pressure at the kitchen tap goes down. It must be said that there are 2 reasons for high blood pressure: Peripheral Vascular Resistance and Kidney Damage.  When this technique does not work then we know the patient has a good bit of kidney damage and that is the cause of their higher BP.

MM: What about taking CoQ10?

DW: Co Q 10 is an essential for heart health as is Vit. E.  On the Co Q 10 the dose should be equal to the persons age in decades or if there is heart pathology then 150 to 300 mg daily.  On the Vit. E there has been much junk medical science made by drug companies to disprove the effectiveness of vitamins so they can sell you their expensive drugs instead.  1200 to 1600 IU of E are needed daily as well and the hearts favorite mineral Magnesium.  With out you'll not only have constipation, night cramps, muscle spasms and a build up of calcium in arterial plaque but in the extreme of mag deficiency you'll get irregular heart beat (arrhythmia).  Of mag we need 1000 to 2000 mg daily. In some folks this can cause the runs so they can use magnesium glycinate the only form of the mineral that does not cause loose stools.

MM: Recently you came out with a book on sexual health. How is your book different from other books on the market?

DW: Most books on men?s sexual performance are written by non experts, guys like "Big Joe From Brooklyn" and the text covers nothing scientific or medical but reads like porn.  Other books on sexual performance are so full of fluff and needless, useless prattle, that out of 100+ pages the real advice or techniques come in the last 5 pages of the work.  My men?s pro-sexual book "The Care And Feeding Of A Penis"  has no porn or nude male pictures, is filled with immediately useful information in every chapter and from penis size, to sperm count from peyronies to erectile dysfunction there is something to benefit every man from 27 to 97!   It is the users manual we should have come with!  To make it accessible world wide with out having to pay the VAT (taxes) usually imposed on book imports we have offered the book as a downloadable E book.  It's available from www.drwongsbooks.com

MM: Well this is certainly going to be a controversial interview to say the least. Thank you for taking the time to do the interview and keep up the great work.

DW: You are very welcome and I look forward to talking to you again. I would like to invite your readers to check out my website: www.drwong.us.

===========================================


https://thenutritionwatchdog.com/what-is-your-plastic-footprint/

===========================================


This may offer leads

https://endocrinedisruption.org/interactive-tools/endocrine-basics



« Last Edit: November 11, 2018, 12:42:34 PM by Crafty_Dog »





bigdog

  • Power User
  • ***
  • Posts: 2321
    • View Profile
all things muscle
« Reply #142 on: January 09, 2019, 02:17:33 PM »

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
The Goodness Paradox
« Reply #143 on: January 27, 2019, 12:06:52 PM »


The Goodness Paradox’ Review: The Benefits of Good Breeding
Humans are peaceful compared to some of our closest primate relatives. Did we domesticate ourselves?
Two Eastern chimpanzees in Mahale Mountains National Park, Tanzania. Auscape/Getty Images
By John Hawks
Jan. 25, 2019 9:45 a.m. ET

An anthropologist at Harvard University, Richard Wrangham is no stranger to wild animals. His long fieldwork with wild chimpanzees in the Kibale Forest of Uganda, and other African field sites, has done much to help scientists see the role of aggression and violence in our close relatives.

Two decades ago, Mr. Wrangham examined chimpanzee violence in his provocative book “Demonic Males: Apes and the Origins of Human Violence” (co-written with Dale Peterson). That book recounts how field primatologists, including Mr. Wrangham at Kibale, began to understand that coalitions of male chimpanzees work together to kill chimpanzees in neighboring groups. Humans, the authors suggested, have a violent heritage, one that is still manifested today.
‘The Goodness Paradox’ Review: The Benefits of Good Breeding
Photo: Tom Brakefield/Getty Images
The Goodness Paradox

By Richard Wrangham
Pantheon, 377 pages, $28.95

The scene among anthropologists debating the thesis of “Demonic Males” was not so different from a face-off between rival chimpanzees. Any suggestion that humans have an intrinsically violent and aggressive nature tends to get people riled. Besides, violence seems to go against so many aspects of human nature. Humans are highly prosocial, cooperative and altruistic. We are kind. If we are so good, how can we be so bad?

In “The Goodness Paradox: The Strange Relationship Between Virtue and Violence in Human Evolution,” Mr. Wrangham probes the deep evolutionary history of human aggression. “We must agree with Frederick the Great: ‘Every man has a wild beast within him,’ ” he writes. “The question is what releases the beast.” “The Goodness Paradox” has a different emphasis from “Demonic Males,” however. Our violent lineage is still the main problem, but here Mr. Wrangham explores the reasons that our species may have partly overcome it.

In pursuing this question, Mr. Wrangham is far from alone. The Harvard psychologist Steven Pinker is the best-known exponent of the idea that human existence has become progressively less violent over time. That concept may seem counterintuitive. Wars, genocides and social conflicts of the 20th century killed millions of combatants and civilians, while people in small-scale human societies seem to live comparatively peaceful lives. But anthropologists studying even the smallest villages and hunter-gatherer groups have recorded murder, revenge killing and warfare. If extrapolated to large industrialized societies comprising millions, even a few cases in a small group would translate to a frightful rate of violent deaths. Mr. Pinker puts such observations together with archaeological evidence of violence in past societies. These data, he argues, say life is getting better—at least for humans.

Mr. Pinker’s ideas have set off almost as much debate among anthropologists as Mr. Wrangham’s did 20 years before. But even assuming that Mr. Pinker is correct about the past 10,000 years of human history, his idea leaves open some very interesting questions. If all humans today live in social systems that greatly restrict our expression of violence and aggression, how did this come about?

Mr. Wrangham follows the Duke University social psychologist Kenneth Dodge and many others in separating human aggression into two types. “Reactive aggression” is the stuff of bar fights, when individuals provoked by taunts start throwing punches or pulling knives. “Proactive aggression” is premeditated, planned, the stuff of careful tactical strikes. The “paradox” of Mr. Wrangham’s title is the distinction between these different aspects of human violence.

To understand why this distinction matters, Mr. Wrangham asks readers to think of an airplane full of people. Hundreds of humans can sit quietly for hours crammed into tiny uncomfortable seats. Hundreds of chimpanzees in such a space would quickly be ripping one another limb from limb. The difference is reactive aggression—extremely high in chimpanzees, low in humans. But humans must implement elaborate security arrangements to prevent a single person from bringing down the plane in a well-planned plot. That’s the threat of proactive aggression, and it’s uniquely developed in humans.

Humans are not alone in being much more peaceful on the whole than chimpanzees. Chimpanzees themselves have a close sister species, the bonobos, which lives in female-dominated social groups. Whereas chimpanzee groups are regularly racked by aggressive interactions, bonobos resolve tension with affiliative behaviors, often sexual interactions. Chimpanzee males intimidate and beat females; bonobo males do not. As the primatologist Frans de Waal has put it, bonobos make love, not war.

Mr. Wrangham describes a “ball game” sometimes played by both male chimpanzees and bonobos, in which two males chase each other around a tree trunk trying to grab each other’s testes. Bonobos have such trust that they sometimes play the game with males from other communities. Chimpanzees have such instant hostility that play between communities would be unthinkable.

The differences between chimpanzees and bonobos have long been a quandary for anthropologists. Neither of them is closer to humans than the other; they are both our closest relatives. As Mr. Wrangham asks, “Why should two species that look so much alike be so different in their intensity of aggression?”

His answer is that bonobos have domesticated themselves. Domesticated animals, like dogs and horses, exhibit huge decreases in aggression compared with their wild ancestors. Humans have induced those changes in our domesticated animals by selecting strongly against reactive aggression. Wolves that could not tolerate the presence of humans didn’t become part of the ancestral dog gene pool. Cattle that attacked their minders were not bred.

The most well-known experiment on domestication was begun in the late 1950s by the Russian researcher Dmitri Belyaev, who selected against aggressive responses to humans in silver foxes, minks and rats. Within a few dozen generations these species exhibited many of the behavioral traits of domesticated species like dogs. They also began to show physical changes. Fox ears became floppy, and spots of color began to appear. These and other changes are part of a “domestication syndrome” shared across many species.

In Mr. Wrangham’s description, bonobos display many aspects of this syndrome. Their brains are smaller than chimpanzees’, a shift also seen in many domesticates. Their skulls are “juvenilized,” with smaller faces and brow ridges. Maybe, he suggests, these changes are tied to their massive reduction in aggression. Such changes, Belyaev and others have claimed, make individuals look less threatening. Then again, adult bonobos are remarkable in their willingness to play. Maybe the juvenilized skull form is a genetic side effect of retaining such juvenile-like behavior.

Bonobos hint that self-domestication might be possible. Could it be that we ourselves are the most extreme example of our propensity to domesticate the creatures around us?

The idea of human self-domestication is older than the theory of evolution. Mr. Wrangham traces the scientific record of human self-domestication to Johann Blumenbach, a German physician and naturalist at the turn of the 19th century. Blumenbach is known for devising a theory of the origin of races by a process he called “degeneration.” It’s a terrible name for a concept that contained the seeds of evolution. Blumenbach claimed that human races had a common origin in the Caucasus region of Asia and then changed when they encountered different environments such as sun, heat or cold.

Whereas Blumenbach tied races to his theory of change, he viewed domestication as part of the nature of all humans, describing us as the “one domestic animal . . . that surpasses all others.” But if humans are domesticated, someone must have domesticated them. For Blumenbach, the only answer was divine intervention.

Since the notion of self-domestication preceded Charles Darwin’s work on evolution, Darwin had a good chance to think it over. He didn’t like the idea. Not many evolutionary biologists have. In his 1962 work on human origins, the geneticist Theodosius Dobzhansky wrote: “The concept of human domestication is too vague an idea at this time to be scientifically productive.”

But times have changed, and a broad array of anthropologists and geneticists have become interested in the idea of human self-domestication again. Some have focused on possible hormonal changes, trying to understand why humans today have skulls that lack the large brow ridges and thicker bone of our fossil ancestors. Others have worked on developmental timing, trying to show that adult humans retain some of the traits of ancient children.
A silver fox descended from those domesticated by Dmitri Belyaev.
A silver fox descended from those domesticated by Dmitri Belyaev. Photo: Artyom Geodakyan/TASS via Getty Images
Newsletter Sign-up

These scientists face the same problem as Blumenbach. Domestication may seem to be in our nature, but there’s little solid idea of how it could have happened.

Mr. Wrangham’s hypothesis is that humans have been shaped by a history of coalitionary proactive aggression. People work together to enforce social rules and punish wrongdoing. This may sound like common sense, but it raises deep questions. Some chimpanzees and bonobos exhibit a sense of fairness in experiments on sharing food, which suggests that they share a basic moral sense with us. But humans have a developed morality that goes beyond a mere sense of injustice, with elaborate codes detailing right and wrong behavior. How did this evolve?

Chimpanzees cooperate during tasks like hunting. But when humans began to communicate using language, a much broader array of cooperation became possible. Humans can plan their cooperation in advance, with full knowledge of the reasons why they are pursuing a course of action. In Mr. Wrangham’s account, language places proactive aggression on steroids. One outraged person can quickly become a mob.

Morality sometimes leads to outcomes that would be perverse if individuals acted to maximize their own fitness. A chapter titled “The Evolution of Right and Wrong” begins with the tale of an Inuit mother who strangles her arrogant son rather than have him bring shame to the family. Killing an offspring is the last thing that pure fitness-maximizing evolution would promote. For this reason, scientists have debated the evolutionary foundations of morality. Many have proposed that the value of moral behavior in encouraging group cohesion must have outweighed the reproductive interests of individuals. Such “cultural group selection” might have emerged if war and competition between groups was very important to humans in the past.

But as chimpanzees clearly demonstrate, war and competition between groups are not sufficient to give rise to humanlike morality. Mr. Wrangham instead follows the work of the anthropologist Christopher Boehm, who has suggested that morals emerged as an individual defense against coalitional violence. The way to stay safe in a small human group is to be relentlessly egalitarian. Each person has an incentive to follow a shared moral code because violations give would-be punishers a cause to recruit allies against him.

It is indeed a paradox—as good as humans are, they are sometimes driven to seemingly outrageous acts. By giving a detailed comparison of human violence and aggression with that of our close primate relatives, Mr. Wrangham has given a possible explanation for how our species might have domesticated itself. That makes this book essential reading as geneticists start to unwrap the package of genes that responded to domestication, which may give hints about our own evolutionary history.

—Mr. Hawks is a professor of anthropology at the University of Wisconsin-Madison.

G M

  • Power User
  • ***
  • Posts: 26643
    • View Profile
Re: Evolutionary biology/psychology
« Reply #144 on: January 27, 2019, 12:33:39 PM »
We live in a time of prosperity. Take that away and see how quickly things turn violent.

bigdog

  • Power User
  • ***
  • Posts: 2321
    • View Profile
An expert on human blind spots gives advice on how to think
« Reply #145 on: February 04, 2019, 07:17:21 PM »

G M

  • Power User
  • ***
  • Posts: 26643
    • View Profile
Re: An expert on human blind spots gives advice on how to think
« Reply #146 on: February 05, 2019, 07:03:51 AM »
https://www.vox.com/science-and-health/2019/1/31/18200497/dunning-kruger-effect-explained-trump

(I post this as the phenomenon, not about the president.)

Maybe Young-adult infotainment site Vox isn't the best source for this, or any other topic.

Crafty_Dog

  • Administrator
  • Power User
  • *****
  • Posts: 72293
    • View Profile
Re: Evolutionary biology/psychology
« Reply #147 on: February 06, 2019, 03:19:12 PM »
Be that as it may, is it relevant to the essence of the summary of the point being made?


G M

  • Power User
  • ***
  • Posts: 26643
    • View Profile
Re: An expert on human blind spots gives advice on how to think
« Reply #149 on: March 25, 2019, 08:46:26 PM »
https://www.vox.com/science-and-health/2019/1/31/18200497/dunning-kruger-effect-explained-trump

(I post this as the phenomenon, not about the president.)

So, let's explore the Dunning-Kruger effect and all the "smart people" who fell hook, line and sinker for "Russian Collusion".