Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - Crafty_Dog

Pages: 1 ... 5 6 [7] 8 9
301
Science, Culture, & Humanities / Privacy
« on: June 13, 2007, 07:11:28 AM »
All:

The notion of Privacy is under serious attack.   In my opinion, Privacy is a Constitutional right, found in the Ninth Amendment of our Consitution.  This, not the scurrilous attacks upon his person, was the basis of my opposition to the nomination of brilliant legal mind Robert Bork  to the Supreme Court-- he denied the existence of a C'l right to Privacy.

In the name of the inane and insane War on Drugs, the government has intruded into people's lives in ways that once upon a time would have been considered fascistic.

And now the march of technology creates its own demons.

This thread is for the discussion of these matters.   Tis a rare event, but I begin with a NY Times editorial.

Marc
==========================================

Editorial
NY Times
Published: June 13, 2007

Internet users are abuzz over Google’s new Street View feature, which displays ground-level photos of urban blocks that in some cases even look through the windows of homes. If that feels like Big Brother, consider the reams of private information that Google collects on its users every day through the search terms they enter on its site.

Privacy International, a London-based group, has just given Google its lowest grade, below Yahoo and Microsoft, for “comprehensive consumer surveillance and entrenched hostility to privacy.”

There are welcome signs that this Wild West era of online privacy invasion could be coming to an end. Data protection chiefs from the 27 countries of the European Union sent Google a letter recently questioning the company’s policy for retaining consumer information. Here at home, the Federal Trade Commission is looking into the antitrust ramifications of Google’s $3.1 billion acquisition of DoubleClick, an online advertising company.

The F.T.C. should also examine the privacy ramifications of the deal. And Congress needs to act on proposals to prevent the warehousing of such personal data.

Google keeps track of the words users type into its popular site, while DoubleClick tracks surfing behavior across different client Web sites. The combination could give Google an unprecedented ability to profile Web users and their preferences. That knowledge means big bucks from companies trying to target their advertisements. But it also means Google could track more sensitive information — like what diseases users have, or what political causes they support.

Google has announced that rather than keeping information indefinitely, it would only keep it for 18 months before making it anonymous. That is a good step, but not enough since it’s not clear what anonymous means. Last year AOL released records of searches by 657,000 unidentified users. Reporters from The Times were able to trace the queries back to “anonymous” users.

Google is the focus of privacy advocates right now, but it is hardly the only concern. Competitors like Yahoo and Microsoft have the same set of incentives. Privacy is too important to leave up to the companies that benefit financially from collecting and retaining data. The F.T.C. should ask tough questions as it considers the DoubleClick acquisition, and Congress and the European Union need to establish clear rules on the collection and storage of personal information by all Internet companies.


302
Politics & Religion / Memorial Day
« on: May 28, 2007, 07:44:03 AM »


America's Honor
The stories behind Memorial Day.

BY PETER COLLIER
Monday, May 28, 2007 12:01 a.m. EDT

Once we knew who and what to honor on Memorial Day: those who had given all their tomorrows, as was said of the men who stormed the beaches of Normandy, for our todays. But in a world saturated with selfhood, where every death is by definition a death in vain, the notion of sacrifice today provokes puzzlement more often than admiration. We support the troops, of course, but we also believe that war, being hell, can easily touch them with an evil no cause for engagement can wash away. And in any case we are more comfortable supporting them as victims than as warriors.

Former football star Pat Tillman and Marine Cpl. Jason Dunham were killed on the same day: April 22, 2004. But as details of his death fitfully emerged from Afghanistan, Tillman has become a metaphor for the current conflict--a victim of fratricide, disillusionment, coverup and possibly conspiracy. By comparison, Dunham, who saved several of his comrades in Iraq by falling on an insurgent's grenade, is the unknown soldier. The New York Times, which featured Abu Ghraib on its front page for 32 consecutive days, put the story of Dunham's Medal of Honor on the third page of section B.

Not long ago I was asked to write the biographical sketches for a book featuring formal photographs of all our living Medal of Honor recipients. As I talked with them, I was, of course, chilled by the primal power of their stories. But I also felt pathos: They had become strangers--honored strangers, but strangers nonetheless--in our midst.

In my own boyhood, figures such as Jimmy Doolittle, Audie Murphy and John Basilone were household names. And it was assumed that what they had done defined us as well as them, telling us what kind of nation we were. But the 110 Medal recipients alive today are virtually unknown except for a niche audience of warfare buffs. Their heroism has become the military equivalent of genre painting. There's something wrong with that.
What they did in battle was extraordinary. Jose Lopez, a diminutive Mexican-American from the barrio of San Antonio, was in the Ardennes forest when the Germans began the counteroffensive that became the Battle of the Bulge. As 10 enemy soldiers approached his position, he grabbed a machine gun and opened fire, killing them all. He killed two dozen more who rushed him. Knocked down by the concussion of German shells, he picked himself up, packed his weapon on his back and ran toward a group of Americans about to be surrounded. He began firing and didn't stop until all his ammunition and all that he could scrounge from other guns was gone. By then he had killed over 100 of the enemy and bought his comrades time to establish a defensive line.

Yet their stories were not only about killing. Several Medal of Honor recipients told me that the first thing they did after the battle was to find a church or some other secluded spot where they could pray, not only for those comrades they'd lost but also the enemy they'd killed.

Desmond Doss, for instance, was a conscientious objector who entered the army in 1942 and became a medic. Because of his religious convictions and refusal to carry a weapon, the men in his unit intimidated and threatened him, trying to get him to transfer out. He refused and they grudgingly accepted him. Late in 1945 he was with them in Okinawa when they got cut to pieces assaulting a Japanese stronghold.

Everyone but Mr. Doss retreated from the rocky plateau where dozens of wounded remained. Under fire, he treated them and then began moving them one by one to a steep escarpment where he roped them down to safety. Each time he succeeded, he prayed, "Dear God, please let me get just one more man." By the end of the day, he had single-handedly saved 75 GIs.

Why did they do it? Some talked of entering a zone of slow-motion invulnerability, where they were spectators at their own heroism. But for most, the answer was simpler and more straightforward: They couldn't let their buddies down.

Big for his age at 14, Jack Lucas begged his mother to help him enlist after Pearl Harbor. She collaborated in lying about his age in return for his promise to someday finish school. After training at Parris Island, he was sent to Honolulu. When his unit boarded a troop ship for Iwo Jima, Mr. Lucas was ordered to remain behind for guard duty. He stowed away to be with his friends and, discovered two days out at sea, convinced his commanding officer to put him in a combat unit rather than the brig. He had just turned 17 when he hit the beach, and a day later he was fighting in a Japanese trench when he saw two grenades land near his comrades.

He threw himself onto the grenades and absorbed the explosion. Later a medic, assuming he was dead, was about to take his dog tag when he saw Mr. Lucas's finger twitch. After months of treatment and recovery, he returned to school as he'd promised his mother, a ninth-grader wearing a Medal of Honor around his neck.





The men in World War II always knew, although news coverage was sometimes scant, that they were in some sense performing for the people at home. The audience dwindled during Korea. By the Vietnam War, the journalists were omnipresent, but the men were performing primarily for each other. One story that expresses this isolation and comradeship involves a SEAL team ambushed on a beach after an aborted mission near North Vietnam's Cua Viet river base.
After a five-hour gunfight, Cmdr. Tom Norris, already a legend thanks to his part in a harrowing rescue mission for a downed pilot (later dramatized in the film BAT-21), stayed behind to provide covering fire while the three others headed to rendezvous with the boat sent to extract them. At the water's edge, one of the men, Mike Thornton, looked back and saw Tom Norris get hit. As the enemy moved in, he ran back through heavy fire and killed two North Vietnamese standing over Norris's body. He lifted the officer, barely alive with a shattered skull, and carried him to the water and then swam out to sea where they were picked up two hours later.

The two men have been inseparable in the 30 years since.

The POWs of Vietnam configured a mini-America in prison that upheld the values beginning to wilt at home as a result of protest and dissension. John McCain tells of Lance Sijan, an airman who ejected over North Vietnam and survived for six weeks crawling (because of his wounds) through the jungle before being captured.

Close to death when he reached Hanoi, Sijan told his captors that he would give them no information because it was against the code of conduct. When not delirious, he quizzed his cellmates about camp security and made plans to escape. The North Vietnamese were obsessed with breaking him, but never did. When he died after long sessions of torture Sijan was, in Sen. McCain's words, "a free man from a free country."

Leo Thorsness was also at the Hanoi Hilton. The Air Force pilot had taken on four MiGs trying to strafe his wingman who had parachuted out of his damaged aircraft; Mr. Thorsness destroyed two and drove off the other two. He was shot down himself soon after this engagement and found out by tap code that his name had been submitted for the Medal.

One of Mr. Thorsness's most vivid memories from seven years of imprisonment involved a fellow prisoner named Mike Christian, who one day found a grimy piece of cloth, perhaps a former handkerchief, during a visit to the nasty concrete tank where the POWs were occasionally allowed a quick sponge bath. Christian picked up the scrap of fabric and hid it.

Back in his cell he convinced prisoners to give him precious crumbs of soap so he could clean the cloth. He stole a small piece of roof tile which he laboriously ground into a powder, mixed with a bit of water and used to make horizontal stripes. He used one of the blue pills of unknown provenance the prisoners were given for all ailments to color a square in the upper left of the cloth. With a needle made from bamboo wood and thread unraveled from the cell's one blanket, Christian stitched little stars on the blue field.

"It took Mike a couple weeks to finish, working at night under his mosquito net so the guards couldn't see him," Mr. Thorsness told me. "Early one morning, he got up before the guards were active and held up the little flag, waving it as if in a breeze. We turned to him and saw it coming to attention and automatically saluted, some of us with tears running down our cheeks. Of course, the Vietnamese found it during a strip search, took Mike to the torture cell and beat him unmercifully. Sometime after midnight they pushed him into our cell, so bad off that even his voice was gone. But when he recovered in a couple weeks he immediately started looking for another piece of cloth."

We impoverish ourselves by shunting these heroes and their experiences to the back pages of our national consciousness. Their stories are not just boys' adventure tales writ large. They are a kind of moral instruction. They remind of something we've heard many times before but is worth repeating on a wartime Memorial Day when we're uncertain about what we celebrate. We're the land of the free for one reason only: We're also the home of the brave.
Mr. Collier wrote the text for "Medal of Honor: Portraits of Valor Beyond the Call of Duty" (Workman, 2006).

303
Science, Culture, & Humanities / Energy issues
« on: May 26, 2007, 07:08:10 AM »
stratfor.com

Global Market Brief: Fear, War, Smog, Storms and the Price of Summer Vacation
Every summer, gasoline prices in the United States go up. This is not because oil tycoons get frisky and realize they can squeeze a little bit more from the people driving to the nearest park with bicycles strapped to the tops of their sport utility vehicles; it is the sum of a variety of mostly structural factors within the U.S. system that are susceptible to natural disasters, along with the risk factors that vary every summer and make the oil market susceptible to unrest, wars and rumors of war.

The good news is that this summer, a few of the key risk factors that inflate crude oil prices with panic premiums could subside -- such as violence in Nigeria, which should wane in the wake of national elections, and tensions between the United States and Iran over Iraq's future, which could be settled in talks soon. If all the stars align, there could even be a rare downward step adjustment in crude prices. The bad news -- aside from the unlikelihood of the stars aligning -- is that a world without strife would still have hurricanes.

Before looking at the specifics of this summer, it is worth reviewing why prices tend to pick up in March and spike around Memorial Day each year, remaining high until they begin to fall in November. Besides the obvious uptick in gasoline demand (first in the spring when farmers hit planting season and then for pleasure driving and vacations as days become longer and sunnier), one culprit for a spike in U.S. prices at the pump is smog -- or rather, how our federal and local governments react to it.

In winter, the standard gasoline is one of about three blends. In the summer, to reduce smog, a crisscross of federal and local government standards mandate special blends. These requirements are not in harmony; myriad blends are mandated and sometimes differ from one part of a state to another (as in California and Texas), depending in part on a location's temperature, altitude and urban density -- that is, the extent to which volatile organic compounds in fuel are likely to evaporate, and the extent to which the air in that place is already unhealthy. Even areas that have similar characteristics request different summer blends.

This variety of requirements results in the inefficient production of boutique blends -- and refineries initially tend to err on the side of caution, producing enough to meet the low end of estimated demand or adding additives to each blend as the trucks are filled rather than ending up with too much of a blend that no one else in the country will buy. Summer additives also tend to be more expensive than winter blend components. (The Environmental Protection Agency and the Department of Energy will release a "Fuel System Requirements Harmonization Study" in 2008. States probably will not want to give up their individual powers to regulate, however -- and new legislative authority would be needed at the federal level to overcome the boutique fuels phenomenon.)

Thus, as the switch is made from winter to summer blends, prices go up. Then, as the summer driving season begins, demand surges and prices stay high. The U.S. system is equipped to handle the boutique blends, so they do not pose the threat of shortages or worse price spikes -- unless there is an unexpected disruption in the supply chain by, say, an immense storm that hits the majority of U.S. refineries in the Gulf of Mexico. The government has demonstrated it can be flexible when a real disaster strikes; after Hurricane Katrina, the Bush administration temporarily waived the air quality standards requiring the variety of blends, which helped mitigate price spikes.

Another factor that can affect summer gasoline prices is oil and gasoline inventories. The Energy Department released its inventory report May 23, and the numbers were not as grim as feared. Although gasoline inventories are still 7 percent below their five-year average for this time of year, they have been climbing rapidly since April, following a three-month period of unexpected refinery fires and other problems on top of regular spring maintenance. Crude oil inventories are actually 7.6 percent above their five-year average, so there is plenty to draw from as refineries play catch-up. There are relatively few giant refineries in the United States, however, so each time one goes offline it is a significant concern. And contrary to the stories of conspiracy theorists, who claim oil companies choose not to build more refineries because they want to keep prices up, the actual reason is the difficulty of overcoming "not-in-my-backyard" campaigns bolstered by environmentalists whenever a new refinery is proposed.

This summer's bad news is that experts expect the hurricane season to be worse than average. Then again, in 2006 these same experts predicted a repeat of 2005 and, instead, El Nino caused a very mild storm season. A direct hit on refining infrastructure still recuperating from Katrina in 2005 is not very likely. However, the possibility remains and makes those who trade on risks jittery -- which brings us to the price of crude.

The price of Nymex crude is hovering around $65 per barrel. The average person can rattle off the reasons for this high price: their names are Iraq, Iran, Nigeria, Venezuela, Russia and Saudi Arabia. Almost every country that produces oil in large quantities is either nationalizing its energy sector (which tends to limit production) or is a political mess (or at risk of quickly becoming one). Then factor in the U.S.-jihadist war, hurricanes, pirates (yes, pirates -- though mostly around Africa and Southeast Asia, not in the Caribbean). And while these concerns about reliable supplies run rampant, world demand is increasing, driven by growing economies worldwide -- particularly China and India, the voracious newcomers to the global resource buffet.

It generally costs less than $32.50 to produce and transport a barrel of oil; the price of oil is floating on a cushion of fear-driven speculation. Even though there has not been an oil supply crisis for more than three decades, when buyers order for future delivery, they are willing to pay top dollar now on the chance that, if they wait, some catastrophe will drive prices far higher.

The circumstances behind anxiety-based oil prices are not likely to get a whole lot worse this year -- and, in some ways, they are getting better. Nigeria is over the worst of the election-driven attacks against oil infrastructure that reduced its output by one-third this year, and that production is beginning to come back on line. After a period of post-election calm, militant attacks are likely to increase later in the summer, but chances are that things will not get quite as bad as they were. In addition, Iran and the United States appear to be finally ready to sit down together and hammer out a deal on Iraq. The first direct and public bilateral talks are scheduled to take place May 28. If this process succeeds -- and, of course, many things could disrupt it -- it still remains to be seen whether the violence in Iraq can be tamed. However, the oil flow from Iraq mostly depends not on peace in the Sunni triangle but on revenue-sharing arrangements among Iraq's various interest groups and regions, which a deal with Iran could help solidify. And, of course, a deal with Iran would decrease the already slight likelihood of a U.S. airstrike against Iran or -- the nightmare scenario -- of conflict in the Persian Gulf leading to an obstruction of the Strait of Hormuz.

Oil traders do not tend to lower prices incrementally as things get gradually better -- only to raise them in fits as their fears are played upon. This means that, from time to time, there is a significant correction -- a sharp drop in oil prices. While we are not prepared to forecast such an adjustment this year, it seems to be more likely than the fruition of the worst fears propping up the current price.

One other thing to note: The Organization of the Petroleum Exporting Countries (OPEC) is back, in a light kind of way. That is, OPEC countries have actually begun pumping below capacity again -- something that has not happened for years. The flip side to this is that OPEC no longer controls nearly as much of total global production as it did in the 1970s. Furthermore, Saudi Arabia does not really want to curtail its production and Venezuela cannot afford to. So, while it is something to watch, OPEC is no longer the main issue.

Overall, while gasoline prices will not be kind this summer, they probably will not behave erratically. The main variables that would disrupt this equation are a very nasty hurricane or relative peace in the Middle East. One of those sounds a little more plausible than the other.

CHINA: China's new State Investment Co. surprised global markets May 20 by announcing a planned purchase of a 9.9 percent stake in U.S. private equity player the Blackstone Group. This move proved China's ability to outsmart the markets (as far as the management of its $1.2 trillion of foreign exchange reserves) and its ability to carry out internal economic reforms while mitigating adverse global market effects. Blackstone is the first foreign equity purchase made with Chinese state foreign reserves, but it will not likely be the last. Watch out for new Chinese foreign exchange reserve-funded purchases in other foreign financial intermediaries next.

RUSSIA: Russian nickel company Norilsk Nickel raised its offer for Canadian mining company LionOre Mining International Ltd. to $6.3 billion May 23, trumping a bid by rival Swiss company Xstrata of $5.7 billion. Norilsk Nickel's bid comes with the blessing of the Kremlin, which is expected eventually to solidify its control over the company and thus ensure Norilsk Nickel has access to whatever funding it needs to expand abroad. Norilsk Nickel already holds around an 18 percent stake in the global market for nickel production. By the time the Kremlin consolidates control over the company, it could find itself with an even larger and richer prize.

FRANCE: France will eventually sell its 15 percent stake in the European Aeronautic Defense and Space Co. (EADS), the parent company of aircraft maker Airbus, French President Nicolas Sarkozy said May 18. Though Airbus has experienced a bout of major setbacks, France's political desire to have a European aerospace champion has almost guaranteed its continued existence, and the company has been subsidized with almost $15 billion worth of EU funds. However, the new French government has promised to reform many of the problems weighing France down. Sarkozy's statement that the French government might pull out of EADS altogether suggests that Airbus' key government support is waning -- and that its lifetime could be limited.

AFRICA: The Common Market for Eastern and Southern Africa (COMESA) approved a common external tariff system May 23 at a meeting in Kenya. The agreement lowers tariffs for COMESA countries to 10 percent for intermediate products and 25 percent for finished goods, and eliminates tariffs on capital goods and raw materials. The agreement brings COMESA closer to implementing a customs union in 2008 that would allow the 20-state bloc to operate commercially like the European Union. Seven COMESA states have yet to join the free trade area launched in 2000, citing revenue losses and competition from more advanced states. The common tariff system will make trade among member states more efficient, and a customs union would improve COMESA's ability to compete with larger economies.

AUSTRALIA: Australian Prime Minister John Howard announced May 22 that Australia will transfer monopoly control of wheat exports from the scandal-engulfed AWB Ltd. (formerly known as the Australian Wheat Board) to a grower-owned company by mid-2008. An independent task force is investigating a claim that AWB paid $224 million in bribes between 1999 and 2003 to former Iraqi President Saddam Hussein's government. Though the move will benefit farm groups by transferring ownership back to the growers, the continuation of the single-desk structure likely will anger the U.S. farm lobby, which has long opposed the system. The move will benefit Howard domestically by strengthening his coalition and bolstering support from farmers in an election year. The group most negatively affected by the new deal will be nongrower investors in the AWB, who will have no stake in the new company.

IRAN: Gasoline prices in Iran increased by 25 percent May 22. Iranian state news agency IRNA reported that Interior Minister Mostafa Pour-Mohammadi said rationing will begin around June 5. The increase -- which follows a May 20 announcement that the government would not raise fuel prices -- is part of Tehran's efforts to reduce state subsidies for gasoline and discourage smugglers who have been buying fuel at Iran's relatively low price and sneaking it out of the country to sell. The pragmatic conservative establishment, led by Expediency Council head Ali Akbar Hashemi Rafsanjani, likely designed the move to create problems for Iranian President Mahmoud Ahmadinejad's administration as part of an effort to weaken his faction's influence in the government.

IRAQ/U.S./UAE: Halliburton is considering $80 billion in projects around the globe as it rethinks its exit from Iraq, Halliburton CEO Dave Lesar said May 22. Lesar forecasts Halliburton investments in the Eastern Hemisphere -- including the Middle East, Russia, Africa, East Asia and the North Sea -- to hover around 70 percent of total capital investment over the next five years. Halliburton also has shown a willingness to sign deals with certain state actors or companies in the Middle East and Russia that the international community frowns upon. Lesar's hint that the company will reconsider its exit from Iraq indicates Halliburton is expecting a political settlement in Iraq that will allow energy majors to re-enter the reconstruction process.

MERCOSUR: Mercosur members' foreign affairs and economy ministers announced some details about the proposed Banco del Sur on May 22. Most important is that the development bank will have equal representation and capital share from its seven members, with the initial capital likely totaling between $2 billion and $3 billion. At least initially, the bank will be capable of development lending, but not of bailing out countries in the event of a serious economic shock. This is a blow to the vision of Venezuelan President Hugo Chavez, who -- with support from Argentina, Ecuador and Bolivia -- has for months proposed Banco del Sur as an alternative to the International Monetary Fund, World Bank and Inter-American Development Bank. Brazil's involvement in Banco del Sur has created the terms to keep the bank tame.

BOLIVIA/BRAZIL: Bolivia said May 23 it will compensate Brazilian state oil firm Petroleo Brasileiro $112 million for the nationalization of two refineries by June 10. Brazil indicated May 21 that it would accept natural gas instead of cash as payment, but then said unless the first payment is made by June 11, the matter will be tabled. Talks over the compensation were troubled; Brazil threatened to suspend investment in Bolivia if fair compensation was not offered, while Bolivia threatened to expropriate the facilities if its offers were rejected. The compensation agreement is important to both countries, but more so to Bolivia: Brazil is a key investor in Bolivia and purchases about 25 million cubic meters of natural gas daily -- nearly two-thirds of Bolivian output.

304
Science, Culture, & Humanities / Apocaholics Anonymous
« on: May 26, 2007, 06:55:33 AM »
http://www.godward.org/commentary/Out%20of%20the%20Box/Apocaholics%20Anonymous.htm

Welcome to “Apocaholics Anonymous” –
Join Me in a Crusade for Panic-Free Living
Updated for the Atlanta Investment Conference
20th Anniversary Reunion, 8:45 am, April 20, 2007
By Gary Alexander, Recovering Apocaholic

 

Hi, I’m Gary and I’m a recovering Apocaholic.  I am currently Apocalypse free for nearly 18 years.  I left the church of the Religious Apocalypse in 1976, over 30 years ago, and I resigned from the secular church of the Financial Apocalypse in 1989.  Yes, I still feel the urge to proclaim the end of all things, from time to time, but I white-knuckle my way to a history book for a little perspective, and then I breathe easier. If you wish to join AA, the only requirement is that you give up the adrenaline rush of media-fed fantasies.

Since I spoke to you last on this subject, in 1994, we have survived “Bankruptcy 1995” (the original epidemic of Hockey Stock charts), the Big Bang in Hong Kong, years of Y2K scare stories, a SARS epidemic, Mad Cow disease, Bird Flu, a real threat on 9/11, Triple Deficits (Budget, Trade and Balance of Payments), wars in Serbia/Kosovo, Iraq and Afghanistan, Deflation in 2003, Inflation since then, The Perfect Storms of 2005 (Katrina, Rita and Wilma, the 3 Witches of the Bermuda Triangle), and today’s reigning fears of Global Warming, $200 Oil and the Sub-prime Housing Loan Crisis Implosion.

But before we go from today’s Sub-prime to the ridiculous claims of imminent collapse, let me introduce the depths of my past addiction to the Apocalypse.  I was born in July 1945, the day the first atomic bomb exploded in Alamogordo, New Mexico.  That mushroom crowd has haunted our lives ever since. As a teenager, I became convinced the world would end before I was 30.  Too soon old…too late smart, I was very, very wrong:


50 Years Ago (1957) – The “Duck and Cover” Generation

My apocalyptic addiction began 50 years ago, in the Year of Sputnik, when all of us Seattle-area 7th graders – mostly the offspring of Boeing engineers – were told that we must now learn more science and math, to close the missile gap with the Soviet Union.  

Back in 1957, the U.S. was the proud owner of 100,000 kilograms of U-235, in what was termed “45 times overkill” of the Soviets.  But the Soviets had more missiles than we did. In that same year, 1957, the first underground nuclear explosion was set off near Las Vegas.  In junior high, I soon became addicted to dystopian novels, like On the Beach, by Nevil Shute, a Briton who had moved to Australia, in order to be among the last on earth to be fried by the inevitable radiation cloud following nuclear Armageddon.  The novel was adapted for the screen in 1959, directed by Stanley Kramer, and starring Gregory Peck as captain Dwight Lionel Towers of the USS Sawfish. The story was set in the near future, 1963 in the book (1964 in the movie), in the months following World War III.  Nuclear fallout killed ALL life, with hot air currents killing off Australia last,  

The characters made their best effort to enjoy what remained of their life before dying from radiation poisoning.  The film was shot in Melbourne, with a chilling ending of wind-swept but empty city streets there. That image has haunted me, to this day. I am convinced that this hopelessness sewed the seeds for the senseless rush to immediate gratification in the 1960s.  With a world about to die, hedonism soon reigned supreme.


In high school, I read Aldous Huxley’s “Brave New World” and the scathing exposes and novels of Philip Wylie (1902-1971), son of a Presbyterian minister father and a novelist mother (who died when he was five).  Wylie wrote apocalyptic nuclear war novels like “Tomorrow” (1954), about the atomic bombing of two fictional Midwest cities adjacent to each other in the mid-1950s.  One had an effective civil defense program, and the other did not.  Later, I read his novel, “Triumph” (1963), another graphic description of the effects of nuclear war story involving a worst-case USA/USSR “spasm war,” in which both sides emptied their arsenals into each other with extensive use of “dirty” bombs to maximize casualties, resulting in the main characters (in a very deep bomb shelter) being the sole survivors in the northern hemisphere, the new Adam and Eve of a new creation.

In the financial realm, I was also becoming convinced that America’s economy was doomed, especially after reading John Kenneth Galbraith’s “The Affluent Society” (1958), which said the rich get richer and the poor get poorer, while advertising creates artificial demand in the West. The same theme was echoed in Vance Packard’s “The Hidden Persuaders” (1957).  He followed up with “The Status Seekers” (1959) and “The Waste Makers” (1960).  Also popular was a book we young cynics all read, “The Ugly American” (1958), by William Lederer and Eugene Burdick.  America was supposedly incredibly shallow and bigoted in the 1950s, soon to be rescued by the Liberated 1960s.

P.S. The world is still a dangerous and violent place, but the most chilling example of violent death now is in Africa, with machetes.  We’ve now gone over 61 years without using nuclear bombs against humans – thank God.  Back in the late 1960s, Herman Kahn wrote “On Thermonuclear War” and “Thinking the Unthinkable,” in which he demonstrated that we can survive a nuclear holocaust, but that didn’t seem likely in 1962:  

 

45 Years Ago (1962): The Cuban Missile Crisis and “Silent Spring”

The closest we came to a nuclear exchange was in October, 1962, during the Cuban Missile Crisis, in my high school senior year.  That was one of the events that caused me to throw away a National Merit Scholarship and decide to attend a small church college that seemed to made sense of these global threats.  Another impetus was the collapse of the global ecology, as demonstrated in another best-selling book that I read in 1962:
 
Rachel Carson (1907-1964) published “Silent Spring” in 1962, based on a compilation of articles she had written for The New Yorker.  Her book is credited with launching the environmental movement that culminated in Earth Day (1970), including a worldwide ban on the main villain in her book, DDT.  Silent Spring was a Book of the Month Club main selection, spending several weeks on the New York Times best seller list.  It was actively endorsed by one of my heroes at the time, a Washington State native, Supreme Court Justice William O. Douglas, as well as many other nature advocates in my school.

As a result of that book and further research, I wrote an extended scientific article for a national magazine in 1970, linking the chemicals in DDT to many of the pest sprays commonly used in homes.  I wrote other articles supporting the ban in DDT, which I am ashamed to say, has caused the deaths of millions of Asians and Africans since then.  Many insect-borne diseases were on the verge of extinction in 1970, when the U.S. tied foreign aid to poor nations to their “voluntary” banning of DDT, to our great shame.

Knock, knock!
Who’s There?
Armageddon!
Armageddon Who?
Armageddon outa here!

 
In 1963, I threw away my future to apply to Ambassador College and join the Worldwide Church of God, in effect saying “Armageddon Outa Here.”  The book that motivated me the most was Herbert Armstrong’s “1975 in Prophecy,” in which he showed from several perspectives that the world couldn’t make it past 1975.  After four years of their college indoctrination, I became a leading writer, editor and researcher for a decade (1966-76) for their publications, turning secular trends into Apocalyptic rhetoric in magazines and in the electronic radio media, writing radio and TV scripts for the voice of “The World Tomorrow,” the late Garner Ted Armstrong.  I didn’t have long to wait for ammunition:

 
40 Years Ago: “The Population Bomb!” and “Famine 1975”

Upon graduation from college, my job of predicting the End of the World by 1975 was made incredibly easier by a wave of new books proclaiming the inevitable end, based on the centuries-old (and easily discredited) theories of Thomas Robert Malthus, who wrote in 1798 that population grew geometrically, but food production could only grow in small (arithmetic) increments.  In 1967, the brothers William and Paul Paddock wrote a book called “Famine 1975,” in which they said it was impossible for food production to keep up with population growth. The title of their first chapter said, “The Population-Food Collision Is Inevitable; It Is Foredoomed.”  The Paddocks believed that the Malthusian formula was on a collision course and all we could do was starve a little less than others.

Then came Paul Ehrlich’s “The Population Bomb” (1968), in which he opened famously by saying, “The battle to feed humanity is over.  In the 1970s and 1980s, hundreds of millions of people will starve to death, in spite of any crash programs embarked upon now.”  Writing in Ramparts magazine, he went even further, “Hundreds of millions of people will soon perish in smog disasters in New York and Los Angeles…the oceans will die of DDT poisoning by 1979…the U.S. life expectancy will drop to 42 years by 1980, due to cancer epidemics.”  Hepatitis and dysentery would sweep America by 1980 and nearly all of us would wear gas masks.  Over 65 million Americans would starve in the 1980s, leaving only 22.6 million starved Americans alive in 1990.  In 1990, he incredibly justified his claims as being right – a trait common to Doomsday prophets. *

 “The individual will frequently emerge not only unshaken but even more convinced of the truth of his beliefs than ever before.  Indeed, he may even show a new fervor about convincing and converting other people to his view.” – Leon Festinger, “When Prophecies Fail.”

In the meantime, Dr. Normal Borlaug was launching the Green Revolution, which has managed to feed billions more people on moderately more arable soil than in the 1960s.  Instead of starving against our will, millions of us are trying to starve voluntarily – by dieting.  Food is far cheaper, relative to the overall growth of the cost of living, than in the 1960s.  From 1977 to 1994, food costs fell 77% in real terms.  Grain is in surplus, despite 46 million idle arable acres of U.S. farmland, and 11 million idle acres in Europe.  

In the first 15 years after “Earth Day,” we made great progress against pollution.  The amount of particulates spewed into the air fell by 64%, carbon monoxide emissions fell 38%, ocean dumping of industrial wastes was cut by 94%, and the number of rivers unfit for swimming dropped 44%.  By 1990, cars emitted 78% fewer pollutants.  Yet Lester Brown’s annual “State of the Earth” keeps saying the opposite, that pollution is growing.

And for anyone who still believes in Dr. Malthus, I have one word to share with you: Chickens!  Are they food, or are they population?  Do they grow arithmetically, or geometrically?  On the Delmarva Peninsula alone, 90 million cluckers live their nasty, brutish, crowded and short lives on the way the chopping block and your local KFC.

The famine/population fear is older than Malthus.  Confucius thought the earth was full, 2500 years ago. Romans thought they had “worn out the earth.”  St. Jerome said “the world is already full, and the population too large for the soil.” Tertullian wailed about “teeming populations of Carthage” with “numbers burdensome to the world.”  He saw death from famine, war and disease as “the means of pruning the luxuriance of the human race.”  In truth, Rome was rich when it was crowded, and a wasteland when it was empty.

35 Years Ago – The Club of Rome and “The Limits to Growth”

In the early 1970s, Garner Ted Armstrong pulled me aside and gave me a challenging new project, which might take years to finish.  He said that all the globe’s trends are getting worse, and that if we could only “feed all these trends into a computer,” we could predict the precise time of the end.  Maybe it’s 1975, as we all still thought at the time, or maybe it’s a little later than that.  After all, we can count the hairs we lose each day and predict when we will go bald.  So we could do the same with all other trends – depleting resources, increased crime, nuclear overkill, chemical and environmental pollutants, etc.

Ambassador College had a new IBM 370 computer and a huge programming team at my disposal, so I set out on this impossible project full of hope.  Two years later, I gave up, but a bunch of secular statisticians in Cambridge, Massachusetts did not give up.  They fed all the same kind of data into Harvard’s massive mainframe and came out with their magnum opus, “Limits to Growth,” modeling the future consequences of growing world population and finite resources.  The study was commissioned by the world’s aristocracy, gathered into a group they called the Club of Rome.  Limits to Growth was written by Dennis and Donella Meadows, among many others.  The book used computer simulation to project a rolling Doomsday.  (All this made me feel like less of a religious nut….)

In short, the report’s authors projected that, at the exponential growth rates they expected to continue, all the known world supplies of zinc, gold, tin, copper, oil, and natural gas would be completely exhausted in 1992.  They set specific dates for each commodity.   President Carter later bought into this idea and published his gloomy Global 2000 report.

Then, along came Dr. Julian Simon, who bet Dr. Paul Ehrlich $1,000 that the price of commodities would FALL, not rise, implying an expansion of resources, rather than a contraction of supplies during the decade in which they were all to disappear – the 1980s.  

By 1985, instead of running out of oil, an oil glut pushed the price down from $40 to $10 a barrel.  Shortages beget higher prices and more exploration, not depletion of resources.  In the extreme cases, shortages create new technologies.  A wood shortage in England in the early 17th Century led to the use of coal and the birth of the industrial revolution. A shortage of whales led to the use and discovery of petroleum, and electrical lighting.  The stench of horse manure in urban streets led to the invention of the horseless carriage.

30 Years Ago – Global Cooling and “The Next Ice Age”

My final TV script for Garner Ted Armstrong came in 1975, when I was about to leave the cocoon of the Church of the Apocalypse for a more mundane job at the University of Southern California.  He wanted a program on Global Cooling, or the Coming Ice Age.  In 1975, there were several covers in major news magazines about the Coming Ice Age.  

One example was Newsweek, for the week of April 28, 1975.  It said that leading climate scientists were “almost unanimous” (sound familiar?) in their predictions of global cooling.  Time Magazine had “The Coming Ice Age” on its cover, and the November 1976 issue of National Geographic had a lead article on the problem of global cooling.

Later on, physicists combined the threat of natural cooling with nuclear war to predict a “Nuclear Winter.”  Our future was clearly frigid.  The trend from 1935 through 1975 was a gradual cooling of temperatures, since the Dust Bowl of the 1930s. (Most record-high state temperatures, to this day, were set in the 1930s, not in the 1990s, Mr. Gore.)

One day in the control studio, Garner Ted Armstrong showed me a news clipping that pointed to the potential threat of carbon-dioxide emissions contributing to future global warming – a threat that currently assaults us in the daily media.  He looked me in the eye, as his trusted researcher, and asked point blank, “Which is it – warming or cooling?”

 

“With any luck, sir,” I quipped, “We’ll get both, and then they will offset each other.”

He was not amused.  But I was on my way out and no longer cared what he thought. I was happy that a peaceful new job awaited me at a less Apocalyptic California college. But that did not stop me from reading a series of best-sellers and coming back to the Doomsday business three years later.  In the 1970s alone, all of this was “Coming…”

305
Politics & Religion / The Coming Anarchy by Robert Kaplan
« on: May 20, 2007, 05:48:49 AM »
THE COMING ANARCHY
by Robert D. Kaplan
How scarcity, crime, overpopulation, tribalism, and disease are rapidly destroying the social fabric of our planet
The Atlantic Monthly, February 1994
http://www.TheAtlantic.com/atlantic/election/connection/foreign/anarcf.htm
table of contents

The Minister's eyes were like egg yolks, an aftereffect of some of the many illnesses, malaria especially, endemic in his country. There was also an irrefutable sadness in his eyes. He spoke in a slow and creaking voice, the voice of hope about to expire. Flame trees, coconut palms, and a ballpoint-blue Atlantic composed the background. None of it seemed beautiful, though. "In forty-five years I have never seen things so bad. We did not manage ourselves well after the British departed. But what we have now is something worse--the revenge of the poor, of the social failures, of the people least able to bring up children in a modern society." Then he referred to the recent coup in the West African country Sierra Leone. "The boys who took power in Sierra Leone come from houses like this." The Minister jabbed his finger at a corrugated metal shack teeming with children. "In three months these boys confiscated all the official Mercedes, Volvos, and BMWs and willfully wrecked them on the road." The Minister mentioned one of the coup's leaders, Solomon Anthony Joseph Musa, who shot the people who had paid for his schooling, "in order to erase the humiliation and mitigate the power his middle-class sponsors held over him."

Tyranny is nothing new in Sierra Leone or in the rest of West Africa. But it is now part and parcel of an increasing lawlessness that is far more significant than any coup, rebel incursion, or episodic experiment in democracy. Crime was what my friend--a top-ranking African official whose life would be threatened were I to identify him more precisely--really wanted to talk about. Crime is what makes West Africa a natural point of departure for my report on what the political character of our planet is likely to be in the twenty-first century.

The cities of West Africa at night are some of the unsafest places in the world. Streets are unlit; the police often lack gasoline for their vehicles; armed burglars, carjackers, and muggers proliferate. "The government in Sierra Leone has no writ after dark," says a foreign resident, shrugging. When I was in the capital, Freetown, last September, eight men armed with AK-47s broke into the house of an American man. They tied him up and stole everything of value. Forget Miami: direct flights between the United States and the Murtala Muhammed Airport, in neighboring Nigeria's largest city, Lagos, have been suspended by order of the U.S. Secretary of Transportation because of ineffective security at the terminal and its environs. A State Department report cited the airport for "extortion by law-enforcement and immigration officials." This is one of the few times that the U.S. government has embargoed a foreign airport for reasons that are linked purely to crime. In Abidjan, effectively the capital of the Cote d'Ivoire, or Ivory Coast, restaurants have stick- and gun-wielding guards who walk you the fifteen feet or so between your car and the entrance, giving you an eerie taste of what American cities might be like in the future. An Italian ambassador was killed by gunfire when robbers invaded an Abidjan restaurant. The family of the Nigerian ambassador was tied up and robbed at gunpoint in the ambassador's residence. After university students in the Ivory Coast caught bandits who had been plaguing their dorms, they executed them by hanging tires around their necks and setting the tires on fire. In one instance Ivorian policemen stood by and watched the "necklacings," afraid to intervene. Each time I went to the Abidjan bus terminal, groups of young men with restless, scanning eyes surrounded my taxi, putting their hands all over the windows, demanding "tips" for carrying my luggage even though I had only a rucksack. In cities in six West African countries I saw similar young men everywhere--hordes of them. They were like loose molecules in a very unstable social fluid, a fluid that was clearly on the verge of igniting.

"You see," my friend the Minister told me, "in the villages of Africa it is perfectly natural to feed at any table and lodge in any hut. But in the cities this communal existence no longer holds. You must pay for lodging and be invited for food. When young men find out that their relations cannot put them up, they become lost. They join other migrants and slip gradually into the criminal process."

"In the poor quarters of Arab North Africa," he continued, "there is much less crime, because Islam provides a social anchor: of education and indoctrination. Here in West Africa we have a lot of superficial Islam and superficial Christianity. Western religion is undermined by animist beliefs not suitable to a moral society, because they are based on irrational spirit power. Here spirits are used to wreak vengeance by one person against another, or one group against another." Many of the atrocities in the Liberian civil war have been tied to belief in juju spirits, and the BBC has reported, in its magazine Focus on Africa, that in the civil fighting in adjacent Sierra Leone, rebels were said to have "a young woman with them who would go to the front naked, always walking backwards and looking in a mirror to see where she was going. This made her invisible, so that she could cross to the army's positions and there bury charms . . . to improve the rebels' chances of success."

Finally my friend the Minister mentioned polygamy. Designed for a pastoral way of life, polygamy continues to thrive in sub-Saharan Africa even though it is increasingly uncommon in Arab North Africa. Most youths I met on the road in West Africa told me that they were from "extended" families, with a mother in one place and a father in another. Translated to an urban environment, loose family structures are largely responsible for the world's highest birth rates and the explosion of the HIV virus on the continent. Like the communalism and animism, they provide a weak shield against the corrosive social effects of life in cities. In those cities African culture is being redefined while desertification and deforestation--also tied to overpopulation--drive more and more African peasants out of the countryside.

A Premonition of the Future
West Africa is becoming the symbol of worldwide demographic, environmental, and societal stress, in which criminal anarchy emerges as the real "strategic" danger. Disease, overpopulation, unprovoked crime, scarcity of resources, refugee migrations, the increasing erosion of nation-states and international borders, and the empowerment of private armies, security firms, and international drug cartels are now most tellingly demonstrated through a West African prism. West Africa provides an appropriate introduction to the issues, often extremely unpleasant to discuss, that will soon confront our civilization. To remap the political earth the way it will be a few decades hence--as I intend to do in this article--I find I must begin with West Africa.

There is no other place on the planet where political maps are so deceptive--where, in fact, they tell such lies--as in West Africa. Start with Sierra Leone. According to the map, it is a nation-state of defined borders, with a government in control of its territory. In truth the Sierra Leonian government, run by a twenty-seven-year-old army captain, Valentine Strasser, controls Freetown by day and by day also controls part of the rural interior. In the government's territory the national army is an unruly rabble threatening drivers and passengers at most checkpoints. In the other part of the country units of two separate armies from the war in Liberia have taken up residence, as has an army of Sierra Leonian rebels. The government force fighting the rebels is full of renegade commanders who have aligned themselves with disaffected village chiefs. A pre-modern formlessness governs the battlefield, evoking the wars in medieval Europe prior to the 1648 Peace of Westphalia, which ushered in the era of organized nation-states.

As a consequence, roughly 400,000 Sierra Leonians are internally displaced, 280,000 more have fled to neighboring Guinea, and another 100,000 have fled to Liberia, even as 400,000 Liberians have fled to Sierra Leone. The third largest city in Sierra Leone, Gondama, is a displaced-persons camp. With an additional 600,000 Liberians in Guinea and 250,000 in the Ivory Coast, the borders dividing these four countries have become largely meaningless. Even in quiet zones none of the governments except the Ivory Coast's maintains the schools, bridges, roads, and police forces in a manner necessary for functional sovereignty. The Koranko ethnic group in northeastern Sierra Leone does all its trading in Guinea. Sierra Leonian diamonds are more likely to be sold in Liberia than in Freetown. In the eastern provinces of Sierra Leone you can buy Liberian beer but not the local brand.

In Sierra Leone, as in Guinea, as in the Ivory Coast, as in Ghana, most of the primary rain forest and the secondary bush is being destroyed at an alarming rate. I saw convoys of trucks bearing majestic hardwood trunks to coastal ports. When Sierra Leone achieved its independence, in 1961, as much as 60 percent of the country was primary rain forest. Now six percent is. In the Ivory Coast the proportion has fallen from 38 percent to eight percent. The deforestation has led to soil erosion, which has led to more flooding and more mosquitoes. Virtually everyone in the West African interior has some form of malaria.

Sierra Leone is a microcosm of what is occurring, albeit in a more tempered and gradual manner, throughout West Africa and much of the underdeveloped world: the withering away of central governments, the rise of tribal and regional domains, the unchecked spread of disease, and the growing pervasiveness of war. West Africa is reverting to the Africa of the Victorian atlas. It consists now of a series of coastal trading posts, such as Freetown and Conakry, and an interior that, owing to violence, volatility, and disease, is again becoming, as Graham Greene once observed, "blank" and "unexplored." However, whereas Greene's vision implies a certain romance, as in the somnolent and charmingly seedy Freetown of his celebrated novel The Heart of the Matter, it is Thomas Malthus, the philosopher of demographic doomsday, who is now the prophet of West Africa's future. And West Africa's future, eventually, will also be that of most of the rest of the world.

Consider "Chicago." I refer not to Chicago, Illinois, but to a slum district of Abidjan, which the young toughs in the area have named after the American city. ("Washington" is another poor section of Abidjan.) Although Sierra Leone is widely regarded as beyond salvage, the Ivory Coast has been considered an African success story, and Abidjan has been called "the Paris of West Africa." Success, however, was built on two artificial factors: the high price of cocoa, of which the Ivory Coast is the world's leading producer, and the talents of a French expatriate community, whose members have helped run the government and the private sector. The expanding cocoa economy made the Ivory Coast a magnet for migrant workers from all over West Africa: between a third and a half of the country's population is now non-Ivorian, and the figure could be as high as 75 percent in Abidjan. During the 1980s cocoa prices fell and the French began to leave. The skyscrapers of the Paris of West Africa are a facade. Perhaps 15 percent of Abidjan's population of three million people live in shantytowns like Chicago and Washington, and the vast majority live in places that are not much better. Not all of these places appear on any of the readily available maps. This is another indication of how political maps are the products of tired conventional wisdom and, in the Ivory Coast's case, of an elite that will ultimately be forced to relinquish power.

Chicago, like more and more of Abidjan, is a slum in the bush: a checkerwork of corrugated zinc roofs and walls made of cardboard and black plastic wrap. It is located in a gully teeming with coconut palms and oil palms, and is ravaged by flooding. Few residents have easy access to electricity, a sewage system, or a clean water supply. The crumbly red laterite earth crawls with foot-long lizards both inside and outside the shacks. Children defecate in a stream filled with garbage and pigs, droning with malarial mosquitoes. In this stream women do the washing. Young unemployed men spend their time drinking beer, palm wine, and gin while gambling on pinball games constructed out of rotting wood and rusty nails. These are the same youths who rob houses in more prosperous Ivorian neighborhoods at night. One man I met, Damba Tesele, came to Chicago from Burkina Faso in 1963. A cook by profession, he has four wives and thirty-two children, not one of whom has made it to high school. He has seen his shanty community destroyed by municipal authorities seven times since coming to the area. Each time he and his neighbors rebuild. Chicago is the latest incarnation.

Fifty-five percent of the Ivory Coast's population is urban, and the proportion is expected to reach 62 percent by 2000. The yearly net population growth is 3.6 percent. This means that the Ivory Coast's 13.5 million people will become 39 million by 2025, when much of the population will consist of urbanized peasants like those of Chicago. But don't count on the Ivory Coast's still existing then. Chicago, which is more indicative of Africa's and the Third World's demographic present--and even more of the future--than any idyllic junglescape of women balancing earthen jugs on their heads, illustrates why the Ivory Coast, once a model of Third World success, is becoming a case study in Third World catastrophe.

President Felix Houphouet-Boigny, who died last December at the age of about ninety, left behind a weak cluster of political parties and a leaden bureaucracy that discourages foreign investment. Because the military is small and the non-Ivorian population large, there is neither an obvious force to maintain order nor a sense of nationhood that would lessen the need for such enforcement. The economy has been shrinking since the mid-1980s. Though the French are working assiduously to preserve stability, the Ivory Coast faces a possibility worse than a coup: an anarchic implosion of criminal violence--an urbanized version of what has already happened in Somalia. Or it may become an African Yugoslavia, but one without mini-states to replace the whole.

Because the demographic reality of West Africa is a countryside draining into dense slums by the coast, ultimately the region's rulers will come to reflect the values of these shanty-towns. There are signs of this already in Sierra Leone--and in Togo, where the dictator Etienne Eyadema, in power since 1967, was nearly toppled in 1991, not by democrats but by thousands of youths whom the London-based magazine West Africa described as "Soweto-like stone-throwing adolescents." Their behavior may herald a regime more brutal than Eyadema's repressive one.

The fragility of these West African "countries" impressed itself on me when I took a series of bush taxis along the Gulf of Guinea, from the Togolese capital of Lome, across Ghana, to Abidjan. The 400-mile journey required two full days of driving, because of stops at two border crossings and an additional eleven customs stations, at each of which my fellow passengers had their bags searched. I had to change money twice and repeatedly fill in currency-declaration forms. I had to bribe a Togolese immigration official with the equivalent of eighteen dollars before he would agree to put an exit stamp on my passport. Nevertheless, smuggling across these borders is rampant. The London Observer has reported that in 1992 the equivalent of $856 million left West Africa for Europe in the form of "hot cash" assumed to be laundered drug money. International cartels have discovered the utility of weak, financially strapped West African regimes.

The more fictitious the actual sovereignty, the more severe border authorities seem to be in trying to prove otherwise. Getting visas for these states can be as hard as crossing their borders. The Washington embassies of Sierra Leone and Guinea--the two poorest nations on earth, according to a 1993 United Nations report on "human development"--asked for letters from my bank (in lieu of prepaid round-trip tickets) and also personal references, in order to prove that I had sufficient means to sustain myself during my visits. I was reminded of my visa and currency hassles while traveling to the communist states of Eastern Europe, particularly East Germany and Czechoslovakia, before those states collapsed.

Ali A. Mazrui, the director of the Institute of Global Cultural Studies at the State University of New York at Binghamton, predicts that West Africa--indeed, the whole continent--is on the verge of large-scale border upheaval. Mazrui writes, "In the 21st century France will be withdrawing from West Africa as she gets increasingly involved in the affairs [of Europe]. France's West African sphere of influence will be filled by Nigeria--a more natural hegemonic power. . . . It will be under those circumstances that Nigeria's own boundaries are likely to expand to incorporate the Republic of Niger (the Hausa link), the Republic of Benin (the Yoruba link) and conceivably Cameroon."


306
Science, Culture, & Humanities / Solar Flashlight
« on: May 20, 2007, 03:36:44 AM »
I saw this in today's NY Times and thought it cool.
======

Solar Flashlight Lets Africa’s Sun Deliver the Luxury of Light to the Poorest Villages
           
 
By WILL CONNORS and RALPH BLUMENTHAL
Published: May 20, 2007
FUGNIDO, Ethiopia — At 10 p.m. in a sweltering refugee camp here in western Ethiopia, a group of foreigners was making its way past thatch-roofed huts when a tall, rail-thin man approached a silver-haired American and took hold of his hands.

The man, a Sudanese refugee, announced that his wife had just given birth, and the boy would be honored with the visitor’s name. After several awkward translation attempts of “Mark Bent,” it was settled. “Mar,” he said, will grow up hearing stories of his namesake, the man who handed out flashlights powered by the sun.

Since August 2005, when visits to an Eritrean village prompted him to research global access to artificial light, Mr. Bent, 49, a former foreign service officer and Houston oilman, has spent $250,000 to develop and manufacture a solar-powered flashlight.

His invention gives up to seven hours of light on a daily solar recharge and can last nearly three years between replacements of three AA batteries costing 80 cents.

Over the last year, he said, he and corporate benefactors like Exxon Mobil have donated 10,500 flashlights to United Nations refugee camps and African aid charities.

Another 10,000 have been provided through a sales program, and 10,000 more have just arrived in Houston awaiting distribution by his company, SunNight Solar.

“I find it hard sometimes to explain the scope of the problems in these camps with no light,” Mr. Bent said. “If you’re an environmentalist you think about it in terms of discarded batteries and coal and wood burning and kerosene smoke; if you’re a feminist you think of it in terms of security for women and preventing sexual abuse and violence; if you’re an educator you think about it in terms of helping children and adults study at night.”

Here at Fugnido, at one of six camps housing more than 21,000 refugees 550 miles west of Addis Ababa, the Ethiopian capital, Peter Gatkuoth, a Sudanese refugee, wrote on “the importance of Solor.”

“In case of thief, we open our solor and the thief ran away,” he wrote. “If there is a sick person at night we will took him with the solor to health center.”

A shurta, or guard, who called himself just John, said, “I used the light to scare away wild animals.” Others said lights were hung above school desks for children and adults to study after the day’s work.

Mr. Bent’s efforts have drawn praise from the United Nations, Africare, Rice University and others.

Kevin G. Lowther, Southern Africa director for Africare, the largest American aid group for Africa, said his staff was sending 5,000 of his lights, purchased by Exxon Mobil at $10 each, to rural Angola.

Dave Gardner, a spokesman for Exxon Mobil, said the company’s $50,000 donation in November grew out of an earlier grant it made to Save the Children to build six public schools in Kibala, Angola, a remote area of Kwanza Sul Province.

“At a dedication ceremony for the first four schools in June 2006,” Mr. Gardner said in an e-mail message, “we noticed that a lot of the children had upper respiratory problems, part of which is likely due to the use of wood, charcoal, candles and kero for lighting in the small homes they have in Kibala.”

The Awty International School, a large prep school in Houston, has sent hundreds of the flashlights to schools it sponsors in Haiti, Cameroon and Ethiopia, said Chantal Duke, executive assistant to the head of school.

“In places where there is absolutely no electricity or running water, having light at night is a luxury many families don’t have and never did and which we take for granted in developed countries,” Ms. Duke said by e-mail. Mr. Bent, a former Marine and Navy pilot, served under diplomatic titles in volatile countries like Angola, Bosnia, Nigeria and Somalia in the early 1990s.

In 2001 he went to work as the general manager of an oil exploration team off the coast of the Red Sea in Eritrea, for a company later acquired by the French oil giant Perenco. But the oil business, he said, “didn’t satisfy my soul.”

The inspiration for the flashlight hit him, he said, while working for Perenco in Asmara, Eritrea. One Sunday he visited a local dump to watch scavenging by baboons and birds of prey, and came upon a group of homeless boys who had adopted the dump as their home.

They took him home to a rural village where he noticed that many people had nothing to light their homes, schools and clinics at night.

With a little research, he discovered that close to two billion people around the world go without affordable access to light.

He worked with researchers, engineers and manufacturers, he said, at the Department of Energy, several American universities, and even NASA before finding a factory in China to produce a durable, cost-effective solar-powered flashlight whose shape was inspired by his wife’s shampoo bottle.

The light, or sun torch, has a narrow solar panel on one side that charges the batteries, which can last between 750 and 1,000 nights, and uses the more efficient light-emitting diodes, or L.E.D.s, to cast its light. “L.E.D.s used to be very expensive,” Mr. Bent said. “But in the last 18 months they’ve become cheaper, so distributing them on a widespread scale is possible.”

The flashlights usually sell for about $19.95 in American stores, but he has established a BoGo — for Buy One, Give One — program on his Web site, BoGoLight.com, where if you buy one flashlight for $25, he will buy and ship another one to Africa, and donate $1 to one of the aid groups he works with.

Mr. Bent, who is now an oil consultant, lives in Houston with his wife and four young children. When he is not in the air flying his own plane, he is often on the road.

Traveling early this month in Ethiopia’s border area with Sudan, Mr. Bent stopped in each town’s market to methodically check the prices and quality of flashlights and batteries imported from China.

He unscrewed the flashlights one by one, inspecting the batteries, pronouncing them “terrible — they won’t last two nights.”

On his last day along the border, Mr. Bent visited Rapan Sadeeq, 21, a Sudanese refugee who is something of a celebrity in his camp, Bonga, for his rudimentary self-made radios, walkie-talkies and periscopes.

The two men huddled in the hut, discussing what parts would be needed to power the radio with solar panels instead of clunky C batteries. “Oh, I can definitely send you some parts,” Mr. Bent said. “You can be my field engineer in Ethiopia.”

Related clip can be found at http://bogolight.com/

307
All:

The McCain-Feingold law makes me seething angry.  I am utterly baffled that the Supreme Court could have affirmed its constitutionality.

The camel has gotten his nose in the tent and now he seeks to stick his head in.

Marc
====================
WSJ
Cutting the Grass
Congressional Democrats prepare another assault on the First Amendment.

Monday, May 14, 2007 12:01 a.m. EDT

A recent Wall Street Journal/NBC News poll shows 6 in 10 Americans think the Democratic Congress "hasn't brought much change." Eager to change this impression, the Democrats are frantically trying to pass legislation before Memorial Day. First on the agenda is a bill restricting lobbying, which is heading for the House floor with lightning speed. The House Judiciary Committee is expected to pass it tomorrow, sending it to the full House for a final vote next Tuesday or Wednesday.

When a bill moves that quickly, you can bet an someone will try to make some last-minute mischief. Hardly anyone objects to the legislation's requirement that former lawmakers wait two years instead of one before lobbying Congress. Ditto with bans on lobbying by congressional spouses and restrictions on sitting members of Congress negotiating contracts with private entities for future employment.

But the legislation may be amended on the floor to restrict grassroots groups that encourage citizens to contact members of Congress. The amendment, pushed by Rep. Marty Meehan of Massachusetts, would require groups that organize such grassroots campaigns to register as "lobbyists" and file detailed quarterly reports on their donors and activities. The law would apply to any group that took in at least $100,000 in any given quarter for "paid communications campaigns" aimed at mobilizing the public.





The same groups that backed the McCain-Feingold law, limiting political speech in advance of an election, are behind this latest effort to curb political speech. Common Cause and Democracy 21 say special-interest entities hide behind current law to conceal the identities of their donors, whom they would have to reveal if they were lobbying Congress directly. "These Astroturf campaigns are just direct lobbying by another name," says Rep. Meehan, who is resigning from the House this summer and views his bill as his last hurrah in Congress.
But the First Amendment specifically prohibits Congress from abridging "the right of the people . . . to petition the Government for redress of grievances." The Supreme Court twice ruled in the 1950s that grassroots communication isn't "lobbying activity," and is fully protected by the First Amendment. Among the groups that believe the Meehan proposal would trample on the First Amendment are the National Right to Life Committee and the American Civil Liberties Union. The idea goes too far even for Sen. John McCain, who voted to strip a similar provision from a Senate lobbying reform bill last January.

The possible outcomes are disturbing. For example, Oprah Winfrey operates a website dedicated to urging people to contact Congress to demand intervention in Darfur. If her Web master took in over $100,000 in revenue from Ms. Winfrey and similar clients in a single quarter, he might be forced to make disclosures under the law.

"It's huge," Jay Sekulow of the conservative American Center for Law and Justice, told The Hill newspaper. "It's the most significant restriction on grassroots activity in recent history. I'd put it up there with the 2002 Bipartisan Campaign Reform Act"--the formal name for McCain-Feingold.

McCain-Feingold itself is riddled with loopholes, producing a slew of unintended consequences. Its provisions allowing candidates who compete against wealthy opponents who spend their own money to accept larger-than-normal legal contributions in order to compete inexplicably don't apply to the race for president. That means Mitt Romney and John Edwards, both of whom are independently wealthy, have a clear advantage should they run low on cash and need to inject funds into their campaigns quickly.





"Judged by the most visible results on promises like getting big money out of politics or cleaning up politics, campaign finance reform has been, to put it mildly, a disappointment," admits Mark Schmitt, a supporter of such reforms who has written a thoughtful essay in the journal Democracy. He urges reformers to now focus on expanding the "range of choices and voices in the system" and to take seriously the worries of those who fear that McCain-Feingold's restrictions on "election communication" have the potential to squelch important political speech. The Supreme Court is set to rule next month on a case addressing precisely that issue, and Justice Samuel Alito may be more inclined to view McCain-Feingold skeptically than was Sandra Day O'Connor, who was part of a 5-4 majority upholding the law.
Given the checkered history of campaign finance reform, its frequent use by one side of a political debate to hobble opponents, and the prospect that courts may yet find portions of McCain-Feingold unconstitutional, it would be a travesty for a Congress desperate for a quick-fix legislative accomplishment to circumscribe the First Amendment with little debate and even less understanding of what the consequences will be.

308
Politics & Religion / Communicating with the Muslim World
« on: May 11, 2007, 10:31:40 AM »
All:

I will be posting more about this point, but it seems to me that we need to have a thread dedicated to how to communicate with the Muslim world.

I begin with a piece from the WSJ.

Marc
==================

Boos for Al-Hurra
May 11, 2007; Page A10
We've been watching the debate over Al-Hurra, the U.S.-funded Middle East TV channel that has lately developed a reputation as a friendly forum for terrorists and Islamic radicals. A bipartisan group of Congressmen has called for Al-Hurra's news director, former CNN producer Larry Register, to resign -- and it's time he and his supervisors gave taxpayers some answers.

With an annual budget over $70 million, Al-Hurra is part of the long arm of America's public diplomacy in the Middle East. The network was established to provide a credible source of information in the region, in a market dominated by Al-Jazeera and Al-Arabiya. The goal was to help start a discussion about freedom and democracy. Instead, the network seems to have aligned itself with everyone else in pandering to the so-called Arab street.

The shift began when Mr. Register took over last November. As journalist Joel Mowbray has detailed in these pages, Al-Hurra has made a practice in Mr. Register's tenure of friendly coverage of camera-ready extremists from al Qaeda, Hamas and other terrorist groups. Most famously, the network gave more than 60 minutes of airtime to Hezbollah leader Hasan Nasrallah, who informed viewers that Hezbollah was "facing a strategic and historic victory." Under Mr. Register, Holocaust denial panels became "Holocaust existence panels." People like al Qaeda operative Muhammed Hanja received airtime to celebrate America's "defeat" on September 11.

Mr. Register's defense has been, in essence, that if Al-Hurra doesn't run anti-American content, no one will watch. He seems to have misunderstood his assignment: Al-Hurra is not meant to compete with Al-Jazeera but to offer an alternative view of the Middle East from those of either its dictators or jihadis.

But Al-Hurra is not alone in its failures. VOA and Radio Farda in Iran also stray into broadcasts that wax critical on U.S. policies. Here's betting those outlets see more scrutiny in coming days, and there's plenty of people besides Mr. Register to question -- starting with the Broadcasting Board of Governors that is charged with the network's oversight.

The BBG failed to investigate Mr. Register's change of journalism-marketing strategy when criticism began to emerge. After Mr. Mowbray's original article in March, Joaquin Blaya, chairman of the Board's Middle East Committee, wrote us a letter dismissing the criticism. Mr. Blaya was more accommodating in a second letter that we ran May 9 -- perhaps because he's feeling heat from Capitol Hill.

He certainly hasn't felt any heat from Undersecretary of State Karen Hughes, who sits on the board and has also preferred to see no evil. Ms. Hughes has conceded that the Nasrallah interview was "a violation of our policy." But in a speech to the Board of Governors and Freedom House last week, she missed an opportunity to clarify what is expected of Al-Hurra in return for taxpayer support. On Wednesday, State Department spokesman Sean McCormack said Mr. Register is doing "a very good job."

Al-Hurra (which means "the free one") can be a useful tool in the battle of ideas that is crucial to the war on Islamic extremism. But if it and its sister broadcasts are merely going to provide one more outlet for anti-U.S. propaganda, who needs them? Dissidents in Soviet Russia and its satellites once looked to Radio Free Europe and VOA as sources of truth they weren't getting from local media. Nobody thinks the Cold War would have ended sooner if they had offered more airtime to the Kremlin.

309
Politics & Religion / Assessing Blame for Iraq front of WW3:
« on: May 05, 2007, 10:46:13 AM »

http://www.armedforcesjournal.com/2007/05/2635198
A failure in generalship
By Lt. Col. Paul Yingling
"You officers amuse yourselves with God knows what buffooneries and never dream in the least of serious service. This is a source of stupidity which would become most dangerous in case of a serious conflict."
- Frederick the Great


For the second time in a generation, the United States faces the prospect of defeat at the hands of an insurgency. In April 1975, the U.S. fled the Republic of Vietnam, abandoning our allies to their fate at the hands of North Vietnamese communists. In 2007, Iraq's grave and deteriorating condition offers diminishing hope for an American victory and portends risk of an even wider and more destructive regional war.

These debacles are not attributable to individual failures, but rather to a crisis in an entire institution: America's general officer corps. America's generals have failed to prepare our armed forces for war and advise civilian authorities on the application of force to achieve the aims of policy. The argument that follows consists of three elements. First, generals have a responsibility to society to provide policymakers with a correct estimate of strategic probabilities. Second, America's generals in Vietnam and Iraq failed to perform this responsibility. Third, remedying the crisis in American generalship requires the intervention of Congress.

 
The Responsibilities of Generalship

Armies do not fight wars; nations fight wars. War is not a military activity conducted by soldiers, but rather a social activity that involves entire nations. Prussian military theorist Carl von Clausewitz noted that passion, probability and policy each play their role in war. Any understanding of war that ignores one of these elements is fundamentally flawed.

The passion of the people is necessary to endure the sacrifices inherent in war. Regardless of the system of government, the people supply the blood and treasure required to prosecute war. The statesman must stir these passions to a level commensurate with the popular sacrifices required. When the ends of policy are small, the statesman can prosecute a conflict without asking the public for great sacrifice. Global conflicts such as World War II require the full mobilization of entire societies to provide the men and materiel necessary for the successful prosecution of war. The greatest error the statesman can make is to commit his nation to a great conflict without mobilizing popular passions to a level commensurate with the stakes of the conflict.

Popular passions are necessary for the successful prosecution of war, but cannot be sufficient. To prevail, generals must provide policymakers and the public with a correct estimation of strategic probabilities. The general is responsible for estimating the likelihood of success in applying force to achieve the aims of policy. The general describes both the means necessary for the successful prosecution of war and the ways in which the nation will employ those means. If the policymaker desires ends for which the means he provides are insufficient, the general is responsible for advising the statesman of this incongruence. The statesman must then scale back the ends of policy or mobilize popular passions to provide greater means. If the general remains silent while the statesman commits a nation to war with insufficient means, he shares culpability for the results.

However much it is influenced by passion and probability, war is ultimately an instrument of policy and its conduct is the responsibility of policymakers. War is a social activity undertaken on behalf of the nation; Augustine counsels us that the only purpose of war is to achieve a better peace. The choice of making war to achieve a better peace is inherently a value judgment in which the statesman must decide those interests and beliefs worth killing and dying for. The military man is no better qualified than the common citizen to make such judgments. He must therefore confine his input to his area of expertise — the estimation of strategic probabilities.

The correct estimation of strategic possibilities can be further subdivided into the preparation for war and the conduct of war. Preparation for war consists in the raising, arming, equipping and training of forces. The conduct of war consists of both planning for the use of those forces and directing those forces in operations.

To prepare forces for war, the general must visualize the conditions of future combat. To raise military forces properly, the general must visualize the quality and quantity of forces needed in the next war. To arm and equip military forces properly, the general must visualize the materiel requirements of future engagements. To train military forces properly, the general must visualize the human demands on future battlefields, and replicate those conditions in peacetime exercises. Of course, not even the most skilled general can visualize precisely how future wars will be fought. According to British military historian and soldier Sir Michael Howard, "In structuring and preparing an army for war, you can be clear that you will not get it precisely right, but the important thing is not to be too far wrong, so that you can put it right quickly."

The most tragic error a general can make is to assume without much reflection that wars of the future will look much like wars of the past. Following World War I, French generals committed this error, assuming that the next war would involve static battles dominated by firepower and fixed fortifications. Throughout the interwar years, French generals raised, equipped, armed and trained the French military to fight the last war. In stark contrast, German generals spent the interwar years attempting to break the stalemate created by firepower and fortifications. They developed a new form of war — the blitzkrieg — that integrated mobility, firepower and decentralized tactics. The German Army did not get this new form of warfare precisely right. After the 1939 conquest of Poland, the German Army undertook a critical self-examination of its operations. However, German generals did not get it too far wrong either, and in less than a year had adapted their tactics for the invasion of France.

After visualizing the conditions of future combat, the general is responsible for explaining to civilian policymakers the demands of future combat and the risks entailed in failing to meet those demands. Civilian policymakers have neither the expertise nor the inclination to think deeply about strategic probabilities in the distant future. Policymakers, especially elected representatives, face powerful incentives to focus on near-term challenges that are of immediate concern to the public. Generating military capability is the labor of decades. If the general waits until the public and its elected representatives are immediately concerned with national security threats before finding his voice, he has waited too long. The general who speaks too loudly of preparing for war while the nation is at peace places at risk his position and status. However, the general who speaks too softly places at risk the security of his country.

Failing to visualize future battlefields represents a lapse in professional competence, but seeing those fields clearly and saying nothing is an even more serious lapse in professional character. Moral courage is often inversely proportional to popularity and this observation in nowhere more true than in the profession of arms. The history of military innovation is littered with the truncated careers of reformers who saw gathering threats clearly and advocated change boldly. A military professional must possess both the physical courage to face the hazards of battle and the moral courage to withstand the barbs of public scorn. On and off the battlefield, courage is the first characteristic of generalship.

Failures of Generalship in Vietnam

America's defeat in Vietnam is the most egregious failure in the history of American arms. America's general officer corps refused to prepare the Army to fight unconventional wars, despite ample indications that such preparations were in order. Having failed to prepare for such wars, America's generals sent our forces into battle without a coherent plan for victory. Unprepared for war and lacking a coherent strategy, America lost the war and the lives of more than 58,000 service members.

Following World War II, there were ample indicators that America's enemies would turn to insurgency to negate our advantages in firepower and mobility. The French experiences in Indochina and Algeria offered object lessons to Western armies facing unconventional foes. These lessons were not lost on the more astute members of America's political class. In 1961, President Kennedy warned of "another type of war, new in its intensity, ancient in its origin — war by guerrillas, subversives, insurgents, assassins, war by ambush instead of by combat, by infiltration instead of aggression, seeking victory by evading and exhausting the enemy instead of engaging him." In response to these threats, Kennedy undertook a comprehensive program to prepare America's armed forces for counterinsurgency.

Despite the experience of their allies and the urging of their president, America's generals failed to prepare their forces for counterinsurgency. Army Chief of Staff Gen. George Decker assured his young president, "Any good soldier can handle guerrillas." Despite Kennedy's guidance to the contrary, the Army viewed the conflict in Vietnam in conventional terms. As late as 1964, Gen. Earle Wheeler, chairman of the Joint Chiefs of Staff, stated flatly that "the essence of the problem in Vietnam is military." While the Army made minor organizational adjustments at the urging of the president, the generals clung to what Andrew Krepinevich has called "the Army concept," a vision of warfare focused on the destruction of the enemy's forces.

Having failed to visualize accurately the conditions of combat in Vietnam, America's generals prosecuted the war in conventional terms. The U.S. military embarked on a graduated attrition strategy intended to compel North Vietnam to accept a negotiated peace. The U.S. undertook modest efforts at innovation in Vietnam. Civil Operations and Revolutionary Development Support (CORDS), spearheaded by the State Department's "Blowtorch" Bob Kromer, was a serious effort to address the political and economic causes of the insurgency. The Marine Corps' Combined Action Program (CAP) was an innovative approach to population security. However, these efforts are best described as too little, too late. Innovations such as CORDS and CAP never received the resources necessary to make a large-scale difference. The U.S. military grudgingly accepted these innovations late in the war, after the American public's commitment to the conflict began to wane.

America's generals not only failed to develop a strategy for victory in Vietnam, but also remained largely silent while the strategy developed by civilian politicians led to defeat. As H.R. McMaster noted in "Dereliction of Duty," the Joint Chiefs of Staff were divided by service parochialism and failed to develop a unified and coherent recommendation to the president for prosecuting the war to a successful conclusion. Army Chief of Staff Harold K. Johnson estimated in 1965 that victory would require as many as 700,000 troops for up to five years. Commandant of the Marine Corps Wallace Greene made a similar estimate on troop levels. As President Johnson incrementally escalated the war, neither man made his views known to the president or Congress. President Johnson made a concerted effort to conceal the costs and consequences of Vietnam from the public, but such duplicity required the passive consent of America's generals.

Having participated in the deception of the American people during the war, the Army chose after the war to deceive itself. In "Learning to Eat Soup With a Knife," John Nagl argued that instead of learning from defeat, the Army after Vietnam focused its energies on the kind of wars it knew how to win — high-technology conventional wars. An essential contribution to this strategy of denial was the publication of "On Strategy: A Critical Analysis of the Vietnam War," by Col. Harry Summers. Summers, a faculty member of the U.S. Army War College, argued that the Army had erred by not focusing enough on conventional warfare in Vietnam, a lesson the Army was happy to hear. Despite having been recently defeated by an insurgency, the Army slashed training and resources devoted to counterinsurgency.

By the early 1990s, the Army's focus on conventional war-fighting appeared to have been vindicated. During the 1980s, the U.S. military benefited from the largest peacetime military buildup in the nation's history. High-technology equipment dramatically increased the mobility and lethality of our ground forces. The Army's National Training Center honed the Army's conventional war-fighting skills to a razor's edge. The fall of the Berlin Wall in 1989 signaled the demise of the Soviet Union and the futility of direct confrontation with the U.S. Despite the fact the U.S. supported insurgencies in Afghanistan, Nicaragua and Angola to hasten the Soviet Union's demise, the U.S. military gave little thought to counterinsurgency throughout the 1990s. America's generals assumed without much reflection that the wars of the future would look much like the wars of the past — state-on-state conflicts against conventional forces. America's swift defeat of the Iraqi Army, the world's fourth-largest, in 1991 seemed to confirm the wisdom of the U.S. military's post-Vietnam reforms. But the military learned the wrong lessons from Operation Desert Storm. It continued to prepare for the last war, while its future enemies prepared for a new kind of war.

310
Politics & Religion / The US Congress; Congressional races
« on: May 05, 2007, 03:45:16 AM »
The following article helped me realize we need a thread specifically for the doings/shenanigans of our elected representatives

==============================================

http://www.military.com/NewsContent/0,13319,133428,00.html?ESRC=dod.nl

Air Force Might Cut Pay for Surge
Military.com  |  By Christian Lowe  |  April 25, 2007
The Air Force’s top officer said Wednesday that if nearly $1 billion in personnel funds taken from the service to pay for combat in Iraq and Afghanistan isn’t restored by the end of the summer, Airmen and civilian employees might not get their pay.

Due to a congressional delay in approving a wartime supplemental funding bill this year, the Pentagon pulled about $880 million from the Air Force’s personnel accounts to make up for a shortfall it warned lawmakers would come in mid-April.

Poll: Should Air Force personnel be used to man Army billets in Iraq?
Air Force Chief of Staff Gen. Michael “Buzz” Moseley said at a breakfast meeting with reporters today that the money is coming out of the military personnel account earmarked for the last four months of the year.

“Somebody’s going to have to pay us back,” Moseley said. “You have to pay people every day when they come to work.”

“A: it’s the right thing to do, and B: it’s kind of the law,” he added.

Alert: Tell your public officials how you feel about this issue.

The shortfall could delay permanent change of station moves, temporary duty expenses and other pays that “take care of people,” he said.

On April 15, the Army announced it would have to cut training, depot repair, and maintenance of non war-related gear because funding for the surge in Iraq, combat operations in Afghanistan and other Global War on Terrorism costs was running dry.

The Army also requested that about $1.6 billion be diverted from the Air Force and Navy personnel accounts to help put the wartime funding tab in the black.

With Congress locked in a political battle with the Bush administration over withdrawal deadlines and troop rotation schedules, the $100 billion wartime spending bill to pay for operations through the end of the fiscal year has yet to be signed into law.

Though both the Senate and House have submitted the supplemental bill to the floor for a vote this week, President Bush has vowed a veto over withdrawal deadlines inserted into the law.

Chairman of the Joint Chiefs of Staff Gen. Peter Pace has said if the wartime funds aren’t in place by mid-May, even more drastic cuts will have to be made, including reductions in training for forces on their way to Iraq, which will force the Pentagon to extend the deployments of units already there.

“The comptroller now has a check that they’re going to have to give us back to pay for [personnel] as we get closer to the end of the summer,” Moseley explained, putting the screws to Pentagon and administration budgeteers to recoup the loss.

“I don’t want to have any concerns about getting that money back,” he said. “It would be a breach of faith to take mil-pers money out of a service and then fast forward a couple of quarters and then just say ‘eat it.’”

Moseley said he’ll resist providing Airmen to man jobs the Army and Marine Corps can’t fill due to high operational tempo and increased demand, insisting his service is “drawing some red lines” to deny ground commanders’ requests.

About 20,000 Air Force personnel have filled shortfalls in the ground services’ manning – dubbed “in lieu of taskings” – including convoy and base security operations and even detainee handling jobs. As early as 2005, Air Force security personnel began augmenting Army detainee-handling troops at Camp Bucca prison near Baghdad and have continued to man prison jobs in Iraq.

“We don’t guard prisoners, we don’t even have a prison,” Moseley said. “To take out people and train them to be a detainee-guarding entity requires time away from their normal job.”

Some U.S.-based Air Force commands have as many as 25 percent of their personnel deployed to Iraq and are still executing their home station duties. For example, the San Angelo, Texas-based 17th Training Wing has its crash, fire, and rescue teams and security force units deployed “and we’re still operating the wing,” Moseley said.  

Moseley said he’s happy to provide personnel with job skills the Air Force has in abundance, including drivers and information technology specialists. But “I am less supportive of things outside of our competencies,” he said.

“We’ve drawn some red lines on some of the ‘in lieu of’ taskings to get away from the tasking of our folks that is incredibly outside the competencies.”



311
Science, Culture, & Humanities / FDA vs. medical freedom
« on: May 04, 2007, 06:52:43 AM »
Drug Czars
By STEVEN WALKER
May 4, 2007; Page A15

The Food and Drug Administration recently argued in the D.C. Court of Appeals that it has the power to ban meat and vegetables without violating anyone's fundamental rights. The agency chose this bizarre position in an attempt to counter arguments made by patients and their advocates in Abigail Alliance v. von Eschenbach. This groundbreaking case challenges the agency's refusal to grant access to investigational drugs, even as a last resort for terminally ill patients.

Last year, a three-judge panel decided that the FDA is violating the due- process rights of terminally ill patients by denying them access to promising investigational drugs. In response the FDA moved for a rehearing by the full court, hoping to prevent a lower court-supervised examination of whether its draconian policies actually serve a narrowly tailored compelling governmental interest. In layman's terms, this means the FDA would have to show its policies toward terminal patients are so critical to the well-being of society that they supersede (in broad and highly imperfect fashion) the fundamental right of an individual to pursue life free of undue government interference. The FDA knows their policies will not survive this test, and doesn't want the question asked.

Consider the FDA's handling of Genasense, a new drug for melanoma and chronic lymphocytic leukemia (CLL), two often terminal forms of cancer. The drug is being developed by Genta, a small, innovative company with only one approved drug and limited financial resources. Despite compelling evidence that Genasense is making progress in fighting both diseases, the FDA appears determined to kill the drug.

 
In the case of the melanoma application, instead of reviewing the clinical-trial data in accordance with usual methods (which showed positive results), the FDA chose a nonstandard statistical approach aimed at discrediting the results. The agency used this analysis in its briefing to its advisory committee, claiming that the drug might not be effective. The committee then relied on that information to vote against approval.

Now, Genta has found a serious mathematical error in the FDA's analysis, rendering its results meaningless. Genta is filing a complaint under the Federal Data Quality Act to correct the record. But in the meantime, the drug remains unapproved and melanoma patients continue to wait.

Genasense was also shown in a well-run, randomized clinical trial (the FDA's gold standard) to cause a complete disappearance of disease in 17% of patients with advanced CLL when combined with two older drugs. Just 7% of patients in a control group who received only the older drugs experienced similar benefit. The responders to Genasense have seen their relief last an average of 36 months, while those using other drugs saw their cancer return, on average, in 22 months.

Following these results, the Director of the FDA's cancer division, Dr. Richard Pazdur, again convened a public meeting of his advisory committee. After an agency presentation designed to elicit a negative outcome, the panel voted 7 to 3 against approval, triggering an immediate reaction of surprise and dismay among many CLL experts.

But the committee vote is less surprising if one knows that the FDA appointed several voting consultants to the committee (none of them CLL experts), and recused from the meeting the only sitting member of the committee who is an expert in CLL. Perhaps even more troubling, two of the voting committee members worked behind the scenes as undisclosed consultants for the FDA on Genasense, then without disclosure voted in the open meeting.

A shocked Genta quickly requested a meeting with the FDA to seek clarity on the agency's position, and to present additional information from patient follow-up. On the referral of an eminent leukemia expert, Genta asked if we would attend the meeting as witnesses in our capacity as patient advocates. No compensation was offered, requested or received.

Most of the meeting was consumed by getting the FDA to admit the obvious: The long-lasting, complete disappearance of CLL and its symptoms constituted "clinical benefit." Making these arguments were two cancer-medicine professors at M.D. Anderson Cancer Center, the recused ODAC member and an immediate past president of the American Society of Hematology -- all experts in CLL. None were employees of Genta and collectively represented a far more qualified advisory committee than the one that the FDA had convened.

The FDA's inane answer to the CLL experts was that the long-lasting disappearance of disease in patients taking Genasense was a "theoretical construct" and not grounds for approval.

The experts explained to the FDA that complete responses in advanced CLL patients are the medical equivalent of the Holy Grail. The FDA finally agreed, but was unimpressed with emerging data showing responders to Genasense living longer than responders in the control group.

The experts were unanimous in advising that Genasense should be approved, but the FDA was unmoved. The agency's Dr. Pazdur suggested that Genta could make the drug available as an unapproved treatment through an expanded access program -- this from a regulator fond of stating that the best way to get a drug to patients in need is through approval! In this case the agency was saying to Genta: We are not going to approve your drug, but any patient who needs it can have it so long as you give it away.

Genta responded that nonapproval would be a denial of patient access to Genasense because they could not afford to give it away in an expanded access program. Twice, Dr. Pazdur referred to that logic as a "business decision."

Less than 48 hours later, the FDA rejected Genasense. Within days Genta made a "business decision," laying off a third of its staff in a cost cutting move aimed at keeping the doors open long enough to appeal the FDA's decision. The appeal was filed in early April. Genta's announcement of the filing included a statement from one of the expert physicians: "It is puzzling that they would deny approval to a drug that met its primary and key secondary endpoint, especially since these findings were observed in the only randomized controlled trial that has ever been conducted in patients with relapsed CLL."

The FDA's handling of Genasense lays bare the all too common, aggressive incompetence of the FDA's cancer-drug division and should lead to an immediate examination of its policies and leadership, followed by swift corrective action.

As for the FDA's belief that their power to control us and even deny us the pursuit of life itself is unlimited under the Constitution, we can only hope the appeals court disagrees. An agency that blocks progress against deadly diseases -- while arguing that its power to do so is above challenge -- is in dire need of a court supervised review.

Mr. Walker is co-founder and chief adviser for the Abigail Alliance for Better Access to Developmental Drugs . He receives no compensation for his work as an advocate, nor has he ever received compensation from any private or public-sector entity involved in drug development, approval or marketing.

312

http://money.cnn.com/magazines/fortune/fortune_archive/2007/05/14/100008848/index.htm?source=yahoo_quote

 
The smartest (or the nuttiest) futurist on Earth
Ray Kurzweil is a legendary inventor with a history of mind-blowing ideas. Now he's onto something even bigger. If he's right, the future will be a lot weirder and brighter than you think.
By Brian O'Keefe, Fortune senior editor
May 2 2007: 11:08 AM EDT


(Fortune Magazine) -- If you went around saying that in a couple of decades we'll have cell-sized, brain-enhancing robots circulating through our bloodstream or that we'll be able to upload a person's consciousness into a computer, people would probably question your sanity. But if you say things like that and you're Ray Kurzweil, you get invited to dinner at Bill Gates' house - twice - so he can pick your brain for insights on the future of technology. The Microsoft chairman calls him a "visionary thinker and futurist."

Kurzweil is an inventor whose work in artificial intelligence has dazzled technological sophisticates for four decades. He invented the flatbed scanner, the first true electric piano, and large-vocabulary speech-recognition software; he's launched ten companies and sold five, and has written five books; he has a BS in computer science from MIT and 13 honorary doctorates (but no real one); he's been inducted into the Inventor's Hall of Fame and charges $25,000 every time he gives a speech - 40 times last year.

 
Still life with innovator: Kurzweil at his home near Boston.
 
Everybody loves Raymond: Kurzweil is a major tech conference draw, commanding $25,000 a speech.
 
The power of technology will keep growing exponentially. By 2050, you'll be able to buy a device with the computational capacity of all mankind for the price of a nice refrigerator.
 
A ROM machine: A $14,600 contraption that provides a workout so intense that it takes just four minutes.

More from FORTUNE
Verizon returns Vonage's serve

Imus won't go quietly

Detroit's darkest hour
 

FORTUNE 500
Current Issue
Subscribe to Fortune 
And now, if anything, he's gaining momentum as a cultural force: He has not one but two movies in the works - one a documentary about his career and ideas and the other an adaptation of his recent bestseller, The Singularity Is Near, which he's writing and co-producing (he's talking about a distribution deal with the people who brought you "The Day After Tomorrow").

When Kurzweil isn't giving keynote addresses or reading obscure peer-review journals, he's raising money for his new hedge fund, FatKat (Financial Accelerating Transactions from Kurzweil Adaptive Technologies). He's already attracted a roster of blue-ribbon investors that includes venture capitalist Vinod Khosla, former Microsoft CFO Mike Brown, and former Flextronics-CEO-turned-KKR-partner Michael Marks.

Life imitates TV
Being a hedge fund manager may seem an odd pursuit for an expert in AI, but to Kurzweil it's perfectly natural. The magic that has enabled all his innovations has been the science of pattern recognition - and what is the financial market, he postulates, but a series of patterns?

Kurzweil, however, has something bigger on his mind than just making money - after half a lifetime studying trends in technological change, he believes he's found a pattern that allows him to see into the future with a high degree of accuracy.

The secret is something he calls the Law of Accelerating Returns, and the basic idea is that the power of technology is expanding at an exponential rate. Mankind is on the cusp of a radically accelerating era of change unlike anything we have ever seen, he says, and almost more extreme than we can imagine.

What does that mean? By the time a child born today graduates from college, Kurzweil believes, poverty, disease, and reliance on fossil fuels should be a thing of the past. Speaking of which, don't get him started on global-warming hype.

"These slides that Gore puts up are ludicrous," says the man who once delivered a tech conference presentation as a singing computer avatar named Ramona. (That stunt was the inspiration for the 2002 Al Pacino movie "Simone.") "They don't account for anything like the technological progress we're going to experience."

Great big cell phones in the sky
He has plenty more ideas that may seem Woody Allen - wacky in a "Sleeper" kind of way (virtual sex as good as or better than the real thing) and occasionally downright disturbing à la "2001: A Space Odyssey" (computers will achieve consciousness in about 20 years). But a number of his predictions have had a funny way of coming true.

Back in the 1980s he predicted that a computer would beat the world chess champion in 1998 (it happened in 1997) and that some kind of worldwide computer network would arise and facilitate communication and entertainment (still happening). His current vision goes way, way past the web, of course. But at least give the guy a hearing. "We are the species that goes beyond our potential," he says. "Merging with our technology is the next stage in our evolution."

In mid-April, Kurzweil traveled to the Island hotel in Newport Beach, Calif., as one of the featured speakers at a two-day World Innovation Forum. The roster of luminaries included Harvard Business School professor Clayton Christensen and Vint Cerf, one of the fathers of the Internet, now at Google (Charts, Fortune 500). But Kurzweil was the only one followed around by a team of documentary-film makers.

He took the stage wearing a brown houndstooth sports coat and navy checked tie and began toggling through his PowerPoint slides. He's about 5-foot-7, and in regular conversation he tends to speak in a monotone. But he comes alive onstage, mixing in reliable one-liners with his bigger point: Don't underestimate the power of technological change. "Information technologies are doubling in power every year right now," he tells the crowd of 400 or so attendees. "Doubling every year is multiplying by 1,000 in ten years. It's remarkable how scientists miss this basic trend."

Kurzweil's crusade, if you will, is to get across that most of us (even scientists) fail to see the world changing exponentially because we are "stuck in the intuitive linear view." To hammer home his point, Kurzweil packs his presentations with charts that show, for instance, supercomputer power doubling consistently over time.

He explains that Moore's Law - the number of transistors on a chip will double every two years - is but one excellent example of the Law of Accelerating Returns. One of Kurzweil's favorite illustrations of exponential growth is the Human Genome Project. "It was scheduled to be a 15-year project," he says. "After seven years only 1% of it was done, and the critics said it would be impossible. But if you double from 1% every year over seven years, you get 100%. It was right on schedule."

He believes humanity is near that 1% moment in technological growth. By 2027, he predicts, computers will surpass humans in intelligence; by 2045 or so, we will reach the Singularity, a moment when technology is advancing so rapidly that "strictly biological" humans will be unable to comprehend it.

A corporate governance gadfly irks CEOs
Everything will be subject to his Law of Accelerating Returns, Kurzweil says, because "everything is ultimately becoming information technology." As we are able to reverse-engineer and decode our own DNA, for instance, medical technology can be converted to bits and bytes and zoom along at the same fantastic rate. That will enable overlapping revolutions in genetics, nanotechnology, and robotics. Which is how you end up with nanobots living in your brain.

Kurzweil, 59, declared his career as an inventor at age 5. He grew up in Queens, New York, one of two children (he has a younger sister named Enid) of Fredric and Hannah Kurzweil, Viennese Jews who fled the Nazis in 1938. His parents encouraged their son's ambition. "Ideas were the religion of our household," he says. "They saw science and technology as the way of the future and a way to make money and not struggle the way they did." Fredric, a composer and conductor, died of heart disease at 58, an event that would have a lasting impact on his son.

Kurzweil discovered computers at age 12, and quickly demonstrated an amazing facility with technology. At 14 he wangled a job as the computer programmer at the research department of Head Start, the federal government's early-childhood-development program. While there he wrote software that was later distributed by IBM (Charts, Fortune 500) with its mainframes.

Going beyond 'Moneyball'
At 17 he won an international science contest by building a computer that analyzed the works of Chopin and Beethoven to compose music; that trick landed him on the TV show "I've Got a Secret," hosted by Steve Allen. At MIT he started a company that used a computer to crunch numbers and match high school students with the best college choice; he sold it for $100,000 plus royalties.

After graduating from MIT, he founded Kurzweil Computer Products in 1974, and his initial breakthrough came later that year when he created the first optical-character-recognition program capable of reading any font. After he happened to sit next to a blind man on a plane, he decided to apply the technology to building a reading machine for the sight-impaired. To make it work he invented the flatbed scanner and the text-to-speech synthesizer, and introduced a reader in 1976.

When his first reader customer - Stevie Wonder - later complained about the limitations of electronic keyboards, Kurzweil used pattern-recognition science to invent the first keyboard that could realistically reproduce the sound of pianos and other orchestra instruments. Thus was born Kurzweil Music Systems. (When his name is recognized today, it's still often as "that keyboard guy.")

Kurzweil never left the Boston area after college. He and his wife, Sonya, live in a suburb about 20 minutes west of the city in a house they bought 25 years ago. Both of his children are grown and out of the house - Ethan, 28, is at Harvard Business School and Amy, 20, is at Stanford - so it's just the two of them and 300 or so cat figurines. (Kurzweil says he likes the way cats always seem to be "calmly thinking through their options.")

Kurzweil won't say how much he's worth, but he's never had the kind of payday that made so many of his peers centimillionaires or better. He sold Kurzweil Computer Products to Xerox (Charts, Fortune 500) in 1980 for $6.25 million. Kurzweil Music Systems was in bankruptcy when Korean piano maker Young Chang bought it in 1990 for $12 million.

From NBA to MBA: Shaq suits up for business
Kurzweil Applied Intelligence introduced a series of speech-recognition products and went public in 1993, but was tarnished by an accounting-fraud scandal in 1995. Kurzweil, who was co-CEO, was not implicated. "I was focusing on the technology," he says. "There was this small conspiracy, which was deeply shocking." KAI was sold in 1997 for $53 million.

If Kurzweil hasn't made the big score, he's done well enough to keep funding his new ventures. Former Microsoft (Charts, Fortune 500) CFO Brown has invested in a few of Kurzweil's businesses and says he's impressed. "There's a certain smart kind of person who can get all the way from the big picture down to the little kernel and back," he says. "He's extremely adaptive that way. His businesses in my experience have always been well run and successful. He's grown them until they get to be a certain size and typically sold them to somebody who has a bigger distribution network."

These days Kurzweil organizes his business interests - including FatKat and Ray & Terry's Longevity Products, which sells supplements - under the umbrella of Kurzweil Technologies. The company takes up all of one floor and half of another in a nondescript office-park building in Wellesley Hills, Mass. In the reception area on the second floor is an antique Ediphone, one of Thomas Edison's dictation machines.

On a table filled with plaques noting Kurzweil's achievements is a photo of him receiving the National Medal of Technology from President Clinton. There's a pipe-smoking mannequin with a ribbon that reads I AM AN INVENTOR on its chest. In the basement is a supercomputer processing millions of bits of market-related data.

Kurzweil is hoping that FatKat will prove to be as spectacular an achievement as his early inventions, only a lot more lucrative. When describing his approach, he refers to the success of fellow MIT board member and hedge fund manager James Simons of Renaissance Technologies, whose $6 billion fund Medallion has averaged 36% returns annually after fees since 1988 and who, according to the hedge fund trade magazine Alpha, was the highest-paid hedgie last year, with a take-home of $1.7 billion.

Kurzweil says he is applying Simons-like quantitative analysis to take advantage of market inefficiencies. And he's confident that, just as he trained computers to recognize patterns in human speech or the sound of a violin, he can do the same with currency fluctuations and stock-ownership trends. The ultimate goal is to create the first fully artificially intelligent quant fund - a black box that can learn to monitor itself and adjust. Although he started the company back in 1999, the fund has only been trading for about a year.

How's he doing? Kurzweil won't say, citing SEC rules, nor will his investors. "I view Ray as one of the best pattern-recognition people in the world," says Khosla, when asked why he put money into FatKat. "I am a happy investor in Ray's company. A very happy investor."

As respected as Kurzweil is, to some of his peers his ideas have a persistent whiff of the too-good-to-be-true. One intellectual equal who takes exception to Kurzweil's views is Mitch Kapor, the co-founder and former CEO of Lotus Development. In 2002, Kapor made a much publicized $20,000 bet with Kurzweil that a computer would not be able to demonstrate consciousness at a human level by 2029.

But his quibbles with Kurzweil run much deeper than that debate. He rejects Kurzweil's theories about the implications of accelerating technology as pseudo-evangelistic bunk. "It's intelligent design for the IQ 140 people," he says. "This proposition that we're heading to this point at which everything is going to be just unimaginably different - it's fundamentally, in my view, driven by a religious impulse. And all of the frantic arm-waving can't obscure that fact for me, no matter what numbers he marshals in favor of it. He's very good at having a lot of curves that point up to the right."

Even technologists who take Kurzweil seriously don't necessarily echo his optimism. It was after a conversation with him that Bill Joy wrote an apocalyptic cover story for Wired magazine in 2000 about nanotechnology run amok.

Kurzweil, who's always careful to acknowledge the possibility that everything could go haywire, says his outlook is about math, not religion. And he's not planning to go anywhere until he bears witness to humankind's ultimate destiny, even if it takes him forever.

Note that by "forever" we mean "forever": The man literally intends not to die. With an acute memory of his father's early death, he's been getting weekly blood tests and intravenous treatments. He also takes pills - lots of pills, more than 200 vitamins, antioxidants, and other supplements every day. It's all part of his effort to "reprogram" his body chemistry and stop growing old. "I've slowed down aging to a crawl," he claims. "By most measures my biological age is about 40, and I have some hormone and nutrient levels of a person in his 30s."

Tuesday night in Newport Beach, after his talk at the Innovation Forum, Kurzweil is having dinner at an upscale seafood restaurant with one of his true believers, Peter Diamandis. The 45-year-old Diamandis is best known as the creator of the X Prize, a $10 million bounty for the first privately built, manned rocket launched into space. (Microsoft co-founder Paul Allen's team won in 2004.)

He's developing a new X Prize for a 100-mile-a-gallon car, and considering others in cancer research and, with Kurzweil's help, AI. Diamandis says he buys completely into Kurzweil's Law of Accelerating Returns and everything that it implies. "The Singularity, for anyone who stops and thinks about it, is completely obvious," he says.

Diamandis, who has an MD, has also been profoundly affected by Kurzweil's 2004 book Fantastic Voyage: Live Long Enough to Live Forever and has adopted Kurzweil's dietary guidelines. Diamandis pulls out a plastic bag of supplement pills and explains he's up to about 30 a day. Kurzweil reaches into his jacket for some of his own supplements. "His pills are bigger than my pills!" says Diamandis.

Then, more seriously, he asks Kurzweil if he ever gets nosebleeds from the supplement regimen. Kurzweil doesn't. "I think it might be the memory pills," says Diamandis. The conversation morphs into a debate on why earthlings have been unable to detect extraterrestrial civilizations, because with the billions of star systems out there, surely the Law of Accelerating Returns must have taken root somewhere...

It's easy to ridicule a scene like this, and perhaps people will when the movie comes out. (The documentary crew was there.) It's currently unfashionable to be so positive in one's open-mindedness. But remember, Kurzweil has been right before. And frankly, he's delighted we haven't heard from anyone else in the universe yet - it just means we're further up the technology curve than the aliens. "I think it's exciting that we're in the lead," he says, fiddling with his half-eaten ahi tuna. "There's a lot ahead of us."

Reporter associates Doris Burke and Telis Demos contributed to this article.

From the May 14, 2007 issue

313
Science, Culture, & Humanities / Remember the Alamo
« on: April 30, 2007, 09:54:12 PM »
Internet friend Ed Rothstein writes for the NY Times.  As always I am impressed by the depth and breadth of his writing.  Here is his most recent column:
==========

Remembering the Alamo Is Easier When You Know Its Many-Sided History
By EDWARD ROTHSTEIN

Published: April 30, 2007

SAN ANTONIO — With apologies to all Texians — as they were once called — before visiting San Antonio, I really didn’t remember the Alamo. I retained a vague impression from youth in which heroism, independence and Davy Crockett were major elements, and Mexicans were the bad guys, but that was about it. It was like a childhood fairy tale, barely recalled.

Skip to next paragraph

Enlarge This Image



Michael Stravato for The New York Times

Tourists talk with an Alamo Ranger outside the mission.

That’s fine for myths: they are not really meant to survive with photographic realism. That is one way they have such a broad effect on the mind and culture, creating impressions, molding perceptions, shaping expectations. That’s also why every demythologizing movement has an element of aggressive triumph over myth’s power, as if a mesmeric trance were being overturned.

But when it comes to the Alamo — particularly here in this Texas city where this old Spanish mission turned fort attracts nearly three million visitors a year — the history and its mythical meanings have been wrestled over almost as much as the blood-soaked terrain was in preceding centuries. “Remember the Alamo!” was the old battle cry; in recent decades the fight was over just what was being remembered.

Even now, the Alamo is often looked at by local Latinos as a relic of Anglo imperialism, with Mexico losing Texas in a land grab. For its advocates, though, the Alamo reflects a stubborn Texan drive for independence won from Mexico in 1836, just as that nation was losing its way in the mire of coups and tyranny. In this view, the Alamo is a tragic counterpart to Lexington and Concord, leading to the Republic of Texas — and ultimately bringing the entire Southwest into the orbit of the United States.

This also puts the Alamo at the center of a larger drama in which American history itself is the contested arena, a drama now shaping how American museums present the past. Dare we celebrate our past if it turns out, when seen in the harsh light of American middle age, that it was not as golden as we once imagined? (Jamestown, Va., in commemorating its forthcoming 400th anniversary, apparently thinks not.) But dare we mourn our past if it turns out that things were not as bad as they were elsewhere and held the promise of something far better? The Alamo’s current incarnation — its central exhibition was mounted by its curator, Richard Bruce Winders, in 2005 — may provide some perspective on the opposing traps of sanitized idealism and cynical self-disgust.

The mythic power of the place is plain in the bare outlines of Texas history. Before the 1820s, Texas had been a lightly populated province of Spanish-owned Mexico. Incentives and cheap land lured many settlers from the United States. Then, once Mexico won independence from Spain in 1821, creating a republic based on the federal system of the United States, the Texan region was joined with its stronger neighbor, Coahuila, to form a single Mexican state.

But the Mexican government proved less than stable, with 13 presidents in 15 years. Mexico increased tariffs and Texans began to feel poorly represented. Finally, in 1835, Gen. Antonio López de Santa Anna suspended the constitution, declared himself president and made it clear that Texan yearnings deserved no more consideration than those of the Zacatecas rebels, who were first subdued and then massacred.

Some Texans sought accommodation, some hoped for Mexican statehood, but after General Santa Anna’s maneuvers, many sought independence. As the general’s army marched to Texas to crush resistance, fewer than 200 rebels armed themselves in the crumbling fort of the Alamo, where Mexicans had, not long before, suffered a temporary defeat. This time the Texans also happened to have at least two legends of the American West with them: Jim Bowie and Crockett.

The rebels hoped in vain for reinforcements as several thousand Mexican soldiers surrounded the Alamo. The Texans’ certain defeat took 13 days — with only a few scattered survivors — but their fight was so fierce that the Mexican army was significantly weakened. Within weeks, that army was beaten by Sam Houston leading the Texans. General Santa Anna was captured, and in ransoming his freedom, he granted Texas its independence.

That is the background to the heroic tale told in John Wayne’s 1960 film “The Alamo.” Similar valor is displayed in the 1988 Imax film “Alamo: The Price of Freedom,” shown continuously about a block from the fort. The focus on heroism has always been prevalent at the fort as well, which displays a lock of Crockett’s hair, along with Houston’s sword.

But after the 1970s, as James E. Crisp recounts in his fascinating 2004 book, “Sleuthing the Alamo,” “a new and radicalized generation of historians saw the origins of the conflict in the prejudices of Anglo-American bigots.” Race, for some historians, became the central issue in the revolution. Texan immigrants from the Southern United States relied on slavery, which was forbidden in Mexico, creating a major incentive for Texas independence and the application of a selective idea of liberty. Even Bowie was not just a war hero, expert with a hunting knife: he had made his fortune as a slave trader and shady land speculator. And D. W. Griffith’s 1915 silent film “Martyrs of the Alamo” may have gotten at more of the historical truth in Griffith’s racial condescension to the Mexicans than in his depiction of the battles fought.

As Mr. Crisp writes, “We should never allow even the most revered of our society’s ‘sacred narratives’ to be accepted as simple truths, nor to be mistaken for legitimate history.”

But these characterizations are simple truths: a set of opposing mythologies with their own assertions of moral superiority and injured outrage. As Mr. Crisp points out, more complicated truths require rejecting race as the primary issue in the Texan revolution. He also suggests that as far as Mexicans were concerned, real discrimination fully came into its own only at the beginning of the 20th century, when among other indignities, Texas public schools practiced segregation of its “white” pupils from citizens of Mexican descent. At the time of the revolution, relations between the groups were far different.

They were also never simple. Some wariness of outsiders came from the Mexicans themselves. As H. W. Brands’s history of Texas, “Lone Star Nation,” shows, one reason for Mexican nervousness about Texas’s future as a Mexican province was the crucial cultural differences between the North American colonists and the Mexican colonists. In an 1828 survey of the region for the Mexican government, Manuel de Mier y Terán outlined important distinctions in attitudes toward individualism that he believed would make it increasingly difficult for Mexico to control North American Texans.

Within the Alamo, Mr. Winders’s intelligent exhibition now treats those 13 days of battle as part of an extended civil war in Mexico over the ideas of liberty and federalism. But establishing a context for understanding history beyond the myth doesn’t diminish the myth’s power or its importance. “What is a shrine?” his exhibition asks. “A shrine is a place hallowed by its associations.”

And the Alamo is such a shrine, that for all the flaws and eccentricities of its inhabitants and its era left a heroic mark on the sluggish human trudge toward liberty. It still commands remembrance.

Connections, a critic’s perspective on arts and ideas, appears every other Monday.

314
From today's Political Journal of the WSJ

-- John Fund
A Peach of a Tax Plan

Just maybe, the model for a fundamental tax overhaul nation-wide has percolated up in the State of Georgia. On Wednesday, Glenn Richardson, speaker of Georgia's House of Representatives, filed a bill that would junk the state's existing tax code and replace it with a much simpler one.

Under the plan, all state and local property taxes would be eliminated. So would the estate tax, unemployment insurance and worker's compensation taxes, business and occupational fees, intangible taxes and insurance taxes. The entire structure would be replaced with a flat rate income tax of 5.75% and a flat 5.75% sales tax. The state's income tax is currently 6% and the sales tax is 4.5%.

The architect of the plan is the famous Reagan economist Arthur Laffer. "This would bring the focus of the entire country on Georgia," Mr. Laffer said in an interview. "States compete; they're like puppies bouncing around in a box at a pet store to get noticed. This is a way for Georgia to get noticed and set itself apart from all the rest of the states when it tries to sell itself to businesses and families."

House Speaker Richardson has been an ardent champion of tax reform in Georgia, which has become one of the reddest states in the nation. Georgia has a Republican legislature and, in Sonny Perdue, a Republican governor. "We must change the burdensome and antiquated tax system we currently have," Mr. Richardson says. He concedes that many business groups are likely to oppose the plan because it eliminates all the special favors, handouts and loopholes in the current Georgia code.

This plan would have to be approved by both houses of the legislature and then placed on the November 2008 ballot to be approved by voters. Mr. Laffer says the economic and jobs impact would be significantly positive because it increases "after-tax incentives to work, invest, produce and live in Georgia." Mr. Richardson adds: "I believe the House tax reform plan will be the talk of the nation." Who, knows the flat tax may finally get legs across America -- maybe even in Washington.

-- Stephen Moore

315
Politics & Religion / The Case for Grenades
« on: April 30, 2007, 01:16:16 PM »
Woof All:

With the author's blessing, I share the following piece here-- including its catchy title!

TAC,
Marc
=============================

Another thread reminded me of a question that I have long pondered and been unable to answer. I view the National Firearms Act as clearly unconstitutional and a partial abridgment of the right to bear arms. It reduces the right to bear arms to the right to bear some arms. But the distinction between permitted and prohibited arms is seemingly arbitrary and without a connection to any real objective standard. How does a 14” barreled shotgun differ from a 18” barreled shotgun? If manufacturing a firearm that is compact and maneuverable is unlawful, then why can we still have handguns? If the power factor is the difference, then why can we have bullpups?

And yet there is something within the NFA that somehow seems reasonable to me. The Second Amendment was written as an absolute right, whereas the Fourth Amendment was written as a restriction against unreasonable searches and seizures. But—perhaps as the result of a faulty assumption within my worldview—I still wish to accept reasonable limits on the right to bear arms. Before ya’ll start looking for a length of rope, allow me to give some examples. I believe that it is reasonable to prohibit the civilian ownership of biological weapons such as ricin or anthrax. I believe it is reasonable to prohibit the civilian ownership of nuclear missiles. And yet these are arms, and we have a right to arms. If you view defense against a tyrannical government as a reason for possessing arms, then support for a right to own RPGs and nuclear submarines may be supportable. If you focus exclusively upon defense against street criminals, then the need for such weaponry falters.

Personally, I fear my government much more than street thugs. I have heard several liberal gun-grabbers bolster their argument for banning small arms by asserting the “futility of defense” theory, i.e. that our government is now so powerful that we can’t reasonably expect to defend ourselves with small arms anyway, so there is no real defense-against-tyranny support for the RKBA. I cannot completely dismiss this argument. But, rather than accepting their desire to confiscate everything, it just makes me think that we need more and bigger guns.

Somewhere between the pellet rifle and the Patriot missile, we must draw the line. What arms are reasonable? I first came to this question because I realized that I want hand grenades. Everyone scoffs at that. The most ardent defenders of the RKBA throw back their heads in laughter. It is viewed as an insane position. But why? What makes hand grenades different?

Discrimination. I have been told that it is because hand grenades are not discriminatory. That is incorrect—used properly, hand grenades are discriminatory. Soldiers don’t simply drop grenades everywhere they go. Grenades are chosen because the enemy is behind cover and otherwise unkillable. They are used only when they can be used to eliminate the enemy and simultaneously not present a threat to friendly troops. Used improperly, rifles are not discriminatory.

Training. I have been told that it is because it takes too much training to use a grenade. That is absurd. You first make sure that you are in a safe position from which to throw a grenade; that you have good cover so you don’t catch any of the shrapnel. Then you pull the pin and throw. That’s it. You have to know how far you can throw a grenade. How do you figure that out? Simple—grab a dummy and practice. Training is a red herring; just as much so for grenades as for CCW.

Danger. I have been told that grenades are just too dangerous. So what? Sharp sticks are dangerous. Guns are dangerous. A negligent discharge with a rifle can kill an innocent well over a mile away. A moron with a grenade presents virtually no danger to anyone one-hundred yards away. And what is the danger in doing nothing? What is more dangerous: lobbing a grenade or allowing the bad guy to continue shooting at me from around a corner?

The entire population of the free world seems to agree that my position on this issue is crazy. So please help me. What is the difference? What are the guiding principles that distinguish a grenade from a rifle? Why is the grenade not a legitimate tool of defense against both criminals and tyrants?




Historical Perspectives

To help formulate a position on this issue, I have looked at a variety of stances on the RKBA that have predominated throughout the history of our nation. This is my own work, and I have created these categories based upon the inferences and presuppositions that can be read into the works of various statutes, court documents, judicial decisions and scholarly articles covering the subject. If you are unable to locate other authors who have treated the subject in similar fashion, it is because I may be the first to have done so.

I have divided the dominant schools on the RKBA into six categories: 1) Civilized Warfare, 2) the 19th Century, 3) Miller, 4) the Police Model, 5) the NRA Model (also called the Modern View), and 6) Halbrook. The Civilized Warfare model recognizes a RKBA for the purpose of calling forth an armed militia to wage war upon a battlefield. The 19th Century Model is complete laissez faire; the RKBA exists for the militia, for defense against bandits or tyrants, for sport, for pleasure or any other lawful purpose (although in practice it was denied to minority groups). The Miller model is based upon a strict reading of the judicial decision handed down in U.S. v. Miller. The Police Model regards the RKBA from the position of personal defense from non-governmental criminals. The NRA model is merely representative of the “popular” or “modern” perspective upon the RKBA, and is based solely upon the majority approval it receives and not upon any principled set of criteria. The Halbrook model is based upon the support for the RKBA that is presented by a significant set of constitutional scholars, including the notable Stephen Halbrook. The ideology that supports each model is reflected in the types of arms that are protected under each model.

I also divided the subject arms into various categories, some of which will seem rather benign to you and some of which may disturb your conscience. The nine categories of arms that I considered are: 1) crew-served machineguns, 2) single infantryman machineguns, 3) assault weapons, 4) hunting rifles, 5) full-size handguns, 6) small or inexpensive handguns, 7) sporting shotguns, 8) .22 rifles and pistols, 9) grenades, bombs, mines and artillery. This list is not comprehensive and my definitions are very broad. Some groups are representative of guns that would not ordinarily fall within the strict categorization. As an example, a short-barreled shotgun would be viewed in like fashion with an assault weapon by all models except the NRA model (which is the only model which reflects popular opinion rather than demonstrable criteria).

The simplest and most permissive model is the 19th Century view. All of the categories of arms are allowed in this model. Anything goes. Men are responsible for their actions, but there are no prior restraints upon the ownership of any firearm or weapon.

The Civilized Warfare view (similar to the theory put forward by Saul Cornell) provides a RKBA only for arms that have a place in war. Small, inexpensive handguns have no place on a modern battlefield, and therefore can be regulated or prohibited at will by the legislature. The same applies to sporting shotguns. No one would choose to go to war armed with a Browning Citori and a Beretta Jetfire, therefore the 2nd Amendment wasn’t intended to protect these arms.

(Note: Unprotected arms may still be allowed by the Congress. The fact that the 2nd Amendment does not provide a right to possess an arm does not preclude the Congress from choosing to extend a privilege to own such arms. Additionally, all of these models presume a positivist position of the law—the idea that the 2nd Amendment creates a RKBA rather than merely recognizing a pre-existing and inalienable right.)

The Miller model is the most surprising to those with an introductory knowledge of Second Amendment jurisprudence. The Miller case is probably the most often mis-cited and misrepresented case in the Supreme Court’s history. It is lauded as the case which affirmed the federal government’s authority to regulate firearms. But the holding of the case is much more narrow than that which is normally cited. The majority decision, issued by Justice McReynolds, held only that the government was free to regulate those firearms which were “not part of any ordinary military equipment” and “could not contribute to the common defense.” U.S. v. Miller, 307 U.S. 174, 177 (1939). The firearm in question was a short-barreled shotgun. At the District Court, the judges took it under judicial notice that short-barreled shotguns had a military application. Those judges were veterans of the Great War, and had seen shotguns used to great effect in the trenches. The Supreme Court however was not comprised of men of valor. Justice McReynolds unilluminatingly states that there was “an absence of any evidence” that such an arm had an application to the military. (Id.) What the record fails to disclose was that no attorney represented the defendants in oral argument or in brief to the Supreme Court! There was no evidence in support of the defense because only the State put on any evidence. This is damning enough in and of itself, but even in this kangaroo court Justice McReynolds specifically limited the authority of the federal government to regulate those arms which had no relation to the military. This means that under Miller, as under the Civilized Warfare model, the government would be free to tax, regulate or (possibly) prohibit engraved double guns and Olympic target pistols, but that the people retained the right to any guns which had a place in warfare. This is still good precedent in the U.S., and logically should support ownership of M-16s, M-249s, M-2s, SAMs, grenades and any other item in the full complement of arms that our soldiers have to choose from. The Miller model may or may not protect the right to possess .22 rifles and pistols. Because these firearms are sometimes used to train troops, they may be protected.

The Police model embraces protection for those arms that can be effectively used to prevent crime. An individual infantryman’s machinegun would probably be protected (many agencies issue the M-16), but not a crew-served machinegun because it isn’t useful in a civilian police role (they are too cumbersome to be used for reactive defense). Unlike Miller or the Civilized Warfare models, the Police model recognizes a right to possess small or inexpensive handguns. But it still does not recognize a right to sporting shotguns or grenades/bombs/mines/artillery. The police model may require further refinement of the definitions, because it would actually support the ownership of flash-bang grenades, but would not protect fragmentation grenades. However, current federal law does not distinguish between the two.

The NRA model simply reflects the current assumptions. Machineguns—both individual and crew-served—are not protected. Grenades, et al, are not protected. Assault weapons must be further defined in this model. Guns that are merely cosmetically different from sporting rifles, but which have been modified so as to limit themselves to a reduced (semi-auto) rate of fire, are protected. But true military rifles which retain the ability to fire fully-automatic are not protected. Additionally, this model accepts the arbitrary barrel length determination of the NFA. A 16” barreled rifle is fine, while a 16” barreled shotgun is morally culpable. Unlike previous models, the NRA model or Modern view places great importance on the right to possess sporting arms which have no utility in war or defense. Firearms are lauded for their beauty and are protected for their ability to entertain us rather than to defend us.

The Halbrook model is closest to the NRA model, but reaches somewhat further in that it still looks to a military/defense use for arms, limiting this role though to an individual infantryman’s arms. The right to a machinegun is recognized, but not a crew-served machinegun. Additionally, unlike the NRA model, the protection of assault weapons would likely include short-barreled rifles and shotguns. Despite a connection being drawn to the individual infantryman, the Halbrook model still dismisses a right to grenades and explosives, despite the fact that these are standard kit for an infantryman.

There are currently no serious constitutional scholars supporting an inalienable right to bear arms that is inherent in man and which cannot be erased by legislative or judicial decree. Only a few of us Christian lunatics cling to this idea.

It may be that none of the models listed here are representative of your outlook. But you must be aware of your worldview and prior assumptions when you are addressing this question. Have you simply adopted the logically-inconsistent NRA view? If you base your support for the RKBA upon the Civilized Warfare or 19th Century model, then how can you not support a right to own grenades? If you believe that you have a right to own that high-polished, over-under quail gun, do you base that in a 19th Century view or the NRA view? In a country whose food problem is in having too much of it, can we distinguish the value of a sporting shotgun from that of a bicycle or racecar—aren’t they all just toys that we amuse ourselves with rather than tools which are necessary for sustenance or freedom?
__________________
Virtute et Armis,
J. Bradley

316
Science, Culture, & Humanities / Water
« on: April 28, 2007, 06:55:46 AM »
David Gordon http://eutrapelia.blogspot.com/ has brought to my attention the simple but important observation that water scarcity increasingly is going to be a real problem around the world.  Here's an article on point which he just sent me from The Economist



================
Australia's water shortage

The big dry

Apr 26th 2007 | MURRAY MOUTH, SOUTH AUSTRALIA
From The Economist print edition


Australia is struggling to cope with the consequences of a devastating drought. As the world warms up, other countries should pay heed

THE mouth of the Murray-Darling river sets an idyllic scene. Anglers in wide-brimmed sunhats wade waist-deep into the azure water. Pleasure boats cruise languidly around the sandbanks that dot the narrow channel leading to the Southern Ocean. Pensioners stroll along the beach. But over the cries of the seagulls and the rush of the waves, there is another sound: the mechanical drone from a dredging vessel. It never stops and must run around the clock to prevent the river mouth from silting up. Although the Murray-Darling is Australia's longest river system, draining a basin the size of France and Spain combined, it no longer carries enough water to carve its own path to the sea.

John Howard, Australia's prime minister, arrived here in February and urged the four states through which the Murray-Darling flows to hand their authority over the river to the federal government. After seven years of drought, and many more years of over-exploitation and pollution, he argued that the only hope of restoring the river to health lies in a complete overhaul of how it is managed. As the states weigh the merits of Mr Howard's scheme, the river is degenerating further. Every month hydrologists announce that its flow has fallen to a new record low (see chart). In April Mr Howard warned that farmers would not be allowed to irrigate their crops at all next year without unexpectedly heavy rain in the next few months. A region that accounts for 40% of Australia's agriculture, and 85% of its irrigation, is on the verge of ruin.


 
 
 

 
The drought knocked one percentage point off Australia's growth rate last year, by the government's reckoning. It is paying out A$2m ($1.7m) a day in drought-relief to farmers. If mature vines and fruit trees die in the coming months through the lack of water, the economic fallout will be more serious and lasting. Most alarming of all, the Murray-Darling's troubles are likely to worsen. As Australia's population continues to grow so does demand for water in the cities and for the crops that grow in the river basin. Meanwhile, global warming appears to be heating the basin up and drying it out. Although few scientists are confident that they can ascribe any individual event—including today's drought—to global warming, most agree that droughts like the present one will become more common.

Many of the world's rivers, including the Colorado in America, China's Yellow river and the Tagus, which flows through Spain and Portugal, are suffering a similar plight. As the world warms up, hundreds of millions of people will face the same ecological crisis as the residents of the Murray-Darling basin. As water levels dwindle, rows about how supplies should be used are turning farmers against city-dwellers and pitching environmentalists against politicians. Australia has a strong economy, a well-funded bureaucracy and robust political institutions. If it is struggling to respond to this crisis, imagine how drought will tear apart other, less prepared parts of the world.

Droughts have long plagued the Murray-Darling. The region is afflicted by a periodic weather pattern known as El Niño. At irregular intervals of two to seven years, the waters of the central Pacific warm up, heralding inclement weather throughout the southern hemisphere. Torrential rains flood the coast of Peru, while south-eastern Australia wilts in drought. The duration of these episodes is as unpredictable as their arrival. They can range from a few months to several years. As a result, the flow of the Darling, the longest tributary of the Murray, varies wildly, from as little as 0.04% of the long-term average to as much as 911%. Although the most recent El Niño ended earlier this year, it has left the soils in the basin so dry and the groundwater so depleted that the Murray-Darling's flow continues to fall, despite normal levels of rainfall over the past few months.

Protracted droughts are a part of Australian folklore. Schoolchildren learn a hackneyed Victorian poem in praise of "a sunburnt country...of droughts and flooding rains". Dorothea Mackellar wrote those lines just after the "Federation drought" of the late 1890s and early 1900s. The recession that accompanied it was so severe that it helped nudge Australia's six states, at the time separate British colonies, into uniting as a federation, or commonwealth, as Australians tend to call it.



Water politics
Negotiations over the federal constitution almost foundered on the subject of the Murray-Darling. South Australia, at the mouth of the river, wanted it kept open for navigation to the hinterland, allowing the state to become a trading hub. Its capital, Adelaide, also depended on water piped from the Murray to keep its taps running—as it still does. Further upstream, Victoria and New South Wales wanted to build dams to encourage agriculture. Queensland played little part in the row, since its stretch of the Darling was sparsely populated at the time. In the end, Victoria and New South Wales agreed to ensure a minimum flow to South Australia and to divide the remaining water equally between themselves. Like their counterparts elsewhere in the world, Australian engineers gaily pockmarked the basin with dams, weirs and locks, with little thought for what that would do downstream.

By the 1990s the drawbacks were evident. For one thing, states were allowing irrigators to use too much water. By 1994 human activity was consuming 77% of the river's average annual flow, even though the actual flow falls far below the average in dry years. The mouth of the river was beginning to silt up—a powerful symbol of over-exploitation. Thanks to a combination of reduced flow and increased run-off from saline soils churned up by agriculture, the water was becoming unhealthily salty, especially in its lower reaches. The tap water in Adelaide, which draws 40% of its municipal supplies from the river and up to 90% when other reserves dry up, was beginning to taste saline. The number of indigenous fish was falling, since the floods that induce them to spawn were becoming rarer. Toxic algae flourished in the warmer, more sluggish waters. In 1991 a hideous bloom choked a 1,000km (625 mile) stretch of the Darling.

Such horrors stirred indignation among urban Australians. The bad publicity put tourists off river cruises, fishing trips and visits to the basin's various lakes and wetlands. Many small businesses got hurt in the process. The citizens of Adelaide, which contains several marginal parliamentary seats, began to worry that the taps would run dry. Farmers were also starting to fear for the security and quality of their water supplies.

 
 
 

 


So Australia embarked on a series of reforms that in many ways serve as a model for the management of big, heavily exploited rivers. New South Wales, Victoria and South Australia agreed to cap the amount of water they took from the river and to keep clear, public records of water-use rights. They also made plans to reduce salinity and increase "environmental flows". The commonwealth agreed to encourage this by allocating buckets of cash to compliant states. All these initiatives were to be managed by a body, called the Murray-Darling Basin Commission, in which the commonwealth and the various riparian states, including Queensland and the tiny Australian Capital Territory (ACT), had equal representation and where decisions were taken by consensus.

Moreover, Australia's politicians also agreed to a set of principles by which water should be managed throughout the country. There should be no more subsidies for irrigation. Farmers should pay for the maintenance of channels and dams. For each river and tributary, scientists would calculate the maximum sustainable allocations of water and states would make sure that extractions did not exceed that figure. To ensure that such a scarce resource was used as efficiently as possible, water should be tradable, both within and between states. And the minimum environmental flows necessary to keep the river in good health should be accorded just as high a status as water put to commercial uses.

Guided by these principles, the states and the commonwealth have made much progress. By 1999 the average salinity of the river in South Australia had fallen by over 20%. In the late 1990s salinity levels were falling within the prescribed limit over 90% of the time, compared with roughly 60% in the 1970s and 1980s. The construction of fish ladders around dams and weirs, and the release of extra water into important breeding grounds, has spawned a recovery in native species. The commission is spending A$650m to boost environmental flows, mainly by stemming losses from irrigation, and hence leaving more water in the river.

The trade in water has taken off. There are two basic sorts of transaction: sales of part of a farmer's water allocation for the year or a permanent transfer. Temporary exchanges between farmers in the same state topped 1,000 gigalitres (220 billion gallons) in 2003, or around a tenth of all water used for agriculture. That roughly matches the cumulative amount of water that has changed hands permanently within the same state.

Meanwhile, the commission has codified rules for trading water between users in different states. The volumes are much smaller, but the system is working as economists had hoped. In general, water is flowing from regions with salty soil to more fertile ones; from farms that are profligate with water to ones that are more efficient; and from low-value crops to more profitable ones. In particular, struggling dairy and rice farmers in New South Wales and Victoria have sold water to the booming orchards and vineyards of South Australia. A government assessment of a pilot scheme for interstate trade determined that such shifts prompted A$767m of extra investment in irrigation and food-processing between 1997 and 2001. Another study found that water trading helped to reduce the damage wrought by droughts.

But there are lots of problems. For one thing, the reforms concern only water that has already reached the river. Farmers in certain states can still drill wells to suck up groundwater, and tree plantations absorb a lot of rainwater that would otherwise find its way into the river. Little dams on farms, which block small streams or trap run-off from rain or flooding, are an even bigger worry. Little is known about how many there are or how fast their numbers are growing. In theory, most states are trying to regulate them, but the rules are full of loopholes and enforcement is difficult. Hydrologists fear that the severity of the drought has encouraged farmers to build more dams.

Some states are keener on the reforms than others. In 1995, when New South Wales, South Australia and Victoria agreed to cap the amount of water they took from the river, Queensland refused to join them on the grounds that it uses only a tiny share of the basin's water. The state government felt it had a right to promote irrigation along its stretch of the Darling to bring Queensland to the same level of agricultural development as the other states. It has since agreed to negotiate a cap. But earlier this year, despite the ongoing drought, it awarded new water-use rights to farmers on the Warrego, one of the tributaries of the Darling.

New South Wales, meanwhile, frequently exceeds its cap. Its farmers plant mainly annual crops, such as rice and wheat, instead of perennials like fruit trees or grape vines. If there is not enough water to go round, its farmers may suffer for a season, but their earnings are not permanently diminished. So the state tends to be less cautious in its allocation of water than Victoria or South Australia. However, the commission has no power to ensure that states stick to their caps. It can only denounce offenders publicly, in the forlorn hope that the shame will induce them to behave better.

Climate change is likely to exacerbate all these disputes. The Commonwealth Scientific and Industrial Research Organisation (CSIRO), a government agency, estimates that it could reduce the Murray's flow by as much as 5% in 20 years and 15% in 50 years. But other projections are much more cataclysmic. CSIRO cites a worst case of 20% less water in 20 years and 50% in 50 years. Peter Cullen, an academic and member of the government's National Water Commission, points out that inflows to the Murray have fallen to less than half of their long-term average over the past six years. He thinks it would be prudent to manage water on the assumption that low flows are here to stay.

Mr Howard argues that the Murray-Darling Basin Commission moves too slowly to cope with all the upheaval. He wants the states to surrender their powers over the basin to the commonwealth. That will allow his government, he says, to work out exactly how much water is being siphoned off through wells and dams, and to use that information to set a new, sustainable cap on water use.

The government would also help farmers meet the new restrictions by investing in more efficient irrigation or by buying up their water rights—all without any of the typical bickering and foot-dragging that have held up collective action in the past. To entice the states to agree, he is offering to spend A$10 billion of the commonwealth's money on the various schemes. But the advantage of adopting policies by consensus, presumably, is that they may prove more durable than anything imposed from Canberra. National governments, even in Australia, are not immune to inefficiency and bias. They are often at loggerheads with the states.

Moreover, not all Australians want to move as quickly as Mr Howard does. He faces an election later this year in which his environmental record—and particularly his lack of action on global warming—will be a big issue. Nor does the federal government have any experience of managing rivers. In a recent book, "Water Politics in the Murray-Darling Basin", Daniel Connell argues that any institutional arrangement that fails to give enough weight to regional concerns will not last.



Running a river
Several state governments have their doubts about Mr Howard's plan. South Australia wants the administration of the river put in the hands of a panel of independent experts. Victoria, the only state to reject the prime minister's scheme outright, says that he could achieve the same goals without any extra powers by simply withholding money from recalcitrant states. Its government has also complained that the scheme would reward the most wasteful irrigators for their inefficiency, by helping to pay for improvements to their infrastructure and then allowing them to use much of the water saved. So the extravagant irrigators of New South Wales will end up with extra water, while their parsimonious counterparts in Victoria will benefit less.

Moreover, many Australians are uncomfortable with the idea of water trading, says Blair Nancarrow, the head of the Australian Research Centre for Water in Society, a division of CSIRO. People living in less fertile areas fear that local farmers will gradually sell all their water rights, eroding employment and commerce and killing off the area's towns. Concerned politicians have insisted on limits to the amount of water that can be traded out of regions and states each year and have refused to allow the commission to buy water directly from farmers for environmental flows. The National Party, the junior partner in Australia's coalition government, draws much of its support from the countryside and is particularly reluctant to give free rein to the water market.

In the eyes of Mr Cullen, however, many of the changes Australians fear are inevitable. As it is, he notes, the amount of money farms make for every million litres of water they use varies dramatically between states, from roughly A$300 in New South Wales to A$600 in Victoria and A$1,000 in South Australia. He believes that investment and water will continue to gravitate towards the bigger, more professionally managed farms. In the long run, the irrigation of pasture for livestock, which currently consumes about half of the basin's agricultural water, will not make sense. The number of small, family-owned farms will shrink.

Ian Zadow owns just such a farm, near Murray Bridge in South Australia, which has been in the family since 1905. He is also head of the local irrigators' association. His son used to work on the farm with him. But farming cannot support two families, so the younger man has taken a job tending graveyards instead. "If you can pay all your bills and get three meals on the table," says Mr Zadow, "that's about as good as it is going to get."

At the moment however, things are nowhere near that good. Last year, he saw his allocation of water slashed first by 20%, then by 30% and finally by 40%. Next season, unless much more rain falls, he stands to get no allocation at all. He feels that city-dwellers should do their bit to help farmers by conserving more water. When push comes to shove, he says, politicians will always give priority to the cities over the countryside, since they are home to more voters. He also thinks irrigators in New South Wales and Victoria should be trying harder to save water. Before too long Mr Zadow's complaints may be echoed by millions of farmers around the world.

If the Australian drought continues, the thousands who depend on irrigation water for a living will be in deep trouble. Many are already in debt and struggling to make ends meet. When asked what will happen if there is no water for them this year, Mr Zadow hesitates for a moment before replying, "Christ knows."

317
Texts That Run Rings Around Everyday Linear Logic

By EDWARD ROTHSTEIN
NY Times
Published: March 26, 2007

The feeling is familiar. You are listening to a piece of music, and nothing links one moment with the next. Sounds seem to emerge without purpose from some unmapped realm, neither connecting to what came before nor anticipating anything after. The same thing can happen while reading. Passages accumulate like tedious entries in an exercise book. Chaos, disorder, clumsiness, disarray: these must be the marks of poor construction or, perhaps, of deliberate provocation.

In a strange way, though, the very same sensations might also be marks of our own perceptual failures. Perhaps the order behind the sounds is simply not being heard; perhaps the logic of the argument is not being understood. Paying attention to anything alien can be like listening to a foreign language. There may be logic latent in the sounds, but it is not evident to untrained ears.

This is one reason we so persist in trying to find order, even when it is not first apparent. It is almost a faith in science, psychology, religion and art: an unshakable conviction that some pattern will be found. And often it is. Now, a brief book by the British anthropologist Mary Douglas, “Thinking in Circles: An Essay on Ring Composition” (Yale University Press), provides another glimpse, cursory but suggestive, of this quest for pattern.

Over the course of her career Ms. Douglas has become a master at discerning order in unexpected forms and surprising places. In an unassuming way, without pretense or revolutionary claims, she reveals the logic behind the varied customs of a society. One of the arguments made in her classic book “Purity and Danger” was that herein lies the very work of a culture: to shape a rigorous order that can hold threatening outside forces at bay. Societies divide the world into the clean and the unclean, the permitted and the forbidden, the pure and the polluted, imposing their categories on the continuities of nature, creating order while disclosing it.

This order is also preserved and passed on through literary and religious texts, which must themselves communicate a culture’s way of understanding the world. Why, though, Ms. Douglas asks, are so many of these texts so disorganized, so clumsily written — at least according to generations of readers? The biblical Book of Numbers, she points out, has been dismissed as an unstructured miscellany; one important scholar, Julius Wellhausen, looked at it, she writes, as if it were a “kind of attic used for storing biblical materials that did not fit,” almost a “junk room for the rest of the Pentateuch.”

Over the centuries many Chinese novels have also been attacked for lack of structure, repetition and episodic incoherence. So have Persian and Zoroastrian poetry. Even the Iliad has come in for its share of criticism. Ms. Douglas adds, “The terms disarray and chaotic, together with disordered, clumsy, and other pejoratives” crop up very often in descriptions of the texts that interest her. She herself reacts like an anthropologist surveying a society’s strange customs. “Whenever I read criticism of dire editorial confusion,” she writes, “my pulse quickens; I scent a hidden structure.”

In many cases she finds one. “Writings that used to baffle and dismay unprepared readers, when read correctly, turn out to be marvelously controlled and complex compositions,” she writes. Many epic works of non-Western cultures, she explains, have a distinctive shape: they are constructed in the form of rings.

Here is how the ring works. First there is an introductory section, a prologue that presents the theme and context. The story then proceeds toward its crucial center: the turning point and climax. Once there, the beginning is invoked again and the tale reverses direction. The second half of the story rigorously echoes the first, using verbal markers — like repetition or changes in style — but proceeding as a mirror image, as if the writer is walking backward through the plot. The ending is a return to the beginning. The ring structure also resembles an unrolling thread that is then pulled back onto its spool.

This pattern, Ms. Douglas and other writers have suggested, appears again and again in world literature. She argues, for example, that the biblical story of Abraham’s near sacrifice of Isaac is laid out in ring form. It begins with God’s call to Abraham; the turning point comes when the angel calls to Abraham before he strikes Isaac, their interchange echoing the words at the beginning. Then, step by step, the story reverses itself, repeating at each step language used earlier.

In her brilliant analysis of the biblical book Numbers (fully explored in another of her volumes, “In the Wilderness”), Ms. Douglas has found that the entire text is constructed in a circling and mirroring form, in which bands of narrative alternate with layers of legal writ. A work that might seem a structural hodgepodge takes on, in her analysis, a rigorous logic; the parallels established by the ring form assume important meanings that are crucial for understanding the biblical book’s preoccupation with the priesthood and authority.

Ms. Douglas explores the ring structure in more recent literature as well (including Laurence Sterne’s “Tristram Shandy”), but for the most part, she writes, the pattern has become lost to Western perception. Narratives rigorously written in ring form have come to seem chaotic and clumsy. This is not, she insists, because they are esoteric codes but because today we look elsewhere for order, distrusting the ring form’s rigorous demands. The ring can seem to overturn linear logic and expectation; we prefer open-ended explorations and mistake order for chaos.

I’m not sure that that is the full explanation. And this book is too limited a survey to do the theme justice; it suggests more than it proves. But there is a compelling reason for why the ring pattern that Ms. Douglas outlines works so well: It maps out the ways in which human beings make sense of things.

At first one event follows another. We may not be entirely sure where it is going. Is there a point at all? Then, with declarative emphasis comes the turning, where, with a shock, we hear a first echo. We connect these different moments; a pattern begins to take shape. Then, step by step, other similarities are heard — they too take on meaning — moving backward from the most recent to the earliest in time, until we return to where we began. This kind of narrative needs to be heard again, for it is only in the retelling that the full nature of its order is revealed.

The ring form thus seems to presume repetition and re-interpretation to be understood; it almost takes on the aspect of ritual. It also seems to presume a community that will share in accumulated understanding. Is this perhaps what makes the ring form so alien to contemporary life? Right now, disorder seems much more realistic.

Connections, a critic’s perspective on arts and ideas, appears every other Monday.

===========
A friend responds:

I recall a dim memory of a book I read some years ago called the Gift of the Jews which argued that until the emergence of a more linear, future oriented thinking pattern, thoughts and societies and world views were circular, “rings,” based on the repetitive pattern of the days and seasons.  

 

After all, the entirety of life in those days was focused on the daily and seasonal cycles that always closed on themselves, so why would one have any other world view?  Your ancestors and your successors could expect precisely the same life.  It was always so, and would always be so.

 

Then the human capacity for thinking forward, beyond the circular view of everyday life, began to take hold and people thought forward longer term, developed plans, built buildings, then cities, and the current linear thinking model evolved and became dominant as we separated from the natural cycle of the earth.  Storage of surpluses and complexities of trade necessitated a change of viewpoint.

 

This thesis seems consistent with your points, Ed. Today everyday logic is linear.  Four thousand years ago everyday logic was circular, like the passage of time.  Round and round.

 

Maybe that is why the older texts are constructed as indicated in your column.

 

Fred

318
Science, Culture, & Humanities / Jewish Resistance to the Nazis
« on: April 24, 2007, 09:36:06 AM »

Exhibition Review

Resisting the Nazis Despite the Odds


By EDWARD ROTHSTEIN

Published: April 16, 2007

The discipline and determination are half-brilliant, half-mad: in 1940, in Warsaw, the Polish-Jewish historian Emanuel Ringelblum decided that the entire experience of Jewry under Nazi rule should be thoroughly documented. The internment of Jews within the Warsaw ghetto, he wrote (with chilly irony), “provided even greater opportunity for development of the archive.”

Skip to next paragraph



Shulamith Posner-Mansbach/ United States Holocaust Memorial Museum

The show at the Museaum of Jewish Heritage includes this 1932 photo taken in Kiel, Germany. More Photos »

Multimedia

Slide Show

Resisting the Nazis

A competition was established to select writers, teachers and intellectuals; they would study topics like community life, education, crime, youth, art and religion, while helping to smuggle information into the ghetto. Comprehensiveness and objectivity were meant to eclipse surrounding horrors, documenting them for the future. The secret project was called, in heavily sardonic code, Oyneg Shabbes, using the Yiddish words for a celebration welcoming the Sabbath.

“To our great regret, however, only part of the plan was carried out,” Mr. Ringelblum writes, explaining with hyperbolic understatement: “We lacked the necessary tranquillity for a plan of such scope and volume.” Writers were executed; some were exiled for slave labor; and, in 1942, hundreds of thousands of ghetto residents were deported to death camps. Before the ghetto was consumed in the final conflagrations of an armed rebellion, Mr. Ringelblum’s archive was buried in tin boxes and milk cans that were only partly rediscovered after the war.

This epic is briefly alluded to in the important exhibition “Daring to Resist: Jewish Defiance in the Holocaust,” opening today at the Museum of Jewish Heritage in association with the Ghetto Fighters’ House in Israel. Mr. Ringelblum is mentioned here, and facsimiles of the buried documents (now housed in Warsaw) are shown, but they are primarily demonstrating that in extreme times resistance to tyranny takes many forms. One is the enterprise of Oyneg Shabbes: documentation.

Others forms of resistance are reflected in objects that in ordinary times have no distinctiveness: a ritual slaughterer’s knife used at great risk to butcher kosher chickens in Denmark so they could be smuggled into Germany in the 1930s; a blue-and-white wrestling sash from 1934 awarded to Jewish contestants no longer permitted to compete with their fellow Germans; a girl’s 1938 report card from a school founded by Jews in Berlin after Jewish children were banned from public schools.

And reflecting later years are artifacts from even darker times, including false documents used by Jewish women who were couriers secretly bearing information from beyond the walls of ghettos and camps. Also on view are a violin, a stage set, school notebooks: all relics of a resilient Jewish life nurtured at the brink of extinction. (“When the children will come out of the cage,” one survivor recalls being told, “they should be able to fly.”)

There is even a pillowcase given to a Lithuanian woman by Rivka Gotz, who defied the Nazi ban on Jewish childbirth and smuggled her newborn son, Ben, out of the Shavli ghetto in a suitcase, placing him under the woman’s secret care. The pillowcase now comes from Ben Gotz’s collection.

Such is the evidence of resistance of one kind or another: creating institutions in the face of oppression; following religious observances that were the object of Nazi repugnance; continuing cultural life with defiant pride; risking life to bring new life into being. It is not until late in the exhibition that visitors see the first guns used by Jewish partisans or can read the first accounts of their sabotage as they darted from forests like gnats in the face of the German war machine.

The exhibition’s curator, Yitzchak Mais, former director of the Yad Vashem museum in Jerusalem and a curator of the planned Illinois Holocaust Museum in Skokie, explains in a valuable companion volume to the show (which also includes many difficult-to-find firsthand accounts) that his intention was to address the kinds of accusatory questions that the writer Primo Levi said he often heard as a survivor: “Why did you not escape? Why did you not rebel?”

Mr. Mais’s answer is that Jews did, again and again There were more than 90 Jewish fighting organizations in European ghettos and three rebellions at the hellish centers of the Nazi death-kingdom: at Sobibor, Treblinka and Auschwitz-Birkenau. But also, Mr. Mais suggests, “visitors to our exhibition will be challenged to re-evaluate their understanding of what constitutes resistance.”

This is the show’s greatest strength, and also its greatest weakness. It is a strength because to demonstrate how all of this involved resistance, the exhibition must convey just how extraordinary the circumstances were: the gradually tightening grip that held European Jews; the impressions that couldn’t fully foreshadow what was to come; the human impulse toward hope being slowly stifled. “How does one respond,” an introductory film asks, “when the future is unknown?”

“Who can you turn to?” asks the label text. “Who will speak for you when your government turns enemy and neighbors turn away?” “Is it better to lie low or stand tall?” And another question: “To stay or to go?”

When the scale of the Nazi ambition starts to become clear, it is beyond comprehension. The show includes numerous fragments of interviews with survivors (which unfortunately are too brief and miscellaneous) that capture those impressions. One woman recalls the postcards arriving from relatives whom the Nazis had just relocated “East”; they are full of carefully phrased optimism and artificially cheery description. But after the Nazi-supervised pap, one card ominously adds: “Very soon we are going to visit Uncle Mavet.” Mavet, in Hebrew, means death.

But the exhibition’s polemical focus is also a weakness, for it ends up turning resistance into a catchall concept that applies to any refusal to submit completely. There is an element of truth here, but also a needless desire to encompass every act of pride and survival within the idea of resistance. The result is almost too reassuring: Jews, the label text tells us, “recognized that their most precious resource was hope,” and, “They acted imaginatively to shield their communities from despair and promote the will to resist.”

It is as if the exhibition were shying away from too much complication. Almost unmentioned, for example, are the moral quandaries faced by Jewish leaders who even at best had to weigh the communal benefits of cooperation with the communal costs of resistance. In one of the show’s short videos, a survivor recalls being called before community leaders when they learn of her plan to escape. They cite the massacres that would follow. She is asked, “Who gave you the right to buy your freedom at the price of others?”

That dilemma is unexplored. That would mean examining the idea of resistance more intensively; making more distinctions, not fewer. Why, for example, did it take so much time for Jewish resistance to erupt into outright refusal and rebellion? In the show’s companion book, the historian David Engel suggests that at first Jews saw the Nazi phenomenon as a recurrence of earlier traumas, as part of the cycle of Jewish historical experience. Jews, after all, had received full German citizenship only in 1871, so if they were deprived of benefits in 1933, it was more a regression than a cataclysm.

The sense of repetitive cycles was reinforced by the literal medievalism of German oppression: the ghettos, the yellow stars, the governing Jewish councils. These historical echoes, Mr. Engel suggests, made Jews less likely to see clearly what was happening and made resistance less likely.

Those who did see, like the partisan Abba Kovner, took very different actions. In 1941, at 23, he said that the German goal was the “absolute, total annihilation” of the Jews. This put the entire situation in a new context. Unfortunately in this show one doesn’t fully grasp how drastically interpretation shaped response; the partisans were a turning point as much as a continuation. Here, though, their acts almost become a supplement to broadly defined resistance, and the fighters lack individuality.

In a 2001 PBS documentary, “Resistance: Untold Stories of Jewish Partisans,” Kenneth M. Mandel and Daniel B. Polin tell the story through interviews with 11 partisans who become recognizable individuals recounting an astonishing past. Some of those same figures appear in this exhibition’s videos, but they are stripped of context and speaking in snippets. We don’t learn enough about them to fully understand their achievement.

This makes the exhibition less powerful than it might have been. But at a time when Nazism has become a denatured metaphor for any political system deemed unpleasantly powerful, and when the concept of resistance has been perverted into meaninglessness by terrorist groups boasting exterminationist goals, this show begins to re-establish the sense of scale that once made Nazism so horrific and resistance so difficult.


319
Science, Culture, & Humanities / The Power Curve
« on: April 23, 2007, 10:36:24 PM »
From yesterday's WSJ:

Shattering the Bell Curve
The power law rules.

BY DAVID A. SHAYWITZ
Tuesday, April 24, 2007 12:01 a.m. EDT

Life isn't fair. Many of the most coveted spoils--wealth, fame, links on the Web--are concentrated among the few. If such a distribution doesn't sound like the familiar bell-shaped curve, you're right.

Along the hilly slopes of the bell curve, most values--the data points that track whatever is being measured--are clustered around the middle. The average value is also the most common value. The points along the far extremes of the curve contribute very little statistically. If 100 random people gather in a room and the world's tallest man walks in, the average height doesn't change much. But if Bill Gates walks in, the average net worth rises dramatically. Height follows the bell curve in its distribution. Wealth does not: It follows an asymmetric, L-shaped pattern known as a "power law," where most values are below average and a few far above. In the realm of the power law, rare and extreme events dominate the action.

For Nassim Taleb, irrepressible quant-jock and the author of "Fooled by Randomness" (2001), the contrast between the two distributions is not an amusing statistical exercise but something more profound: It highlights the fundamental difference between life as we imagine it and life as it really is. In "The Black Swan"--a kind of cri de coeur--Mr. Taleb struggles to free us from our misguided allegiance to the bell-curve mindset and awaken us to the dominance of the power law.

The attractiveness of the bell curve resides in its democratic distribution and its mathematical accessibility. Collect enough data and the pattern reveals itself, allowing both robust predictions of future data points (such as the height of the next five people to enter the room) and accurate estimations of the size and frequency of extreme values (anticipating the occasional giant or dwarf.

The power-law distribution, by contrast, would seem to have little to recommend it. Not only does it disproportionately reward the few, but it also turns out to be notoriously difficult to derive with precision. The most important events may occur so rarely that existing data points can never truly assure us that the future won't look very different from the present. We can be fairly certain that we will never meet anyone 14-feet tall, but it is entirely possible that, over time, we will hear of a man twice as rich as Bill Gates or witness a market crash twice as devastating as that of October 1987.

The problem, insists Mr. Taleb, is that most of the time we are in the land of the power law and don't know it. Our strategies for managing risk, for instance--including Modern Portfolio Theory and the Black-Scholes formula for pricing options--are likely to fail at the worst possible time, Mr. Taleb argues, because they are generally (and mistakenly) based on bell-curve assumptions. He gleefully cites the example of Long Term Capital Management (LTCM), an early hedge fund that blew up after its Nobel laureate founders "allowed themselves to take a monstrous amount of risk" because "their models ruled out the possibility of large deviations."





Mr. Taleb is fascinated by the rare but pivotal events that characterize life in the power-law world. He calls them Black Swans, after the philosopher Karl Popper's observation that only a single black swan is required to falsify the theory that "all swans are white" even when there are thousands of white swans in evidence. Provocatively, Mr. Taleb defines Black Swans as events (such as the rise of the Internet or the fall of LTCM) that are not only rare and consequential but also predictable only in retrospect. We never see them coming, but we have no trouble concocting post hoc explanations for why they should have been obvious. Surely, Mr. Taleb taunts, we won't get fooled again. But of course we will.
Writing in a style that owes as much to Stephen Colbert as it does to Michel de Montaigne, Mr. Taleb divides the world into those who "get it" and everyone else, a world partitioned into heroes (Popper, Hayek, Yogi Berra), those on notice (Harold Bloom, necktie wearers, personal-finance advisers) and entities that are dead to him (the bell curve, newspapers, the Nobel Prize in Economics).

A humanist at heart, Mr. Taleb ponders not only the effect of Black Swans but also the reason we have so much trouble acknowledging their existence. And this is where he hits his stride. We eagerly romp with him through the follies of confirmation bias (our tendency to reaffirm our beliefs rather than contradict them), narrative fallacy (our weakness for compelling stories), silent evidence (our failure to account for what we don't see), ludic fallacy (our willingness to oversimplify and take games or models too seriously), and epistemic arrogance (our habit of overestimating our knowledge and underestimating our ignorance).

For anyone who has been compelled to give a long-term vision or read a marketing forecast for the next decade, Mr. Taleb's chapter excoriating "The Scandal of Prediction" will ring painfully true. "What is surprising is not the magnitude of our forecast errors," observes Mr. Taleb, "but our absence of awareness of it." We tend to fail--miserably--at predicting the future, but such failure is little noted nor long remembered. It seems to be of remarkably little professional consequence.

I suspect that part of the explanation for this inconsistency may be found in a study of stock analysts that Mr. Taleb cites. Their predictions, while badly inaccurate, were not random but rather highly correlated with each other. The lesson, evidently, is that it's better to be wrong than alone.

If we accept Mr. Taleb's premise about power-law ascendancy, we are left with a troubling question: How do you function in a world where accurate prediction is rarely possible, where history isn't a reliable guide to the future and where the most important events cannot be anticipated?

Mr. Taleb presents a range of answers--be prepared for various outcomes, he says, and don't rush for buses--but it's clear that he remains slightly vexed by the world he describes so vividly. Then again, beatific serenity may not be the goal here. As Mr. Taleb warns, certitude is likely to be found only in a fool's (bell-curve) paradise, where we choose the comfort of the "precisely wrong" over the challenge of the "broadly correct." Beneath Mr. Taleb's blustery rhetoric lives a surprisingly humble soul who has chosen to follow a demanding and somewhat lonely path.

I wonder how many of us will have the courage to join him. Very few, I predict--unless, of course, something unexpected happens.

Dr. Shaywitz is a physician-scientist in New Jersey. You can buy "The Black Swan" from the OpinionJournal bookstore.


320
http://www.bulatlat.com/news/7-11/7-11-batak.htm

The Vanishing Batak Tribe

The end of the Batak had come and gone. Their culture was already gone. The language was all that remained. Do you doom yourself and your children to lives of abject poverty, ridden with disease and living with hunger on a daily basis just to preserve a language?

By Antonio Graceffo
Posted by Bulatlat

 Lorenzo Batak stands about five feet tall, and wears the traditional loin cloth, made from bark. At fifty-four years of age he is one of the most respected tribal elders. His face is lined. His curly black hair has gone completely gray, and his teeth are disappearing, making him look much older than he really is. Of late, he has been plagued by a constant cough and shortness of breath. Lung infections are rampant among the tribal people, living in their jungle community. The homes are lean-tos composed of leaves and bamboo, centered around a fire pit. The makeshift dwellings are suitable for the Batak, a nomadic people, accustomed to abandoning their village, and relocating. In the past, their relocations were conducted in a rhythm with the natural ecosystem. They would move, so as not to deplete the forest resources, which have sustained their people for centuries. Lately, most of their relocations have been a reaction to forced incursions by lowlanders.

The Batak tribe of Palawan in danger
of disappearing. It is losing its identity,
with only its language remaining.
 

Today, the entire community has turned out to greet the outreach mission from Tag Balay, an NGO, lead by Marifi Nitor-Pablico of Tag Balay Foundation. Lorenzo recognizes me from a previous visit to another Batak village and he smiles broadly, slapping me on the chest. The tribe is much more excited to see Marifi and her team of volunteers who are bringing food and medicine. Perhaps the most important member of the team is Dr. Richard LaGuardia, an American Filipino doctor, living in Puerto Princesa, who donated his time and medical assistance. The young students from Palawan State University follow behind, carrying crates of donated medicines.

Batak women, wearing sarongs, bare-breasted squat in a line, at the long tribal drums, made from hollowed out tree trunks. They pound out a joyful rhythm with heavy club-like drum sticks.

The Batak, believed to be the oldest inhabitants of the Philippines, are one of three principal tribes, located in Puerto Princesa City, on Palawan Island. In the far south of the island is the Palawan tribe, who still live as cave dwellers, hunting in the forest with blowguns. Inside the limits of Puerto Princesa City are the Batak and Tagbanua. The Tagbanua are by far the largest of the Palawan tribes. Population estimates range from 15-25,000 persons. The Tagbanua are largely integrated, living in communities, raising rice crops, and sending their children to church and school, much as their Filipino neighbors. (Note: all tribes in the Philippines are more or less indigenous and are entitled to Philippine citizenship. The term Filipino here refers to the modern, non-tribal, majority of Filipinos.) The Batak still live largely as they have for centuries, as semi-nomadic hunter gatherers. They are by far the smallest tribe, both in stature and in numbers. The average Batak man barely stands five feet tall. The tribal population is estimated at 360 members.

The Batak are a negrito people, with kinky (curly) hair and dark skin. Their mother-tongue is called Binatak and is related to other regional languages of Malayic origin. While the Palawan and the Tagbanua tribes developed a unique alphabet, the Batak have never had a writing system. Anthropologists believe the Batak to be related to the Aeta people, found in other parts of the Philippines. The Batak also bare a resemblance to the Semang and Sakai tribes of the Malay Peninsula. As the Batak do not have a written history, much of the explanation of their origin is based on guess work. Dr. Carlos Fernandez, a retired professor of anthropology in Puerto Princesa and a leading authority on the Palawan tribes, explained that a commonly held theory is that Borneo was once connected to Palawan by a land bridge. The Batak and other tribes are believed to have migrated from Sabah, Malaysian Borneo, centuries ago. The theory goes on to suggest that the ultimate origin of these tribes may be from Madagascar.

In her book on the tribe, Bakas (an ethnographic documentation of the Batak indigenous people in Sitio Kayasan, Barangay Tagabenit, Puerto Princesa City, Palawan, Philippines) Marifi Nitor-Pablico recounts the legend which the Batak use to explain their own origin.

Long ago while a mother was sleeping, her four sons came in the house. The eldest son lifted her skirt and laughed at his mother's nakedness. The second son also laughed but not as much. The third son did not laugh at all.

The fourth son covered his mother with cloth. The father stepped in the room, and told the children this had been a test, and they had each won an award. To the oldest son he gave a stick used to beat bark for making cloth. To the second son, he gave a piece of torn cloth. To the third son he gave a piece of new cloth. And to the youngest he gave a piece of iron. From the oldest son came the Batak people. From the second, the Tagbanua. From  the third, the Moro (rich Muslim traders). And from the fourth came the Spaniards.

Batak language

Binatak, the dialect of the Batak, is classified as an Austronesian Malayo-Polynesian Meso-Philippine Palawano language. Due to contact with outsiders the Batak language has become the recipient of many loan words from Tagbanua, Tagalog/Filipino, Spanish, and English. Although illiteracy is extremely high, nearly 100 percent of Batak speak Filipino, the lingua-franca of the Philippines. The distance to the primary school is identified as primary reason why illiteracy can't be combated among the Batak.

"Violence is not part of their code of ethos" explained Dr. Fernandez, "They deal with conflict by running away. They avoided contact with foreigners. Historically, their only means of defense was moving deeper into the forest."

Aside from the fact that it was historically easy for lowlanders to steal Batak land, simply by driving them into the jungle, Marifi explained that as the Batak push deeper and deeper into inaccessible jungle, they moved further and further away from schools and medical aid stations. Even if they lived closer to a school, however, Batak families are extremely poor and would be unable to pay tuition fees.

Unlike tribal people in other countries, Batak enjoy full rights of citizenship, including land ownership. Under the Ancestral Domain Sustainable Development and Protection Plan (ADSDPP) the Batak are gaining land rights. But they are still extremely shy about dealing with outsiders and run from confrontation. As a result, sending them medical supplies, teaching them agriculture, or giving them land rights are nearly ineffective in helping to preserve this vanishing race of people.

Lack of access to doctors adds to their staggering rate of infant mortality. Several Batak women confirmed that the average number of babies born per family was eight, but normally only two would live.

The Batak are hunter gatherers, so their diet consisted largely of forest products and meat. In the last thirty years, the forest cover of the Philippines has decreased from 70 percent to 3 percent. Only three percent of the Philippine Islands are covered in old growth forest. Thanks to the efforts of the environmentally-minded Mayor Edward Hagedorn, Puerto Princesa City, with 49 percent old growth forest coverage, is referred to as "the cleanest and greenest" city in the Philippines, and possibly in the world. Even with
the protectionist measures, the environment of the Batak is shrinking. Today, there is very little large game left on Palawan Island. The largest animal they could hope to kill in the forest is a wild pig, and they are now becoming rare.

The Batak have made some changes to their diet, adapting the eating of rice to supplement the diminishing forest products. They buy additional foods from lowlanders when they have money. This has forced them into a market economy which they have very little understanding of. Batak are often cheated by the middlemen, whether they be Muslim, Chinese, or Filipino. They sell their products to local buyers at a fraction of their fair market value, because they have no direct access to the end-user markets in the city.

First contact

My first contact with the Batak was at Kalakwasa Village, a one hour walk from the paved road. When I met Lorenzo, an elder, I just assumed he would be the headman, and my point of contact. Instead, however, I was introduced to a much younger man, Eliseo, age 42, who claimed he was headman. Elisio claimed the village had been in its present location for 32 years. Nomads don't normally stay in one place for 32 years. I had trouble believing this and many others of his answers. "Before, we moved a lot. But now, we have settled here because no one came to help us when we lived deep in the forest." The Batak were living in houses, with woven walls, raised up non stilts. Elisio explained that these were not traditional Batak houses. "Before, our houses were made of natural materials. Now, we use wooden prefab materials provided by government." The new, permanent houses meant the tribe could no longer move.

Noticing that one of the buildings had a cross on the roof, I asked if it was a church. "Yes, we converted to Christianity (not Catholicism) ten years ago."

That single statement of fact explained the Disney like look of the village. Typical Filipino houses on stilts with woven walls were not typical for nomads. The fact that a young man was the leader also made no sense. But then Elisio explained.

"I worked with the missionaries. They taught me to speak Tagalog and to read. So, now I am the leader."

It would later turn out that not only was Elicio not the headman, but he was not even a Batak. He was a Tagbanua who had set himself up in business as guide and interpreter for foreign visitors to the tribe.

Dr. Fernandez explained that historically, the main outside influence on the Batak were the Muslim merchants who the Batak traded with when they were living in coastal regions. For the most part, however, the Batak were and are xenophobic, which is why the Spanish language and Catholicism never
caught on. Traditionally, the Batak followed an animist religion. They believed in spirits that lived in the forest, trees, rivers, and animals. Their value system was based on this belief system.

Recently, however, foreign missionaries, generally from Protestant sects, had been successfully converting villages. Once a village converts, every aspect of tribal identity disappears. In asking further questions about tribal customs and beliefs, Elisio either didn't know, didn't want to say, or just outright lied, so that he could provide us with the standard Christian answers which would have been no different than if we had remained in town and interviewed any Filipino working in a bank in Puerto Princesa.

Example: "What is the average marriage age of the tribe?” "Eighteen," answered Elisio.

This answer is a clear fabrication. Rural Filipinos don't even wait till eighteen to marry. For tribal people, the answer should be closer to twelve. Dr. Fernandez would later confirm that the onset of puberty is the signal that the child is ready for marriage.

"How many children do most families have?" "Two"

This was a near lie. The correct answer, as I would learn from Marifi and Dr. Fernandez later, was that the average family had eight children, but on average only two would live.

"How many wives do the tribal people have?" "Only one," answered Elisio, dutifully lying.

Polygamy

The Batak traditionally allowed polygamy, but it didn't come up very often because the man had to be wealthy enough to support the additional wives and children. After Christian conversion, this practice became taboo.

Tribal people, nearly everywhere, live in harmony with nature. Their existence is one of delicate balance. If any element is taken from the equation, if any changes are made to the eco system, they could go extinct.
If researched and studied deeply, every aspect of their cultural belief system is normally found to have practical and positive applications. Said another way, all that they do, they do in order that the tribe may continue to exist.

In choosing a mate, women will choose the man who is the best provider. If asked, she knows that this increases the chances of survival of her children. But modern researchers will also see a kind of social Darwinism in this practice. The best provider will probably be the biggest, the strongest, the healthiest or the cleverest man. By marrying and fathering children, these desirable genes are perpetuated. And the tribe as a whole becomes stronger. If the feeblest men married the feeblest women, they would produce feeble children who would not survive. Polygamy could really only be practiced by men who were super providers. There is an implication that they were carrying genes for unusually desirable traits, and so, polygamy gave them the opportunity to produce as many offspring as possible.

Another important function of polygamy is that the tribal people know that siblings shouldn't marry. Most tribes also discourage first cousins from marrying, but if there are no other spouses available even first cousins will marry. Polygamy would increase the marriage pool, so that men who were already married wouldn't be off the list of potential husbands.

Once the tribe converted to Christianity, they stopped practicing polygamy. The marriage pool decreased in size and women were often forced to marry "undesirable" men.

"Do cousins marry?" "Never," said Elicio, "We go to the other village to find a wife if none is available here."

This was again a near lie. Cousins did marry, because of the ever shrinking gene pool. If 30 families live in a village, and each have only two children, it doesn't take long for everyone to be related.  As for finding
a wife in another village, Marifi explained that this often meant marrying a Tagbanua. Because of so many intermarriages, the Batak are being slowly bred out of existence.

Dr. Fernandez said that as a result of poor diet and disease, Batak men have become very small. "In Asia," He said. "Women can marry up or they can marry at the same level, but they cannot marry down. Batak men are becoming undesirable candidates for marriage, so many of the Batak women are marrying Tagbanua."

The Tagbanua just looked healthier and stronger than Batak men. They were also richer. A large percentage of them farmed rice and lived in or near the city. Some even had regular jobs.

Marifi confirmed, "It is getting harder and harder for Batak men to marry."

"What do you do with your dead?" I asked Elicio. "We bury them in a coffin."

Superstitions and rituals

Once again, the Christian answer was given. In reality, tribal people usually have a number of superstitions and rituals associated with death. Some tribes actually relocate the entire village if one person dies.
According to Dr. Fernandez, the Batak would burn the house where the dead person had lived, and no one would live in that house again. This superstition had the practical function of preventing the spread of
communicable diseases. Now that they lived in pre-fab houses bought in the city, I wondered how quick they would be to burn them. And would not burning the home of the deceased result in more deaths?

The part of his story that was believable was that the pastor hadn't been to the village in ages. This was so common and frustrating among tribal people. Missionaries convert them, destroy the culture, and then leave.

Elisio told me that the church also served as a school for the Batak children. The teacher only came on Mondays and Tuesdays and taught first and second grade. As a result, although the church/school had been there for ten years, nearly everyone was still illiterate.

In most tribes babies are delivered at home, by midwives, as is the custom of the Batak. In many tribes it is customary to cut the umbilical cord with bamboo, a practice which leads to infection and threatens the life of the mother and infant. When I asked Elicio about this, he answered.

"The midwife uses scissors and she boils them for thirty minutes to sterilize them first."

This was one more answer that had been programmed into him by the missionaries. And of course, it turned out to be untrue. In questioning Batak women in another village, I found out that they use bamboo to cut the umbilical cord.

According to Elicio there were 33 families, 140 people living in the village. Dr. Fernandez explained that the political organization of the Batak was very loose, much simpler than the organization of say the Native
Americans. Native Americans had chiefs and councils. They had political units and sub units. But with the Batak there isn't even a chief, just a village headman, who is consulted and whose opinion weighs more than that of the others, but he is not the boss. This type of structure can only work for about 90 people. Native Americans, on the other hand, were able to organize thousands and even tens of thousands of members in their nations. For the Batak, when the limit, of about 90, is reached, they would split off and
form a new village.

According to this information, Elicio's village was way past being due for a split. Once again, this was putting unusual pressure on the forest resources to sustain this unnaturally large group of people.

Elicio was wearing basketball shorts and a T-shirt. Only the very old men seemed to be wearing a loin cloth. Many of the adolescents and even up to their thirties were wearing jeans. I asked if the missionaries had introduced the wearing of clothes. But Elicio answered, "No, we want to look like city people." Whether this was the case of not, the tribal culture was clearly dying out.

"Do you still hunt in the jungle with bows and arrows?" I asked. Elicio assured me that they did.
Always interested in primitive weaponry I asked to see them.

Elicio turned to Lorenzo and, ostensibly, asked in Batak language, for the bows.

"Our bows are already at the museum." answered Lorenzo.

A diet of tuber

Elicio said the tribe ate a diet of fruits, vegetables, and meat they hunted. The lack of bows suggested they weren't doing any hunting. And fruits and vegetables don't grow so readily in the wild. Even if they did,
they would be depleted by the tribe's lack of mobility. I would later find out that the Batak ate a diet which consisted almost exclusively of a tuber called kudot. It looks like a white root, which is so tough that it should be inedible. But the Batak would pound it and boil it for hours, till it had a consistency of mashed-potatoes mixed with saw dust. The resultant glue was absolutely tasteless, which was probably a good thing. If there was any nutritional value at all in kudot, it was most likely a source of carbohydrates but nothing else.

321
Science, Culture, & Humanities / Jewish Genius
« on: April 12, 2007, 01:13:42 PM »
A piece of scary implications , , ,  :-o



 
From issue: April 2007
Jewish Genius
By Charles Murray
Since its first issue in 1945, COMMENTARY has published hundreds of articles about Jews and Judaism. As one would expect, they cover just about every important aspect of the topic. But there is a lacuna, and not one involving some obscure bit of Judaica. COMMENTARY has never published a systematic discussion of one of the most obvious topics of all: the extravagant overrepresentation of Jews, relative to their numbers, in the top ranks of the arts, sciences, law, medicine, finance, entrepreneurship, and the media.
I have personal experience with the reluctance of Jews to talk about Jewish accomplishment—my co-author, the late Richard Herrnstein, gently resisted the paragraphs on Jewish IQ that I insisted on putting in The Bell Curve (1994). Both history and the contemporary revival of anti-Semitism in Europe make it easy to understand the reasons for that reluctance. But Jewish accomplishment constitutes a fascinating and important story. Recent scholarship is expanding our understanding of its origins.

And so this Scots-Irish Gentile from Iowa hereby undertakes to tell the story. I cover three topics: the timing and nature of Jewish accomplishment, focusing on the arts and sciences; elevated Jewish IQ as an explanation for that accomplishment; and current theories about how the Jews acquired their elevated IQ.

_____________



From 800 B.C.E. through the first millennium of the Common Era, we have just two examples of great Jewish accomplishment, and neither falls strictly within the realms of the arts or sciences. But what a pair they are. The first is the fully realized conceptualization of monotheism, expressed through one of the literary treasures of the world, the Hebrew Bible. It not only laid the foundation for three great religions but, as Thomas Cahill describes in The Gifts of the Jews (1998), introduced a way of looking at the meaning of human life and the nature of history that defines core elements of the modern sensibility. The second achievement is not often treated as a Jewish one but clearly is: Christian theology expressed through the New Testament, an accomplishment that has spilled into every aspect of Western civilization.

But religious literature is the exception. The Jews do not appear in the annals of philosophy, drama, visual art, mathematics, or the natural sciences during the eighteen centuries from the time of Homer through the first millennium C.E., when so much was happening in Greece, China, and South Asia. It is unclear to what extent this reflects a lack of activity or the lack of a readily available record. For example, only a handful of the scientists of the Middle Ages are mentioned in most histories of science, and none was a Jew. But when George Sarton put a high-powered lens to the Middle Ages in his monumental Introduction to the History of Science (1927-48), he found that 95 of the 626 known scientists working everywhere in the world from 1150 to 1300 were Jews—15 percent of the total, far out of proportion to the Jewish population.

As it happens, that same period overlaps with the life of the most famous Jewish philosopher of medieval times, Maimonides (1135–1204), and of others less well known, not to mention the Jewish poets, grammarians, religious thinkers, scholars, physicians, and courtiers of Spain in the "Golden Age," or the brilliant exegetes and rabbinical legislators of northern France and Germany. But this only exemplifies the difficulty of assessing Jewish intellectual activity in that period. Aside from Maimonides and a few others, these thinkers and artists did not perceptibly influence history or culture outside the confines of the Jewish world.

Generally speaking, this remained the case well into the Renaissance and beyond. When writing a book called Human Accomplishment (2003), I compiled inventories of "significant figures" in the arts and sciences, defined as people who are mentioned in at least half of the major histories of their respective fields. From 1200 to 1800, only seven Jews are among those significant figures, and only two were important enough to have names that are still widely recognized: Spinoza and Montaigne (whose mother was Jewish).

_____________



The sparse representation of Jews during the flowering of the European arts and sciences is not hard to explain. They were systematically excluded, both by legal restrictions on the occupations they could enter and by savage social discrimination. Then came legal emancipation, beginning in the late 1700's in a few countries and completed in Western Europe by the 1870's, and with it one of the most extraordinary stories of any ethnic group at any point in human history.

As soon as Jewish children born under legal emancipation had time to grow to adulthood, they started appearing in the first ranks of the arts and sciences. During the four decades from 1830 to 1870, when the first Jews to live under emancipation reached their forties, 16 significant Jewish figures appear. In the next four decades, from 1870 to 1910, the number jumps to 40. During the next four decades, 1910–1950, despite the contemporaneous devastation of European Jewry, the number of significant figures almost triples, to 114.

To get a sense of the density of accomplishment these numbers represent, I will focus on 1870 onward, after legal emancipation had been achieved throughout Central and Western Europe. How does the actual number of significant figures compare to what would be expected given the Jewish proportion of the European and North American population? From 1870 to 1950, Jewish representation in literature was four times the number one would expect. In music, five times. In the visual arts, five times. In biology, eight times. In chemistry, six times. In physics, nine times. In mathematics, twelve times. In philosophy, fourteen times.

Disproportionate Jewish accomplishment in the arts and sciences continues to this day. My inventories end with 1950, but many other measures are available, of which the best known is the Nobel Prize. In the first half of the 20th century, despite pervasive and continuing social discrimination against Jews throughout the Western world, despite the retraction of legal rights, and despite the Holocaust, Jews won 14 percent of Nobel Prizes in literature, chemistry, physics, and medicine/physiology. In the second half of the 20th century, when Nobel Prizes began to be awarded to people from all over the world, that figure rose to 29 percent. So far, in the 21st century, it has been 32 percent. Jews constitute about two-tenths of one percent of the world's population. You do the math.

_____________



What accounts for this remarkable record? A full answer must call on many characteristics of Jewish culture, but intelligence has to be at the center of the answer. Jews have been found to have an unusually high mean intelligence as measured by IQ tests since the first Jewish samples were tested. (The widely repeated story that Jewish immigrants to this country in the early 20th century tested low on IQ is a canard.) Exactly how high has been difficult to pin down, because Jewish sub-samples in the available surveys are seldom perfectly representative. But it is currently accepted that the mean is somewhere in the range of 107 to 115, with 110 being a plausible compromise.

The IQ mean for the American population is "normed" to be 100, with a standard deviation of 15. If the Jewish mean is 110, then the mathematics of the normal distribution says that the average Jew is at the 75th percentile. Underlying that mean in overall IQ is a consistent pattern on IQ subtests: Jews are only about average on the subtests measuring visuo-spatial skills, but extremely high on subtests that measure verbal and reasoning skills.

A group's mean intelligence is important in explaining outcomes such as mean educational attainment or mean income. The key indicator for predicting exceptional accomplishment (like winning a Nobel Prize) is the incidence of exceptional intelligence. Consider an IQ score of 140 or higher, denoting the level of intelligence that can permit people to excel in fields like theoretical physics and pure mathematics. If the mean Jewish IQ is 110 and the standard deviation is 15, then the proportion of Jews with IQ's of 140 or higher is somewhere around six times the proportion of everyone else.

The imbalance continues to increase for still higher IQ's. New York City's public-school system used to administer a pencil-and-paper IQ test to its entire school population. In 1954, a psychologist used those test results to identify all 28 children in the New York public-school system with measured IQ's of 170 or higher. Of those 28, 24 were Jews.

Exceptional intelligence is not enough to explain exceptional accomplishment. Qualities such as imagination, ambition, perseverance, and curiosity are decisive in separating the merely smart from the highly productive. The role of intelligence is nicely expressed in an analogy suggested to me years ago by the sociologist Steven Goldberg: intelligence plays the same role in an intellectually demanding task that weight plays in the performance of NFL offensive tackles. The heaviest offensive tackle is not necessarily the best. Indeed, the correlation between weight and performance among NFL offensive tackles is probably quite low. But they all weigh more than 300 pounds.

So with intelligence. The other things count, but you must be very smart to have even a chance of achieving great work. A randomly selected Jew has a higher probability of possessing that level of intelligence than a randomly selected member of any other ethnic or national group, by far.

_____________



Nothing that I have presented up to this point is scientifically controversial. The profile of disproportionately high Jewish accomplishment in the arts and sciences since the 18th century, the reality of elevated Jewish IQ, and the connection between the two are not to be denied by means of data. And so we come to the great question: how and when did this elevated Jewish IQ come about? Here, the discussion must become speculative. Geneticists and historians are still assembling the pieces of the explanation, and there is much room for disagreement.

I begin with the assumption that elevated Jewish intelligence is grounded in genetics. It is no longer seriously disputed that intelligence in Homo sapiens is substantially heritable. In the last two decades, it has also been established that obvious environmental factors such as high income, books in the house, and parental reading to children are not as potent as one might expect. A "good enough" environment is important for the nurture of intellectual potential, but the requirements for "good enough" are not high. Even the very best home environments add only a few points, if that, to a merely okay environment. It is also known that children adopted at birth do not achieve the IQ's predicted by their parents' IQ.

To put it another way, we have good reason to think that Gentile children raised in Jewish families do not acquire Jewish intelligence. Hence my view that something in the genes explains elevated Jewish IQ. That conclusion is not logically necessary but, given what we know about heritability and environmental effects on intelligence in humans as a species, it is extremely plausible.

Two potential explanations for a Jewish gene pool favoring high intelligence are so obvious that many people assume they must be true: winnowing by persecution (only the smartest Jews either survived or remained Jews) and marrying for brains (scholars and children of scholars were socially desirable spouses). I too think that both of these must have played some role, but how much of a role is open to question.

In the case of winnowing through persecution, the logic cuts both ways. Yes, those who remained faithful during the many persecutions of the Jews were self-selected for commitment to Judaism, and the role of scholarship in that commitment probably means that intelligence was one of the factors in self-selection. The foresight that goes with intelligence might also have had some survival value (as in anticipating pogroms), though it is not obvious that its effect would be large enough to explain much.

But once the Cossacks are sweeping through town, the kind of intelligence that leads to business success or rabbinical acumen is no help at all. On the contrary, the most successful people could easily have become the most likely to be killed, by virtue of being more visible and the targets of greater envy. Furthermore, other groups, such as the Gypsies, have been persecuted for centuries without developing elevated intelligence. Considered closely, the winnowing-by-persecution logic is not as compelling as it may first appear.

What of the marrying-for-brains theory? "A man should sell all he possesses in order to marry the daughter of a scholar, as well as to marry his daughter to a scholar," advises the Talmud (Pesahim 49a), and scholarship did in fact have social cachet within many Jewish communities before (and after) emancipation. The combination could have been potent: by marrying the children of scholars to the children of successful merchants, Jews were in effect joining those selected for abstract reasoning ability with those selected for practical intelligence.

Once again, however, it is difficult to be more specific about how much effect this might have had. Arguments have been advanced that rich merchants were in fact often reluctant to entrust their daughters to penniless and unworldly scholars. Nor is it clear that the fertility rate of scholars, or their numbers, were high enough to account for a major effect on intelligence. The attractiveness of brains in prospective marriage partners surely played some role but, once again, the data for assessing how much have not been assembled.

_____________



Against this backdrop of uncertainty, a data-driven theory for explaining elevated Jewish IQ appeared in 2006 in the Journal of Biosocial Science. In an article entitled "Natural History of Ashkenazi Intelligence," Gregory Cochran (a physicist) and Jason Hardy and Henry Harpending (anthropologists) contend that elevated Jewish IQ is confined to the Ashkenazi Jews of northern and central Europe, and developed from the Middle Ages onward, primarily from 800 to 1600 C.E.

In the analysis of these authors, the key factor explaining elevated Jewish intelligence is occupational selection. From the time Jews became established north of the Pyrenees-Balkans line, around 800 C.E ., they were in most places and at most times restricted to occupations involving sales, finance, and trade. Economic success in all of these occupations is far more highly selected for intelligence than success in the chief occupation of non-Jews: namely, farming. Economic success is in turn related to reproductive success, because higher income means lower infant mortality, better nutrition, and, more generally, reproductive "fitness." Over time, increased fitness among the successful leads to strong selection for the cognitive and psychological traits that produce that fitness, intensified when there is a low inward gene flow from other populations—as was the case with Ashkenazim.

Sephardi and Oriental Jews—i.e., those from the Iberian peninsula, the Mediterranean littoral, and the Islamic East—were also engaged in urban occupations during the same centuries. But the authors cite evidence that, as a rule, they were less concentrated in occupations that selected for IQ and instead more commonly worked in craft trades. Thus, elevated intelligence did not develop among Sephardi and Oriental Jews—as manifested by contemporary test results in Israel that show the IQ's of non-European Jews to be roughly similar to the IQ's of Gentiles.

The three authors conclude this part of their argument with an elegant corollary that matches the known test profiles of today's Ashkenazim with the historical experience of their ancestors:

The suggested selective process explains the pattern of mental abilities in Ashkenazi Jews: high verbal and mathematical ability but relatively low spatio-visual ability. Verbal and mathematical talent helped medieval businessmen succeed, while spatio-visual abilities were irrelevant.
The rest of their presentation is a lengthy and technical discussion of the genetics of selection for IQ, indirect evidence linking elevated Jewish IQ with a variety of genetically based diseases found among Ashkenazim, and evidence that most of these selection effects have occurred within the last 1,200 years.
_____________



No one has yet presented an alternative to the Cochran-Hardy-Harpending theory that can match it for documentation. But, as someone who suspects that elevated Jewish intelligence was (a) not confined to Ashkenazim and (b) antedates the Middle Ages, I will outline the strands of an alternative explanation that should be explored.

It begins with evidence that Jews who remained in the Islamic world exhibited unusually high levels of accomplishment as of the beginning of the second millennium. The hardest evidence is Sarton's enumeration of scientists mentioned earlier, of whom 15 percent were Jews. These were not Ashkenazim in northern Europe, where Jews were still largely excluded from the world of scientific scholarship, but Sephardim in the Iberian peninsula, in Baghdad, and in other Islamic centers of learning. I have also mentioned the more diffuse cultural evidence from Spain, where, under both Muslim and Christian rule, Jews attained eminent positions in the professions, commerce, and government as well as in elite literary and intellectual circles.

After being expelled from Spain at the end of the 15th century, Sephardi Jews rose to distinction in many of the countries where they settled. Some economic historians have traced the decline of Spain after 1500, and the subsequent rise of the Netherlands, in part to the Sephardi commercial talent that was transferred from the one to the other. Centuries later, in England, one could point to such Sephardi eminences as Benjamin Disraeli and the economist David Ricardo.

In sum, I propose that a strong case could be assembled that Jews everywhere had unusually high intellectual resources that manifested themselves outside of Ashkenaz and well before the period when non-rabbinic Ashkenazi accomplishment manifested itself.

322
Science, Culture, & Humanities / The Prisoner's Dilema; Game Theory
« on: April 11, 2007, 05:46:25 PM »
Although this subject could easily be part of the Evolutionary Psychology thread, I give it its own thread because I think it worthy.
============

First, a description of the PD:  

http://en.wikipedia.org/wiki/Prisoner's_dilemma

-------------

Next, an article and a high IQ friend's comments:

Human Nature Redux

By DAVID BROOKS
Published: February 18, 2007
>
Sometimes a big idea fades so imperceptibly from public consciousness you don't even notice until it has almost disappeared. Such is the fate of the belief in natural human goodness.

The Way We Live Now

This belief, most often associated with Jean-Jacques Rousseau, begins with the notion that "everything is good as it leaves the hands of the Author of things; everything degenerates in the hands of man." Human beings are virtuous and free in their natural state. It is only corrupt institutions that make them venal. They are happy in their simplicity, but social conventions make them unwell.

This belief had gigantic ramifications over the years. It led, first of all, to the belief that bourgeois social conventions are repressive and soul-destroying. It contributed to romantic revolts against tradition and etiquette. Whether it was 19th-century Parisian bohemians or 20th-century beatniks and hippies, Western culture has seen a string of antiestablishment rebellions led by people who wanted to shuck off convention and reawaken more natural modes of awareness.

It led people to hit the road, do drugs, form communes and explore free love in order to unleash their authentic selves.

In education, it led to progressive reforms, in which children were liberated to follow their natural instincts. Politically, it led to radical social engineering efforts, because if institutions were the source of sin, then all you had to do was reshape institutions in order to create a New Man.

Therapeutically, it led to an emphasis of feelings over reason, self-esteem over self-discipline. In the realm of foreign policy, it led to a sort of global doctrine of the noble savage - the belief that societies in the colonial world were fundamentally innocent, and once the chains of their oppression were lifted something wonderful would flower.

Over the past 30 years or so, however, this belief in natural goodness has been discarded. It began to lose favor because of the failure of just about every social program that was inspired by it, from the communes to progressive education on up. But the big blow came at the hands of science.

From the content of our genes, the nature of our neurons and the lessons of evolutionary biology, it has become clear that nature is filled with competition and conflicts of interest. Humanity did not come before status contests. Status contests came before humanity, and are embedded deep in human relations. People in hunter-gatherer societies were deadly warriors, not sexually liberated pacifists. As Steven Pinker has put it, Hobbes was more right than Rousseau.

Moreover, human beings are not as pliable as the social engineers imagined. Human beings operate according to preset epigenetic rules, which dispose people to act in certain ways. We strive for dominance and undermine radical egalitarian dreams. We're tribal and divide the world into in-groups and out-groups.

This darker if more realistic view of human nature has led to a rediscovery of different moral codes and different political assumptions. Most people today share what Thomas Sowell calls the Constrained Vision, what Pinker calls the Tragic Vision and what E. O. Wilson calls Existential Conservatism. This is based on the idea that there is a universal human nature; that it has nasty, competitive elements; that we don't understand much about it; and that the conventions and institutions that have evolved to keep us from slitting each other's throats are valuable and are altered at great peril.

Today, parents don't seek to liberate their children; they supervise, coach and instruct every element of their lives. Today, there really is no antinomian counterculture - even the artists and rock stars are bourgeois strivers. Today, communes and utopian schemes are out of favor. People are mostly skeptical of social engineering efforts and jaundiced about revolutionaries who promise to herald a new dawn. Iraq has revealed what human beings do without a strong order-imposing state.

This is a big pivot in intellectual history. The thinkers most associated with the Tragic Vision are Isaiah Berlin, Adam Smith, Edmund Burke, Alexander Hamilton, James Madison, Friedrich Hayek and Hobbes. Many of them are conservative.

And here's another perversity of human nature. Many conservatives resist the theory of evolution even though it confirms many of conservatism's deepest truths.

------------
Interesting article.  I think that using evolutionary psychology to justify the Statist leviathan contains a number of very serious problems.  Politicians should not be seen as altruistic Platonic Guardians; rather, they would simply be ambitious, probably ruthless human beings seeking to maximize their own genetically-mandated fitness criteria, just like everyone else.  The best system would be one that harnessed our self-interested behavior towards value creation at the societal level.  Of course, Adam Smith discussed this a long time ago.

Stephen Quartz, who I believe is still at CalTech, has performed a number of interesting experiments with people placed in game situation with the following rules:  Person A starts the game with $5.  He can decide how much of this to share with Person B.  Whatever he decides to share, that amount will double before it gets to Person B.  Person B can then decide if he wants to give any money back to Person A.  The game can be played with an unknown number of iterative rounds, but it is truly fascinating when both players are told in advance that the game will have, say, 10 rounds of play.  (By the way, the players are complete strangers to one another and do not communicate with each other during the game).

If one knows how many rounds are going to be involved in the game, it is easy to "defect" during that round and keep all of the winnings to oneself.  Knowing this, the other player will defect a round earlier.  And so on and so on...a regression to the first round takes place and the game theoretical solution would end up with Person A simply pocketing the $5 and walking away, operating under the assumption that any money given to Person B will never be seen again.  However, virtually no one actually plays this way.  A typical game begins with Person A making an initial offer of $2.50 to the other player.  If the other player gives the (now $5) same amount back or something close to it, a tentative "trust potential" has been formed.  MRI scans performed on the brains of the players have revealed that blood flow to pleasure centers is enhanced when a cycle of trust has been completed.  We really seem to enjoy cooperative, win-win arrangements when we can find them.  The game generally continues through the full number of rounds, with Person B sharing 50% of the final pot with his new "partner".

The Hobbesian gimmick used by some Socialists to justify their social engineering programs does not reflect the true nature of the human animal, a social primate equipped with an intuitive sense of cooperation and a finely-honed ability to keep track of favors given and received


323
Science, Culture, & Humanities / Men & Women; male and female
« on: March 28, 2007, 09:33:49 AM »
Ex-wife becomes a man; ex-husband seeks end to alimony

CLEARWATER, Florida (AP) -- Lawrence Roach agreed to pay alimony to the woman he divorced, not the man she became after a sex change, his lawyers argued in a Florida court Tuesday in an effort to end the payments.

But the ex-wife's attorneys said the operation does not alter the agreement.

The lawyers and Circuit Judge Jack St. Arnold agreed the case delves into relatively uncharted legal territory. They found only a 2004 Ohio case that addressed whether or not a transsexual could still collect alimony after a sex change.

"There is not a lot out there to help us," St. Arnold said.

Roach and his wife, Julia, divorced in 2004 after 18 years of marriage. The 48-year-old utility worker agreed to pay her $1,250 a month in alimony. Since then, Julia Roach, 55, has had a sex change and legally changed her name to Julio Roberto Silverwolf.

"It's illegal for a man to marry a man, and it should likewise be illegal for a man to pay alimony to a man," said Roach's attorney, John McGuire. "When she changed to a man, I believe she terminated that alimony."
Silverwolf did not appear in court Tuesday and has declined to talk about the divorce. His lawyer, Gregory Nevins, said the language of the divorce decree is clear and firm -- Roach agreed to pay alimony until his ex-wife dies or remarries.

"Those two things haven't happened," said Nevins, a senior staff attorney with the national gay rights group Lambda Legal.

St. Arnold is considering the arguments. But lawyers on both sides agreed Tuesday that Roach will probably have to keep paying alimony to Silverwolf.

The judge poked holes in several of Roach's legal arguments and noted that appeals courts have declined to legally recognize a sex change in Florida when it comes to marriage. The appellate court "is telling us you are what you are when you are born," St. Arnold said.

In the Ohio case, an appeals court ruled in September 2004 that a Montgomery County man must continue to pay $750 a month in alimony to his transsexual ex-wife because her sex change was not reason enough to violate the agreement.

Roach's other attorney, John Smitten, said the case falls into a legal void.
"It's probably something that has to be addressed by the Legislature," Smitten said. "There is one other case in the entire United States. It really needs to be addressed either for or against the concept of eliminating alimony for that reason."

Roach, who has since remarried, said has been unable to convince state and federal lawmakers to tackle the issue. He said he will continue to fight.

"This is definitely wrong. I have a right to move forward with my life. I wish no harm and hardship to that person," Roach said of his ex-wife. "They can be the person they want to be, to find happiness and peace within themselves. I have the right to do the same. But I can't rest because I'm paying a lot of money every month."

The legal fight is the second transsexual rights showdown in Pinellas County in less than a week. On Friday, transsexual activists from around the United States packed a City Commission meeting in neighboring Largo to oppose the firing of City Manager Steve Stanton after he announced he was seeking a sex-change operation.

Despite the support, commissioners voted 5-2 to fire Stanton.

Copyright 2007 The Associated Press.

324
All:

IMHO this subject bears very careful scrutiny.  Certainly there are areas where coordination with our neighbors, Mexico and Canada, is to the benefit of all three of us.  The very real danger though is that there are forces in our government which seek to transcend the limitations of our Constitution (e.g. neutering gun rights via the UN) and our sovereignty (e.g. as the bureaucrats of the EU in Brussels seek to do throughout Europe-- e.g. trying to tell the Irish that they should raise taxes to the higher levels of elsewhere or imposing the unlimited entry of undesired groups of people.

Marc
============================


'Working groups led by DHS should now [be] driven by a single agenda: the SPP'
© 2007 WorldNetDaily.com

A memo signed by Department of Homeland Security Secretary Michael Chertoff implements a controversial program condemned by critics as a precursor to a European Union-style partnership with Mexico and Canada.

The document shows the Security and Prosperity Partnership, or SPP, is being directed at the highest level of the Bush administration, says the public interest group Judicial Watch, which obtained it and other documents through a Freedom of Information Act request.

The Sept. 22, 2005, memo describes the agencies within the Department of Homeland Security responsible for executing the security agenda of the SPP.

Titled "Implementation Memorandum for the (SPP)," the document says the SPP "has, in addition to identifying a number of new action items, comprehensively rolled up most of our existing homeland security-related policy initiatives with Canada and Mexico, and ongoing action and reporting in the various U.S.-Canada and U.S.-Mexico working groups led by DHS should now be driven by a single agenda: the SPP."

"These new records prove the Security and Prosperity Partnership is being directed by officials at the very highest levels of the United States government," said Judicial Watch President Tom Fitton.

Fitton said Americans "should know that the SPP is a core policy initiative for many agencies in our government, including the Department of Homeland Security."

The records obtained by Judicial Watch also contain an information paper describing 10 "Prosperity Pillar Working Groups" and the organization of the "U.S.-Mexico Critical Infrastructure Protection Work Group."

Judicial Watch said that unlike previous records produced by other federal agencies, the DHS records are heavily redacted, blocking out names of the U.S., Mexican and Canadian government officials carrying out the partnership's agenda across all three countries.

The DHS also released a 10-page chart listing 36 "SPP Security High-Level Working Groups" that include the "Mexico-U.S. Repatriation Technical WG," the "Mexico-U.S. Intelligence and Information Sharing WG," and the "Canada-U.S. Cross Border Crime Forum."

In October, as WND reported, about 1,000 documents obtained in a FOIA request to the SPP showed bureaucrats from agencies throughout the Bush administration meeting regularly with their counterparts in the Canadian and Mexican governments to engage in a broad rewriting of U.S. administrative law and regulations.

WND first reported the SPP activity last summer, showing the Bush administration had launched extensive working-group activity to implement a trilateral agreement with Mexico and Canada.

The groups, working under the North American Free Trade Agreement office in the Department of Commerce, are to implement an agreement signed by President Bush, then-Mexican President Vicente Fox and then-Canadian Prime Minister Paul Martin in Waco, Texas, March 23, 2005.

The trilateral agreement, signed as a joint declaration not submitted to Congress for review, led to the creation of the SPP.

An SPP report to the heads of state of the U.S., Mexico and Canada, -- released June 27, 2005 -- lists some 20 different working groups spanning a wide variety of issues ranging from e-commerce, to aviation policy, to borders and immigration, involving the activity of multiple U.S. government agencies.

The working groups have produced a number of memorandums of understanding and trilateral declarations of agreement.

325
Politics & Religion / Islam in Asia & Africa
« on: March 14, 2007, 09:14:36 AM »
THAILAND: Muslim separatists ambushed a bus in Thailand's southern province of Yala, shooting dead nine people at point-blank range. Only the driver and one critically injured Buddhist passenger survived. The militants detonated a small bomb about 1,640 feet from the bus, reportedly to slow police trying to arrive at the scene.

Stratfor.com

326
Science, Culture, & Humanities / Animal farming practices
« on: March 14, 2007, 06:02:48 AM »
I am sympathetic to the points this piece makes and draw attention to the point about anti-biotics.  Not only does this raise questions about developing drug resistant bacteria in order to maximize farmer profits, but it also doses our bodies when we eat these animals, thus reducing beneficial intestinal flora and fomenting bad flora.

=================


Pig Out
By NICOLETTE HAHN NIMAN
Published: March 14, 2007
BOLINAS, Calif.

Jonathon Rosen
WITH some fanfare, the world’s largest pork producer, Smithfield Foods, recently announced that it intended to phase out certain cages for its breeding females. Called gestation crates, the cages virtually immobilize pigs during their pregnancies in metal stalls so narrow they are unable to turn around.

Numerous studies have documented crated sows exhibiting behavior characteristic of humans with severe depression and mental illness. Getting rid of gestation crates (already on their way out in the European Union) is welcome and long overdue, but more action is needed to end inhumane conditions at America’s hog farms.

Of the 60 million pigs in the United States, over 95 percent are continuously confined in metal buildings, including the almost five million sows in crates. In such setups, feed is automatically delivered to animals who are forced to urinate and defecate where they eat and sleep. Their waste festers in large pits a few feet below their hooves. Intense ammonia and hydrogen sulfide fumes from these pits fill pigs’ lungs and sensitive nostrils. No straw is provided to the animals because that would gum up the works (as it would if you tossed straw into your toilet).

In my work as an environmental lawyer, I’ve toured a dozen hog confinement operations and seen hundreds from the outside. My task was to evaluate their polluting potential, which was considerable. But what haunted me was the miserable creatures inside.

They were crowded into pens and cages, never allowed outdoors, and never even provided a soft place to lie down. Their tails had been cut off without anesthetic. Regardless of how well the operations are managed, the pigs subsist in inherently hostile settings. (Disclosure: my husband founded a network of farms that raise pigs using traditional, non-confinement methods.)

The stress, crowding and contamination inside confinement buildings foster disease, especially respiratory illnesses. In addition to toxic fumes, bacteria, yeast and molds have been recorded in swine buildings at a level more than 1,000 times higher than in normal air. To prevent disease outbreaks (and to stimulate faster growth), the hog industry adds more than 10 million pounds of antibiotics to its feed, the Union of Concerned Scientists estimates. This mountain of drugs — a staggering three times more than all antibiotics used to treat human illnesses — is a grim yardstick of the wretchedness of these facilities.

There are other reasons that merely phasing out gestation crates does not go nearly far enough. Keeping animals in such barren environments is a serious deprivation. Pigs in nature are active, curious creatures that typically spend 10 hours a day foraging, rooting and roaming.

Veterinarians consider pigs as smart as dogs. Imagine keeping a dog in a tight cage or crowded pen day after day with absolutely nothing to chew on, play with or otherwise occupy its mind. Americans would universally denounce that as inhumane. Extreme boredom is considered the main reason pigs in confinement are prone to biting one another’s tails and engaging in other aggressive behavior.

Finally, even if the gestation crate is abandoned, pork producers will still keep a sow in a narrow metal cage once she gives birth to her piglets. This slightly larger cage, called a farrowing crate, severely restricts a sow’s movements and makes normal interactions between mother and piglets impossible.

Because confinement buildings are far from cities and lack windows, all of this is shielded from public view. But such treatment of pigs contrasts sharply with what people say they want for farm animals. Surveys consistently find that Americans believe all animals, including those raised for food, deserve humane treatment. A 2004 survey by Ohio State University found that 81 percent of respondents felt that the well-being of livestock is as important as that of pets.

Such sentiment was behind the widely supported Humane Slaughter Act of 1958, which sought to improve treatment of cattle and hogs at slaughterhouses. But it’s clear that Americans expect more — they want animals to be humanely treated throughout their lives, not just at slaughter. To ensure this, Congress should ban gestation crates altogether and mandate that animal anti-cruelty laws be applied to farm animals.

As a cattle rancher, I am comfortable raising animals for human consumption, but they should not be made to suffer. Because we ask the ultimate sacrifice of these creatures, it is incumbent on us to ensure that they have decent lives. Let us view the elimination of gestation crates as just a small first step in the right direction.


Nicolette Hahn Niman, a lawyer and cattle rancher, is writing a book about the meat industry.

327
http://news.yahoo.com/s/ap/20070311/ap_en_mo/film_manufacturing_dissent

By CHRISTY LEMIRE, AP Movie Writer
Sun Mar 11, 6:02 PM ET
 


AUSTIN, Texas - As documentary filmmakers, Debbie Melnyk and Rick Caine looked up to        Michael Moore.

ADVERTISEMENT
 
Then they tried to do a documentary of their own about him — and ran into the same sort of resistance Moore himself famously faces in his own films.

The result is "Manufacturing Dissent," which turns the camera on the confrontational documentarian and examines some of his methods. Among their revelations in the movie, which had its world premiere Saturday night at the South by Southwest film festival: That Moore actually did speak with then-General Motors chairman Roger Smith, the evasive subject of his 1989 debut "Roger & Me," but chose to withhold that footage from the final cut.

The husband-and-wife directors spent over two years making the movie, which follows Moore on his college tour promoting 2004's "Fahrenheit 9/11." The film shows Melnyk repeatedly approaching Moore for an interview and being rejected; members of Moore's team also kick the couple out of the audience at one of his speeches, saying they weren't allowed to be shooting there.

At their own premiere Saturday night, the Toronto-based filmmakers expected pro-Moore plants in the audience heckling or trying to otherwise sabotage the screening, but it turned out to be a tame affair.

"It went really well," Melnyk said. "People really liked the film and laughed at the right spots and got the movie and we're really happy about it."

Moore hasn't commented publicly on "Manufacturing Dissent" and Melnyk thinks he never will. He also hasn't responded to several calls and e-mails from The Associated Press.

"There's no point for Michael to respond to the film because then it gives it publicity," she said.

"(President) Bush didn't respond to `Fahrenheit 9/11,' and there's a reason for that," Caine added.

The two were and still are fans of all his movies — including the polarizing "Fahrenheit 9/11," which grossed over $119 million and won the Palme d'Or at the        Cannes Film Festival — and initially wanted to do a biography on him. They traveled to his childhood home of Davison, Mich., visited his high school and traced his early days in politics and journalism.

"The fact that he made documentaries entertaining was extremely influential and got all kinds of people out to see them," said Melnyk, whose previous films with Caine include 1998's "Junket Whore." "Let's face it, he made documentaries popular and that is great for all documentary filmmakers."

"All of these films — `Super Size Me,' `An Inconvenient Truth' — we've all been riding in his wake," said Caine. "There's a nonfiction film revolution going on and we're all beneficiaries of that. For that point alone, he's worth celebrating."

But after four months of unsuccessfully trying to sit down with Moore for an on-camera interview, they realized they needed to approach the subject from a different angle. They began looking at the process Moore employs in his films, and the deeper they dug, the more they began to question him.

The fact that Moore spoke with Smith, including a lengthy question-and-answer exchange during a May 1987 GM shareholders meeting, first was reported in a Premiere magazine article three years later. Transcripts of the discussion had been leaked to the magazine, and a clip of the meeting appeared in "Manufacturing Dissent." Moore also reportedly interviewed Smith on camera in January 1988 at the Waldorf Astoria hotel in New York.

Since then, in the years since "Roger & Me" put Moore on the map, those details seem to have been suppressed and forgotten.

"It was shocking, because to me that was the whole premise of `Roger & Me,'" Melnyk said.

She and Caine also had trouble finding people to talk on camera about Moore, partly because potential interview subjects assumed they were creating a right-wing attack piece; as self-proclaimed left-wingers, they weren't.

Despite what they've learned, the directors still appreciate Moore.

"We're a bit disappointed and disillusioned with Michael," Melnyk said, "but we are still very grateful to him for putting documentaries out there in a major way that people can go to a DVD store and they're right up there alongside dramatic features."

328
Science, Culture, & Humanities / Physics & Mathematics
« on: March 11, 2007, 07:27:20 AM »
Dark Energy
NY Times

Three days after learning that he won the 2006 Nobel Prize in Physics, George Smoot was talking about the universe. Sitting across from him in his office at the University of California, Berkeley, was Saul Perlmutter, a fellow cosmologist and a probable future Nobelist in Physics himself. Bearded, booming, eyes pinwheeling from adrenaline and lack of sleep, Smoot leaned back in his chair. Perlmutter, onetime acolyte, longtime colleague, now heir apparent, leaned forward in his.

“Time and time again,” Smoot shouted, “the universe has turned out to be really simple.”

Perlmutter nodded eagerly. “It’s like, why are we able to understand the universe at our level?”

“Right. Exactly. It’s a universe for beginners! ‘The Universe for Dummies’!”

But as Smoot and Perlmutter know, it is also inarguably a universe for Nobelists, and one that in the past decade has become exponentially more complicated. Since the invention of the telescope four centuries ago, astronomers have been able to figure out the workings of the universe simply by observing the heavens and applying some math, and vice versa. Take the discovery of moons, planets, stars and galaxies, apply Newton’s laws and you have a universe that runs like clockwork. Take Einstein’s modifications of Newton, apply the discovery of an expanding universe and you get the big bang. “It’s a ridiculously simple, intentionally cartoonish picture,” Perlmutter said. “We’re just incredibly lucky that that first try has matched so well.”

But is our luck about to run out? Smoot’s and Perlmutter’s work is part of a revolution that has forced their colleagues to confront a universe wholly unlike any they have ever known, one that is made of only 4 percent of the kind of matter we have always assumed it to be — the material that makes up you and me and this magazine and all the planets and stars in our galaxy and in all 125 billion galaxies beyond. The rest — 96 percent of the universe — is ... who knows?

“Dark,” cosmologists call it, in what could go down in history as the ultimate semantic surrender. This is not “dark” as in distant or invisible. This is “dark” as in unknown for now, and possibly forever.

If so, such a development would presumably not be without philosophical consequences of the civilization-altering variety. Cosmologists often refer to this possibility as “the ultimate Copernican revolution”: not only are we not at the center of anything; we’re not even made of the same stuff as most of the rest of everything. “We’re just a bit of pollution,” Lawrence M. Krauss, a theorist at Case Western Reserve, said not long ago at a public panel on cosmology in Chicago. “If you got rid of us, and all the stars and all the galaxies and all the planets and all the aliens and everybody, then the universe would be largely the same. We’re completely irrelevant.”

All well and good. Science is full of homo sapiens-humbling insights. But the trade-off for these lessons in insignificance has always been that at least now we would have a deeper — simpler — understanding of the universe. That the more we could observe, the more we would know. But what about the less we could observe? What happens to new knowledge then? It’s a question cosmologists have been asking themselves lately, and it might well be a question we’ll all be asking ourselves soon, because if they’re right, then the time has come to rethink a fundamental assumption: When we look up at the night sky, we’re seeing the universe.

Not so. Not even close.

In 1963, two scientists at Bell Labs in New Jersey discovered a microwave signal that came from every direction of the heavens. Theorists at nearby Princeton University soon realized that this signal might be the echo from the beginning of the universe, as predicted by the big-bang hypothesis. Take the idea of a cosmos born in a primordial fireball and cooling down ever since, apply the discovery of a microwave signal with a temperature that corresponded precisely to the one that was predicted by theorists — 2.7 degrees above absolute zero — and you have the universe as we know it. Not Newton’s universe, with its stately, eternal procession of benign objects, but Einstein’s universe, violent, evolving, full of births and deaths, with the grandest birth and, maybe, death belonging to the cosmos itself.

But then, in the 1970s, astronomers began noticing something that didn’t seem to fit with the laws of physics. They found that spiral galaxies like our own Milky Way were spinning at such a rate that they should have long ago wobbled out of control, shredding apart, shedding stars in every direction. Yet clearly they had done no such thing. They were living fast but not dying young. This seeming paradox led theorists to wonder if a halo of a hypothetical something else might be cocooning each galaxy, dwarfing each flat spiral disk of stars and gas at just the right mass ratio to keep it gravitationally intact. Borrowing a term from the astronomer Fritz Zwicky, who detected the same problem with the motions of a whole cluster of galaxies back in the 1930s, decades before anyone else took the situation seriously, astronomers called this mystery mass “dark matter.”

================



Page 2 of 6)



So there was more to the universe than meets the eye. But how much more? This was the question Saul Perlmutter’s team at Lawrence Berkeley National Laboratory set out to answer in the late 1980s. Actually, they wanted to settle an issue that had been nagging astronomers ever since Edwin Hubble discovered in 1929 that the universe seems to be expanding. Gravity, astronomers figured, would be slowing the expansion, and the more matter the greater the gravitational effect. But was the amount of matter in the universe enough to slow the expansion until it eventually stopped, reversed course and collapsed in a backward big bang? Or was the amount of matter not quite enough to do this, in which case the universe would just go on expanding forever? Just how much was the expansion of the universe slowing down?

The tool the team would be using was a specific type of exploding star, or supernova, that reaches a roughly uniform brightness and so can serve as what astronomers call a standard candle. By comparing how bright supernovae appear and how much the expansion of the universe has shifted their light, cosmologists sought to determine the rate of the expansion. “I was trying to tell everybody that this is the measurement that everybody should be doing,” Perlmutter says. “I was trying to convince them that this is going to be the tool of the future.” Perlmutter talks like a microcassette on fast-forward, and he possesses the kind of psychological dexterity that allows him to walk into a room and instantly inhabit each person’s point of view. He can be as persuasive as any force of nature. “The next thing I know,” he says, “we’ve convinced people, and now they’re competing with us!”

By 1997, Perlmutter’s Supernova Cosmology Project and a rival team had amassed data from more than 50 supernovae between them — data that would reveal yet another oddity in the cosmos. Perlmutter noticed that the supernovae weren’t brighter than expected but dimmer. He wondered if he had made a mistake in his observations. A few months later, Adam Riess, a member of a rival international team, noticed the same general drift in his math and wondered the same thing. “I’m a postdoc,” he told himself. “I’m sure I’ve messed up in at least 10 different ways.” But Perlmutter double-checked for intergalactic dust that might have skewed his readings, and Riess cross-checked his math, calculation by calculation, with his team leader, Brian Schmidt. Early in 1998, the two teams announced that they had each independently reached the same conclusion, and it was the opposite of what either of them expected. The rate of the expansion of the universe was not slowing down. Instead, it seemed to be speeding up.

That same year, Michael Turner, the prominent University of Chicago theorist, delivered a paper in which he called this antigravitational force “dark energy.” The purpose of calling it “dark,” he explained recently, was to highlight the similarity to dark matter. The purpose of “energy” was to make a distinction. “It really is very different from dark matter,” Turner said. “It’s more energylike.”

More energylike how, exactly?

Turner raised his eyebrows. “I’m not embarrassed to say it’s the most profound mystery in all of science.”

Extraordinary claims,” Carl Sagan once said, “require extraordinary evidence.” Astronomers love that saying; they quote it all the time. In this case the claim could have hardly been more extraordinary: a new universe was dawning.

It wouldn’t be the first time. We once thought the night sky consisted of the several thousand objects we could see with the naked eye. But the invention of the telescope revealed that it didn’t, and that the farther we saw, the more we saw: planets, stars, galaxies. After that we thought the night sky consisted of only the objects the eye could see with the assistance of telescopes that reached all the way back to the first stars blinking to life. But the discovery of wavelengths beyond the optical revealed that it didn’t, and that the more we saw in the radio or infrared or X-ray parts of the electromagnetic spectrum, the more we discovered: evidence for black holes, the big bang and the distances of supernovae, for starters.

=====================

(Page 3 of 6)



The difference with “dark,” however, is that it lies not only outside the visible but also beyond the entire electromagnetic spectrum. By all indications, it consists of data that our five senses can’t detect other than indirectly. The motions of galaxies don’t make sense unless we infer the existence of dark matter. The brightness of supernovae doesn’t make sense unless we infer the existence of dark energy. It’s not that inference can’t be a powerful tool: an apple falls to the ground, and we infer gravity. But it can also be an incomplete tool: gravity is ... ?

Dark matter is ... ? In the three decades since most astronomers decisively, if reluctantly, accepted the existence of dark matter, observers have eliminated the obvious answer: that dark matter is made of normal matter that is so far away or so dim that it can’t be seen from earth. To account for the dark-matter deficit, this material would have to be so massive and so numerous that we couldn’t possibly miss it.

Which leaves abnormal matter, or what physicists call nonbaryonic matter, meaning that it doesn’t consist of the protons and neutrons of “normal” matter. What’s more (or, perhaps more accurately, less), it doesn’t interact at all with electricity or magnetism, which is why we wouldn’t be able to see it, and it can rarely interact even with protons and neutrons, which is why trillions of these particles might be passing through you every second without your knowing it. Theorists have narrowed the search for dark-matter particles to two hypothetical candidates: the axion and the neutralino. But so far efforts to create one of these ghostly particles in accelerators, which mimic the high levels of energy in the first fraction of a second after the birth of the universe, have come up empty. So have efforts to catch one in ultrasensitive detectors, which number in the dozens around the world.

For now, dark-matter physicists are hanging their hopes on the Large Hadron Collider, the latest-generation subatomic-particle accelerator, which goes online later this year at the European Center for Nuclear Research on the Franco-Swiss border. Many cosmologists think that the L.H.C. has made the creation of a dark-matter particle — as George Smoot said, holding up two fingers — “this close.” But one of the pioneer astronomers investigating dark matter in the 1970s, Vera Rubin, says that she has lived through plenty of this kind of optimism; she herself predicted in 1980 that dark matter would be identified within a decade. “I hope he’s right,” she says of Smoot’s assertion. “But I think it’s more a wish than a belief.” As one particle physicist commented at a “Dark Universe” symposium at the Space Telescope Science Institute in Baltimore a few years ago, “If we fail to see anything in the L.H.C., then I’m off to do something else,” adding, “Unfortunately, I’ll be off to do something else at the same time as hundreds of other physicists.”

Juan Collar might be among them. “I know I speak for a generation of people who have been looking for dark-matter particles since they were grad students,” he said one wintry afternoon in his University of Chicago office. “I doubt how many of us will remain in the field if the L.H.C. brings home bad news. I have been looking for dark-matter particles for more than 15 years. I’m 42. So most of my colleagues, my age, we are kind of going through a midlife crisis.” He laughed. “When we get together and we drink enough beer, we start howling at the moon.”

Although many scientists say that the existence of the axion will be proved or disproved within the next 10 years — as a result of work at Lawrence Livermore National Laboratory — the detection of a neutralino one way or the other is much less certain. A negative result from an experiment might mean only that theorists haven’t thought hard enough or that observers haven’t looked deep enough. “It could very well be that Mother Nature has decided that the neutralino is way down there,” Collar said, pointing not to a graph that he taped up in his office but to a point below the sheet of paper itself, at the blank wall. “If that is the case,” he went on to say, “we should retreat and worship Mother Nature. These particles maybe exist, but we will not see them, our sons will not see them and their sons won’t see them.”


329
Politics & Religion / Nuclear War, Germ War, Bio War, Chem War, WMD
« on: March 08, 2007, 10:14:38 AM »
This article raises some very important and very scary questions.  Comments?
================

The Words None Dare Say: Nuclear War
By George Lakoff www.informationclearinghouse.info/article17220.htm


"The elimination of Natanz would be a major setback for Iran's nuclear ambitions, but the conventional weapons in the American arsenal could not insure the destruction of facilities under seventy-five feet of earth and rock, especially if they are reinforced with concrete. "-Seymour Hersh, The New Yorker, April 17, 2006
"The second concern is that if an underground laboratory is deeply buried, that can also confound conventional weapons. But the depth of the Natanz facility - reports place the ceiling roughly 30 feet underground - is not prohibitive. The American GBU-28 weapon - the so-called bunker buster - can pierce about 23 feet of concrete and 100 feet of soil. Unless the cover over the Natanz lab is almost entirely rock, bunker busters should be able to reach it. That said, some chance remains that a single strike would fail. " - Michael Levi, New York Times, April 18, 2006
 

03/01/07 "ich" -- -  A familiar means of denying a reality is to refuse to use the words that describe that reality. A common form of propaganda is to keep reality from being described.

In such circumstances, silence and euphemism are forms of complicity both in propaganda and in the denial of reality. And the media, as well as the major presidential candidates, are now complicit.

The stories in the major media suggest that an attack against Iran is a real possibility and that the Natanz nuclear development site is the number one target. As the above quotes from two of our best sources note, military experts say that conventional "bunker-busters" such as the GBU-28 might be able to destroy the Natanz facility, especially with repeated bombings. On the other hand, they also say such iterated use of conventional weapons might not work, e.g., if the rock and earth above the facility becomes liquefied. On that supposition, a "low yield" "tactical" nuclear weapon, say, the B61-11, might be needed.

If the Bush administration, for example, were to insist on a sure "success," then the "attack" would constitute nuclear war. The words in boldface are nuclear war, that's right, nuclear war - a first strike nuclear war.

We don't know what exactly is being planned - conventional GBU-28s or nuclear B61-11s. And that is the point. Discussion needs to be open. Nuclear war is not a minor matter.

The Euphemism

As early as August 13, 2005, Bush, in Jerusalem, was asked what would happen if diplomacy failed to persuade Iran to halt its nuclear program. Bush replied, "All options are on the table." On April 18, the day after the appearance of Seymour Hersh's New Yorker report on the administration's preparations for a nuclear war against Iran, President Bush held a news conference. He was asked,

"Sir, when you talk about Iran, and you talk about how you have diplomatic efforts, you also say all options are on the table. Does that include the possibility of a nuclear strike? Is that something that your administration will plan for?"

He replied,

"All options are on the table."

The President never actually said the forbidden words "nuclear war," but he appeared to tacitly acknowledge the preparations - without further discussion.

Vice-President Dick Cheney, speaking in Australia last week, backed up the President .

"We worked with the European community and the United Nations to put together a set of policies to persuade the Iranians to give up their aspirations and resolve the matter peacefully, and that is still our preference. But I've also made the point, and the president has made the point, that all options are on the table."

Republican Presidential Candidate John McCain, on FOX News, August 14, 2005, said the same .

"For us to say that the Iranians can do whatever they want to do and we won't under any circumstances exercise a military option would be for them to have a license to do whatever they want to do ... So I think the president's comment that we won't take anything off the table was entirely appropriate."

But it's not just Republicans. Democratic Presidential candidate John Edwards, in a speech in Herzliyah, Israel, echoed Bush.

"To ensure that Iran never gets nuclear weapons, we need to keep ALL options on the table. Let me reiterate - ALL options must remain on the table."

Although, Edwards has said, when asked about this statement, that he prefers peaceful solutions and direct negotiations with Iran, he has nonetheless repeated the "all options on the table" position - making clear that he would consider starting a preventive nuclear war, but without using the fateful words.

Hillary Clinton, at an AIPAC dinner in New York, said,

"We cannot, we should not, we must not, permit Iran to build or acquire nuclear weapons, and in dealing with this threat, as I have said for a very long time, no option can be taken off the table."

Translation: Nuclear weapons can be used to prevent the spread of nuclear weapons.

Barack Obama, asked on 60 Minutes about using military force to prevent Iran from developing nuclear weapons, began a discussion of his preference for diplomacy by responding, "I think we should keep all options on the table."

Bush, Cheney, McCain, Edwards, Clinton, and Obama all say indirectly that they seriously consider starting a preventive nuclear war, but will not engage in a public discussion of what that would mean. That contributes to a general denial, and the press is going along with it by a corresponding refusal to use the words.

If the consequences of nuclear war are not discussed openly, the war may happen without an appreciation of the consequences and without the public having a chance to stop it. Our job is to open that discussion.

Of course, there is a rationale for the euphemism: To scare our adversaries by making them think that we are crazy enough to do what we hint at, while not raising a public outcry. That is what happened in the lead up to the Iraq War, and the disaster of that war tells us why we must have such a discussion about Iran. Presidential candidates go along, not wanting to be thought of as interfering in on-going indirect diplomacy. That may be the conventional wisdom for candidates, but an informed, concerned public must say what candidates are advised not to say.

More Euphemisms

The euphemisms used include "tactical," "small," "mini-," and "low yield" nuclear weapons. "Tactical" contrasts with "strategic"; it refers to tactics, relatively low-level choices made in carrying out an overall strategy, but which don't affect the grand strategy. But the use of any nuclear weapons would be anything but "tactical." It would be a major world event - in Vladimir Putin's words, "lowering the threshold for the use of nuclear weapons," making the use of more powerful nuclear weapons more likely and setting off a new arms race. The use of the word "tactical" operates to lessen their importance, to distract from the fact that their very use would constitute a nuclear war.

What is "low yield"? Perhaps the "smallest" tactical nuclear weapon we have is the B61-11, which has a dial-a-yield feature: it can yield "only" 0.3 kilotons, but can be set to yield up to 170 kilotons. The power of the Hiroshima bomb was 15 kilotons. That is, a "small" bomb can yield more than 10 times the explosive power of the Hiroshima bomb. The B61-11 dropped from 40,000 feet would dig a hole 20 feet deep and then explode, send shock waves downward, leave a huge crater, and spread radiation widely. The idea that it would explode underground and be harmless to those above ground is false - and, anyway, an underground release of radiation would threaten ground water and aquifers for a long time and over a wide distance.

To use words such as "low yield" or "small" or "mini-" nuclear weapon is like speaking of being a little bit pregnant. Nuclear war is nuclear war! It crosses the moral line.

Any discussion of roadside canister bombs made in Iran justifying an attack on Iran should be put in perspective: Little canister bombs (EFPs - explosively formed projectiles) that shoot a small hot metal ball at a humvee or tank versus nuclear war.

Incidentally, the administration may be focusing on the canister bombs because it seeks to claim that the Authorization for Use of Military Force Against Iraq Resolution of 2002 permits the use of military force against Iran based on its interference in Iraq. In that case, no further authorization by Congress would be needed for an attack on Iran.

The journalistic point is clear. Journalists and political leaders should not talk about an "attack." They should use the words that describe what is really at stake: nuclear war - in boldface.

Then there is the scale of the proposed attack. Military reports leaking out suggest a huge (mostly or entirely non-nuclear) airstrike on as many as 10,000 targets - a "shock and awe" attack that would destroy Iran's infrastructure the way the U.S. bombing destroyed Iraq's infrastructure. The targets would not just be "military targets." As Dan Plesch reports in the New Statesman, February 19, 2007, such an attack would wipe out Iran's military, business, and political infrastructure. Not just nuclear installations, missile launching sites, tanks, and ammunition dumps, but also airports, rail lines, highways, bridges, ports, communications centers, power grids, industrial centers, hospitals, public buildings, and even the homes of political leaders. That is what was attacked in Iraq: the "critical infrastructure." It is not just military in the traditional sense. It leaves a nation in rubble, and leads to death, maiming, disease, joblessness, impoverishment, starvation, mass refugees, lawlessness, rape, and incalculable pain and suffering. That is what the options appear to be "on the table." Is nation destruction what the American people have in mind when they acquiesce without discussion to an "attack"? Is nuclear war what the American people have in mind? An informed public must ask and the media must ask. The words must be used.

Even if the attack were limited to nuclear installations, starting a nuclear war with Iran would have terrible consequences - and not just for Iranians. First, it would strengthen the hand of the Islamic fundamentalists - exactly the opposite of the effect U.S. planners would want. It would be viewed as yet another major attack on Islam. Fundamentalist Islam is a revenge culture. If you want to recruit fundamentalist Islamists all over the world to become violent jihadists, this is the best way to do it. America would become a world pariah. Any idea of the U.S. as a peaceful nation would be destroyed. Moreover, you don't work against the spread of nuclear weapons by using those weapons. That will just make countries all over the world want nuclear weaponry all the more. Trying to stop nuclear proliferation through nuclear war is self-defeating.

As Einstein said, "You cannot simultaneously prevent and prepare for war."

Why would the Bush administration do it? Here is what conservative strategist William Kristol wrote last summer during Israel's war with Hezbollah.

"For while Syria and Iran are enemies of Israel, they are also enemies of the United States. We have done a poor job of standing up to them and weakening them. They are now testing us more boldly than one would have thought possible a few years ago. Weakness is provocative. We have been too weak, and have allowed ourselves to be perceived as weak.

The right response is renewed strength -- in supporting the governments of Iraq and Afghanistan, in standing with Israel, and in pursuing regime change in Syria and Iran. For that matter, we might consider countering this act of Iranian aggression with a military strike against Iranian nuclear facilities. Why wait? Does anyone think a nuclear Iran can be contained? That the current regime will negotiate in good faith? It would be easier to act sooner rather than later. Yes, there would be repercussions -- and they would be healthy ones, showing a strong America that has rejected further appeasement."

-Willam Kristol, Weekly Standard 7/24/06

"Renewed strength" is just the Bush strategy in Iraq. At a time when the Iraqi people want us to leave, when our national elections show that most Americans want our troops out, when 60% of Iraqis think it all right to kill Americans, Bush wants to escalate. Why? Because he is weak in America. Because he needs to show more "strength." Because if he knocks out the Iranian nuclear facilities, he can claim at least one "victory." Starting a nuclear war with Iran would really put us in a worldwide war with fundamentalist Islam. It would make real the terrorist threat he has been claiming since 9/11. It would create more fear - real fear - in America. And he believes, with much reason, that fear tends to make Americans vote for saber-rattling conservatives.

Kristol's neoconservative view that "weakness is provocative" is echoed in Iran, but by the other side. Mahmoud Ahmadinejad was quoted in The New York Times of February 24, 2007 as having "vowed anew to continue enriching uranium, saying, 'If we show weakness in front of the enemies, they will increase their expectations.'" If both sides refuse to back off for fear of showing weakness, then prospects for conflict are real, despite the repeated analyses, like that of The Economist that the use of nuclear weapons against Iran would be politically and morally impossible. As one unnamed administration official has said (The New York Times, February 24, 2007), "No one has defined where the red line is that we cannot let the Iranians step over."

What we are seeing now is the conservative message machine preparing the country to accept the ideas of a nuclear war and nation destruction against Iran. The technique used is the "slippery slope." It is done by degrees. Like the proverbial frog in the pot of water - if the heat is turned up slowly the frog gets used to the heat and eventually boils to death - the American public is getting gradually acclimated to the idea of war with Iran.

* First, describe Iran as evil - part of the axis of evil. An inherently evil person will inevitably do evil things and can't be negotiated with. An entire evil nation is a threat to other nations.
* Second, describe Iran's leader as a "Hitler" who is inherently "evil" and cannot be reasoned with. Refuse to negotiate with him.
* Then repeat the lie that Iran is on the verge of having nuclear weapons - weapons of mass destruction. IAEA Director General Mohamed ElBaradei says they are at best many years away.
* Call nuclear development "an existential threat" - a threat to our very existence.
* Then suggest a single "surgical" "attack" on Natanz and make it seem acceptable.
* Then find a reason to call the attack "self-defense" - or better protection for our troops from the EFPs, or single-shot canister bombs.
* Claim, without proof and without anyone even taking responsibility for the claim, that the Iranian government at its highest level is supplying deadly weapons to Shiite militias attacking our troops, while not mentioning the fact that Saudi Arabia is helping Sunni insurgents attacking our troops.
* Give "protecting our troops" as a reason for attacking Iran without getting new authorization from Congress. Claim that the old authorization for attacking Iraq implied doing "whatever is necessary to protect our troops" from Iranian intervention in Iraq.
* Argue that de-escalation in Iraq would "bleed" our troops, "weaken" America, and lead to defeat. This sets up escalation as a winning policy, if not in Iraq then in Iran.
* Get the press to go along with each step.
* Never mention the words "preventive nuclear war" or "national destruction." When asked, say, "All options are on the table." Keep the issue of nuclear war and its consequences from being seriously discussed by the national media.
* Intimidate Democratic presidential candidates into agreeing, without using the words, that nuclear war should be "on the table." This makes nuclear war and nation destruction bipartisan and even more acceptable.

Progressives managed to blunt the "surge" idea by telling the truth about "escalation." Nuclear war against Iran and nation destruction constitute the ultimate escalation.

The time has come to stop the attempt to make a nuclear war against Iran palatable to the American public. We do not believe that most Americans want to start a nuclear war or to impose nation destruction on the people of Iran. They might, though, be willing to support a tit-for-tat "surgical" "attack" on Natanz in retaliation for small canister bombs and to end Iran's early nuclear capacity.

It is time for America's journalists and political leaders to put two and two together, and ask the fateful question: Is the Bush administration seriously preparing for nuclear war and nation destruction? If the conventional GBU-28s will do the job, then why not take nuclear war off the table in the name of controlling the spread of nuclear weapons? If GBU-28s won't do the job, then it is all the more important to have that discussion.

This should not be a distraction from Iraq. The general issue is escalation as a policy, both in Iraq and in Iran. They are linked issues, not separate issues. We have learned from Iraq what lack of public scrutiny does.

George Lakoff is a Senior Fellow at the Rockridge Institute. Lakoff is Professor of Linguistics at the University of California, Berkeley.

330
March 7, 2007



By RANDAL C. ARCHIBOLD
TOHONO O'ODHAM NATION, Ariz. - A fresh footprint in the dirt, fibers in the
mesquite. Harold Thompson reads the signs like a map.
They point to drug smugglers, 10 or 11, crossing from Mexico. The deep
impressions and spacing are a giveaway to the heavy loads on their backs.
With no insect tracks or paw prints of nocturnal creatures marking the
steps, Mr. Thompson determines the smugglers probably crossed a few hours
ago.
"These guys are not far ahead; we'll get them," said Mr. Thompson, 50, a
strapping Navajo who follows the trail like a bloodhound.
At a time when all manner of high technology is arriving to help beef up
security at the Mexican border - infrared cameras, sensors, unmanned
drones - there is a growing appreciation among the federal authorities for
the American Indian art of tracking, honed over generations by ancestors
hunting animals.
Mr. Thompson belongs to the Shadow Wolves, a federal law enforcement unit of
Indian officers that has operated since the early 1970s on this vast Indian
nation straddling the Mexican border.
Tracking skills are in such demand that the Departments of State and Defense
have arranged for the Shadow Wolves to train border guards in other
countries, including some central to the fight against terrorism. Several
officers are going to train border police in Tajikistan and Uzbekistan,
which border Afghanistan, and in several other countries.
In the renewed push to secure the border with Mexico, the curbing of
narcotics trafficking often gets less public attention than the capturing of
illegal immigrants.
But the 15-member Shadow Wolves unit, part of Immigration and Customs
Enforcement, is recruiting members to reach the congressionally authorized
complement of 21. And the immigration agency is considering forming a sister
unit to patrol part of the Canadian border at the Blackfeet reservation in
Montana, where concern about drug trafficking is growing.
"Detecting is one thing, and apprehending is something entirely different,"
said Rodney Irby, a special agent in Tucson for the immigration agency who
helps supervise the Shadow Wolves. "I applaud the technology; it will only
make the border more secure. But there are still going to be groups of
people who penetrate the most modern technology, and we need a cadre of
agents and officers to apprehend them."
The Shadow Wolves have seized nearly 30,000 pounds of illegal drugs since
October, putting them on pace to meet or exceed previous annual seizure
amounts. They routinely seize some 100,000 pounds of illegal drugs a year,
Mr. Irby said.
They home in on drug smugglers, who use less-traveled cattle tracks, old
wagon-wheel trails and barely formed footpaths to ferry their loads to roads
and highways about 40 miles from the border.
The Tohono land, which is the size of Connecticut and the third-largest
reservation in area in the country, has long vexed law enforcement. Scores
of people die crossing here every year in the searing, dry heat of summer or
the frigid cold of winter. And its 76-mile-long border with Mexico, marked
in most places with a three- or four-strand barbed-wire fence that is easy
to breach, is a major transshipment point for marijuana, Mexico's largest
illicit crop.
Adding to the challenge is that drug smugglers have enlisted tribal members
or forced them into cooperation, sometimes stashing their loads in the
ramshackle houses dotting the landscape or paying the young to act as
guides. Several tribal members live on the Mexican side, and those on the
American side have long freely crossed the border, which they usually do
through a few informal entry points that drug traffickers, too, have picked
up on.
How much the Shadow Wolves disrupt the criminal organizations is debated.
Officials said they believed the group's work at least complicated drug
smuggling operations - the Shadow Wolves have received death threats over
the years - but they said they could not estimate the amount of drugs making
it through.
Marvin Eleando, a Tohono who retired from the unit in 2004, said he believed
the Shadow Wolves got just a small fraction of the drugs moving through the
Tohono lands. Mr. Eleando estimated it would take about 100 Shadow Wolves to
truly foil the smugglers, who employ spotters on mountaintops who watch for
officers and then shift routes accordingly.
Still, he said, the unit must keep up the effort because the drugs, and the
gun violence often associated with trafficking, imperil tribal members.
"The kids get mixed up in this and then don't want to work anymore," Mr.
Eleando said.
Lately, according to the Border Patrol and Immigration and Customs
Enforcement, drug seizures in Arizona, and especially around the reservation
and the Tucson area, have surged, and the size of the loads found has
increased.
Officials said it was too soon to tell whether the uptick signaled a
long-term pattern. But they believed it could be partly explained by the
additional staffing on the border. Law enforcement officials said that there
also appeared to be a bumper crop of marijuana in Mexico and that smugglers
seemed to be trying to ship tons of it ahead of government crackdowns there.
"We never know how much is being pushed in our direction," said David V.
Aguilar, the chief of the Border Patrol, though he added that it seemed the
amount was "higher at this point."
Alonzo Peña, the agent in charge of Immigration and Customs Enforcement in
Arizona, said investigators had many theories but little concrete
information to explain the increase in trafficking.
"Is this marijuana that has been sitting in warehouses, and they are trying
to get rid of it now that there is a strong hand in Mexico?" Mr. Peña said.
"We just don't know other than that we are seeing more loads and bigger
loads in many areas."
The Shadow Wolves, established with a handful of officers in 1972 as part of
what was then the United States Customs Service, were the first federal law
enforcement officers allowed on Tohono land.
The federal government agreed to the Tohono O'odham Nation's demand that the
officers have American Indian ancestry, a requirement still in place.
Members are at least one-quarter Indian, and the current group represents
seven tribes, including the Tohono.
While other law enforcement agencies, including the Border Patrol, use
tracking, the Shadow Wolves believe that their experience and their Indian
ancestry give them an edge, particularly here.
"I speak the language, so when we are dealing with elderly members in
particular I can make them more comfortable," said Gary Ortega, a Tohono who
has been in the Shadow Wolves for nine years. "They are willing to tell us
things they know or see that they may not tell another federal agent or
officer."
There is also, of course, the thrill of the hunt.
On a recent day, Mr. Thompson picked up the track around 3 a.m. and, with
Mr. Ortega, stayed on it for nearly 12 hours through thorny thickets and
wide-open desert. As the terrain grew craggy, Mr. Thompson kept a brisk
pace, with Mr. Ortega and other officers leapfrogging ahead to help find the
trail.
"Every chase is just a little different," Mr. Ortega said, barely pausing as
he followed the prints in the sand.
It grew easier as the sun rose and the smugglers kept bumping into thorny
bushes and stopping to rest, leaving their food wrappers behind and coat
fibers in the cat-claw brush. By midafternoon, Mr. Ortega and Mr. Thompson
were tiring, too. But the scent of the men's burlap sacks perked up Mr.
Ortega, and he quickened his pace, finally catching sight of the smugglers
and prompting them to bolt from their resting spot.
Left behind were 10 bales of marijuana, 630 pounds in total, a fairly
typical bust, with a street value of more than $315,000.
With the weight off their backs, the smugglers showed new speed dashing to
hiding places and easily outmatched their pursuers. Other Shadow Wolves
drove out to pick up the load, finding their colleagues resting on the bales
and grinning in satisfaction.
"When we get the dope or the guys," Mr. Thompson said, "that's when it
 ends."

331
Politics & Religion / Cyber Jihad
« on: March 02, 2007, 10:01:52 PM »
Cyberspace as a combat zone: The phenomenon of Electronic Jihad

E. ALSHECH , THE JERUSALEM POST Feb. 28, 2007
Alongside military jihad, which has been gaining momentum and extracting an ever growing price from many countries around the globe, Islamists have been developing a new form of warfare, termed "electronic jihad," which is waged on the Internet. This new form of jihad was launched in recent years and is still in its early stages of development. However, as this paper will show, Islamists are fully aware of its destructive potential, and persistently strive to realize this potential.

Electronic jihad is a phenomenon whereby mujahideen use the Internet to wage economic and ideological warfare against their enemies. Unlike other hackers, those engaged in electronic jihad are united by a common strategy and ideology which are still in a process of formation.

This paper aims to present the phenomenon of electronic jihad and to characterize some of its more recent developments. It lays out the basic ideology and motivations of its perpetrators, describes, as far as possible, its various operational strategies, and assesses the short and long-term dangers posed by this relatively new phenomenon. The paper focuses on electronic jihad waged by organized Islamist groups that mobilize large numbers of hackers around the world to attack servers and Web sites owned by those whom they regard as their enemies.

Organized Electronic Jihad

In the past few years Islamist Web sites have provided ample evidence that Islamist hackers do not operate as isolated individuals, but carry out coordinated attacks against Web sites belonging to those whom they regard as their enemies. As evident from numerous postings on the Islamist Web sites, many of these coordinated attacks are organized by groups devoted to electronic jihad. Six prominent groups of this sort have emerged on the Internet over the past few years: Hackboy, Ansar Al-Jihad LilJihad Al-Electroni, Munazamat Fursan Al-Jihad Al-Electroni, Majmu'at Al-Jihad Al-Electroni, Majma' Al-Haker Al-Muslim, and Inhiyar AlDolar. All these groups, with the exception of Munazamat Fursan Al-Jihad and Inhiyar alDolar, have Web sites of their own through which they recruit volunteers to take part in electronic attacks, maintain contacts with others who engage in electronic jihad, coordinate their attacks, and enable their members to chat with one another anonymously.

The Majmu'at Al-Jihad Al-Electroni Web site, for example, includes the following sections: a document explaining the nature of electronic jihad, a section devoted to electronic jihad strategy, a technical section on software used for electronic attacks, a section describing previous attacks and their results, and various appeals to Muslims, mujahideen, and hackers worldwide.

A more recent indication of the increasingly organized nature of electronic jihad is an initiative launched January 3, 2007 on Islamist Web sites: mujahideen operating on the Internet (and in the media in general) were invited to sign a special pact called "Hilf Al-Muhajirin" (Pact of the Immigrants). In it, they agree "to stand united under the banner of the Muhajirun Brigades in order to promote [cyber-warfare]," and "to pledge allegiance to the leader [of the Muhajirun Brigades]." They vow to "obey [the leader] in [all tasks], pleasant or unpleasant, not to contest [his] leadership, to exert every conceivable effort in [waging] media jihad...[and to persist] in attacking those websites which do harm to Islam and to the Muslims..."

This initiative clearly indicates that the Islamist hackers no longer regard themselves as loosely connected individual activists, but as dedicated soldiers who are bound by a pact and committed to a joint ideological mission.

The Ideology and Ethical Boundaries of Electronic Jihad

Mission statements posted on the Web sites of electronic jihad groups reveal that just like the mujahideen on the military front, the mujahideen operating on the Internet are motivated by profound ideological conviction.

They despise hackers who "engage in purposeless and meaningless sabotage" or are motivated by desire for publicity or by any other worldly objective. They perceive themselves as jihad-fighters who assist Islam and promote (monotheism) via the Internet.

More importantly, they view cyberspace as a virtual battlefield in which the mujahideen can effectively defeat the West.

That the mujahideen operating in cyberspace are motivated by ideology, in contrast to many hackers, is illustrated by the following example. Recently, a participant on an Islamist forum posted instructions for breaking into a UK-based commercial Web site and stealing the customers' credit card information in order to inflict financial damage on the "unbelievers" (i.e. on the non-Muslims customers and retailers). His initiative sparked a fierce debate among the forum participants, the dominant opinion being that this initiative falls outside the boundaries of legitimate cyberjihad. One forum participant wrote: "Oh brother, we do not steal... We attack racist, American and Shi'ite [websites] and all corrupt websites." Another participant reminded the forum members that stealing from unbelievers is forbidden.

One objective of electronic jihad which is frequently evoked by the mujahideen is assisting Islam by attacking Web sites that slander Islam or launch attacks against Islamic Web sites, or by attacking websites that interfere with the goal of rendering Islam supreme (e.g. Christian Web sites). More recently, however, the mujahideen have begun to cite additional objectives: avenging the death of Muslim martyrs and the suffering of Muslims worldwide (including imprisoned jihad fighters); inflicting damage on Western economy; affecting the morale of the West; and even bringing about the total collapse of the West.

The following excerpts from Arabic messages posted by Islamist hackers exemplify each of these objectives.

Eliminating Websites That Harm Islam

"The administration wishes to inform you of the following so that you understand our operational methods and our jihad strategy. My brothers, our operational methods are not only to assault... and target any website that stands in the way of our victory... We are indeed victorious when we disable such [harmful] websites, but the matter is not so simple. We target...websites that wage intensive war [against us]... We target them because they are the foremost enemies of jihad in cyberspace; their existence threatens Islamic and religious websites throughout the Internet..."

Avenging the Death of Martyrs and the Suffering of Muslims and Imprisoned Mujahideen Worldwide

"We shall say to the Crusaders and their followers: We take an oath to avenge the martyrs' blood and the weeping of Muslim mothers and children. The Worshipers of the Cross and their followers have already been warned that their websites may be broken into and destroyed. We must not forget our leaders, our mujahideen, our people and our children who were martyred in Palestine, Iraq, Afghanistan, Chechnya and in other places. We shall take revenge upon you, O' Zionists and Worshipers of the Cross. We shall never rest or forget what you did to us. [There are only two options] in electronic jihad for the sake of Allah: Victory or death.

We dedicate these [operations of] hacking [into enemy websites] to the martyr and jihadfighter sheikh Abu Mus'ab Al-Zarqawi, to the jihad-fighter Sheikh Osama bin Laden, to the imprisoned fighter of electronic jihad Irhabi 007, to the fighter of electronic jihad Muhibb Al-Shaykhan and to all the mujahideen for the sake of Allah..."

Inflicting Economic Damage on the West and Damaging its Morale

"Allah has commanded us in various Koranic verses to wage war against the unbelievers... Electronic jihad utilizes methods and means which inflict great material damage on the enemy and [which also] lower his morale and his spirits via the Internet. The methods of [hacking] have been revealed [to us] by expert [hackers] on the Internet and networks... many of whom engage in purposeless and meaningless sabotage. These lethal methods will be harnessed [for use] against our enemies, so as to inflict the greatest [possible] financial damage [upon them] - which can amount to millions - and [in order] to damage [their] morale, so that [they] will be afraid of the Muslims wherever they go and even when they are surfing the Web."

Bringing About the Total Collapse of the West

"I have examined most of the material [available] in hacking manuals but have not found articles which discuss... how to disable all the [electronic] networks around the world. I found various articles which discuss how to attack websites, e-mails, servers, etc., but I have not read anything about harming or blocking the networks around the world, even though this is one of the most important topics for a hacker and for anyone who engages in electronic jihad. Such [an attack] will cripple the West completely. I am not talking about attacking websites or [even] the Internet [as a whole], but [about attacking] all the [computer] networks around the world including military networks, and [networks] which control radars, missiles and communications around the world... If all these networks stop [functioning even] for a single day... it will bring about the total collapse of the West... while affecting our interests only slightly. The collapse of the West will bring about the breakdown of world economy and of the stock markets, which depend on [electronic] communication [for] their activities, [e.g.] transfers of assets and shares. [Such an attack] will cause the capitalist West to collapse."

Actual Attacks and Their Effects

Reports on Islamist Web sites indicate that most of the hacking operations carried out by mujahideen have been aimed at three types of Web sites: a) Ideological Web sites which promote beliefs, doctrines and ideologies which the mujahideen perceive as incompatible with Sunni Islam, such as Christianity, Shi'ism and Zionism. b) Web sites which the mujahideen perceive as defamatory or harmful to Islam. Many of these are private blogs, news blogs and non-Islamic forums (e.g., http://answering-islam.org.uk ). c) Web sites which promote behavior that is contrary to the mujahideen's religious worldview (e.g., http://www.nscrush.org/news/journal, a Web site associated with a girls' sports team).

As for Web sites associated with governments, defense systems, and Western economic interests - Islamist Web sites present little or no evidence that mujahideen have actually attacked them. There is, however, sufficient evidence to suggest that such sensitive targets continue to be of intense interest to the mujahideen. For example, an Islamist forum recently conducted a survey among its participants regarding the targets they would like to attack. Among the targets suggested were Western financial Web sites and Web sites associated with the FBI and CIA. Moreover, in September 2006, an Islamic Web site posted a long list of IP addresses allegedly associated with key governmental defense institutions in the West, including "the Army Ballistics Research Laboratory," "the Army Armament Research Development and Engineering Center," "the Navy Computers and Telecommunications Station," "the National Space Development Agency of Japan," and others. The title of the message indicates that the list is meant for use in electronic attacks.

Another message, posted on an Islamist Web site on December 5, 2006, stated that Islamist hackers had cancelled a planned attack, nicknamed "The Electronic Guantanamo Raid," against American banks. The posting explained that the attack had been cancelled because the banks had been warned about the attack by American media and government agencies. It stated further that the panic in the media shows how important it is "to focus on attacking sensitive economic American websites [instead of] other [websites, like those that offend Islam]..." The writer added: "If [we] attack websites associated with the stock[market] and with banks, disabling them for a few days or even for a few hours, it will cause millions of dollars' worth of damage... I [therefore] call upon all members [of this forum] to focus on these websites and to urge all Muslims who are able to participate in this [type of] Islamic Intifada to attack websites associated with the American stock[market] and banks..."

332
Politics & Religion / Thailand
« on: February 24, 2007, 06:22:14 PM »
http://www.smh.com.au/news/world/thais-sense-of-self-th...ullpage#contentSwap1

Thais' sense of self threatened by insurgency

Tom Allard National Security Editor
February 24, 2007

Brutality … a man carries a bar girl injured in a bombing in Yala province last Sunday.
Photo: AFP

BEHEADINGS, mutilated Buddhist monks, assassinations of secular teachers, mass-casualty attacks - the Islamist insurgency raging in Thailand's south is getting more barbaric and effective with each passing month.

That is the assessment of terrorism analysts and Thai Government advisers after a spate of co-ordinated and deadly bombings this week, and warnings of more to come, including in Bangkok.

Even more worrying is the possibility of attacks on tourist resorts where Westerners, including thousands of Australians, flock.

"The brutality is amazing," said Zachary Abuza, a US terrorism expert who specialises in a conflict that has simmered for decades. "For the previous generation, these acts would have been considered unseemly. No one would have done things like hacking apart monks, blowing them up when they are collecting their alms, targeting women and children."

Thailand's Islamic minority, centred on four provinces abutting Malaysia, has long complained of mistreatment. But the ferocity of the insurgency has stunned the Government, with more than 2000 people killed since 2004.

There have been about 30 beheadings, and 60 more botched attempts. More than 60 teachers have died, along with hundreds of bystanders, police and soldiers.

"This is new to the Thai people," said Panitan Wattanayagorn, an academic who advises the Prime Minister, Surayud Chulanont. "It's been quite a shock. Thais are learning about cultural differences. They assumed everyone was Thai, had the Thai national identity. Apparently, not so."

The bombs are becoming larger and more sophisticated, and the ideology underpinning the attacks more virulent.

Dr Abuza said: "It's more Islamist than it's ever been … [but] they want separate communities, from private Islamic schools to their own courts. They are convincing women not to go to hospital to give birth."

The insurgency has received scant attention in the West, which is puzzling given the scale of the violence and Thailand's popularity as a tourist destination.

For Thailand's military-backed government, deposing Thaksin Shinawatra in September was justified, at least in part, by his inability to come to grips with the insurgency. It was Mr Thaksin who reacted with indifference when mosques were attacked and when 78 unarmed protesters died of asphyxiation in the back of army trucks. This infuriated Thai Muslims and prompted a surge of recruits.

However, a public apology, the dropping of charges against protesters, even a willingness to introduce a degree of Islamic law in the region have failed to gain the Government any kudos.

The Thai authorities do not even know who their enemy is, Dr Abuza says. The insurgents operate in largely autonomous cells, never stating their goals or accepting responsibility for attacks.

Jemaah Islamiah and other al-Qaeda-affiliated groups have been in contact with the insurgency - the JI leader Hambali was arrested in Thailand - but the consensus is that it remains self-directed.

Nevertheless, it has adopted many techniques of the global jihadist movement, from simultaneous bomb attacks, to the emphasis on civilian targets. Like Jemaah Islamiah, it also abhors the West, in particular the nightclubs, bars and others "dens of sin" that are so common in Bangkok and the tourist towns.

Tourism operators, who are enjoying a revival in business following the 2004 Boxing Day tsunami, are terrified, Professor Panitan said. Muslims in Phuket "watch anyone who comes up from the south very closely. It's worked to date, but how long will it hold?" Dr Abuza says information from Phuket's Muslim minority led to the arrests of a group of suspected insurgents in November.

For now, Dr Abuza believes the insurgents will stay away from tourist centres. "It would be easy enough [to attack tourists]. But I don't think they have to yet, because they are winning. The change of strategy comes when you are losing.

"If they were backed into a corner, I don't think they would hesitate for a second."

333
Science, Culture, & Humanities / Survival issues outside the home
« on: February 21, 2007, 03:33:31 PM »
All:

This thread is intended to be a companion thread to the Survival-- hunkering down in the home thread.  The idea is to enable the conversation in each to be more focused.

I'll begin this thread with:

In the eco-system in which I find myself, many plausible disaster scenarios could lead to the fellow members of my species also seeking gasoline making it very difficult and/or time-consuming to get gasoline.

Therefor I am interested in the advantages and disadvantages of buying a diesel pick up truck (my current truck is 17 years old and smells of 17 years of sweat  :-P ) and putting it through a conversion for about $800 that would enable it to ALSO run on

a) bio-diesel, which is eco-friendly and already available at some stations here in CA, and, more importantly

b) bio fuel e.g. soybean oil or the like.

My thought with the latter is that I could safely store quite a bit of soybean oil at the house and in the event of sustained non-availability of gasoline be good to go for quite some time.

Similarly, in an "Escape from LA" kind of scenario that I could throw 100 gallons (or whatever) in the back of the truck and be good to go for quite a ways-- again, without the safety issues of storing gasoline.

Does anyone know anything about this?

Marc

PS:  This also keeps money out of the pockets of the mad mullahs et al of Iran and the rest of the mideast as well as makes for a cleaner environment.


334
Science, Culture, & Humanities / Survivalist, Prepper/prepping issues
« on: February 21, 2007, 09:43:15 AM »
All:

On the nearby "Epidemics" thread, SB Mig has a good post today that, inter alia, brings up the survivalist issues that could arise during a epidemic such as having to hunker down in one's home for an extended stretch.

As an Angeleno, such questions have been on my radar screen for a while due to the plethora of possibilities for pandemonium in the greater Los Angeles region-- earthquake, brushfire, terrorism, mass breakdown of social order (think of the Rodney King riots), shut off of water to LA etc.   An epidemic is simply one more SHTF scenario for us.  Being snowed in might be one for other parts of the country.

Anyway, this thread is for asking questions and sharing tips about being able to hunker down at home for an extended stretch.

I'll kick things off:

We have a large generator.  At a hardware store it probably would have cost over $800 but at Costco we were able to get it for under $400.  We have 15 gallons of gasoline.  We start up the generator about twice a year to make sure all is well.

I am looking into solar packs for cell phones and lap top computers

We have somewhere between 25 and 50 gallons of water.

I am not clear on how many days of food that does not require cooking that we have-- but I should be.

The house has suitable levels of firepower for social disorder as well.

TAC,
Marc

PS:  I see SB Mig has just added this to his post:  http://www.slate.com/id/2148772/sidebar/2149226/ent/2148778/
It looks very useful-- thanks SB!

335
Science, Culture, & Humanities / Inspirational thoughts
« on: February 21, 2007, 09:29:41 AM »
Woof All:

Sometimes we run across something that helps us center ourselves.  This thread is for sharing such things-- and yes, feel free to run the risk of being a bit corny.

Marc
=============================

 
God's Coffee

A group of alumni, highly established in their careers, got together to
visit their old university professor. Conversation soon turned into
complaints about stress in work and life.

Offering his guests coffee, the professor went to the kitchen and
returned with a large pot of coffee and an assortment of cups - porcelain,
plastic, glass, crystal, some plain looking, some expensive, some exquisite ---
telling them to help themselves to the coffee.

When all the students had a cup of coffee in hand, the professor said:
"If you noticed, all the nice looking expensive cups were taken up,
leaving behind the plain and cheap ones. While it is normal for you to
want only the best for yourselves, that is the source of your problems
and stress.

Be assured that the cup itself adds no quality to the coffee. In most
cases it is just more expensive and in some cases even hides what we
drink.

What all of you really wanted was coffee, not the cup, but you
consciously went for the best cups... And then you began eyeing each other's cups.

Now consider this: Life is the coffee; the jobs, money and position in
society are the cups. They are just tools to hold and contain Life, and
the type of cup we have does not define, nor change the quality of Life we
live.

Sometimes, by concentrating only on the cup, we fail to enjoy the
coffee God has provided us."God brews the coffee, not the cups.........
Enjoy your coffee!

"The happiest people don't have the best of everything. They just make
the best of everything they have."

Live simply. Love generously. Care deeply. Speak kindly. Leave the rest
to God.

336
Politics & Religion / Islam in North Africa, Mali, the Magreb, the Sahel
« on: February 20, 2007, 07:27:27 AM »
Its the NY Slimes, so read with care-- but several interesting things in this piece.
=======================================================

North Africa Feared as Staging Ground for Terror
By CRAIG S. SMITH
NY Times
Published: February 20, 2007
TUNIS — The plan, hatched for months in the arid mountains of North Africa, was to attack the American and British Embassies here. It ended in a series of gun battles in January that killed a dozen militants and left two Tunisian security officers dead.

But the most disturbing aspect of the violence in this normally placid, tourist-friendly nation is that it came from across the border in Algeria, where an Islamic terrorist organization has vowed to unite radical Islamic groups across North Africa.
Counterterrorism officials on three continents say the trouble in Tunisia is the latest evidence that a brutal Algerian group with a long history of violence is acting on its promise: to organize extremists across North Africa and join the remnants of Al Qaeda into a new international force for jihad.

[Last week, the group claimed responsibility for seven nearly simultaneous bombings that destroyed police stations in towns east of Algiers, the Algerian capital, killing six people.]

This article was prepared from interviews with American government and military officials, French counterterrorism officials, Italian counterterrorism prosecutors, Algerian terrorism experts, Tunisian government officials and a Tunisian attorney working with Islamists charged with terrorist activities.

They say North Africa, with its vast, thinly governed stretches of mountain and desert, could become an Afghanistan-like terrorist hinterland within easy striking distance of Europe. That is all the more alarming because of the deep roots that North African communities have in Europe and the ease of travel between the regions. For the United States, the threat is also real because of visa-free travel to American cities for most European passport holders.

The violent Algerian group the Salafist Group for Preaching and Combat, known by its French initials G.S.P.C., has for several years been under American watch.

“The G.S.P.C. has become a regional terrorist organization, recruiting and operating in all of your countries — and beyond,” Henry A. Crumpton, then the United States ambassador at large for counterterrorism, said at a counterterrorism conference in Algiers last year. “It is forging links with terrorist groups in Morocco, Nigeria, Mauritania, Tunisia and elsewhere.”

Officials say the group is funneling North African fighters to Iraq, but is also turning militants back toward their home countries.

The ambitions of the group are particularly troubling to counterterrorism officials on the watch for the re-emergence of networks that were largely interrupted in the wake of the Sept. 11, 2001, attacks. While most estimates put the current membership of the group in the hundreds, it has survived more than a decade of Algerian government attempts to eradicate it. It is now the best-organized and -financed terrorist group in the region.

Last year, on the fifth anniversary of the Sept. 11 attacks, Al Qaeda chose the G.S.P.C. as its representative in North Africa. In January, the group reciprocated by switching its name to Al Qaeda of the Islamic Maghreb, claiming that the Qaeda leader, Osama bin Laden, had ordered the change.

“Al Qaeda’s aim is for the G.S.P.C. to become a regional force, not solely an Algerian one,” said the French counterterrorism magistrate, Jean-Louis Bruguière, in Paris. He calls the Algerian group the biggest terrorist threat facing France today.

“We know from cases that we’re working on that the G.S.P.C.’s mission is now to recruit people in Morocco and Tunisia, train them and send them back to their countries of origin or Europe to mount attacks,” he said.

The G.S.P.C. was created in 1998 as an offshoot of the Armed Islamic Group, which along with other Islamist guerrilla forces fought a brutal decade-long civil war after the Algerian military canceled elections in early 1992 because an Islamist party was poised to win.

In 2003, a G.S.P.C. leader in southern Algeria kidnapped 32 European tourists, some of whom were released for a ransom of 5 million euros (about $6.5 million at current exchange rates), paid by Germany.

Officials say the leader, Amari Saifi, bought weapons and recruited fighters before the United States military helped corner and catch him in 2004. He is now serving a life sentence in Algeria.

Change of Leadership

Since then, an even more radical leader, Abdelmalek Droukdel, has taken over the group. The Algerian military says he cut his teeth in the 1990s as a member of the Armed Islamic Group’s feared Ahoual or “horror” company, blamed for some of the most gruesome massacres of Algeria’s civil war.

He announced his arrival with a truck bomb at the country’s most important electrical production facility in June 2004, and focused on associating the group with Al Qaeda.

Links to the G.S.P.C. soon began appearing in terrorism cases elsewhere in North Africa and in Europe.

In 2005, Moroccan authorities arrested a man named Anour Majrar, and told Italy and France that he and two other militants had visited G.S.P.C. leaders in Algeria earlier that year.

================



His interrogation led to arrests in Algeria, Italy and France, where Mr. Majrar’s associates were quickly linked to an attempted robbery of 5 million euros at an armored car depot in Beauvais, north of Paris. A hole had been blown in a wall at the depot with military-grade C4 plastic explosives, but it was not big enough for the men to get through.

A later investigation turned up Kalashnikov assault rifles, French Famas military assault rifles, rocket-propelled grenades, TNT and two more pounds of C4. French counterterrorism officials say the group was planning attacks on the Paris Metro, the city’s Orly Airport, and the headquarters of the Direction de la Surveillance du Territoire, France’s domestic intelligence agency.
Italian prosecutors say a related cell in Milan was planning attacks on the city’s police headquarters and on the Basilica of San Petronio in Bologna, whose 15th-century fresco depicts the Prophet Muhammad in hell.

The G.S.P.C. or its members in Algeria appear to have become a touchstone for groups suspected of being terror cells across the region, in much the way that Qaeda representatives in London were a decade ago.

Wiretaps, interrogation of terrorism suspects and recovered documents suggest that the network has associates in France, Italy, Turkey and even Greece, which is favored as an entry point to Europe because of its relatively lax immigration controls, counterterrorism officials say.

There had been hints that the North African groups were planning more formal cooperation as far back as 2005, when Moroccan intelligence authorities found messages sent by Islamic militants to Osama bin Laden, according to European counterintelligence officials.

Evidence of an Alliance

Indications that a cross-border alliance was under way came in June 2005, when the G.S.P.C. attacked a military outpost in Mauritania, killing 15 soldiers. The attackers fled into Mali, according to the United States military.

Moroccan police officers raiding suspected Islamic militant cells last summer also found documents discussing a union between the G.S.P.C. and the Islamic Combatant Group in Morocco, the Islamic Fighting Group in Libya and several smaller Tunisian groups, intelligence officials say.

In September, Al Qaeda’s second in command, Ayman al-Zawahri, released a videotape in which he said that his global terrorist network had joined forces with the G.S.P.C.

The video was followed by an unsettling increase in terrorist attacks across the region, including one against Halliburton employees in Algeria in December that left one Algerian dead and nine people wounded.

But the strongest evidence yet of the G.S.P.C.’s North African cross-border cooperation came in January when Tunisia announced that it had killed 12 Islamic extremists and captured 15 of them. Officials said that six of the extremists had crossed into the country from Algeria.

Their 36-year-old leader, Lassad Sassi, was a former Tunisian policeman who ran a terrorist cell in Milan until May 2001 before fleeing to Algeria, according to an Italian prosecutor, Armando Spataro.

Mr. Sassi, now dead, is still listed as a defendant in a current terrorism trial in Milan, which began before he died. He was charged in absentia with providing military clothing and money to the G.S.P.C. while financing and planning suicide bomb attacks in Italy.

Tunisian officials say that Mr. Sassi and five other men — four Tunisians and one Mauritanian — crossed the rugged border from Algeria into Tunisia months ago.

They set up a base in the mountains of Djebel Terif, where Mr. Sassi trained 20 other Tunisian men in the use of automatic weapons and explosives.

A Trail of Violence

The decision to move against the group began when the police in the Tunis suburb of Hammam Lif detained a young woman in December who led them to a house where a gun battle left two suspected terrorists dead, two officers wounded and two other men in custody, a police officer involved said. His account of the events could not be independently verified.

Another arrest led the police into the hills toward the training camp.

Three of the militants and a Tunisian Army captain were killed during a chase through the mountains. Tunisian security forces mounted a search in which 13 more men were arrested and Mr. Sassi was killed.

The remnants of the group fled and members were later tracked down and killed in another gun battle.

Tunisian officials have sought to play down the G.S.P.C. link, and have said the recently dismantled group’s target was the West.

In fact, according to Samir Ben Amor, a Tunisian attorney who defends many young Tunisian Islamists, more than 600 young Tunisian Islamists have been arrested in the past two years — more than 100 in the past two months — trying to make their way to Iraq to fight the United States.

“It’s the same thing that we saw in Bosnia, Kosovo and above all Afghanistan,” said Mr. Bruguière, the French magistrate. “Al Qaeda’s objective is to create an operational link between the groups in Iraq and the G.S.P.C.”

Tunisia is among the most vulnerable of the North African countries, because its rigid repression of Islam has created a well of resentment among religious youth, and its popularity as a tourist destination for Europeans makes it a target.

Tunisian security forces found Google Earth satellite images of the American and British Embassies as well as the names of diplomats who worked in both buildings. But according to the police officer involved in the case and journalists in Tunisia, the targets also included hotels and nightclubs.

An attack on those sites would have dealt a heavy blow to Tunisia’s tourist industry, one of the country’s most important sources of foreign exchange. An April 2002 bombing of a synagogue on the Tunisian tourist island of Djerba, for which the G.S.P.C. claimed responsibility, helped sink the country’s economic growth that year to its slowest rate in a decade.

337
Politics & Religion / The War on Drugs
« on: February 15, 2007, 02:02:30 AM »
All:

IMHO the WOD is a tremendous foolishness that is both counter-productive and counter to basic American values of live and let live. 

We begin this thread with a piece whose title captures a certain something , , ,

TAC,
Marc
================

DEA: More marijuana needed for studies
Judge rules federal supply is inadequate
By Michael Doyle - McClatchy Newspapers
WASHINGTON -- Medical researchers need more marijuana sources because government supplies aren't meeting scientific demand, a federal judge has ruled.

In an emphatic but nonbinding opinion, the Drug Enforcement Administration's own judge is recommending that a University of Massachusetts professor be allowed to grow a legal pot crop.  The real winners could be those suffering from painful and wasting diseases, proponents say.

"The existing supply of marijuana is not adequate," Administrative Law Judge Mary Ellen Bittner ruled.

The federal government's 12-acre marijuana plot at the University of Mississippi provides neither the quantity nor quality scientists need, researchers contend.  While Bittner didn't embrace those criticisms, she agreed that the system for producing and distributing research marijuana is flawed.

"Competition in the manufacture of marijuana for research purposes is inadequate," Bittner determined.  Bittner further concluded that there is "minimal risk of diversion" from a new marijuana source.  Making additional supplies available, she stated, "would be in the public interest."

The DEA isn't required to follow Bittner's 88-page opinion, and the Bush administration's anti-drug stance may make it unlikely that the grass-growing rules will loosen.  Both sides can now file further information before DEA administrators make their ruling.

"We could still be months away from a final decision," DEA spokesman Garrison Courtney said Tuesday, adding that "obviously, we're going to take the judge's opinion into consideration."

Still, the ruling is resonating in labs and with civil libertarians.

"(The) ruling is an important step toward allowing medical marijuana patients to get their medicine from a pharmacy just like everyone else," said Allen Hopper, an attorney with the American Civil Liberties Union.

Based in the California seaside town of Santa Cruz, the ACLU's Drug Law Reform Project has been representing University of Massachusetts scientist Lyle Craker.  Since 2001, Craker has been confronting numerous bureaucratic and legal obstacles in his request for permission to grow research-grade marijuana.  An agronomist who received a doctorate from the University of Minnesota, Craker was asked to grow bulk marijuana by a five-member group called the Multidisciplinary Association for Psychedelic Studies. The psychedelic studies group wants to research such areas as developing vaporizers that can efficiently deliver pot smoke.

"This ruling is a victory for science, medicine and the public good," Craker said.

"I hope that the DEA abides by the decision and grants me the opportunity to do my job unimpeded by drug war politics."

(EDITORS: BEGIN OPTIONAL TRIM)

The latest research made public this week indicated that marijuana provided more pain relief for AIDS patients than prescription drugs did. The Bush administration quickly dismissed those findings as a "smokescreen," and it has remained hostile to Craker's research efforts.  During the trial, for instance, DEA attorneys secured an admission from Multidisciplinary Association for Psychedelic Studies head Richard Doblin that he has smoked marijuana regularly since 1971.

"Can you tell us the source of this marijuana?" DEA attorney Brian Bayly asked Doblin, before withdrawing the question under objections.

The DEA originally claimed that it lost Craker's research application. Then the agency said that his photocopied follow-up lacked a necessary original signature. After a year, Craker tried again. He then had to wait another year before the DEA started processing the application, in which he proposed to grow about 25 pounds of marijuana in the first year.

Craker sued after the agency rejected his application. That brought his case before Bittner.

(EDITORS: BEGIN OPTIONAL TRIM)

She oversaw the trial, which featured witnesses such as former California legislator John Vasconcellos.

"People have a right to know more about what might help them in their suffering and pain or illness, whatever it might be," Vasconcellos testified, in words repeated by Bittner. "The more research, the better."

(END OPTIONAL TRIM)

The University of Mississippi has monopolized government-grade marijuana since 1968. The university also contracts with North Carolina's Research Triangle Institute, which runs a machine that can roll up to 1,000 finished marijuana cigarettes in an hour.

(EDITORS: STORY CAN END HERE)

The government-grown pot is too "harsh" and filled with stems and seeds, researchers testified.

"The material was of such poor quality, we did not deem it to be representative of medical cannabis," researcher Dr. Ethan Russo said.

(e-mail: mdoylemcclatchydc.com)

02-13-07

mb-cd

338
Politics & Religion / Stratfor: Russia's Great Power Strategy
« on: February 13, 2007, 11:39:47 PM »
All:

Once again, Stratfor/Geroge Friedman lay down some deep thinking.  Comments?

Marc

PS:  I am a lifetime subscriber to Stratfor.  What you see here is only a fraction of what they produce.

=========================================

Russia's Great-Power Strategy
By George Friedman

Most speeches at diplomatic gatherings aren't worth the time it takes to listen to them. On rare occasion, a speech is delivered that needs to be listened to carefully. Russian President Vladimir Putin gave such a speech over the weekend in Munich, at a meeting on international security. The speech did not break new ground; it repeated things that the Russians have been saying for quite a while. But the venue in which it was given and the confidence with which it was asserted signify a new point in Russian history. The Cold War has not returned, but Russia is now officially asserting itself as a great power, and behaving accordingly.

At Munich, Putin launched a systematic attack on the role the United States is playing in the world. He said: "One state, the United States, has overstepped its national borders in every way ... This is nourishing an arms race with the desire of countries to get nuclear weapons." In other words, the United States has gone beyond its legitimate reach and is therefore responsible for attempts by other countries -- an obvious reference to Iran -- to acquire nuclear weapons.

Russia for some time has been in confrontation with the United States over U.S. actions in the former Soviet Union (FSU). What the Russians perceive as an American attempt to create a pro-U.S. regime in Ukraine triggered the confrontation. But now, the issue goes beyond U.S. actions in the FSU. The Russians are arguing that the unipolar world -- meaning that the United States is the only global power and is surrounded by lesser, regional powers -- is itself unacceptable. In other words, the United States sees itself as the solution when it is, actually, the problem.

In his speech, Putin reached out to European states -- particularly Germany, pointing out that it has close, but blunt, relations with Russia. The Central Europeans showed themselves to be extremely wary about Putin's speech, recognizing it for what it was -- a new level of assertiveness from an historical enemy. Some German leaders appeared more understanding, however: Foreign Minister Frank-Walter Steinmeier made no mention of Putin's speech in his own presentation to the conference, while Ruprecht Polenz, chairman of the Bundestag Foreign Affairs Committee, praised Putin's stance on Iran. He also noted that the U.S. plans to deploy an anti-missile shield in Poland and the Czech Republic was cause for concern -- and not only to Russia.

Putin now clearly wants to escalate the confrontations with the United States and likely wants to build a coalition to limit American power. The gross imbalance of global power in the current system makes such coalition-building inevitable -- and it makes sense that the Russians should be taking the lead. The Europeans are risk-averse, and the Chinese do not have much at risk in their dealings with the United States at the moment. The Russians, however, have everything at risk. The United States is intruding in the FSU, and an ideological success for the Americans in Ukraine would leave the Russians permanently on the defensive.

The Russians need allies but are not likely to find them among other great-power states. Fortunately for Moscow, the U.S. obsession with Iraq creates alternative opportunities. First, the focus on Iraq prevents the Americans from countering Russia elsewhere. Second, it gives the Russians serious leverage against the United States -- for example, by shipping weapons to key players in the region. Finally, there are Middle Eastern states that seek great-power patronage. It is therefore no accident that Putin's next stop, following the Munich conference, was in Saudi Arabia. Having stabilized the situation in the former Soviet region, the Russians now are constructing their follow-on strategy, and that concerns the Middle East.

The Russian Interests

The Middle East is the pressure point to which the United States is most sensitive. Its military commitment in Iraq, the confrontation with Iran, the Israeli-Palestinian conflict and oil in the Arabian Peninsula create a situation such that pain in the region affects the United States intensely. Therefore, it makes sense for the Russians to use all available means of pressure in the Middle East in efforts to control U.S. behavior elsewhere, particularly in the former Soviet Union.

Like the Americans, the Russians also have direct interests in the Middle East. Energy is a primary one: Russia is not only a major exporter of energy supplies, it is currently the world's top oil producer. The Russians have a need to maintain robust energy prices, and working with the Iranians and Saudis in some way to achieve this is directly in line with Moscow's interest. To be more specific, the Russians do not want the Saudis increasing oil production.





There are strategic interests in the Middle East as well. For example, the Russians are still bogged down in Chechnya. It is Moscow's belief that if Chechnya were to secede from the Russian Federation, a precedent would be set that could lead to the dissolution of the Federation. Moscow will not allow this. The Russians consistently have claimed that the Chechen rebellion has been funded by "Wahhabis," by which they mean Saudis. Reaching an accommodation with the Saudis, therefore, would have not only economic, but also strategic, implications for the Russians.

On a broader level, the Russians retain important interests in the Caucasus and in Central Asia. In both cases, their needs intersect with forces originating in the Muslim world and trace, to some extent, back to the Middle East. If the Russian strategy is to reassert a sphere of influence in the former Soviet region, it follows that these regions must be secured. That, in turn, inevitably involves the Russians in the Middle East.

Therefore, even if Russia is not in a position to pursue some of the strategic goals that date back to the Soviet era and before -- such as control of the Bosporus and projection of naval power into the Mediterranean -- it nevertheless has a basic, ongoing interest in the region. Russia has a need both to limit American power and to achieve direct goals of its own. So it makes perfect sense for Putin to leave Munich and embark on a tour of Saudi Arabia and other Persian Gulf countries.

The Complexities

But the Russians also have a problem. The strategic interests of Middle Eastern states diverge, to say the least. The two main Islamic powers between the Levant and the Hindu Kush are Saudi Arabia and Iran. The Russians have things they want from each, but the Saudis and Iranians have dramatically different interests. Saudi Arabia -- an Arab and primarily Sunni kingdom -- is rich but militarily weak. The government's reliance on outside help for national defense generates intense opposition within the kingdom. Desert Storm, which established a basing arrangement for Western troops within Saudi Arabia, was one of the driving forces behind the creation of al Qaeda. Iran -- a predominantly Persian and Shiite power -- is not nearly as rich as Saudi Arabia but militarily much more powerful. Iran seeks to become the dominant power in the Persian Gulf -- out of both its need to defend itself against aggression, and for controlling and exploiting the oil wealth of the region.

Putting the split between Sunni and Shiite aside for the moment, there is tremendous geopolitical asymmetry between Saudi Arabia and Iran. Saudi Arabia wants to limit Iranian power, while keeping its own dependence on foreign powers at a minimum. That means that, though keeping energy prices high might make financial sense for the kingdom, the fact that high energy prices also strengthen the Iranians actually can be a more important consideration, depending on circumstances. There is some evidence that recent declines in oil prices are linked to decisions in Riyadh that are aimed at increasing production, reducing prices and hurting the Iranians.

This creates a problem for Russia. While Moscow has substantial room for maneuver, the fact is that lowered oil prices impact energy prices overall, and therefore hurt the Russians. The Saudis, moreover, need the Iranians blocked -- but without going so far as to permit foreign troops to be based in Saudi Arabia itself. In other words, they want to see the United States remain in Iraq, since the Americans serve as the perfect shield against the Iranians so long as they remain there. Putin's criticisms of the United States, as delivered in Munich, would have been applauded by Saudi Arabia prior to the 2003 invasion of Iraq. But in 2007, the results of that invasion are exactly what the Saudis feared -- a collapsed Iraq and a relatively powerful Iran. The Saudis now need the Americans to stay put in the region.

The interests of Russia and Iran align more closely, but there are points of divergence there as well. Both benefit from having the United States tied up, militarily and politically, in wars, but Tehran would be delighted to see a U.S. withdrawal from Iraq that leaves a power vacuum for Iran to fill. The Russians would rather not see this outcome. First, they are quite happy to have the United States bogged down in Iraq and would prefer that to having the U.S. military freed for operations elsewhere. Second, they are interested in a relationship with Iran but are not eager to drive the United States and Saudi Arabia into closer relations. Third, the Russians do not want to see Iran become the dominant power in the region. They want to use Iran, but within certain manageable limits.

Russia has been supplying Iran with weapons. Of particular significance is the supply of surface-to-air missiles that would raise the cost of U.S. air operations against Iran. It is not clear whether the advanced S300PMU surface-to-air missile has yet been delivered, although there has been some discussion of this lately. If it were delivered, this would present significant challenges for U.S. air operation over Iran. The Russians would find this particularly advantageous, as the Iranians would absorb U.S. attentions and, as in Vietnam, the Russians would benefit from extended, fruitless commitments of U.S. military forces in regions not vital to Russia.

Meanwhile, there are energy matters: The Russians, as we have said, are interested in working with Iran to manage world oil prices. But at the same time, they would not be averse to a U.S. attack that takes Iran's oil off the market, spikes prices and enriches Russia.

Finally, it must be remembered that behind this complex relationship with Iran, there historically has been animosity and rivalry between the two countries. The Caucasus has been their battleground. For the moment, with the collapse of the Soviet Union, there is a buffer there, but it is a buffer in which Russians and Iranians are already dueling. So long as both states are relatively weak, the buffer will maintain itself. But as they get stronger, the Caucasus will become a battleground again. When Russian and Iranian territories border each other, the two powers are rarely at peace. Indeed, Iran frequently needs outside help to contain the Russians.

A Complicated Strategy

In sum, the Russian position in the Middle East is at least as complex as the American one. Or perhaps even more so, since the Americans can leave and the Russians always will live on the doorstep of the Middle East. Historically, once the Russians start fishing in Middle Eastern waters, they find themselves in a greater trap than the Americans. The opening moves are easy. The duel between Saudi Arabia and Iran seems manageable. But as time goes on, Putin's Soviet predecessors learned, the Middle East is a graveyard of ambitions -- and not just American ambitions.

Russia wants to contain U.S. power, and manipulating the situation in the Middle East certainly will cause the Americans substantial pain. But whatever short-term advantages the Russians may be able to find and exploit in the region, there is an order of complexity in Putin's maneuver that might transcend any advantage they gain from boxing the Americans in.

In returning to "great power" status, Russia is using an obvious opening gambit. But being obvious does not make it optimal.
Contact Us

339
Science, Culture, & Humanities / Life and Death
« on: February 08, 2007, 11:43:45 AM »
The WEEKLY STANDARD
Kid Turns 70… And nobody cares
by Joseph Epstein

Seventy. Odd thing to happen to a five-year-old boy who, only the other day, sang "Any Bonds Today," whose mother's friends said he would be a heartbreaker for sure (he wasn't), who was popular but otherwise undistinguished in high school, who went on to the University of Chicago but long ago forgot the dates of the rule of the Thirty Tyrants in Athens and the eight reasons for the Renaissance, who has married twice and written several books, who somewhere along the way became the grandfather of three, life is but a dream, sha-boom sha-boom, 70, me, go on, whaddya, kiddin' me?

A funny age to turn, 70, and despite misgivings I have gone ahead and done it, yet with more complex thoughts than any previous birthday has brought. Birthdays have never been particularly grand events for me; my own neither please nor alarm me. I note them chiefly with gratitude for having got through another year. I have never been in any way part of the cult of youth, delighted to be taken for younger than I am, or proud that I can do lots of physical things that men my age are no longer supposed to be able to do: 26 chin-ups with gila monsters biting both my ankles. I have always thought I looked--and, as mothers used to instruct, always tried to act--my age. But now, with 70 having arrived, I notice that for the first time I am beginning to fudge, to hedge, to fib slightly, about my age. In conversation, in public appearances, I described myself as "in my late 60s," hoping, I suppose, to be taken for 67. To admit to 70 is to put oneself into a different category: to seem uncomfortably close to, not to put too fine a point on it, Old Age.

At 70 middle age is definitely--and definitively--done. A wonderful per iod, middle age, so nondescript and im precise, extending perhaps from one's late 30s to one's late 60s, it allows a person to think him- or herself simultaneously still youthful, though no longer a kid. Forty-eight, 57, 61, those middle-aged numbers suggest miles to go before one sleeps, miles filled with potential accomplishments, happy turnabouts in one's destiny, midlife crises (if one's tastes run to such extravaganzas), surprises of all kinds.

Many ski lifts at Vail and Aspen, I have been told, no longer allow senior-citizen discounts at 60, now that so many people continue skiing well into their 60s. With increased longevity, it's now thought a touch disappointing if a person dies before 85. Sixty, the style sections of the newspapers inform us, is the new 40. Perhaps. But 70--70, to ring a change on the punchline of the joke about the difference between a virgin and a German Jew--70 remains 70. One can look young for 70, one can be fit for 70, but in the end there one is, 70.

W.H. Auden, who pegged out at 66, said that while praying we ought quickly to get over the begging part and get on to the gratitude part. "Let all your thinks," he wrote, "be thanks." One can either look upon life as a gift or as a burden, and I myself happen to be a gift man. I didn't ask to be born, true enough; but really, how disappointing not to have been. I had the additional good luck of arriving in 1937, in what was soon to become the most interesting country in the world and to have lived through a time of largely unrelieved prosperity in which my particular generation danced between the raindrops of wars: a child during World War II, too young for Korea, too old for Vietnam, but old enough for the draft, which sent me for 22 months (useful as they now in retrospect seem) off to Missouri, Texas, and Arkansas. My thinks really are chiefly thanks.

As for my decay, what the French call my décomposition géneralé, it proceeds roughly on schedule, yet for the moment at a less than alarming rate. I have had a heart bypass operation. Five or so years ago, I was found to have auto-immune hepatitis, which caused me no pain, and which side-effectless drugs have long since put in remission. I am paunchless, have a respectable if not abundant amount of hair atop my head (most of it now gray, some of it turning white), retain most of my teeth (with the aid of expensive dentistry). I have so far steered clear of heart attack, dodged the altogether too various menacing cancers whirling about, and missed the wretched roll of the dice known as aneurysms. (Pause while I touch wood.) My memory for unimportant things has begun to fade, with results that thus far have been no more than mildly inconvenient. (I set aside 10 minutes or so a day to find my glasses and fountain pen.)

I have not yet acquired one of those funny walks--variants of the prostate shuffle, as I think of them--common to men in their late 60s and 70s. I am, though, due for cataract surgery. I'm beginning to find it difficult to hear women with high-pitched voices, especially in restaurants and other noisy places. And I take a sufficient number of pills--anti-this and supplement-that--to have made it necessary to acquire one of those plastic by-the-day-of-the-week pill sorters.

Suddenly, I find myself worrying in a way I never used to do about things out of the routine in my life: having to traverse major freeways and tollways to get to a speaking or social engagement. I take fewer chances, both as a driver and once intrepid jaywalker. I find myself sometimes stumbling over small bumps in the sidewalk, and in recent years have taken a couple of falls, where once I would do an entrechat and a simple pirouette--a Nureyev of the pavement--and move along smartly. I walk more slowly up and down stairs, gripping the railing going downstairs. I have, in sum, become more cautious, begun to feel, physically, more fragile, a bit vulnerable.

Sleep has become erratic. Someone not long ago asked me if I watched Charlie Rose, to which I replied that I am usually getting up for the first time when Charlie Rose goes on the air. I fall off to sleep readily enough, but two or three hours later I usually wake, often to invent impressively labyrinthine anxieties for myself to dwell upon for an hour or two before falling back into aesthetically unsatisfying dreams until six or so in the morning. Very little distinction in this, I have discovered by talking to contemporaries, especially men, who all seem to sleep poorly. But this little Iliad of woes is pretty much par for the course, if such a cliché metaphor may be permitted from a nongolfer. That I have arrived at 70 without ever having golfed is one of the facts of my biography to date of which I am most proud.

"Bodily decrepitude," says Yeats, "is wisdom." I seem to have accrued more of the former than the latter. Of wisdom generally, I haven't all that much to declare. I find myself more impressed by the mysteries of life and more certain that most of the interesting questions it poses have no persuasive answers, or at least none likely to arrive before I depart the planet. I haven't even settled the question of whether I believe in God. I try to act as if God exists--that is, the prospects of guilt and shame and the moral endorphins that good conduct brings still motivate me to act as decently as I'm able. I suffer, then, some of the fear of religion without any of the enjoyment of the hope it brings. I don't, meanwhile, have a clue about why there is suffering in the world, whether there is an afterlife, or how to explain acts of truly grand altruism or unprofitable evil. You live and you learn, the proverb has it; but in my case, You live and you yearn seems closer to it.

But then, I must report that at 70 even my yearnings are well down. I have no interest in acquiring power of any kind and fame beyond such as I now pathetically possess holds little interest for me. My writing has won no big prizes, nor do I expect it ever to do so. ("Tell them," the normally gentle and genteel 90-year-old William Maxwell said to Alec Wilkinson and another friend on the day before his death, "their f--ing honors mean nothing to me.") I am ready to settle for being known as a good writer by thoughtful people.

I would like to have enough money so that I don't have to worry, or even think, about money, but it begins to look as if I shan't achieve this, either. Rousseau spoke of feeling himself "delivered from the anxiety of hope, certain of gradually losing the anxiety of desire . . . " I've not yet lost all my desire, and suspect that to do so probably is a sign of resigning from life. Although I'm not keen on the idea of oblivion, which seems the most likely of the prospects that await, I like to think that I have become a bit less fearful of death. One of the most efficient ways to decrease this fear, I've found, is to welcome death, at least a little, and this growing older can cause one to do--or at least it has me, sometimes.

Seventy poses the problem of how to live out one's days. To reach 70 and not recognize that one is no longer living (as if one ever were) on an unlimited temporal budget is beyond allowable stupidity. The first unanswerable question at 70 is how many days, roughly, are left in what one does best to think of as one's reprieve. Unless one is under the sentence of a terminal cancer or another wasting disease, no one can know, of course; but I like the notion of the French philosopher Alain that, no matter what age one is, one should look forward to living for another decade, but no more. My mother lived to 82 and my father to 91, so I'm playing, I suppose, with decent genetic cards. Yet I do not count on them. A year or so ago, my dentist told me that I would have to spend a few thousand dollars to replace some dental work, and I told him that I would get back to him on this once I had the results of a forthcoming physical. If I had been found to have cancer, I thought, at least I could let the dentistry, even the flossing, go. Turning 70 one has such thoughts.

At 70 one encounters the standard physical diminutions. I am less than certain how old I actually look, but in a checkout line, I can now say to a young woman, "You have beautiful eyes," without her thinking I'm hitting on her. If my dashing youthful looks are gone, my intellectual and cultural stamina are also beginning to deplete. I have lost most of my interest in travel, and feel, as did Philip Larkin, that I should very much like to visit China, but only on the condition that I could return home that night.

Another diminution I begin to notice is in the realm of tact. I have less of it. I feel readier than ever before to express my perturbation, impatience, boredom. Why, with less time remaining, hold back? "I wonder," I find myself wanting to say to a fairly large number of people, "if you haven't greatly overestimated your charm?" Perhaps, though, I do better to hold off on this until I reach 80, as I hope to be able to do; it will give me something to live for.

A younger friend in California writes to me that, in a restaurant in Bel Air, Robin Williams, Emma Thompson, and Pete Townsend (of The Who, he is courteous enough to explain) walked by his table. I write back to tell him that I would have been much more impressed if Fred Astaire, Ingrid Bergman, and Igor Stravinsky had done so. My longing to meet Robin Williams, Emma Thompson, and Pete Townsend is roughly the same, I should guess, as their longing to meet me.

I don't much mind being mildly out of it, just as I don't finally mind growing older. George Santayana, perhaps the most detached man the world has known outside of certain Trappist monasteries, claimed to prefer old age to all others. "I heartily agree that old age is, or may be as in my case, far happier than youth," he wrote to his contemporary William Lyon Phelps. "I was never more entertained or less troubled than I am now." Something to this, if one isn't filled with regret for the years that have gone before, and I am not, having had a very lucky run thus far in my life. At 70 it is natural to begin to view the world from the sidelines, a glass of wine in hand, watching younger people do the dances of ambition, competition, lust, and the rest of it.

Schopenhauer holds that the chief element in old age is disillusionment. According to this dourest of all philosophers, at 70 we have, if we are at all sentient, realized "that there is very little behind most of the things desired and most of the pleasures hoped for; and we have gradually gained an insight into the great poverty and hollowness of our whole existence. Only when we are seventy do we thoroughly understand the first verse of Ecclesiastes." And yet, even for those of us who like to think ourselves close to illusionless, happiness keeps breaking through, fresh illusions arrive to replace defunct ones, and the game goes on.

If the game is to be decently played, at 70 one must harken back as little as possible to the (inevitably golden) days of one's youth, no matter how truly golden they may seem. The temptation to do so, and with some regularity, sets in sometime in one's 60s. As a first symptom, one discovers the word "nowadays" turning up in lots of one's sentences, always with the assumption that nowadays are vastly inferior to thenadays, when one was young and the world green and beautiful. Ah, thenadays--so close to "them were the days"--when there was no crime, divorce was unheard of, people knew how to spell, everyone had good handwriting, propriety and decorum ruled, and so on and on into the long boring night of nostalgia.

Start talking about thenadays and one soon finds one's intellectual motor has shifted into full crank, with everything about nowadays dreary, third-rate, and decline-and-fallish. A big mistake. The reason old people think that the world is going to hell, Santayana says, is they believe that, without them in it, which will soon enough be the case, how good really can it be?

Seventy brings prominently to the fore the question of Big D, and I don't mean Dallas. From 70 on, one's death can no longer be viewed as a surprise; a disappointment, yes, but not a surprise. Three score and ten, after all, is the number of years of life set out in the Bible; anything beyond that is, or ought to be, considered gravy, which is likely to be high in cholesterol, so be careful. Henry James, on his deathbed, in a delirium, said of death, "So here it is at last, the distinguished thing." Wonder why? Few things are less distinguished than death, that most democratic of events and oldest of jokes that comes to each of us afresh.

At 70 one more clearly than ever before hears footsteps, as they say wide-receivers in the NFL do who are about to be smashed by oncoming pass-defenders while awaiting the arrival of a pass thrown to them in the middle of the field. The footsteps first show up in the obituary pages, which I consult with greater interest than any other section of the newspaper. Not too many days pass when someone I know, or someone whom someone else I know knows, does not show up there. Late last year the anthropologist Clifford Geertz and the novelist William Styron conked out; neither was a close friend, though as fellow members of an editorial board I spent a fair amount of time with them. Then the tennis player Ham Richardson appeared on the obit page. I was a ballboy for an exhibition he and Billy Talbert put on with two members of the Mexican Davis Cup team at the Saddle & Cycle Club in the 1950s in Chicago. I was surprised to learn that Richardson was only three years older than I. I am fairly frequently surprised to discover that the newly deceased are only a little older than I.

Along with footsteps, I also hear clocks. Unlike baseball, life is a game played with a clock. At 70, a relentlessly insistent ticking is going off in the background. I have decided to read, and often reread, books I've missed or those I've loved and want to reread one more time. I recently reread War and Peace, my second reading of this greatest of all novels, and I ended it in sadness, not only because I didn't wish to part from Pierre, Natasha, Nicolai, and the others left alive at the novel's end, but because I know it is unlikely I shall return for another rereading.


340
All:

This thread is for discussion and articles treating the question of "Can't we all just get along?"  

I open with one from the investment newsletter of Richard Russell.

TAC,
CD
=====================

To my surprise, I received a slew of e-mails over the weekend all centered on whether quarterback Rex Grossman is Jewish or not. Along these lines, I have one interesting story. It concerns the great Jewish boxer, Bennie Leonard, considered by many the best lightweight boxer of all time. Bennie had lightening hands -- he scored 69 KOs out of his 157 fights, which is amazing for a lightweight. In his career during the 20s he was defeated only 11 times. Ring Magazine lists Bennie as number 8 in lists of the 80 best fighters of the last 80 years.

Back in the 40s there were a lot of Irish bars on 8th Avenue in New York. One chain was called the Blarney Stone. The Blarney was famous for having all sorts of free food at the bar, and many times I would drop in to the Blarney Stone for a ten cent beer and a hand full of meat balls. The Blarney was a tough place, and bar fights were commonplace.

At any rate, there's this famous story about Bennie Leonard. One day Bennie stopped in at an Irish bar on 8th Avenue. Bennie was drinking a beer when a fierce-looking Irishman stalked out to the middle of the bar, raised a fist and shouted, "Is there a Jew in the house?" There was a dead silence, and then Bennie walked up to the big guy and said, "Yeah, I'm a Jew." Where upon the big guy extended his hand and said, "I've always wanted to meet you, Mr. Leonard. This is a real pleasure. May I buy you a beer?" And that concludes my racial/religious stories, at least for a while.

341
Politics & Religion / Know Thy Enemy
« on: February 05, 2007, 06:00:05 PM »
All:

Because I would like for this piece to get specific attention, I give it its very own thread.    I am hoping more for our own personal comments, than posting of additional articles.

Marc
==========================


KNOWING THE ENEMY
by GEORGE PACKER
Can social scientists redefine the “war on terror”?
Issue of 2006-12-18
Posted 2006-12-11

 

In 1993, a young captain in the Australian Army named David Kilcullen was living among villagers in West Java, as part of an immersion program in the Indonesian language. One day, he visited a local military museum that contained a display about Indonesia’s war, during the nineteen-fifties and sixties, against a separatist Muslim insurgency movement called Darul Islam. “I had never heard of this conflict,” Kilcullen told me recently. “It’s hardly known in the West. The Indonesian government won, hands down. And I was fascinated by how it managed to pull off such a successful counterinsurgency campaign.”

Kilcullen, the son of two left-leaning academics, had studied counterinsurgency as a cadet at Duntroon, the Australian West Point, and he decided to pursue a doctorate in political anthropology at the University of New South Wales. He chose as his dissertation subject the Darul Islam conflict, conducting research over tea with former guerrillas while continuing to serve in the Australian Army. The rebel movement, he said, was bigger than the Malayan Emergency—the twelve-year Communist revolt against British rule, which was finally put down in 1960, and which has become a major point of reference in the military doctrine of counterinsurgency. During the years that Kilcullen worked on his dissertation, two events in Indonesia deeply affected his thinking. The first was the rise—in the same region that had given birth to Darul Islam, and among some of the same families—of a more extreme Islamist movement called Jemaah Islamiya, which became a Southeast Asian affiliate of Al Qaeda. The second was East Timor’s successful struggle for independence from Indonesia. Kilcullen witnessed the former as he was carrying out his field work; he participated in the latter as an infantry-company commander in a United Nations intervention force. The experiences shaped the conclusions about counter-insurgency in his dissertation, which he finished in 2001, just as a new war was about to begin.

“I saw extremely similar behavior and extremely similar problems in an Islamic insurgency in West Java and a Christian-separatist insurgency in East Timor,” he said. “After 9/11, when a lot of people were saying, ‘The problem is Islam,’ I was thinking, It’s something deeper than that. It’s about human social networks and the way that they operate.” In West Java, elements of the failed Darul Islam insurgency—a local separatist movement with mystical leanings—had resumed fighting as Jemaah Islamiya, whose outlook was Salafist and global. Kilcullen said, “What that told me about Jemaah Islamiya is that it’s not about theology.” He went on, “There are elements in human psychological and social makeup that drive what’s happening. The Islamic bit is secondary. This is human behavior in an Islamic setting. This is not ‘Islamic behavior.’ ” Paraphrasing the American political scientist Roger D. Petersen, he said, “People don’t get pushed into rebellion by their ideology. They get pulled in by their social networks.” He noted that all fifteen Saudi hijackers in the September 11th plot had trouble with their fathers. Although radical ideas prepare the way for disaffected young men to become violent jihadists, the reasons they convert, Kilcullen said, are more mundane and familiar: family, friends, associates.

Indonesia’s failure to replicate in East Timor its victory in West Java later influenced Kilcullen’s views about what the Bush Administration calls the “global war on terror.” In both instances, the Indonesian military used the same harsh techniques, including forced population movements, coercion of locals into security forces, stringent curfews, and even lethal pressure on civilians to take the government side. The reason that the effort in East Timor failed, Kilcullen concluded, was globalization. In the late nineties, a Timorese international propaganda campaign and ubiquitous media coverage prompted international intervention, thus ending the use of tactics that, in the obscure jungles of West Java in the fifties, outsiders had known nothing about. “The globalized information environment makes counterinsurgency even more difficult now,” Kilcullen said.

Just before the 2004 American elections, Kilcullen was doing intelligence work for the Australian government, sifting through Osama bin Laden’s public statements, including transcripts of a video that offered a list of grievances against America: Palestine, Saudi Arabia, Afghanistan, global warming. The last item brought Kilcullen up short. “I thought, Hang on! What kind of jihadist are you?” he recalled. The odd inclusion of environmentalist rhetoric, he said, made clear that “this wasn’t a list of genuine grievances. This was an Al Qaeda information strategy.” Ron Suskind, in his book “The One Percent Doctrine,” claims that analysts at the C.I.A. watched a similar video, released in 2004, and concluded that “bin Laden’s message was clearly designed to assist the President’s reëlection.” Bin Laden shrewdly created an implicit association between Al Qaeda and the Democratic Party, for he had come to feel that Bush’s strategy in the war on terror was sustaining his own global importance. Indeed, in the years after September 11th Al Qaeda’s core leadership had become a propaganda hub. “If bin Laden didn’t have access to global media, satellite communications, and the Internet, he’d just be a cranky guy in a cave,” Kilcullen said.

In 2004, Kilcullen’s writings and lectures brought him to the attention of an official working for Paul Wolfowitz, then the Deputy Secretary of Defense. Wolfowitz asked him to help write the section on “irregular warfare” in the Pentagon’s “Quadrennial Defense Review,” a statement of department policy and priorities, which was published earlier this year. Under the leadership of Donald Rumsfeld, who resigned in November, the Pentagon had embraced a narrow “shock-and-awe” approach to war-fighting, emphasizing technology, long-range firepower, and spectacular displays of force. The new document declared that activities such as “long-duration unconventional warfare, counterterrorism, counterinsurgency, and military support for stabilization and reconstruction efforts” needed to become a more important component of the war on terror. Kilcullen was partly responsible for the inclusion of the phrase “the long war,” which has become the preferred term among many military officers to describe the current conflict. In the end, the Rumsfeld Pentagon was unwilling to make the cuts in expensive weapons systems that would have allowed it to create new combat units and other resources necessary for a proper counterinsurgency strategy.

In July, 2005, Kilcullen, as a result of his work on the Pentagon document, received an invitation to attend a conference on defense policy, in Vermont. There he met Henry Crumpton, a highly regarded official who had supervised the C.I.A.’s covert activities in Afghanistan during the 2001 military campaign that overthrew the Taliban. The two men spent much of the conference talking privately, and learned, among other things, that they saw the war on terror in the same way. Soon afterward, Condoleezza Rice, the Secretary of State, hired Crumpton as the department’s coördinator for counterterrorism, and Crumpton, in turn, offered Kilcullen a job. For the past year, Kilcullen has occupied an office on the State Department’s second floor, as Crumpton’s chief strategist. In some senses, Kilcullen has arrived too late: this year, the insurgency in Iraq has been transformed into a calamitous civil war between Sunnis and Shiites, and his ideas about counterinsurgency are unlikely to reverse the country’s disintegration. Yet radical Islamist movements now extend across the globe, from Somalia to Afghanistan and Indonesia, and Kilcullen—an Australian anthropologist and lieutenant colonel, who is “on loan” to the U.S. government—offers a new way to understand and fight a war that seems to grow less intelligible the longer it goes on.



Kilcullen is thirty-nine years old, and has a wide pink face, a fondness for desert boots, and an Australian’s good-natured bluntness. He has a talent for making everything sound like common sense by turning disturbing explanations into brisk, cheerful questions: “America is very, very good at big, short conventional wars? It’s not very good at small, long wars? But it’s even worse at big, long wars? And that’s what we’ve got.” Kilcullen’s heroes are soldier-intellectuals, both real (T. E. Lawrence) and fictional (Robert Jordan, the flinty, self-reliant schoolteacher turned guerrilla who is the protagonist of Hemingway’s “For Whom the Bell Tolls”). On his bookshelves, alongside monographs by social scientists such as Max Gluckman and E. E. Evans-Pritchard, is a knife that he took from a militiaman he had just ambushed in East Timor. “If I were a Muslim, I’d probably be a jihadist,” Kilcullen said as we sat in his office. “The thing that drives these guys—a sense of adventure, wanting to be part of the moment, wanting to be in the big movement of history that’s happening now—that’s the same thing that drives me, you know?”

More than three years into the Iraq war and five into the conflict in Afghanistan, many members of the American military—especially those with combat experience—have begun to accept the need to learn the kind of counterinsurgency tactics that it tried to leave behind in Vietnam. On December 15th, the Army and the Marine Corps will release an ambitious new counterinsurgency field manual—the first in more than two decades—that will shape military doctrine for many years. The introduction to the field manual says, “Effective insurgents rapidly adapt to changing circumstances. They cleverly use the tools of the global information revolution to magnify the effects of their actions. . . . However, by focusing on efforts to secure the safety and support of the local populace, and through a concerted effort to truly function as learning organizations, the Army and Marine Corps can defeat their insurgent enemies.”

One night earlier this year, Kilcullen sat down with a bottle of single-malt Scotch and wrote out a series of tips for company commanders about to be deployed to Iraq and Afghanistan. He is an energetic writer who avoids military and social-science jargon, and he addressed himself intimately to young captains who have had to become familiar with exotica such as “The Battle of Algiers,” the 1966 film documenting the insurgency against French colonists. “What does all the theory mean, at the company level?” he asked. “How do the principles translate into action—at night, with the G.P.S. down, the media criticizing you, the locals complaining in a language you don’t understand, and an unseen enemy killing your people by ones and twos? How does counterinsurgency actually happen? There are no universal answers, and insurgents are among the most adaptive opponents you will ever face. Countering them will demand every ounce of your intellect.” The first tip is “Know Your Turf”: “Know the people, the topography, economy, history, religion and culture. Know every village, road, field, population group, tribal leader, and ancient grievance. Your task is to become the world expert on your district.” “Twenty-eight Articles: Fundamentals of Company-Level Counterinsurgency”—the title riffs on a T. E. Lawrence insurgency manual from the First World War—was disseminated via e-mail to junior officers in the field, and was avidly read.

Last year, in an influential article in the Journal of Strategic Studies, Kilcullen redefined the war on terror as a “global counterinsurgency.” The change in terminology has large implications. A terrorist is “a kook in a room,” Kilcullen told me, and beyond persuasion; an insurgent has a mass base whose support can be won or lost through politics. The notion of a “war on terror” has led the U.S. government to focus overwhelmingly on military responses. In a counterinsurgency, according to the classical doctrine, which was first laid out by the British general Sir Gerald Templar during the Malayan Emergency, armed force is only a quarter of the effort; political, economic, and informational operations are also required. A war on terror suggests an undifferentiated enemy. Kilcullen speaks of the need to “disaggregate” insurgencies: finding ways to address local grievances in Pakistan’s tribal areas or along the Thai-Malay border so that they aren’t mapped onto the ambitions of the global jihad. Kilcullen writes, “Just as the Containment strategy was central to the Cold War, likewise a Disaggregation strategy would provide a unifying strategic conception for the war—something that has been lacking to date.” As an example of disaggregation, Kilcullen cited the Indonesian province of Aceh, where, after the 2004 tsunami, a radical Islamist organization tried to set up an office and convert a local separatist movement to its ideological agenda. Resentment toward the outsiders, combined with the swift humanitarian action of American and Australian warships, helped to prevent the Acehnese rebellion from becoming part of the global jihad. As for America, this success had more to do with luck than with strategy. Crumpton, Kilcullen’s boss, told me that American foreign policy traditionally operates on two levels, the global and the national; today, however, the battlefields are also regional and local, where the U.S. government has less knowledge and where it is not institutionally organized to act. In half a dozen critical regions, Crumpton has organized meetings among American diplomats, intelligence officials, and combat commanders, so that information about cross-border terrorist threats is shared. “It’s really important that we define the enemy in narrow terms,” Crumpton said. “The thing we should not do is let our fears grow and then inflate the threat. The threat is big enough without us having to exaggerate it.”

By speaking of Saddam Hussein, the Sunni insurgency in Iraq, the Taliban, the Iranian government, Hezbollah, and Al Qaeda in terms of one big war, Administration officials and ideologues have made Osama bin Laden’s job much easier. “You don’t play to the enemy’s global information strategy of making it all one fight,” Kilcullen said. He pointedly avoided describing this as the Administration’s approach. “You say, ‘Actually, there are sixty different groups in sixty different countries who all have different objectives. Let’s not talk about bin Laden’s objectives—let’s talk about your objectives. How do we solve that problem?’ ” In other words, the global ambitions of the enemy don’t automatically demand a monolithic response.



The more Kilcullen travels to the various theatres of war, the less he thinks that the lessons of Malaya and Vietnam are useful guides in the current conflict. “Classical counterinsurgency is designed to defeat insurgency in one country,” he writes in his Strategic Studies article. “We need a new paradigm, capable of addressing globalised insurgency.” After a recent trip to Afghanistan, where Taliban forces have begun to mount large operations in the Pashto-speaking south of the country, he told me, “This ain’t your granddaddy’s counterinsurgency.” Many American units there, he said, are executing the new field manual’s tactics brilliantly. For example, before conducting operations in a given area, soldiers sit down over bread and tea with tribal leaders and find out what they need—Korans, cold-weather gear, a hydroelectric dynamo. In exchange for promises of local support, the Americans gather the supplies and then, within hours of the end of fighting, produce them, to show what can be gained from coöperating.

But the Taliban seem to be waging a different war, driven entirely by information operations. “They’re essentially armed propaganda organizations,” Kilcullen said. “They switch between guerrilla activity and terrorist activity as they need to, in order to maintain the political momentum, and it’s all about an information operation that generates the perception of an unstoppable, growing insurgency.” After travelling through southern Afghanistan, Kilcullen e-mailed me:

342
Science, Culture, & Humanities / Nature
« on: February 02, 2007, 05:13:40 PM »
Britain's Top Woman Paraglider Attacked by Eagles Mid-Flight
Friday, February 02, 2007

By Bernard Lagan


Britain’s top woman paraglider told today how she cheated death after two huge Australian eagles attacked her 8,200 feet above the Outback.

Nicky Moss, 38, said she thought “Why me?” when the eagles came screeching out of the sky and began shredding the wing of her paraglider over New South Wales this week.  She spun out of control and into a terrifying freefall for 1600 feet when one of the eagles became entangled in the lines suspending her beneath the glider’s wing, causing it to collapse and sending them diving toward earth before it managed to free it itself.

The wedge-tailed eagles are Australia’s largest bird of prey and are among the world’s biggest eagles. They swoop upon sheep and have wing spans of more than 7 1/2 feet.

Ms Moss, who was training with the British team for the World Paragliding Championships, to be held in Australia, said the first she knew of the eagles was when she heard a screeching sound from behind. Until then she had been soaring on air currents above the border area of New South Wales and Queensland.

“I looked around and couldn’t see anything, and then the next moment the top surface of my wing deformed as an eagle flew straight into the top of me. It quite possibly ripped the canopy and then wheeled around and continued to have other goes for quite a long period of time,” Ms Moss said.

“Another eagle actually came in and joined it. It was a pair of them. I was getting bombarded by wedge-tailed eagles. They tore my glider. There were quite big rips in it from their talons. At one point one of them dived at me from behind and actually hit me on the back of the head and flew through the lines of my glider and got all tangled up.  It collapsed the glider completely. So we were plummeting for 500 meters (1600 feet), probably something like that, before the eagle got himself out,” she said.

Ms Moss said that she considered deploying her emergency parachute but decided that the eagles might also attack and damage that, leaving her with no back-up.  She regained control of her paraglider after the eagle escaped from her control lines. But the birds continued to attack her.

“I screamed at the eagles quite a bit. I just had to carry on flying. I got out of the skies as quickly as I could by doing some maneuvers, and about 300 feet from the ground the eagles left me alone. I landed quite easily and safely in a paddock.”

Her dramatic brush with the eagles was seen from the ground by other competitors.

343
Politics & Religion / Support our troops
« on: February 02, 2007, 01:05:25 PM »
Woof All:

We highly recommend the work of courageous front line work of Michael Yon in Iraq.  If you search for Truth, he should be a regular source for you.   We give money to support his workt, and hope that you will too.   To learn about who he is we recommend a thorough surf through his website.   

This thread is to connect you with what he is doing.

Michael, green light to post whatever you want.

The Adventure continues,
Marc "Crafty Dog" Denny
============================================================

Greetings from Iraq:

 

When I arrived in Mosul a few weeks ago, they were getting about 1 car bomb per week.  Now it’s up to about 1 per day. The fighting is intensifying here, and that’s the bad news. There is some good news, however: Iraqi Security Forces, though taking losses, are thoroughly punishing the enemy here.  Just a few days ago, the enemy launched a large and well-planned attack on a police station.  In late 2004 or 2005, such an attack would have been devastating – and was.  But this time, when the enemy demanded the Iraq Police surrender, the police responded with gunfire. Lots of it. After several hours of fighting, the enemy fled in front of their blood trails. 

I'll be writing about that incident in an upcoming dispatch. Meanwhile, a new dispatch is available at: “The Hands of God.”
http://www.michaelyon-online.com/wp/the-hands-of-god.htm

 In addition to the audio file of a conversation between American soldiers and Iraqi villagers, after a homicide bomber attack, it also includes links to several previous dispatches. To get a better context for the new reports out of Mosul, readers might want to read Gates of Fire and Battle for Mosul IV.

 

Thank you for supporting this site. Many have asked for a mailing address to use as an alternative to the paypal system. It is included below my signature.

 

Respectfully,
Michael   

Michael Yon
P O Box 416
Westport Pt MA 02791

344
Science, Culture, & Humanities / Internet and related technology
« on: February 02, 2007, 06:58:29 AM »
The Coming Exaflood
By BRET SWANSON
January 20, 2007; Page A11

Today there is much praise for YouTube, MySpace, blogs and all the other democratic digital technologies that are allowing you and me to transform media and commerce. But these infant Internet applications are at risk, thanks to the regulatory implications of "network neutrality." Proponents of this concept -- including Democratic Reps. John Dingell and John Conyers, and Sen. Daniel Inouye, who have ascended to key committee chairs -- are obsessed with divvying up the existing network, but oblivious to the need to build more capacity.

To understand, let's take a step back. In 1999, Yahoo acquired Broadcast.com for $5 billion. Broadcast.com had little revenue, and although its intent was to stream sports and entertainment video to consumers over the Internet, two-thirds of its sales at the time came from hosting corporate video conferences. Yahoo absorbed the start-up -- and little more was heard of Broadcast.com or Yahoo's video ambitions.

Seven years later, Google acquired YouTube for $1.65 billion. Like Broadcast.com, YouTube so far has not enjoyed large revenues. But it is streaming massive amounts of video to all corners of the globe. The difference: Broadcast.com failed because there were almost no broadband connections to homes and businesses. Today, we have hundreds of millions of links world-wide capable of transmitting passable video clips.

Why did that come about? At the Telecosm conference last October, Stanford professor Larry Lessig asserted that the previous federal Internet policy of open access neutrality was the chief enabler of success on the net. "ecause of that neutrality," Mr. Lessig insisted, "the explosion of innovation and the applications and content layer happened. Now . . . the legal basis supporting net neutrality has been erased by the FCC."

In fact, Mr. Lessig has it backward. Broadcast.com failed precisely because the FCC's "neutral" telecom price controls and sharing mandates effectively prohibited investments in broadband networks and crashed thousands of Silicon Valley business plans and dot-com dreams. Hoping to create "competition" out of thin air, the Clinton-Gore FCC forced telecom providers to lease their wires and switches at below-market rates. By guaranteeing a negative rate of return on infrastructure investments, the FCC destroyed incentives to build new broadband networks -- the kind that might have allowed Broadcast.com to flourish.

By 2000, the U.S. had fewer than five million consumer "broadband" links, averaging 500 kilobits per second. Over the past two years, the reverse has been true. As the FCC has relaxed or eliminated regulations, broadband investment and download speeds have surged -- we now enjoy almost 50 million broadband links, averaging some three megabits per second. Internet video succeeded in the form of YouTube. But that "explosion of innovation" at the "applications and content layer" was not feasible without tens of billions of dollars of optics, chips and disks deployed around the world. YouTube at the edge cannot happen without bandwidth in the core.

Messrs. Lessig, Dingell and Conyers, and Google, now want to repeat all the investment-killing mistakes of the late 1990s, in the form of new legislation and FCC regulation to ensure "net neutrality." This ignores the experience of the recent past -- and worse, the needs of the future.

Think of this. Each year the original content on the world's radio, cable and broadcast television channels adds up to about 75 petabytes of data -- or, 10 to the 15th power. If current estimates are correct, the two-year-old YouTube streams that much data in about three months. But a shift to high-definition video clips by YouTube users would flood the Internet with enough data to more than double the traffic of the entire cybersphere. And YouTube is just one company with one application that is itself only in its infancy. Given the growth of video cameras around the world, we could soon produce five exabytes of amateur video annually. Upgrades to high-definition will in time increase that number by another order of magnitude to some 50 exabytes or more, or 10 times the Internet's current yearly traffic.

We will increasingly share these videos with the world. And even if we do not share them, we will back them up at remote data storage facilities. I just began using a service called Mozy that each night at 3 a.m. automatically scans and backs up the gigabytes worth of documents and photos on my PCs. My home computers are now mirrored at a data center in Utah. One way or another, these videos will thus traverse the net at least once, and possibly, in the case of a YouTube hit, hundreds of thousands of times.

There's more. Advances in digital medical imaging will soon slice your brain 1,024 ways with resolution of less than half a millimeter and produce multigigabyte files. A technician puts your anatomy on a DVD and you send your body onto the Internet for analysis by a radiologist in Mumbai. You skip doctor visits, stay home and have him come to you with a remote video diagnosis. Add another 10 exabytes or more of Internet data traffic. Then there's what George Gilder calls the "global sensorium," the coming network of digital surveillance cameras, RFID tags and other sensors, sprawling across every home, highway, hybrid, high-rise, high-school, etc. All this data will be collected, analyzed and transmitted. Oh, and how about video conferencing? Each year we generate some 20 exabytes of data via telephone. As these audio conversations gradually shift to video, putting further severe strains on the network, we could multiply the 20 exabytes by a factor of 100 or more.

Today's networks are not remotely prepared to handle this exaflood.

Wall Street will finance new telco and cable fiber optic projects, but only with some reasonable hope of a profit. And that is what net neutrality could squelch. Google, for example, has guaranteed $900 million in advertising revenue to MySpace and paid Dell $1 billion to install Google search boxes on its computers; YouTube partnered with Verizon Wireless; MySpace signed its own content deal with Cingular. But these kinds of preferential partnerships, where content and conduit are integrated to varying degrees -- and which are ubiquitous in almost every industry -- could be outlawed under net neutrality.

Ironically, the condition that net neutrality seeks to ban -- discrimination or favoritism of content on the Internet -- is only necessary in narrowband networks. When resources are scarce, the highest bidder can exclude the others. But with real broadband networks, capacity is abundant and discrimination unnecessary. Net neutrality's rules, price controls and litigation would prevent broadband networks from being built, limit the amount of available bandwidth and thus encourage the zero-sum discrimination supposedly deplored.

Without many tens of billions of dollars worth of new fiber optic networks, thousands of new business plans in communications, medicine, education, security, remote sensing, computing, the military and every mundane task that could soon move to the Internet will be frustrated. All the innovations on the edge will die. Only an explosion of risky network investment and new network technology can accommodate these millions of ideas.

Mr. Swanson is a senior fellow at the Discovery Institute, and contributing editor at the Gilder Technology Report.


345
Science, Culture, & Humanities / Football concussions, a story
« on: February 02, 2007, 05:55:40 AM »
NY Times
 By ALAN SCHWARZ
Published: February 2, 2007
Ted Johnson helped the New England Patriots win three of the past five Super Bowls before retiring in 2005. Now, he says, he forgets people’s names, misses appointments and, because of an addiction to amphetamines, can become so terrified of the outside world that he locks himself alone inside his Boston apartment in bed with the blinds drawn for days at a time.

Ted Johnson says that his behavior has become so erratic that “I can’t even let myself have a job right now. I don’t trust myself.”
“There’s something wrong with me,” said Mr. Johnson, 34, who spent 10 years in the National Football League as the Patriots’ middle linebacker. “There’s something wrong with my brain. And I know when it started.”

Mr. Johnson’s decline began, he said, in August 2002, with a concussion he sustained in a preseason game against the New York Giants. He sustained another four days later during a practice, after Patriots Coach Bill Belichick went against the recommendation of the team’s trainer, Johnson said, and submitted him to regular on-field contact.

Mr. Belichick and the Patriots’ head trainer at the time, Jim Whalen — each of whom remain in those positions — declined to comment on Mr. Johnson’s medical experience with the team or his allegations regarding their actions.

Following his two concussions in August 2002, Mr. Johnson sat out the next two preseason games on the recommendation of a neurologist. After returning to play, Mr. Johnson sustained more concussions of varying severity over the next three seasons, each of them exacerbating the next, according to Mr. Johnson’s current neurologist, Dr. Robert Cantu.

Dr. Cantu said that he was convinced Mr. Johnson’s cognitive impairment and depression “are related to his previous head injuries, as they are all rather classic postconcussion symptoms.” He added, “They are most likely permanent.”

Asked for a prognosis of Mr. Johnson’s future, Dr. Cantu, the chief of neurosurgery and director of sports medicine at Emerson Hospital in Concord, Mass., said: “Ted already shows the mild cognitive impairment that is characteristic of early Alzheimer’s disease. The majority of those symptoms relentlessly progress over time. It could be that at the time he’s in his 50s, he could have severe Alzheimer’s symptoms.”

Mr. Johnson is among a growing number of former players and their relatives who are questioning whether their serious health issues are related to injuries they sustained and the treatment they received as players. Mr. Johnson said he decided to go public with his story after reading in The New York Times two weeks ago about Andre Waters, the former Philadelphia Eagles player who committed suicide last November and was later determined to have had significant brain damage caused by football-related concussions.

Mr. Johnson said he was not suicidal, but that the depression and cognitive problems he had developed since 2002 had worsened to the point that he now takes Adderall, a prescription amphetamine, at two to three times the dosage authorized by his doctors, who have been unaware of this abuse.

When he runs out of these pills, Mr. Johnson said, he shuts himself inside his downtown apartment for days and communicates with no one until a new prescription becomes available. He said he was coming forward with his story so that his friends and family might better understand his situation, and also so that the National Football League might improve its handling of concussions.

While the league’s guidelines regarding head injuries have been strengthened over the past decade, the N.F.L.’s record of allowing half of players who sustain concussions to return to the same game remains a subject of medical debate.

“I am afraid of somebody else being the next Andre Waters,” said Mr. Johnson, who spent two weeks in February at a psychiatric hospital outside Boston with, he said, no appreciable results. “People are going to question me: ‘Are you a whistleblower, what are you doing this for?’ You can call it whatever you want about what happened to me. I didn’t know the long-term ramifications. You can say that my coach didn’t know the long-term, or else he wouldn’t have done it. It is going to be hard for me to believe that my trainer didn’t know the long-term ramifications, but I am doing this to protect the players from themselves.”

The N.F.L. spokesman Greg Aiello said that the league had no knowledge of Johnson’s specific situation. Regarding the subject of player concussions in general, he said, “We are very concerned about the issue of concussions, and we are going to continue to look hard at it and do everything possible to protect the health of our players.”

At a news conference yesterday in Miami, where the Super Bowl will be held Sunday, Gene Upshaw, the executive director of the National Football League Players Association, spoke in general terms about concussions in the N.F.L. “If a coach or anyone else is saying, ‘You don’t have a concussion, you get back in there,’ you don’t have to go, and you shouldn’t go,” Upshaw said, not speaking about the Johnson case specifically. “You know how you feel. That’s what we tried to do throughout the years, is take the coach out of the decision-making. It’s the medical people that have to decide.”

Mr. Johnson, who has a 2-year-old daughter and a 1-year-old son, is currently in divorce proceedings with his wife, Jackie, a situation that he admitted was compounding his depression.
==========

He was arrested in July on domestic assault-and-battery charges, which were later dropped because his wife declined to testify. Mr. Johnson said that his concussive symptoms and drug addiction not only precipitated his marriage’s decline but began several years before it, specifically that preseason of 2002.


According to Patriots medical records that Mr. Johnson shared with The Times, the only notable concussion in his career to that point happened when he played at the University of Colorado in 1993. Against the Giants on Aug. 10, 2002, those records indicate, he sustained a “head injury” — the word concussion was not used — and despite the clearing of symptoms after several minutes on the sideline, he did not return to the game.

Mr. Johnson said that four days later, when full-contact practice resumed, Mr. Whalen issued him a red jersey, the standard signal to all other players that he was not supposed to be hit in any way. About an hour into the practice, Mr. Johnson said, before a set of high-impact running drills, an assistant trainer came out on the field with a standard blue jersey. When he asked for an explanation, Mr. Johnson said, the assistant told him that he was following Mr. Whalen’s instructions.

Mr. Johnson, whose relationship with Mr. Belichick had already been strained by a contract dispute, said he interpreted the scene as Mr. Belichick’s testing his desire to play, and that he might be cut and lose his $1.1 million salary — N.F.L. contracts are not guaranteed — if he did not follow orders.

“I’m sitting there going, ‘God, do I put this thing on?’ ” Mr. Johnson said. “I put the blue on. I was scared for my job.”

Regarding the intimidation he felt at that moment, Mr. Johnson added, “This kind of thing happens all the time in football. That day it was Bill Belichick and Ted Johnson. But it happens all the time.”

Several Patriots teammates said they did not recall this incident but invariably testified to the believability of Mr. Johnson, the team captain in 1998 and 2003. Said one former teammate, who insisted on anonymity because he still plays with the Patriots under Mr. Belichick, “If Ted tells you something’s going on, something’s going on.”

Mr. Johnson said that the first play called after he put the blue jersey on, known as “ace-ice,” called for one act from him, the middle linebacker: to sprint four yards headlong into the onrushing blocking back. After that collision, Mr. Johnson said, a warm sensation overtook his body, he saw stars, and he felt disoriented as the other players appeared to be moving in slow motion. He never lost consciousness, though, and after several seconds regained his composure and continued to practice “in a bit of a fog” while trying to avoid contact. He said he did not mention anything to anyone until after practice, when he angrily approached Mr. Whalen, the head trainer.

“I said, ‘Just so you know, I got another concussion,’ ” Johnson said. “You could see the blood, like, leave his face. And he was like, ‘All right, all right, well, we’re going to get you in to see a neurologist.’ ”

Dr. Lee H. Schwamm, the neurologist at Massachusetts General Hospital who examined Mr. Johnson, concluded in a memo on Aug. 19, 2002, that Mr. Johnson had sustained a second concussion in that practice. Dr. Schwamm also wrote that, after speaking with Mr. Whalen, that Mr. Whalen “was on the sidelines when he sustained the concussion during the game and assessed him frequently at the sideline,” and that “he has kept Mr. Johnson out of contact since that time.”

Mr. Johnson said that the next day he spoke with Mr. Belichick about the incident but that they only glossed over it.

“He was vaguely acknowledging that he was aware of what happened,” Mr. Johnson said, “and he wanted to just kind of let me know that he knew.”

Mr. Johnson missed the next two preseason games, played in the final one, and then, believing he was still going to be left off the active roster for the opening game against Pittsburgh, angrily left camp for two days before returning and meeting with Mr. Belichick and confronting him privately about the blue-jersey incident.

“It’s as clear as a bell — ‘I had to see if you could play,’ ” Mr. Johnson recalled Mr. Belichick saying. Minutes later, Mr. Johnson said, Mr. Belichick admitted he had made a mistake by making him wear the blue jersey. “It was a real kind of admittance, but it was only him and I in the room,” Mr. Johnson said.

« Previous Page1 2 3
==================

Mr. Johnson sat out the season opener but played the following Sunday against the New York Jets, a game in which Mr. Johnson said he could not remember line formations and was caught out of position because he could not concentrate. After sitting out the next game against Kansas City, Mr. Johnson played against San Diego and had the same problem.

He learned how to manage the disorientation and played the rest of the season but said that, “from that point on, I was getting a lot of these, what I call mini-concussions.”

Mr. Johnson added that he did not report these to his trainer or coaches for fear he would be seen as weak.

This continued through the 2003 season, Mr. Johnson said, as he noticed himself feeling increasingly more unfocused, irritable and depressed. Teammates noticed as well, said Willie McGinest, a fellow linebacker who now plays for the Cleveland Browns.

“He was always an upbeat, positive guy,” Mr. McGinest said. “After the last few concussions, you could tell he was off at times.”

Playing poorly, Mr. Johnson lost his starting job.

In the week before the 2004 Super Bowl, Mr. Johnson said, a friend who supplied amphetamines to several major league baseball pitchers gave him some Adderall pills to cure his lethargy and increase his concentration. “It was the best I had felt in the longest time,” Mr. Johnson said. “The old Ted was back.”

After playing only sparingly in that Super Bowl, Mr. Johnson began taking larger and larger doses before and throughout the 2004 season, when he regained his starting position at middle linebacker and helped the Patriots win their second consecutive Super Bowl.

The better mood did not last long, he said. The minor concussions — euphemized as “dings” in N.F.L. lingo — that he regularly sustained in practice and in games hurt more than the Adderall could help. The thought of violently tackling a player, he said, “made me physically ill.”

“For the first time in my life,” he said, “I was scared of going out there and putting my head in there.”

Mr. Johnson retired before the 2005 season and briefly worked as a football analyst for WBZ-TV in Boston. But he said his malaise and cognitive problems were only getting worse, and in his attempt to regain some sort of balance, he wound up taking large amounts of antidepressants along with increasing amounts of Adderall, creating a dangerous up-and-down cycle that he realized required professional attention. Last February, he spent two weeks at McLean Hospital, a psychiatric institution in suburban Belmont, Mass.

Mr. Johnson said he felt no better after that experience, and he quickly resumed the Adderall abuse that continues today. He has moved out of his former house during his divorce proceedings and lives in a two-bedroom apartment downtown, which after three months contains dozens of half-open moving boxes.

“Welcome to the glamorous life of a former N.F.L. player,” he said. A half-hour later, he stepped into his Range Rover and drove to his local CVS to pick up another bottle of Adderall. The 72 pills of 30 milligrams each are supposed to last nine days, but he knows he will blow through them in four or five.

One of his most maddening frustrations, Mr. Johnson said, is that no tests — from M.R.I.’s to other scans of his brain — have confirmed his condition, causing some people in his life to suspect that he is wallowing in retirement blues. “That’s ridiculous,” he said, “because I always treated football as a steppingstone for the rest of my life. I used to have incredible drive and ambition. I want to get my M.B.A. But I can’t even let myself have a job right now. I don’t trust myself.”

Dr. Cantu, his neurosurgeon, said he was convinced that Mr. Johnson’s condition was primarily caused by successive concussions sustained over short periods of time. He said that M.R.I.’s of Mr. Johnson’s brain were clear, but that “the vast majority of individuals with postconcussion syndrome, including depression, cognitive impairment, all the symptoms that Ted has, have normal M.R.I.’s.”

The most conclusive method to assess this type of brain damage, Dr. Cantu said, was to examine parts of the brain microscopically for tears and tangles, but such a test is done almost exclusively post-mortem. It was this type of examination that was conducted by a neuropathologist at the University of Pittsburgh, Dr. Bennet Omalu, on the brain of Mr. Waters after his suicide, revealing a condition that Dr. Omalu described as that of an 85-year-old with Alzheimer’s disease.

“The type of changes that Andre Waters reportedly had most likely Ted has as well,” Dr. Cantu said.

Experts in the field of athletic head trauma have grown increasingly confident through studies and anecdotal evidence that repeated concussions, particularly those sustained only days apart, are particularly dangerous. Dr. David Hovda, a professor of neurosurgery and director of the Brain Injury Research Center at U.C.L.A., said, “Repeated concussions — it doesn’t matter the severity — have affects that are more than additive, and that last longer.”

Sitting in his apartment this week, Mr. Johnson said that he had not considered a lawsuit against Mr. Belichick, any Patriots personnel or the N.F.L. He said that his sole motivation was to raise awareness of the dangers that football players can face regarding concussions.

Asked who was to blame for his condition — Mr. Belichick, Mr. Whalen, himself or the entire culture of the N.F.L. — Mr. Johnson thought for 30 seconds and said he could not decide.

Several hours later, he was riding in an elevator up to a consultation with Dr. Cantu. As the door opened on the seventh floor, a middle-aged man walked out and smiled warmly at Mr. Johnson. “We missed you this year,” he said.

“Thanks, man,” Mr. Johnson said with a grin and a nod. Later, Mr. Johnson said something else went through his troubled mind at that moment.

“I miss me, too,” he said.


346
Science, Culture, & Humanities / Amazing Artwork
« on: January 31, 2007, 09:17:44 PM »
Best I've ever seen of this sort of thing!

http://www.thepuzzlefactory.com:80/2006_chalk.cfm

347
Science, Culture, & Humanities / Diet
« on: January 28, 2007, 06:44:48 AM »
With this thread, no longer is diet a subset of the Health thread-- diet gets its own thread.

I begin with a long article from today's NY Times Magazine which I think makes a profound point quite similar to the one I have been making for many years now in a more humorous manner-- the secret to diet is to eat so that you defecate well.  Like I tell my children, "Eat real food!  Have you seen it grow out of the ground, from a bush or a tree?  Have you seen a hunter hunt it?  A fisherman fish it?  If not, its not real food!"

==============

Eat food. Not too much. Mostly plants.

That, more or less, is the short answer to the supposedly incredibly complicated and confusing question of what we humans should eat in order to be maximally healthy. I hate to give away the game right here at the beginning of a long essay, and I confess that I’m tempted to complicate matters in the interest of keeping things going for a few thousand more words. I’ll try to resist but will go ahead and add a couple more details to flesh out the advice. Like: A little meat won’t kill you, though it’s better approached as a side dish than as a main. And you’re much better off eating whole fresh foods than processed food products. That’s what I mean by the recommendation to eat “food.” Once, food was all you could eat, but today there are lots of other edible foodlike substances in the supermarket. These novel products of food science often come in packages festooned with health claims, which brings me to a related rule of thumb: if you’re concerned about your health, you should probably avoid food products that make health claims. Why? Because a health claim on a food product is a good indication that it’s not really food, and food is what you want to eat.

Uh-oh. Things are suddenly sounding a little more complicated, aren’t they? Sorry. But that’s how it goes as soon as you try to get to the bottom of the whole vexing question of food and health. Before long, a dense cloud bank of confusion moves in. Sooner or later, everything solid you thought you knew about the links between diet and health gets blown away in the gust of the latest study.

Last winter came the news that a low-fat diet, long believed to protect against breast cancer, may do no such thing — this from the monumental, federally financed Women’s Health Initiative, which has also found no link between a low-fat diet and rates of coronary disease. The year before we learned that dietary fiber might not, as we had been confidently told, help prevent colon cancer. Just last fall two prestigious studies on omega-3 fats published at the same time presented us with strikingly different conclusions. While the Institute of Medicine stated that “it is uncertain how much these omega-3s contribute to improving health” (and they might do the opposite if you get them from mercury-contaminated fish), a Harvard study declared that simply by eating a couple of servings of fish each week (or by downing enough fish oil), you could cut your risk of dying from a heart attack by more than a third — a stunningly hopeful piece of news. It’s no wonder that omega-3 fatty acids are poised to become the oat bran of 2007, as food scientists micro-encapsulate fish oil and algae oil and blast them into such formerly all-terrestrial foods as bread and tortillas, milk and yogurt and cheese, all of which will soon, you can be sure, sprout fishy new health claims. (Remember the rule?)

By now you’re probably registering the cognitive dissonance of the supermarket shopper or science-section reader, as well as some nostalgia for the simplicity and solidity of the first few sentences of this essay. Which I’m still prepared to defend against the shifting winds of nutritional science and food-industry marketing. But before I do that, it might be useful to figure out how we arrived at our present state of nutritional confusion and anxiety.

The story of how the most basic questions about what to eat ever got so complicated reveals a great deal about the institutional imperatives of the food industry, nutritional science and — ahem — journalism, three parties that stand to gain much from widespread confusion surrounding what is, after all, the most elemental question an omnivore confronts. Humans deciding what to eat without expert help — something they have been doing with notable success since coming down out of the trees — is seriously unprofitable if you’re a food company, distinctly risky if you’re a nutritionist and just plain boring if you’re a newspaper editor or journalist. (Or, for that matter, an eater. Who wants to hear, yet again, “Eat more fruits and vegetables”?) And so, like a large gray fog, a great Conspiracy of Confusion has gathered around the simplest questions of nutrition — much to the advantage of everybody involved. Except perhaps the ostensible beneficiary of all this nutritional expertise and advice: us, and our health and happiness as eaters.

FROM FOODS TO NUTRIENTS

It was in the 1980s that food began disappearing from the American supermarket, gradually to be replaced by “nutrients,” which are not the same thing. Where once the familiar names of recognizable comestibles — things like eggs or breakfast cereal or cookies — claimed pride of place on the brightly colored packages crowding the aisles, now new terms like “fiber” and “cholesterol” and “saturated fat” rose to large-type prominence. More important than mere foods, the presence or absence of these invisible substances was now generally believed to confer health benefits on their eaters. Foods by comparison were coarse, old-fashioned and decidedly unscientific things — who could say what was in them, really? But nutrients — those chemical compounds and minerals in foods that nutritionists have deemed important to health — gleamed with the promise of scientific certainty; eat more of the right ones, fewer of the wrong, and you would live longer and avoid chronic diseases.



============



Unhappy Meals

 


 

Published: January 28, 2007
(Page 2 of 12)



Nutrients themselves had been around, as a concept, since the early 19th century, when the English doctor and chemist William Prout identified what came to be called the “macronutrients”: protein, fat and carbohydrates. It was thought that that was pretty much all there was going on in food, until doctors noticed that an adequate supply of the big three did not necessarily keep people nourished. At the end of the 19th century, British doctors were puzzled by the fact that Chinese laborers in the Malay states were dying of a disease called beriberi, which didn’t seem to afflict Tamils or native Malays. The mystery was solved when someone pointed out that the Chinese ate “polished,” or white, rice, while the others ate rice that hadn’t been mechanically milled. A few years later, Casimir Funk, a Polish chemist, discovered the “essential nutrient” in rice husks that protected against beriberi and called it a “vitamine,” the first micronutrient. Vitamins brought a kind of glamour to the science of nutrition, and though certain sectors of the population began to eat by its expert lights, it really wasn’t until late in the 20th century that nutrients managed to push food aside in the popular imagination of what it means to eat.

No single event marked the shift from eating food to eating nutrients, though in retrospect a little-noticed political dust-up in Washington in 1977 seems to have helped propel American food culture down this dimly lighted path. Responding to an alarming increase in chronic diseases linked to diet — including heart disease, cancer and diabetes — a Senate Select Committee on Nutrition, headed by George McGovern, held hearings on the problem and prepared what by all rights should have been an uncontroversial document called “Dietary Goals for the United States.” The committee learned that while rates of coronary heart disease had soared in America since World War II, other cultures that consumed traditional diets based largely on plants had strikingly low rates of chronic disease. Epidemiologists also had observed that in America during the war years, when meat and dairy products were strictly rationed, the rate of heart disease temporarily plummeted.

Naïvely putting two and two together, the committee drafted a straightforward set of dietary guidelines calling on Americans to cut down on red meat and dairy products. Within weeks a firestorm, emanating from the red-meat and dairy industries, engulfed the committee, and Senator McGovern (who had a great many cattle ranchers among his South Dakota constituents) was forced to beat a retreat. The committee’s recommendations were hastily rewritten. Plain talk about food — the committee had advised Americans to actually “reduce consumption of meat” — was replaced by artful compromise: “Choose meats, poultry and fish that will reduce saturated-fat intake.”

A subtle change in emphasis, you might say, but a world of difference just the same. First, the stark message to “eat less” of a particular food has been deep-sixed; don’t look for it ever again in any official U.S. dietary pronouncement. Second, notice how distinctions between entities as different as fish and beef and chicken have collapsed; those three venerable foods, each representing an entirely different taxonomic class, are now lumped together as delivery systems for a single nutrient. Notice too how the new language exonerates the foods themselves; now the culprit is an obscure, invisible, tasteless — and politically unconnected — substance that may or may not lurk in them called “saturated fat.”

The linguistic capitulation did nothing to rescue McGovern from his blunder; the very next election, in 1980, the beef lobby helped rusticate the three-term senator, sending an unmistakable warning to anyone who would challenge the American diet, and in particular the big chunk of animal protein sitting in the middle of its plate. Henceforth, government dietary guidelines would shun plain talk about whole foods, each of which has its trade association on Capitol Hill, and would instead arrive clothed in scientific euphemism and speaking of nutrients, entities that few Americans really understood but that lack powerful lobbies in Washington. This was precisely the tack taken by the National Academy of Sciences when it issued its landmark report on diet and cancer in 1982. Organized nutrient by nutrient in a way guaranteed to offend no food group, it codified the official new dietary language. Industry and media followed suit, and terms like polyunsaturated, cholesterol, monounsaturated, carbohydrate, fiber, polyphenols, amino acids and carotenes soon colonized much of the cultural space previously occupied by the tangible substance formerly known as food. The Age of Nutritionism had arrived.

THE RISE OF NUTRITIONISM

The first thing to understand about nutritionism — I first encountered the term in the work of an Australian sociologist of science named Gyorgy Scrinis — is that it is not quite the same as nutrition. As the “ism” suggests, it is not a scientific subject but an ideology. Ideologies are ways of organizing large swaths of life and experience under a set of shared but unexamined assumptions. This quality makes an ideology particularly hard to see, at least while it’s exerting its hold on your culture. A reigning ideology is a little like the weather, all pervasive and virtually inescapable. Still, we can try.

In the case of nutritionism, the widely shared but unexamined assumption is that the key to understanding food is indeed the nutrient. From this basic premise flow several others. Since nutrients, as compared with foods, are invisible and therefore slightly mysterious, it falls to the scientists (and to the journalists through whom the scientists speak) to explain the hidden reality of foods to us. To enter a world in which you dine on unseen nutrients, you need lots of expert help.

But expert help to do what, exactly? This brings us to another unexamined assumption: that the whole point of eating is to maintain and promote bodily health. Hippocrates’s famous injunction to “let food be thy medicine” is ritually invoked to support this notion. I’ll leave the premise alone for now, except to point out that it is not shared by all cultures and that the experience of these other cultures suggests that, paradoxically, viewing food as being about things other than bodily health — like pleasure, say, or socializing — makes people no less healthy; indeed, there’s some reason to believe that it may make them more healthy. This is what we usually have in mind when we speak of the “French paradox” — the fact that a population that eats all sorts of unhealthful nutrients is in many ways healthier than we Americans are. So there is at least a question as to whether nutritionism is actually any good for you.

=====================

Page 3 of 12)


Another potentially serious weakness of nutritionist ideology is that it has trouble discerning qualitative distinctions between foods. So fish, beef and chicken through the nutritionists’ lens become mere delivery systems for varying quantities of fats and proteins and whatever other nutrients are on their scope. Similarly, any qualitative distinctions between processed foods and whole foods disappear when your focus is on quantifying the nutrients they contain (or, more precisely, the known nutrients).

This is a great boon for manufacturers of processed food, and it helps explain why they have been so happy to get with the nutritionism program. In the years following McGovern’s capitulation and the 1982 National Academy report, the food industry set about re-engineering thousands of popular food products to contain more of the nutrients that science and government had deemed the good ones and less of the bad, and by the late ’80s a golden era of food science was upon us. The Year of Eating Oat Bran — also known as 1988 — served as a kind of coming-out party for the food scientists, who succeeded in getting the material into nearly every processed food sold in America. Oat bran’s moment on the dietary stage didn’t last long, but the pattern had been established, and every few years since then a new oat bran has taken its turn under the marketing lights. (Here comes omega-3!)

By comparison, the typical real food has more trouble competing under the rules of nutritionism, if only because something like a banana or an avocado can’t easily change its nutritional stripes (though rest assured the genetic engineers are hard at work on the problem). So far, at least, you can’t put oat bran in a banana. So depending on the reigning nutritional orthodoxy, the avocado might be either a high-fat food to be avoided (Old Think) or a food high in monounsaturated fat to be embraced (New Think). The fate of each whole food rises and falls with every change in the nutritional weather, while the processed foods are simply reformulated. That’s why when the Atkins mania hit the food industry, bread and pasta were given a quick redesign (dialing back the carbs; boosting the protein), while the poor unreconstructed potatoes and carrots were left out in the cold.

Of course it’s also a lot easier to slap a health claim on a box of sugary cereal than on a potato or carrot, with the perverse result that the most healthful foods in the supermarket sit there quietly in the produce section, silent as stroke victims, while a few aisles over, the Cocoa Puffs and Lucky Charms are screaming about their newfound whole-grain goodness.

EAT RIGHT, GET FATTER

So nutritionism is good for business. But is it good for us? You might think that a national fixation on nutrients would lead to measurable improvements in the public health. But for that to happen, the underlying nutritional science, as well as the policy recommendations (and the journalism) based on that science, would have to be sound. This has seldom been the case.

Consider what happened immediately after the 1977 “Dietary Goals” — McGovern’s masterpiece of politico-nutritionist compromise. In the wake of the panel’s recommendation that we cut down on saturated fat, a recommendation seconded by the 1982 National Academy report on cancer, Americans did indeed change their diets, endeavoring for a quarter-century to do what they had been told. Well, kind of. The industrial food supply was promptly reformulated to reflect the official advice, giving us low-fat pork, low-fat Snackwell’s and all the low-fat pasta and high-fructose (yet low-fat!) corn syrup we could consume. Which turned out to be quite a lot. Oddly, America got really fat on its new low-fat diet — indeed, many date the current obesity and diabetes epidemic to the late 1970s, when Americans began binging on carbohydrates, ostensibly as a way to avoid the evils of fat.

This story has been told before, notably in these pages (“What if It’s All Been a Big Fat Lie?” by Gary Taubes, July 7, 2002), but it’s a little more complicated than the official version suggests. In that version, which inspired the most recent Atkins craze, we were told that America got fat when, responding to bad scientific advice, it shifted its diet from fats to carbs, suggesting that a re-evaluation of the two nutrients is in order: fat doesn’t make you fat; carbs do. (Why this should have come as news is a mystery: as long as people have been raising animals for food, they have fattened them on carbs.)

But there are a couple of problems with this revisionist picture. First, while it is true that Americans post-1977 did begin binging on carbs, and that fat as a percentage of total calories in the American diet declined, we never did in fact cut down on our consumption of fat. Meat consumption actually climbed. We just heaped a bunch more carbs onto our plates, obscuring perhaps, but not replacing, the expanding chunk of animal protein squatting in the center.

How did that happen? I would submit that the ideology of nutritionism deserves as much of the blame as the carbohydrates themselves do — that and human nature. By framing dietary advice in terms of good and bad nutrients, and by burying the recommendation that we should eat less of any particular food, it was easy for the take-home message of the 1977 and 1982 dietary guidelines to be simplified as follows: Eat more low-fat foods. And that is what we did. We’re always happy to receive a dispensation to eat more of something (with the possible exception of oat bran), and one of the things nutritionism reliably gives us is some such dispensation: low-fat cookies then, low-carb beer now. It’s hard to imagine the low-fat craze taking off as it did if McGovern’s original food-based recommendations had stood: eat fewer meat and dairy products. For how do you get from that stark counsel to the idea that another case of Snackwell’s is just what the doctor ordered?

BAD SCIENCE

But if nutritionism leads to a kind of false consciousness in the mind of the eater, the ideology can just as easily mislead the scientist. Most nutritional science involves studying one nutrient at a time, an approach that even nutritionists who do it will tell you is deeply flawed. “The problem with nutrient-by-nutrient nutrition science,” points out Marion Nestle, the New York University nutritionist, “is that it takes the nutrient out of the context of food, the food out of the context of diet and the diet out of the context of lifestyle.”



================

348


 

Have we been conned about cholesterol?
by MALCOLM KENDRICK

Conventional medical wisdom about cholesterol and the role of statins is now being challenged by a small, but growing number of health professionals. Among them is Dr Malcolm Kendrick. A GP for 25 years, he has also worked with the European Society of Cardiology, and writes for leading medical magazines:

When it comes to heart disease, we have been sold a pup. A rather large pup.

Actually, it's more of a full-grown blue whale. We've all been conned.

If you've got a raised risk of heart disease, the standard medical advice is to take a cholesterol-lowering statin drug to cut your chances of having a heart attack because, as we all know, cholesterol is a killer.

Indeed, many of you already believe that you should take statins for the rest of your natural lifespan.

Nearly everybody is in agreement about the need to lower your cholesterol level. The NHS spends nearly £1 billion a year on prescriptions for statins and possibly the same amount administering the cholesterol tests, surgery visits and the rest.

But is it all worth it? According to an article being published in the medical journal The Lancet this week, the answer is probably no.

A leading researcher at Harvard Medical School has found that women don't benefit from taking statins at all, nor do men over 69 who haven't already had a heart attack.

There is a very faint benefit if you are a younger man who also hasn't had a heart attack - out of 50 men who take the drug for five years, one will benefit.

Nor is this the first study to suggest that fighting cholesterol with statins is bunk. Indeed, there are hundreds of doctors and researchers who agree that the cholesterol hypothesis itself is nonsense.

What their work shows, and what your doctor should be saying, is the following:

• A high diet, saturated or otherwise, does not affect blood cholesterol levels.

• High cholesterol levels don't cause heart disease.

• Statins do not protect against heart disease by lowering cholesterol - when they do work, they do so in another way.

• The protection provided by statins is so small as to be not worth bothering about for most people (and all women). The reality is that the benefits have been hyped beyond belief.

• Statins have many more unpleasant side effects than has been admitted, while experts in this area should be treated with healthy scepticism because they are almost universally paid large sums by statin manufacturers to sing loudly from their hymn sheet.

So how can I say saturated fat doesn't matter when everyone knows it is a killer? Could all those millions who have been putting skinless chicken and one per cent fat yoghurts into their trolleys really have been wasting their time?

The experts are so busy urging you to consume less fat and more statins that you are never warned about the contradictions and lack of evidence behind the cholesterol con.

In fact, what many major studies show is that as far as protecting your heart goes, cutting back on saturated fats makes no difference and, in fact, is more likely to do harm.

So how did fat and cholesterol get such a bad name? It all began about 100 years ago, when a researcher found feeding rabbits (vegetarians) a high cholesterol carnivore diet blocked their arteries with plaque.

But it took off in the Fifties with the Seven Countries study by Ancel Keys, which showed that the higher the saturated fat intake in a country, the higher the cholesterol levels and the higher the rate of heart disease.

The countries he chose included Italy, Greece, the USA and the Netherlands. But why these particular ones?

Recently I did my own 14 countries study using figures from the World Health Organisation, and found the opposite.

Countries with the highest saturated fat consumption ? Austria, France, Finland and Belgium ? had the lowest rate of deaths from heart disease, while those with the lowest consumption ? Georgia, Ukraine and Croatia ? had the highest mortality rate from heart disease.

Added to this, the biggest ever trial on dietary modification put 50 million people on a low saturated fat diet for 14 years.

Sausages, eggs, cheese, bacon and milk were restricted. Fruit and fish, however, were freely available. I?m talking about rationing in Britain during and after World War Two. In that time, deaths from heart disease more than doubled.

Even more damning is what happened in 1988. The Surgeon General's office in the US decided to gather all evidence linking saturated fat to heart disease, silencing any nay-sayers for ever.

Eleven years later, however, the project was killed. The letter announcing this stated that the office "did not anticipate fully the magnitude of the additional expertise and staff resources that would be needed".

After eleven years, they needed additional expertise and staff resources? What had they been doing? If they'd found a scrap of evidence, you would never have heard the last of it.

Major trials since have been no more successful. One involved nearly 30,000 middle-aged men and women in Sweden, followed for six years.

The conclusion? "Saturated fat showed no relationship with cardiovascular disease in men. Among the women, cardiovascular mortality showed a downward trend with increasing saturated fat intake." (In other words, the more saturated fat, the less chance of dying from heart disease).

Even stronger evidence of the benefits of increased fat and cholesterol in the diet comes from Japan. Between 1958 and 1999, the Japanese doubled their protein intake, ate 400 per cent more fat and their cholesterol levels went up by 20 per cent.

Did they drop like flies? No. Their stroke rate, which had been the highest in the world, was seven times lower, while deaths from heart attacks, already low, fell by 50 per cent.

It's a bit of a paradox, isn?t it? That's one of the features of the dietary hypothesis - it involves a lot of paradoxes.

The most famous is the French Paradox. They eat more saturated fat than we do in Britain; they smoke more, take less exercise, have the same cholesterol/LDL levels, they also have the same average blood pressure and the same rate of obesity.

And you know what? They have one quarter the rate of heart disease we do.

The official explanation is that the French are protected from heart disease by drinking red wine, eating lightly cooked vegetables and eating garlic.

But there is no evidence that any of these three factors are actually protective. None. By evidence, I mean a randomised, controlled clinical study.

Every time a population is found that doesn't fit the saturated fat/cholestrol hypothesis - the Masai living on blood and milk with no heart disease, the Inuit living on blubber with low heart disease - something is always found to explain it.

One research paper published more than 20 years ago found 246 factors that could protect against heart disease or promote it. By now there must be more than a thousand.

The closer you look the more you find that the cholestrol hypothesis is an amazing beast. It is in a process of constant adaptation in order to encompass all contradictory data without keeling over and expiring.

But you don't need to look at foreign countries to find paradoxes - the biggest one is right here at home. Women are about 300 per cent less likely to suffer heart disease than men, even though on average they have higher cholesterol levels.

For years there was an ad hoc hypothesis to explain this apparent contradiction - women were protected by female sex hormones.

In fact, there has never been a study showing that these hormones protect against heart disease in humans.

But by the Nineties, millions of women were being prescribed HRT to stave off heart disease.

Then came the HERS trial to test the notion. It found HRT increased the risk of heart disease.

So what to do? Put them on statins; bring their cholesterol level down ? below 5.0 mmol is the official advice.

But, as The Lancet article emphasises, women do not benefit from statins. The phrase "Statins do not save lives in women" should be hung in every doctor's surgery.

But it's not just hugely wasteful handing out statins to women and men who are never going to benefit; it also exposes them to the risk of totally unnecessary side effects.

These include muscle weakness (myopathy) and mental and neurological problems such as severe irritability and memory loss.

How common are they? Very rare, say experts, but one trial found that 90 per cent of those on statins complained of side effects, half of them serious.

Only last week, a study reported a link between low LDL cholesterol and developing Parkinson's disease.

Statins are designed to lower LDL. In the face of anticholesterol propaganda, it is easy to forget cholesterol is vital for our bodies to function.

Why do you think an egg yolk is full of cholesterol? Because it takes a lot of cholesterol to build a healthy chicken.

It also takes a hell of a lot to build and maintain a healthy human being.

In fact, cholesterol is so vital that almost all cells can manufacture cholesterol; one of the key functions of the liver is to synthesise cholesterol.

It's vital for the proper functioning of the brain and it's the building bock for most sex hormones.

So it should not be such a surprise to learn that lowering cholesterol can increase death rates.

Woman with a cholesterol level of five or even six have a lower risk of dying than those with a level below four.

The Lancet reported that statins didn't benefit anyone over 69, not even men; in fact, there's good evidence that they may hasten your death.

The Framingham study in the US found that people whose cholesterol levels fell were at a 14 per cent increased risk of death from heart disease for every 1mg/dl.

Set up in 1948, the study screened the whole population of Framingham near Boston for factors that might be involved in heart disease and then followed them to see what happened to them.

It is still going today, making it the longest running and most often quoted study in heart-disease research.

A massive long-term study that looked specifically at cholesterol levels and mortality in older people in Honolulu, published in The Lancet, found that having low cholesterol concentration for a long time increases the risk of death.

This may be because cholesterol is needed to fight off infections or there may be other reasons ? but many other studies have found exactly the same thing.

Low cholesterol levels greatly increase your risk of dying younger. So the cholesterol hypothesis looks something like this:

There is no evidence that saturated fat is bad - and there are lots of 'paradoxes' where countries with a high cholesterol intake don't have a higher death rate from heart disease.

But there is an even more fundamental problem. The theory claims fat and cholesterol do things in the body that just don't make sense.

To begin with, saturated fat and cholesterol are talked of as if they are strongly connected. A low-fat diet lowers cholesterol; a high-fat diet raises it.

What is never explained is how this works. This isn't surprising because saturated fat doesn't raise cholesterol. There is no biochemical connection between the two substances, which may explain all those negative findings.

It's true that foods containing cholesterol also tend to contain saturated fats because both usually come from animals.

It's also true that neither dissolve in water, so in order to travel along the bloodstream they have to be transported in a type of molecule known as a lipoprotein - such as LDLs (low-density lipoproteins) and HDLs (high-density lipoproteins).

But being travelling companions is as close as fats and cholesterol get. Once in the body, most fat from our diet is transported to the fat cells in a lipoprotein called a chylomicron.

Meanwhile, cholesterol is produced in the liver by way of an incredibly complicated 13-step process; the one that statins interfere with.

No biochemist has been able to explain to me why eating saturated fat should have any impact on this cholesterol production line in the liver.

On the other hand, the liver does make fat - lots of it. All the excess carbohydrate that we eat is turned first into glucose and then into fat in the liver.

And what sort of fat does the liver make? Saturated fat; obviously the body doesn't regard it as harmful at all.

Recently, attention has been shifting from the dangers of saturated fat and LDL "bad" cholesterol to the benefits of HDL "good" cholesterol, and new drugs that are going to boost it.

But the idea that more HDLs are going to fight off heart disease is built on equally shaky foundations.

These lipoproteins seem to be cholesterol "scavengers", sucking up the cholesterol that is released when a cell dies and then passing it on to other lipoproteins, which return it to the liver.

Interestingly, the "bad" LDL lipoproteins are involved in the relay. The idea seems to be that HDLs can also get the cholesterol out of the plaques that are blocking arteries.

However, there is a huge difference between absorbing free-floating cholesterol and sucking it out of an atherosclerotic plaque which is covered by an impermeable cap.

• Extracted from The Great Cholesterol Con by Malcolm Kendrick, published by John Blake on January 29 at £9.99.

349
 
http://www.mises.org/story/2451
Making Kids Worthless: Social Security's Contribution to the Fertility Crisis


By Oskari Juurikkala

Posted on 1/24/2007
[Subscribe at email services, tell others, or Digg this story.]

 
"Kinder haben die Leute immer — People will always have children," assured Konrad Adenauer, the German Chancellor, in 1957. He was convinced that the future of the brave new pay-as-you-go social security system would not be undermined by demographic changes.

Adenauer was as wrong as ever. Social security schemes around the developed world are facing a major crisis due to greater longevity, declining retirement ages and — lo and behold — below-replacement fertility rates.

What the good statesman did not realize is how the new system would affect the incentives of individuals to work, to save, and to have children. Labor force participation rates among older workers have declined dramatically since the 1960s throughout the Western world. The rules of social security benefits in most countries mean that working just does not pay off. In this way, pay-as-you-go social security schemes contribute to their own bankruptcy. [1]

But the disincentives to work are not the only problem with government social security schemes. Demographic change too is a result of those systems, because compulsory social security penalizes parenthood and childbearing. Unfortunately, low fertility rates do not merely hasten the insolvency of public pay-as-you-go schemes, but lack of offspring also implies the decline of centuries-old nations.

The decline of fertility in the 20th century is a dismal reality. Fertility rates were higher than 5 in both Europe and the United States just a hundred years ago, but by year 2000, they had plummeted to as low as around 1.5 in Europe and 2.0 in the United States. Many European nations experience fertility rates far below replacement levels: Spain, Italy and Greece dip as low as 1.3. Germany — where according to Adenauer people were always going to have children — reaches an equally bleak figure of 1.4.[2] According to some estimates Italy will reduce its population by half in the next 50 years.

Families, Children, and Old-Age Security

What then has social security got to do with fertility rates? Actually, a lot.

In the absence of public social security systems, families function as a type of private, informal pay-as-you-go insurance mechanism, in which parents look after their children, and children care for their parents in sickness and old age in return. This is the common pattern still found in all traditional societies — just as it was in the West a hundred years ago.

Of course, some individuals cannot have children of their own, or their children may fall ill and die. The natural solution to these risks is to pool them in the informal social insurance market. This is why the norm in traditional societies is not the nuclear family but the extended family.

In addition to man's innate affection for offspring, the main reason why people used to have large families was that it was economically sound. Sociologists and demographers call it "the old age security motive for fertility." [3] In traditional societies, family values and mutual altruism are deeply held values, which are nurtured by both upbringing and material needs.

Family Socialism

Enter public social security. Instead of caring for their own parents and close relatives, those of working age are compelled by force of law and gun to pay for the retirement of everyone else. To put it plainly, social security replaces children and the family as the main support in old age by literally socializing the traditional duties of the family. Why have children when the state will take care of you in your old age?


The effect of social security on fertility is seen clearly in empirical data. The figure below shows cross-sectional data from over 100 countries in 1997. [4] In this data, all countries with large pension systems have fertility rates below the replacement level. No country with pension payments above 4 per cent of GDP has a fertility rate above 3.



Historical data is even more revealing. The following figure depicts time-series data from eight European countries from 1960 to the present. The growth of social security payments (X-axis) is associated with the decline of fertility (Y-axis) almost one-to-one. [5]



Inefficient Formalism

Every kind of socialism creates perverse incentives, and socialism directed to the family perverts the family. Because everyone has to pay for the retirement of everyone else, it does not pay to have children. Of course, people will still continue to have some children, simply because they want to have children as ends in themselves. However, as far as economic incentives are concerned, it has become economically more "rational" to free ride on the children of others. No surprise that masses of people embrace present-oriented lifestyles and refuse to commit themselves to real marriage with children.

Those who defend governmental social security argue that a formal, nation-wide system is surely more efficient in providing for needs in old age. In reality, the opposite is true. The extended family is an effective mechanism for solving informational and monitoring problems, which is why one finds practically no free riding and moral hazard in traditional settings. Under public social security, everyone seeks to benefit at the expense of everyone else — hence the declining labor force participation rates.

Democratic decision-making processes make things even worse, because voters can free ride at the expense of future generations — arguably the main reason why social security ever was popular. In contrast to government social security programs, elderly people in extended families do not retire early, and cannot claim false disability benefits. Even when they have a minor disability, they continue to contribute in other ways such as cooking, looking after younger children, etc.

Looking for Alternatives

Many people nowadays find it hard to see why anyone would have children for the sake of old-age security. Surely, they think, people have children just because they like it. Still, they often hear people say they would like to have more children, but they cannot afford it. Moreover, people in less developed countries seem to afford large families, even though their real incomes barely reach subsistence levels.

What can account for these seemingly conflicting observations? The fact that in the absence of social security, the extended family is an informal social insurance mechanism that renders childbearing economically beneficial. But in countries with large social security systems, people no longer have an old-age security motive for fertility, precisely because social security has made fertility economically unwise.

Of course, social security is not the only reason for declining fertility rates. For one thing, the welfare state undermines the family in many other ways too, such as compulsory public education that seeks to replace family loyalty with allegiance to the state. Moreover, the old-age security motive for fertility should become weaker when other ways of providing for old age become available. Well-defined property rights, legal certainty and freedom of contract are some of the key institutions that foster the development of savings institutions and financial markets, which in turn offer suitable savings and insurance vehicles.

However, the emergence of alternative savings media as such cannot undermine the family in the way compulsory social security does. The reason is that people do not have children just for the sake of old-age security; they have children because they like it, and in the absence of the welfare state, having children is also economically sound. It continues to be so with the advent of formal savings institutions.

Explanatory Power

Looking at some concrete examples reveals the striking ability of social security to explain differences in fertility rates. The most obvious case is the one between developed and less developed countries: all developed countries have total fertility rates far below 3, whereas African and Middle-Eastern countries, where social security systems are practically non-existent, reach rates between 4 and 7. In contrast, past communist countries, where the family was humiliated and disgraced to the utmost, go to the very bottom: some have fertility rates as low as 1.17 (Ukraine), 1.20 (Lithuania) and 1.21 (Czech Republic).[6]

One can also look at differences among the developed Western countries. Among these countries, there are practically no differences in infant mortality rates, female labor force participation rates, and other standard explanations of the fertility decline. Yet total fertility rates differ widely — and exactly in the way predicted by the size of social security systems. The United States has a fertility rate of 2.09, whereas the European Union has an average of 1.47.

Also within Europe, where social security benefits are dangerously generous, there are differences among countries. Some of the most generous schemes are found in Germany, France, and the Mediterranean countries — as are the lowest fertility rates in the region. On the surface, it is surprising to find this in countries that used to be family-oriented and fervently Catholic. However, economic incentives shape behavior, and behavior shapes culture.

Beware Fake Solutions

Old-age security is actually not a problem in a free society. The only problem is government activity that undermines natural communities and perverts economic incentives. Indeed, the very term "social security" is a misnomer: it is anti-social, and it does not provide real security. Social security benefits are nothing but political promises that can be changed — and will be changed — unilaterally by governments that find it convenient to do so. [7]

The presence of the state in retirement security creates calculational confusion, resource misallocation and mismanagement, and harmful free riding. The real cost of the current mess will be borne by future taxpayers, who are paying for the pensions of the present retirees but can expect to receive little in return.

The solution is not to create quasi-market solutions like individual retirement accounts managed by specialist companies, as was done in the Chilean model. That would not be good enough. [8] One problem with compulsory savings schemes is that they misallocate scarce resources to uses that are not optimal for many people. They also tend to be heavily regulated by the government and hence utterly inefficient. The chief motivation for compulsory savings schemes is that they promise great wealth to those who get to sit on the cash piles.

Retirement accounts are also inconsistent with freedom, because the very concept of "retirement" is a creation of the state. Before the establishment of government social security, no one would have thought about a period of idle leisure while waiting to die. In a free society, elderly people would continue to engage themselves in various professional and non-professional activities throughout their lives.

Given current economic affluence, many people would of course leave their normal job when they grow older. But what is needed is to de-institutionalize retirement and let people decide for themselves. Let individuals and families make rational and responsible decisions that enable them to provide for their old-age needs. Maybe some will work like mad when young, and retreat into the country in their forties to live off fiction writing. Many others will choose to be housewives, looking after their children and receiving care and affection in return when they grow old.

Given the efficiency benefits of scrapping social security taxation, there would also be many more voluntary charities and mutual help societies to assist those who, through bad luck or some fault of their own, have no one else to look after them. These organizations would do a better job than any government agency will ever do, because they would be managed with entrepreneurial talent and run by people who really care.

The best solution is also the simplest: get the state out of the way.


--------------------------------------------------------------------------------

Oskari Juurikkala is a Research Fellow at the Institute of Economic Affairs in London. He was an O.P. Alford Fellow at the Mises Institute in 2002. Send him mail. Comment on the blog.

This article was written as part of a larger project on pension and social security reform, entitled Empowerment through Savings and funded by the Templeton Foundation.

Notes

[1] The proportion of working males aged between 60–64 has come down from over 80% in the 1960s to around 50% in many countries. In France, Belgium, and Holland, less than 20% of the 60–64 year-olds were still working in the mid-1990s. See Jonathan Gruber and David A. Wise (eds.), Social Security and Retirement Around the World, Chicago: University of Chicago Press, 1999.

[2] See Wikipedia's list of countries and territories by fertility rate .

[3] J.B. Nugent, "The old age security motive for fertility," Population and Development Review, 1985, Vol. 11, pp. 75–98.

[4] Total fertility rates (Y-axis) and social security taxes as percentage of GDP (X-axis) in 104 countries in 1997. Source: Michele Boldrin, Mariacristina De Nardi and Larry E. Jones, "Fertility and Social Security," NBER Working Paper No. 11146, 2005.

[5] Total fertility rates (Y-axis) and pension payments as percentage of GDP (X-axis) in eight European countries from 1960 to the present. Source: Ibid.

[6] See Wikipedia's list of countries and territories by fertility rate .

[7] See Philip Booth, "The Transition from Social Insecurity," Economic Affairs, 1998, Vol. 18, No. 1, pp. 2–12.

[8] See Dale Steinreich, "Social Security Reform: True and False," The Free Market, October 1996, Vol. 14, No. 10.

350
Science, Culture, & Humanities / Comet in southern hemisphere
« on: January 24, 2007, 05:21:40 AM »
Giant comet lights up skies
Catherine Boyle in Australia
 

 

 

 

A comet discovered by a Scottish astronomer has transformed southern hemisphere skies this week.

Thousands of people have gathered in Australia, New Zealand, Chile, Argentina and South Africa to watch Comet McNaught, the brightest comet seen from Earth in more than 40 years. It is one of very few comets that can be seen by the naked eye in daylight and is around 140 million kilometres (87 million miles) from the Earth.

The comet consists of a head bigger than Mount Everest and a tail that stretches 30 million kilometres into space.

The man who spotted the comet, Robert McNaught, 50, originally from Prestwick, Ayrshire, was working at Siding Spring Observatory in New South Wales when he first saw it last August.

It is so bright that some people in Auckland contacted the emergency services fearing that a plane had fallen out of the sky.
 

 

 

Pages: 1 ... 5 6 [7] 8 9