Politics, Religion, Science, Culture and Humanities > Science, Culture, & Humanities

Education

(1/150) > >>

Crafty_Dog:
WSJ:

The Culture Gap
By BRINK LINDSEY
July 9, 2007; Page A15

Cut through all the statistical squid ink surrounding the issue of economic inequality, and you'll find a phenomenon that genuinely deserves public concern.

Over the past quarter-century or so, the return on human capital has risen significantly. Or to put it another way, the opportunity cost of failing to develop human capital is now much higher than it used to be. The wage premium associated with a college degree has jumped to around 70% in recent years from around 30% in 1980; the graduate degree premium has soared to over 100% from 50%. Meanwhile, dropping out of high school now all but guarantees socioeconomic failure.

In part this development is cause for celebration. Rising demand for analytical and interpersonal skills has been driving the change, and surely it is good news that economic signals now so strongly encourage the development of human talent. Yet -- and here is the cause for concern -- the supply of skilled people is responding sluggishly to the increased demand.

Despite the strong incentives, the percentage of people with college degrees has been growing only modestly. Between 1995 and 2005, the share of men with college degrees inched up to 29% from 26%. And the number of high school dropouts remains stubbornly high: The ratio of 17-year-olds to diplomas awarded has been stuck around 70% for three decades.

Something is plainly hindering the effectiveness of the market's carrots and sticks. And that something is culture.

Before explaining what I mean, let me go back to the squid ink and clarify what's not worrisome about the inequality statistics. For those who grind their ideological axes on these numbers, the increase in measured inequality since the 1970s is proof that the new, more competitive, more entrepreneurial economy of recent decades (which also happens to be less taxed and less unionized) has somehow failed to provide widespread prosperity. According to left-wing doom-and-gloomers, only an "oligarchy" at the very top is benefiting from the current system.

Hogwash. This argument can be disposed of with a simple thought experiment. First, picture the material standard of living you could have afforded back in 1979 with the median household income then of $16,461. Now picture the mix of goods and services you could buy in 2004 with the median income of $44,389. Which is the better deal? Only the most blinkered ideologue could fail to see the dramatic expansion of comforts, conveniences and opportunities that the contemporary family enjoys.

Much of the increase in measured inequality has nothing to do with the economic system at all. Rather, it is a product of demographic changes. Rising numbers of both single-parent households and affluent dual-earner couples have stretched the income distribution; so, too, has the big influx of low-skilled Hispanic immigrants. Meanwhile, in a 2006 paper published in the American Economic Review, economist Thomas Lemieux calculated that roughly three-quarters of the rise in wage inequality among workers with similar skills is due simply to the fact that the population is both older and better educated today than it was in the 1970s.

It is true that superstars in sports, entertainment and business now earn stratospheric incomes. But what is that to you and me? If the egalitarian left has been reduced to complaining that people in the 99th income percentile in a given year (and they're not the same people from year to year) are leaving behind those in the 90th percentile, it has truly arrived at the most farcical of intellectual dead ends.

Which brings us back to the real issue: the human capital gap, and the culture gap that impedes its closure. The most obvious and heartrending cultural deficits are those that produce and perpetuate the inner-city underclass. Consider this arresting fact: While the poverty rate nationwide is 13%, only 3% of adults with full-time, year-round jobs fall below the poverty line. Poverty in America today is thus largely about failing to get and hold a job, any job.

The problem is not lack of opportunity. If it were, the country wouldn't be a magnet for illegal immigrants. The problem is a lack of elementary self-discipline: failing to stay in school, failing to live within the law, failing to get and stay married to the mother or father of your children. The prevalence of all these pathologies reflects a dysfunctional culture that fails to invest in human capital.

Other, less acute deficits distinguish working-class culture from that of the middle and upper classes. According to sociologist Annette Lareau, working-class parents continue to follow the traditional, laissez-faire child-rearing philosophy that she calls "the accomplishment of natural growth." But at the upper end of the socioeconomic scale, parents now engage in what she refers to as "concerted cultivation" -- intensively overseeing kids' schoolwork and stuffing their after-school hours and weekends with organized enrichment activities.

This new kind of family life is often hectic and stressful, but it inculcates in children the intellectual, organizational and networking skills needed to thrive in today's knowledge-based economy. In other words, it makes unprecedented, heavy investments in developing children's human capital.

Consider these data from the National Education Longitudinal Study, an in-depth survey of educational achievement. Among students who received high scores in eighth grade mathematics (and thus showed academic promise), 74% of kids from the highest quartile of socioeconomic status (measured as a composite of parental education, occupations and family income) eventually earned a college degree. By contrast, the college graduation rate fell to 47% for kids from the middle two quartiles, and 29% for those in the bottom quartile. Perhaps more generous financial aid might affect those numbers at the margins, but at the core of these big differentials are differences in the values, skills and habits taught in the home.

Contrary to the warnings of the alarmist left, the increase in economic inequality does not mean the economic system isn't working properly. On the contrary, the system is delivering more opportunities for comfortable, challenging lives than our culture enables us to take advantage of. Far from underperforming, our productive capacity has now outstripped our cultural capacity.

Alas, there is no silver bullet for closing the culture gap. But the public institutions most directly responsible for human capital formation are the nation's schools, and it seems beyond serious dispute that in many cases they are failing to discharge their responsibilities adequately. Those interested in reducing meaningful economic inequality would thus be well advised to focus on education reform. And forget about adding new layers of bureaucracy and top-down controls. Real improvements will come from challenging the moribund state-school monopoly with greater competition.

Mr. Lindsey is vice president for research at the Cato Institute and author of the just-published book, "The Age of Abundance: How Prosperity Transformed America's Politics and Culture" (Collins, 2007).

Crafty_Dog:

Second post of the day:

"The eight Democratic presidential candidates assembled in Washington recently for another of their debates and talked, among other things, about public education. They all essentially agreed that it was underfunded -- one system 'for the wealthy, one for everybody else,' as John Edwards put it. Then they all got into cars and drove through a city where teachers are relatively well paid, per-pupil spending is through the roof and -- pay attention here -- the schools are among the very worst in the nation. When it comes to education, Democrats are ineducable.... [N]ot a one of them even whispered a word of outrage about a public school system that spends $13,000 per child -- third-highest among big-city school systems -- and produces pupils who score among the lowest in just about any category you can name. The only area in which the Washington school system is No. 1 is in money spent on administration. The litany of more and more when it comes to money often has little to do with what, in the military, are called facts on the ground: kids and parents. It does have a lot to do with teachers unions, which are strong supporters of the Democratic Party. Not a single candidate offered anything close to a call for real reform" -- Washington Post columnist Richard Cohen.

Crafty_Dog:
Harvard for Free
Higher education is about to change as elite universities decide what to do with their huge endowments.

BY FAY VINCENT
Thursday, December 13, 2007 12:01 a.m. EST

On Monday Harvard said that next year it will substantially increase its financial aid to middle-class students, bringing its actual tuition costs down to or even below that of some state universities. This is possible because of Harvard's--and other universities'--growing financial success, and it is a signal of far-reaching changes that will ripple throughout higher education.

Superb investment returns have been generated by managers of the endowments of some of the elite private universities, including Harvard, Yale, and even of small liberal arts colleges like Amherst and Williams. The endowments of these four institutions range from $1.7 billion at Amherst to $35 billion at Harvard, and the investment managers are getting annual returns well in excess of 20%. This is more than the alumni of any of those institutions could possibly contribute, and by an enormous margin.

In 1970, when I became a trustee of Williams, the endowment stood at about $35 million. Even using constant dollars, the growth in the endowment since then has been astonishing. At June 30, 2007 it had reached approximately $1.9 billion.

Much (but not all) of this growth is due to the major diversification in the investment mixture adopted by trustees of these schools, who realized some 30 years ago that sticking with the ancient formulae of stocks and bonds was no longer prudent. The change came about because the Sage of Omaha, Warren Buffett, persuaded Grinnell College in 1976 to invest some $13 million in a local TV station that he had identified as a golden opportunity.

Before then, boards at such places worried that nontraditional investments might raise legal issues, or subject them to criticism from alumni. But when the Buffett suggestion turned into a significant windfall of some $36 million for Grinnell in about five years, the rest of the endowment world got the point. I once asked Warren if he had planned to cause such a major switch in strategy. He assured me he had not. "I just saw it as a good buy," he said.





Now, however, these enormous endowments are beginning to raise some fascinating issues for all of higher education. The most obvious issue is whether these schools can seriously claim to have any further need for donations from alumni and friends.
And if, as seems likely, there is much less need for additional giving, does that not mean the administrations of these institutions can operate without the traditional checks and balances of informed alumni? The boards and administrations of the well-endowed schools can safely and proudly proclaim their independence.

In the past, it would have been impossible to ignore alumni. Perhaps an early indication of what I am raising is the recent tussle at Dartmouth over the number of trustees the alumni will be permitted to elect. There the administration has instituted a by-law change that will result in an increase in the number of trustees to be elected by the board, thereby decreasing the power of the alumni.

In the present circumstances, the administration and boards of these schools now control the money because the endowment is managed by internally controlled entities. Accordingly, the most important voice at Yale would have to be the estimable and much-respected David Swenson, who has managed the Yale endowment to astonishing annual returns of over 20% for 10 years. Yale's endowment is about $22.5 billion. What does this mean for the future of governance at Yale? I wonder.

Similarly, these powerful investment returns will change tuition pricing and financial aid--and not just at Harvard. A scholar who follows these matters closely recently told me that he anticipates that the elite private colleges and universities will, in the not-too-distant future, stop charging tuition to any student whose annual family income is below the top 5% of all American families--currently around $200,000.

We already have seen a competition among these schools as of late, with "Free to $30,000" replaced by "Free to $40,000" and now "Free to $60,000." In fact, a recent announcement at Phillips Exeter Academy, that they are offering a free boarding school education to admitted students whose families earn $75,000 or less, raised the stakes for higher education.

If a "Free to $200,000" policy were to be enacted at my alma mater, Williams College, it would cost them only something like $15 million in net tuition revenue out of an operating budget of $200 million. At Harvard, the percentage contribution would be even less. Given the endowment performance at places like Williams and Harvard, they could easily adjust to the loss in tuition revenue. But what about all the lesser-endowed schools that are much more heavily dependant on tuition to maintain their financial stability? How can Fairfield University--where I have served as a trustee--possibly forego tuition to that extent?

What this means is that the cost of the educational Mercedes will be less than the educational Ford. And when Harvard is cheaper than Fairfield, how can Fairfield increase tuition each year, when it will no longer have the umbrella of similar tuition increases being announced by places like Williams and Yale?

I suspect many of us have viewed a four-year college education as a commodity that is priced within a reasonably narrow range. In the past, the Fairfield cost was close to that at Williams. If, as is likely, the big guys drop tuition for all but the richest students, all this will change.

There is another aspect of the financial aid universe that will be affected by these changes in pricing. Currently, there are universities and colleges granting what are known as "merit scholarships." These are financial grants to students who have no demonstrated need.

The Ivies, and many well-endowed institutions, profess only to grant aid based on need. But in the present circumstances, merit grants are being used to tempt talented students away from the Ivies. Some students accept these grants, and decline admission offers at the very elite schools in order to save money for graduate school costs. Thus, Harvard and Williams may be losing attractive students for largely financial reasons. In those cases, the merit offers make money a solid reason to go to a school down the food chain.





If, as is likely, the big guys drop tuition, all this will change, too. And who can blame the elites for using what they have the most of--money and huge endowments.
Because there are so few of these super-rich schools, the effects of their changes in policies will be felt slowly. But like the change in investment strategy Warren Buffett innocently suggested some 30 years ago, the size and growth of their endowments will have significant and not easily anticipated consequences. The ripples of moves made in Cambridge and New Haven will be widely felt.

Mr. Vincent, a former commissioner of Major League Baseball, is the author of "The Only Game in Town: Baseball Stars of the 1930s and 1940s Talk About the Game They Loved" (Simon & Schuster, 2006), the first in a multivolume oral-history project.

WSJ

Crafty_Dog:
Defining Diversity Down
A proposal to make it easier to get into California colleges.

Wednesday, January 9, 2008 12:01 a.m. EST

The world gets more competitive every day, so why would California's education elites want to dumb down their public university admissions standards? The answer is to serve the modern liberal piety known as "diversity" while potentially thwarting the will of the voters.

The University of California Board of Admissions is proposing to lower to 2.8 from 3.0 the minimum grade point average for admission to a UC school. That 3.0 GPA standard has been in place for 40 years. Students would also no longer be required to take the SAT exams that test for knowledge of specific subjects, such as history and science.

UC Board of Admissions Chairman Mark Rashid says that, under this new system of "comprehensive review," the schools "can make a better and more fair determination of academic merit by looking at all the students' achievements." And it is true that test scores and grades do not take full account of the special talents of certain students. But the current system already leaves slots for students with specific skills, so if you think this change is about admitting more linebackers or piccolo players, you don't understand modern academic politics.

The plan would grant admissions officers more discretion to evade the ban on race and gender preferences imposed by California voters. Those limits became law when voters approved Proposition 209 in 1996, and state officials have been looking for ways around them ever since. "This appears to be a blatant attempt to subvert the law," says Ward Connerly, a former member of the University of California Board of Regents, who led the drive for 209. "Subjective admissions standards allow schools to substitute race and diversity for academic achievement."





One loser here would be the principle of merit-based college admissions. That principle has served the state well over the decades, helping to make some of its universities among the world's finest. Since 209, Asian-American students have done especially well, with students of Asian ethnicity at UCLA nearly doubling to 42% from 22%. Immigrants and the children of immigrants now outnumber native-born whites in most UC schools, so being a member of an ethnic minority is clearly not an inherent admissions handicap. Ironically, objective testing criteria were first introduced in many university systems, including California's, precisely to weed out discrimination favoring children of affluent alumni ahead of higher performing students.
The other big losers would be the overall level of achievement demanded in California public elementary and high schools. A recent study by the left-leaning Institute for Democracy, Education and Access at UCLA, the "California Educational Opportunity Report 2007," finds that "California lags behind most other states in providing fundamental learning conditions as well as in student outcomes." In 2005 California ranked 48th among states in the percentage of high-school kids who attend college. Only Mississippi and Arizona rated worse.

The UCLA study documents that the educational achievement gap between black and Latino children and whites and Asians is increasing in California at a troubling pace. Graduation rates are falling fastest for blacks and Latinos, as many of them are stuck in the state's worst public schools. The way to close that gap is by introducing more accountability and choice to raise achievement standards--admittedly hard work, especially because it means taking on the teachers unions.

Instead, the UC Board of Admissions proposal sounds like a declaration of academic surrender. It's one more depressing signal that liberal elites have all but given up on poor black and Hispanic kids. Because they don't think closing the achievement gap is possible, their alternative is to reduce standards for everyone. Diversity so trumps merit in the hierarchy of modern liberal values that they're willing to dumb down the entire university system to guarantee what they consider a proper mix of skin tones on campus.

A decade ago, California voters spoke clearly that they prefer admissions standards rooted in the American tradition of achievement. In the months ahead, the UC Board of Regents will have to decide which principle to endorse, and their choice will tell us a great deal about the future path of American society.


Crafty_Dog:
How dumb can we get?

It's bad enough that Americans are increasingly ignorant about science, art, history, and geography. What's frightening, says author Susan Jacoby, is that we're proud of it.

"The mind of this country, taught to aim at low objects, eats upon itself." Ralph Waldo Emerson offered that observation in 1837, but his words echo with painful prescience in today's very different United States. Americans are in serious intellectual trouble—in danger of losing our hard-won cultural capital to a virulent mixture of anti-intellectualism, anti-rationalism, and low expectations.

This is the last subject that any candidate would dare raise on the long and winding road to the White House. It is almost impossible to talk about the manner in which public ignorance contributes to grave national problems without being labeled an "elitist," one of the most powerful pejoratives that can be applied to anyone aspiring to high office. Instead, our politicians repeatedly assure Americans that they are just "folks," a patronizing term that you will search for in vain in important presidential speeches before 1980. (Just imagine: "We here highly resolve that these dead shall not have died in vain ... and that government of the folks, by the folks, for the folks, shall not perish from the earth.") Such exaltations of ordinariness are among the distinguishing traits of anti-intellectualism in any era.

The classic work on this subject by Columbia University historian Richard Hofstadter, Anti-Intellectualism in American Life, was published in early 1963, between the anti-communist crusades of the McCarthy era and the social convulsions of the late 1960s. Hofstadter saw American anti-intellectualism as a basically cyclical phenomenon that often manifested itself as the dark side of the country's democratic impulses in religion and education. But today's brand of anti-intellectualism is less a cycle than a flood. If Hofstadter (who died of leukemia in 1970 at age 54) had lived long enough to write a modern-day sequel, he would have found that our era of 24/7 infotainment has outstripped his most apocalyptic predictions about the future of American culture.

Dumbness, to paraphrase the late Sen. Daniel Patrick Moynihan, has been steadily defined downward for several decades, by a combination of heretofore irresistible forces. These include the triumph of video culture over print culture (and by video, I mean every form of digital media, as well as older electronic ones); a disjunction between Americans' rising level of formal education and their shaky grasp of basic geography, science, and history; and the fusion of anti-rationalism with anti-intellectualism.

***

First and foremost among the vectors of the new anti-intellectualism is video. The decline of book, newspaper, and magazine reading is by now an old story. The drop-off is most pronounced among the young, but it continues to accelerate and afflict Americans of all ages and education levels.

Reading has declined not only among the poorly educated, according to a report last year by the National Endowment for the Arts. In 1982, 82 percent of college graduates read novels or poems for pleasure; two decades later, only 67 percent did. And more than 40 percent of Americans under 44 did not read a single book—fiction or nonfiction—over the course of a year. The proportion of 17-year-olds who read nothing (unless required to do so for school) more than doubled between 1984 and 2004. This time period, of course, encompasses the rise of personal computers, Web surfing, and videogames.

Does all this matter? Technophiles pooh-pooh jeremiads about the end of print culture as the navel-gazing of (what else?) elitists. In his book Everything Bad Is Good for You: How Today's Popular Culture Is Actually Making Us Smarter, the science writer Steven Johnson assures us that we have nothing to worry about. Sure, parents may see their "vibrant and active children gazing silently, mouths agape, at the screen." But these zombie-like characteristics "are not signs of mental atrophy. They're signs of focus." Balderdash. The real question is what toddlers are screening out, not what they are focusing on, while they sit mesmerized by videos they have seen dozens of times.

Despite an aggressive marketing campaign aimed at encouraging babies as young as 6 months to watch videos, there is no evidence that focusing on a screen is anything but bad for infants and toddlers. In a study released last August, University of Washington researchers found that babies between 8 and 16 months recognized an average of six to eight fewer words for every hour spent watching videos.

I cannot prove that reading for hours in a treehouse (which is what I was doing when I was 13) creates more informed citizens than hammering away at a Microsoft Xbox or obsessing about Facebook profiles. But the inability to concentrate for long periods of time—as distinct from brief reading hits for information on the Web—seems to me intimately related to the inability of the public to remember even recent news events. It is not surprising, for example, that less has been heard from the presidential candidates about the Iraq war in the later stages of the primary campaign than in the earlier ones, simply because there have been fewer video reports of violence in Iraq. Candidates, like voters, emphasize the latest news, not necessarily the most important news.

No wonder negative political ads work. "With text, it is even easy to keep track of differing levels of authority behind different pieces of information," the cultural critic Caleb Crain noted recently in The New Yorker. "A comparison of two video reports, on the other hand, is cumbersome. Forced to choose between conflicting stories on television, the viewer falls back on hunches, or on what he believed before he started watching."

As video consumers become progressively more impatient with the process of acquiring information through written language, all politicians find themselves under great pressure to deliver their messages as quickly as possible—and quickness today is much quicker than it used to be. Harvard University's Kiku Adatto found that between 1968 and 1988, the average sound bite on the news for a presidential candidate—featuring the candidate's own voice—dropped from 42.3 seconds to 9.8 seconds. By 2000, according to another Harvard study, the daily candidate bite was down to just 7.8 seconds.

***

The shrinking public attention span fostered by video is closely tied to the second important anti-intellectual force in American culture: the erosion of general knowledge.

People accustomed to hearing their president explain complicated policy choices by snapping "I'm the decider" may find it almost impossible to imagine the pains that Franklin D. Roosevelt took, in the grim months after Pearl Harbor, to explain why U.S. armed forces were suffering one defeat after another in the Pacific. In February 1942, Roosevelt urged Americans to spread out a map during his radio "fireside chat" so that they might better understand the geography of battle. In stores throughout the country, maps sold out; about 80 percent of American adults tuned in to hear the president. FDR had told his speechwriters that he was certain that if Americans understood the immensity of the distances over which supplies had to travel to the armed forces, "they can take any kind of bad news right on the chin."

This is a portrait not only of a different presidency and president but also of a different country and citizenry, one that lacked access to satellite-enhanced Google maps but was far more receptive to learning and complexity than today's public. According to a 2006 survey by National Geographic–Roper, nearly half of Americans between ages 18 and 24 do not think it necessary to know the location of other countries in which important news is being made. More than a third consider it "not at all important" to know a foreign language, and only 14 percent consider it "very important."

***

That leads us to the third and final factor behind the new American dumbness: not lack of knowledge per se but arrogance about that lack of knowledge. The problem is not just the things we do not know (consider the one in five American adults who, according to the National Science Foundation, thinks the sun revolves around the Earth); it's the alarming number of Americans who have smugly concluded that they do not need to know such things in the first place. Call this anti-rationalism—a syndrome that is particularly dangerous to our public institutions and discourse. Not knowing a foreign language or the location of an important country is a manifestation of ignorance; denying that such knowledge matters is pure anti-rationalism. The toxic brew of anti-rationalism and ignorance hurts discussions of U.S. public policy on topics from health care to taxation.

***

There is no quick cure for this epidemic of arrogant anti-rationalism and anti-intellectualism; rote efforts to raise standardized test scores by stuffing students with specific answers to specific questions on specific tests will not do the job. Moreover, the people who exemplify the problem are usually oblivious to it. ("Hardly anyone believes himself to be against thought and culture," Hofstadter noted.) It is past time for a serious national discussion about whether, as a nation, we truly value intellect and rationality. If this indeed turns out to be a "change election," the low level of discourse in a country with a mind taught to aim at low objects ought to be the first item on the change agenda.

Susan Jacoby's new book is The Age of American Unreason.

Navigation

[0] Message Index

[#] Next page

Go to full version