Fire Hydrant of Freedom

Politics, Religion, Science, Culture and Humanities => Science, Culture, & Humanities => Topic started by: Crafty_Dog on March 11, 2007, 07:27:20 AM

Title: Physics & Mathematics
Post by: Crafty_Dog on March 11, 2007, 07:27:20 AM
Dark Energy
NY Times

Three days after learning that he won the 2006 Nobel Prize in Physics, George Smoot was talking about the universe. Sitting across from him in his office at the University of California, Berkeley, was Saul Perlmutter, a fellow cosmologist and a probable future Nobelist in Physics himself. Bearded, booming, eyes pinwheeling from adrenaline and lack of sleep, Smoot leaned back in his chair. Perlmutter, onetime acolyte, longtime colleague, now heir apparent, leaned forward in his.

“Time and time again,” Smoot shouted, “the universe has turned out to be really simple.”

Perlmutter nodded eagerly. “It’s like, why are we able to understand the universe at our level?”

“Right. Exactly. It’s a universe for beginners! ‘The Universe for Dummies’!”

But as Smoot and Perlmutter know, it is also inarguably a universe for Nobelists, and one that in the past decade has become exponentially more complicated. Since the invention of the telescope four centuries ago, astronomers have been able to figure out the workings of the universe simply by observing the heavens and applying some math, and vice versa. Take the discovery of moons, planets, stars and galaxies, apply Newton’s laws and you have a universe that runs like clockwork. Take Einstein’s modifications of Newton, apply the discovery of an expanding universe and you get the big bang. “It’s a ridiculously simple, intentionally cartoonish picture,” Perlmutter said. “We’re just incredibly lucky that that first try has matched so well.”

But is our luck about to run out? Smoot’s and Perlmutter’s work is part of a revolution that has forced their colleagues to confront a universe wholly unlike any they have ever known, one that is made of only 4 percent of the kind of matter we have always assumed it to be — the material that makes up you and me and this magazine and all the planets and stars in our galaxy and in all 125 billion galaxies beyond. The rest — 96 percent of the universe — is ... who knows?

“Dark,” cosmologists call it, in what could go down in history as the ultimate semantic surrender. This is not “dark” as in distant or invisible. This is “dark” as in unknown for now, and possibly forever.

If so, such a development would presumably not be without philosophical consequences of the civilization-altering variety. Cosmologists often refer to this possibility as “the ultimate Copernican revolution”: not only are we not at the center of anything; we’re not even made of the same stuff as most of the rest of everything. “We’re just a bit of pollution,” Lawrence M. Krauss, a theorist at Case Western Reserve, said not long ago at a public panel on cosmology in Chicago. “If you got rid of us, and all the stars and all the galaxies and all the planets and all the aliens and everybody, then the universe would be largely the same. We’re completely irrelevant.”

All well and good. Science is full of homo sapiens-humbling insights. But the trade-off for these lessons in insignificance has always been that at least now we would have a deeper — simpler — understanding of the universe. That the more we could observe, the more we would know. But what about the less we could observe? What happens to new knowledge then? It’s a question cosmologists have been asking themselves lately, and it might well be a question we’ll all be asking ourselves soon, because if they’re right, then the time has come to rethink a fundamental assumption: When we look up at the night sky, we’re seeing the universe.

Not so. Not even close.

In 1963, two scientists at Bell Labs in New Jersey discovered a microwave signal that came from every direction of the heavens. Theorists at nearby Princeton University soon realized that this signal might be the echo from the beginning of the universe, as predicted by the big-bang hypothesis. Take the idea of a cosmos born in a primordial fireball and cooling down ever since, apply the discovery of a microwave signal with a temperature that corresponded precisely to the one that was predicted by theorists — 2.7 degrees above absolute zero — and you have the universe as we know it. Not Newton’s universe, with its stately, eternal procession of benign objects, but Einstein’s universe, violent, evolving, full of births and deaths, with the grandest birth and, maybe, death belonging to the cosmos itself.

But then, in the 1970s, astronomers began noticing something that didn’t seem to fit with the laws of physics. They found that spiral galaxies like our own Milky Way were spinning at such a rate that they should have long ago wobbled out of control, shredding apart, shedding stars in every direction. Yet clearly they had done no such thing. They were living fast but not dying young. This seeming paradox led theorists to wonder if a halo of a hypothetical something else might be cocooning each galaxy, dwarfing each flat spiral disk of stars and gas at just the right mass ratio to keep it gravitationally intact. Borrowing a term from the astronomer Fritz Zwicky, who detected the same problem with the motions of a whole cluster of galaxies back in the 1930s, decades before anyone else took the situation seriously, astronomers called this mystery mass “dark matter.”

================



Page 2 of 6)



So there was more to the universe than meets the eye. But how much more? This was the question Saul Perlmutter’s team at Lawrence Berkeley National Laboratory set out to answer in the late 1980s. Actually, they wanted to settle an issue that had been nagging astronomers ever since Edwin Hubble discovered in 1929 that the universe seems to be expanding. Gravity, astronomers figured, would be slowing the expansion, and the more matter the greater the gravitational effect. But was the amount of matter in the universe enough to slow the expansion until it eventually stopped, reversed course and collapsed in a backward big bang? Or was the amount of matter not quite enough to do this, in which case the universe would just go on expanding forever? Just how much was the expansion of the universe slowing down?

The tool the team would be using was a specific type of exploding star, or supernova, that reaches a roughly uniform brightness and so can serve as what astronomers call a standard candle. By comparing how bright supernovae appear and how much the expansion of the universe has shifted their light, cosmologists sought to determine the rate of the expansion. “I was trying to tell everybody that this is the measurement that everybody should be doing,” Perlmutter says. “I was trying to convince them that this is going to be the tool of the future.” Perlmutter talks like a microcassette on fast-forward, and he possesses the kind of psychological dexterity that allows him to walk into a room and instantly inhabit each person’s point of view. He can be as persuasive as any force of nature. “The next thing I know,” he says, “we’ve convinced people, and now they’re competing with us!”

By 1997, Perlmutter’s Supernova Cosmology Project and a rival team had amassed data from more than 50 supernovae between them — data that would reveal yet another oddity in the cosmos. Perlmutter noticed that the supernovae weren’t brighter than expected but dimmer. He wondered if he had made a mistake in his observations. A few months later, Adam Riess, a member of a rival international team, noticed the same general drift in his math and wondered the same thing. “I’m a postdoc,” he told himself. “I’m sure I’ve messed up in at least 10 different ways.” But Perlmutter double-checked for intergalactic dust that might have skewed his readings, and Riess cross-checked his math, calculation by calculation, with his team leader, Brian Schmidt. Early in 1998, the two teams announced that they had each independently reached the same conclusion, and it was the opposite of what either of them expected. The rate of the expansion of the universe was not slowing down. Instead, it seemed to be speeding up.

That same year, Michael Turner, the prominent University of Chicago theorist, delivered a paper in which he called this antigravitational force “dark energy.” The purpose of calling it “dark,” he explained recently, was to highlight the similarity to dark matter. The purpose of “energy” was to make a distinction. “It really is very different from dark matter,” Turner said. “It’s more energylike.”

More energylike how, exactly?

Turner raised his eyebrows. “I’m not embarrassed to say it’s the most profound mystery in all of science.”

Extraordinary claims,” Carl Sagan once said, “require extraordinary evidence.” Astronomers love that saying; they quote it all the time. In this case the claim could have hardly been more extraordinary: a new universe was dawning.

It wouldn’t be the first time. We once thought the night sky consisted of the several thousand objects we could see with the naked eye. But the invention of the telescope revealed that it didn’t, and that the farther we saw, the more we saw: planets, stars, galaxies. After that we thought the night sky consisted of only the objects the eye could see with the assistance of telescopes that reached all the way back to the first stars blinking to life. But the discovery of wavelengths beyond the optical revealed that it didn’t, and that the more we saw in the radio or infrared or X-ray parts of the electromagnetic spectrum, the more we discovered: evidence for black holes, the big bang and the distances of supernovae, for starters.

=====================

(Page 3 of 6)



The difference with “dark,” however, is that it lies not only outside the visible but also beyond the entire electromagnetic spectrum. By all indications, it consists of data that our five senses can’t detect other than indirectly. The motions of galaxies don’t make sense unless we infer the existence of dark matter. The brightness of supernovae doesn’t make sense unless we infer the existence of dark energy. It’s not that inference can’t be a powerful tool: an apple falls to the ground, and we infer gravity. But it can also be an incomplete tool: gravity is ... ?

Dark matter is ... ? In the three decades since most astronomers decisively, if reluctantly, accepted the existence of dark matter, observers have eliminated the obvious answer: that dark matter is made of normal matter that is so far away or so dim that it can’t be seen from earth. To account for the dark-matter deficit, this material would have to be so massive and so numerous that we couldn’t possibly miss it.

Which leaves abnormal matter, or what physicists call nonbaryonic matter, meaning that it doesn’t consist of the protons and neutrons of “normal” matter. What’s more (or, perhaps more accurately, less), it doesn’t interact at all with electricity or magnetism, which is why we wouldn’t be able to see it, and it can rarely interact even with protons and neutrons, which is why trillions of these particles might be passing through you every second without your knowing it. Theorists have narrowed the search for dark-matter particles to two hypothetical candidates: the axion and the neutralino. But so far efforts to create one of these ghostly particles in accelerators, which mimic the high levels of energy in the first fraction of a second after the birth of the universe, have come up empty. So have efforts to catch one in ultrasensitive detectors, which number in the dozens around the world.

For now, dark-matter physicists are hanging their hopes on the Large Hadron Collider, the latest-generation subatomic-particle accelerator, which goes online later this year at the European Center for Nuclear Research on the Franco-Swiss border. Many cosmologists think that the L.H.C. has made the creation of a dark-matter particle — as George Smoot said, holding up two fingers — “this close.” But one of the pioneer astronomers investigating dark matter in the 1970s, Vera Rubin, says that she has lived through plenty of this kind of optimism; she herself predicted in 1980 that dark matter would be identified within a decade. “I hope he’s right,” she says of Smoot’s assertion. “But I think it’s more a wish than a belief.” As one particle physicist commented at a “Dark Universe” symposium at the Space Telescope Science Institute in Baltimore a few years ago, “If we fail to see anything in the L.H.C., then I’m off to do something else,” adding, “Unfortunately, I’ll be off to do something else at the same time as hundreds of other physicists.”

Juan Collar might be among them. “I know I speak for a generation of people who have been looking for dark-matter particles since they were grad students,” he said one wintry afternoon in his University of Chicago office. “I doubt how many of us will remain in the field if the L.H.C. brings home bad news. I have been looking for dark-matter particles for more than 15 years. I’m 42. So most of my colleagues, my age, we are kind of going through a midlife crisis.” He laughed. “When we get together and we drink enough beer, we start howling at the moon.”

Although many scientists say that the existence of the axion will be proved or disproved within the next 10 years — as a result of work at Lawrence Livermore National Laboratory — the detection of a neutralino one way or the other is much less certain. A negative result from an experiment might mean only that theorists haven’t thought hard enough or that observers haven’t looked deep enough. “It could very well be that Mother Nature has decided that the neutralino is way down there,” Collar said, pointing not to a graph that he taped up in his office but to a point below the sheet of paper itself, at the blank wall. “If that is the case,” he went on to say, “we should retreat and worship Mother Nature. These particles maybe exist, but we will not see them, our sons will not see them and their sons won’t see them.”

Title: Dark Energy Part Two
Post by: Crafty_Dog on March 11, 2007, 07:28:05 AM
(Page 4 of 6)

The challenge with dark energy, as opposed to dark matter, is even more difficult. Dark energy is whatever it is that’s making the expansion of the universe accelerate, but, for instance, does it change over time and space? If so, then cosmologists have a name for it: quintessence. Does it not change? In that case, they’ll call it the cosmological constant, a version of the mathematical fudge factor that Einstein originally inserted into the equations for relativity to explain why the universe had neither expanded nor contracted itself out of existence.

After the discovery of dark energy, Perlmutter concluded that the next generation of dark-energy telescopes would have to include a space-based observatory. But the search for financing for such an ambitious project can require as much forbearance as the search for dark energy itself. “I don’t think I’ve ever seen as much of Washington as I have in the last few years,” he says, sighing. Even if his Supernova Acceleration Probe didn’t now face competition from several other proposals for federal financing (including, perhaps inevitably, one involving his old rival Riess), delays have prevented it from being ready to launch until at least the middle of the next decade. “Ten years from now,” says Josh Frieman of the University of Chicago, “when we’re talking about spending on the order of a billion dollars to put something up in space — which I think we should do — you’re getting into that class where you’re spending real money.”

Even some cosmologists have begun to express reservations. At a conference at Durham University in England last summer, a “whither cosmology?” panel featuring some of the field’s most prominent names questioned the wisdom of concentrating so much money and manpower on one problem. They pointed to what happened when the government-sponsored Dark Energy Task Force solicited proposals for experiments a couple of years ago. The task force was expecting a dozen, according to one member. They got three dozen. Cosmology was choosing a “risky and not very cost-effective way of moving forward,” one Durham panelist told me later, summarizing the sentiment he heard there.

But even if somebody were to figure out whether or not dark energy changes across time and space, astronomers still wouldn’t know what dark energy itself is. “The term doesn’t mean anything,” said David Schlegel of Lawrence Berkeley National Laboratory this past fall. “It might not be dark. It might not be energy. The whole name is a placeholder. It’s a placeholder for the description that there’s something funny that was discovered eight years ago now that we don’t understand.” Not that theorists haven’t been trying. “It’s just nonstop,” Perlmutter told me. “There’s article after article after article.” He likes to begin public talks with a PowerPoint illustration: papers on dark energy piling up, one on top of the next, until the on-screen stack ascends into the dozens. All the more reason not to put all of cosmology’s eggs into one research basket, argued the Durham panelists. As one summarized the situation, “We don’t even have a hypothesis to test.”

Michael Turner won’t hear of it. “This is one of these godsend problems!” he says. “If you’re a scientist, you’d like to be around when there’s a great problem to work on and solve. The solution is not obvious, and you could imagine it being solved tomorrow, you could imagine it taking another 10 years or you could imagine it taking another 200 years.”

But you could also imagine it taking forever.

“Time to get serious.” The PowerPoint slide, teal letters popping off a black background, stared back at a hotel ballroom full of cosmologists. They gathered in Chicago last winter for a “New Views of the Universe” conference, and Sean Carroll, then at the University of Chicago, had taken it upon himself to give his theorist colleagues their marching orders.

“There was a heyday for talking out all sorts of crazy ideas,” Carroll, now at Caltech, recently explained. That heyday would have been the heady, post-1998 period when Michael Turner might stand up at a conference and turn to anyone voicing caution and say, “Can’t we be exuberant for a while?” But now has come the metaphorical morning after, and with it a sobering realization: Maybe the universe isn’t simple enough for dummies like us humans. Maybe it’s not just our powers of perception that aren’t up to the task but also our powers of conception. Extraordinary claims like the dawn of a new universe might require extraordinary evidence, but what if that evidence has to be literally beyond the ordinary? Astronomers now realize that dark matter probably involves matter that is nonbaryonic. And whatever it is that dark energy involves, we know it’s not “normal,” either. In that case, maybe this next round of evidence will have to be not only beyond anything we know but also beyond anything we know how to know.

======================

(Page 5 of 6)



That possibility always gnaws at scientists — what Perlmutter calls “that sense of tentativeness, that we have gotten so far based on so little.” Cosmologists in particular have had to confront that possibility throughout the birth of their science. “At various times in the past 20 years it could have gotten to the point where there was no opportunity for advance,” Frieman says. What if, for instance, researchers couldn’t repeat the 1963 Bell Labs detection of the supposed echo from the big bang? Smoot and John C. Mather of NASA (who shared the Nobel in Physics with Smoot) designed the Cosmic Background Explorer satellite telescope to do just that. COBE looked for extremely subtle differences in temperature throughout all of space that carry the imprint of the universe when it was less than a second old. And in 1992, COBE found them: in effect, the quantum fluctuations that 13.7 billion years later would coalesce into a universe that is 22 percent dark matter, 74 percent dark energy and 4 percent the stuff of us.

And if the right ripples hadn’t shown up? As Frieman puts it: “You just would have thrown up your hands and said, ‘My God, we’ve got to go back to the drawing board!’ What’s remarkable to me is that so far that hasn’t happpened.”

Yet in a way it has. In the observation-and-theory, call-and-response system of investigating nature that scientists have refined over the past 400 years, the dark side of the universe represents a disruption. General relativity helped explain the observations of the expanding universe, which led to the idea of the big bang, which anticipated the observations of the cosmic-microwave background, which led to the revival of Einstein’s cosmological constant, which anticipated the observations of supernovae, which led to dark energy. And dark energy is ... ?

The difficulty in answering that question has led some cosmologists to ask an even deeper question: Does dark energy even exist? Or is it perhaps an inference too far? Cosmologists have another saying they like to cite: “You get to invoke the tooth fairy only once,” meaning dark matter, “but now we have to invoke the tooth fairy twice,” meaning dark energy.

One of the most compelling arguments that cosmologists have for the existence of dark energy (whatever it is) is that unlike earlier inferences that physicists eventually had to abandon — the ether that 19th-century physicists thought pervaded space, for instance — this inference makes mathematical sense. Take Perlmutter’s and Riess’s observations of supernovae, apply one cornerstone of 20th-century physics, general relativity, and you have a universe that does indeed consist of .26 matter, dark or otherwise, and .74 something that accelerates the expansion. Yet in another way, dark energy doesn’t add up. Take the observations of supernovae, apply the other cornerstone of 20th-century physics, quantum theory, and you get gibberish — you get an answer 120 orders of magnitude larger than .74.

Which doesn’t mean that dark energy is the ether of our age. But it does mean that its implications extend beyond cosmology to a problem Einstein spent the last 30 years of his life trying to reconcile: how to unify his new physics of the very large (general relativity) with the new physics of the very small (quantum mechanics). What makes the two incompatible — where the physics breaks down — is gravity.

In physics, gravity is the ur-inference. Even Newton admitted that he was making it up as he went along. That a force of attraction might exist between two distant objects, he once wrote in a letter, is “so great an Absurdity that I believe no Man who has in philosophical Matters a competent Faculty of thinking can ever fall into it.” Yet fall into it we all do on a daily basis, and physicists are no exception. “I don’t think we really understand what gravity is,” Vera Rubin says. “So in some sense we’re doing an awful lot on something we don’t know much about.”

It hasn’t escaped the notice of astronomers that both dark matter and dark energy involve gravity. Early this year 50 physicists gathered for a “Rethinking Gravity” conference at the University of Arizona to discuss variations on general relativity. “So far, Einstein is coming through with flying colors,” says Sean Carroll, who was one of the gravity-defying participants. “He’s always smarter than you think he was.”

But he’s not necessarily inviolate. “We’ve never tested gravity across the whole universe before,” Riess pointed out during a news conference last year. “It may be that there’s not really dark energy, that that’s a figment of our misperception about gravity, that gravity actually changes the way it operates on long ranges.”

The only way out, cosmologists and particle physicists agree, would be a “new physics” — a reconciliation of general relativity and quantum mechanics. “Understanding dark energy,” Riess says, “seems to really require understanding and using both of those theories at the same time.”

===================

Page 6 of 6)



“It’s been so hard that we’re even willing to consider listening to string theorists,” Perlmutter says, referring to work that posits numerous dimensions beyond the traditional (one of time and three of space). “They’re at least providing a language in which you can talk about both things at the same time.”

According to quantum theory, particles can pop into and out of existence. In that case, maybe the universe itself was born in one such quantum pop. And if one universe can pop into existence, then why not many universes? String theorists say that number could be 10 raised to the power of 500. Those are 10-with-500-zeros universes, give or take. In which case, our universe would just happen to be the one with an energy density of .74, a condition suitable for the existence of creatures that can contemplate their hyper-Copernican existence.

And this is just one of a number of theories that have been popping into existence, quantum-particle-like, in the past few years: parallel universes, intersecting universes or, in the case of Stephen Hawking and Thomas Hertog just last summer, a superposition of universes. But what evidence — extraordinary or otherwise — can anyone offer for such claims? The challenge is to devise an experiment that would do for a new physics what COBE did for the big bang. Predictions in string theory, as in the 10-to-the-power-of-500-universes hypothesis, depend on the existence of extra dimensions, a stipulation that just might put the burden back on particle physics — specifically, the hope that evidence of extra dimensions will emerge in the Large Hadron Collider, or perhaps in its proposed successor, the International Linear Collider, which might come online sometime around 2020, or maybe in the supercollider after that, if the industrial nations of 2030 decide they can afford it.

“You want your mind to be boggled,” Perlmutter says. “That is a pleasure in and of itself. And it’s more a pleasure if it’s boggled by something that you can then demonstrate is really, really true.”

And if you can’t demonstrate that it’s really, really true?

“If the brilliant idea doesn’t come along,” Riess says, “then we will say dark energy has exactly these properties, it acts exactly like this. And then” — a shrug — “we will put it in a box.” And there it will remain, residing perhaps not far from the box labeled “Dark Matter,” and the two of them bookending the biggest box of them all, “Gravity,” to await a future Newton or Einstein to open — or not.
Title: massive black holes
Post by: ccp on April 21, 2007, 07:01:57 PM
I was watching the
"Science" channel on cable the other night.  They had a show on supermassive black holes.  I didn't realize that present theory holds that there is a black hole in every galaxy and is in some way related to the clustering of the stars in that galaxy.  It is also theorized that quasars are also related to supermassive black holes.

I remember in my astronomy classes in the 70's (ugh!) that quasars were the farthests objects in the universe and there was absolutely no explanation as to what they were.  A lot of discovery has happened since then.  A lot of theories formulated.

Yet every time I read about space I am left with this empty feeling.   I feel like we will never be able to understand "where it all began".   It seems unanswerable.  It seems incomprehensible.  Should this thread be headed under religion or God?   But to me the concept of God doesn't really answer the great questions since the beginning of man.   But it is more comforting.

This link is not to the particular show but to another space site which came up today on a news link:

http://www.space.com/bestimg/index.php?guid=4499b3474b769&cat=strangest

 huh
Title: The Shadow goes
Post by: Crafty_Dog on June 20, 2007, 07:01:37 AM
The Shadow Goes
By MARGARET WERTHEIM
Published: June 20, 2007
NY Times


ON Thursday, on the summer solstice, the Sun will celebrate the year’s lazy months by resting on the horizon. The word solstice derives from the Latin “sol” (sun) and “sistere” (to stand still). The day marks the sun’s highest point in the sky, the moment when our shadows shrink to their shortest length of the year. How strange to think that these mundane friends, our ever-present familiars, can actually go faster than the sun’s rays.

I remarked on this recently to my husband as we sat on the porch with our shadows pooling by our chairs. Nothing can go faster than light, he insisted, expressing what is surely the most widely known law of physics, ingrained into us by a thousand “Nova” programs.

That is the point, I explained: Nothing can go faster than light. A shadow isn’t a thing. It’s a non-thing. It’s the absence of light.

Special relativity dictates that we cannot move anything more quickly than the particles of light known as photons, but no law says you can’t do nothing faster than light. Physicists have known this for a long time, even if they generally do not mention it on PBS documentaries.

My husband looked troubled, as did my sister and some friends I regaled with the story that evening. Like the warp drive on “Star Trek,” faster-than-light travel is supposed to be a science-fiction fantasy. Isn’t it?

They are right about the travel: According to relativity, no physical substance can exceed the speed of light because it would take infinite energy to accelerate anything to such a velocity.

Yet the laws of physics pertain only to that which is. That which isn’t is not bound by relativity’s restraint. From the point of view of relativity, a shadow (having no mass) is a non-thing, an existential void.

It’s quite easy to conjure up a faster-than-light shadow, at least in theory. Build a great klieg light, a superstrong version of the ones set up at the Academy Awards. Now paste a piece of black paper onto the klieg’s glass so there is a shadow in the middle of the beam, like the signal used to summon Batman. And we are going to mount our light in space and broadcast the Bat-call to the cosmos.

The key to our trick is to rotate the klieg. As the light turns, the bat shadow sweeps across the sky. Round and round it goes, projecting into the void. Just as the rim of a bicycle wheel moves faster than its hub, so too, away from the source our bat shadow will fly faster and faster, a consequence of the geometry that guarantees the rim of a really big wheel moves faster than a co-rotating small wheel.

At a great enough distance from the source, our shadow bat will go so fast it will exceed the speed of light. This does not violate relativity because a shadow carries no energy. Literally nothing is transferred. Our shadow bat can go 10 times the speed of light or 100 times faster without breaking any of physics’ sacred rules.

My sister leapt to the heart of this apparent paradox: Why isn’t the light itself traveling faster than the speed of light? Isn’t it also rotating in space? Actually, no. The bulbs that produce the light are spinning, but the light particles leave the source at 186,000 miles a second, the vaunted “speed of light.” Once emitted, the photons continue to travel at this speed directly away from the source. Only the shadow revolves around the great circle. The critical point is that no object, no substance, defies light.

My husband was right to object that you’d need one spectacular klieg to produce a detectable shadow thousands of miles out in space. Still, the theory is sound.

The anthropologist Mary Douglas noted that all systems of categorizing break down somewhere, unable to incorporate certain forms. By standing beyond relativity’s injunction, shadows suggest the limits of all classification schemes, a tension that even modern science cannot completely resolve.

In the terms recognized by relativity, shadows are non-things. Yet before the invention of clocks, shadows were the most important means for telling time. Weightless and without energy, shadows can nonetheless convey information — though they cannot, despite our giant klieg, be used for faster-than-light communication. That’s because the shadow’s location cannot be detected until the light, moving at its ponderous relativistic pace, arrives.

“Here there be monsters,” said the medieval maps, signaling the limits of reason’s reach. As a map of being, physics is flanked by the monsters of non-being whose outlines we glimpse in the paradoxes of quantum mechanics and in the zooming arc of a shadow bat going faster than light.

In Christian theology we are told, “God is that which nothing is greater than.” The scientific corollary might be, “Light is that which nothing is faster than” — a statement true both in spirit and fact.

Margaret Wertheim, the director of the Institute for Figuring, a science and mathematics education organization, is writing a book on physics and the imagination.

Title: Super Collider my Cause a Black Hole that will Kill us All, Oh My!
Post by: Body-by-Guinness on August 29, 2007, 07:02:06 AM
For those needing a new unreasoned fear to latch on to. . . .

Aug 3, 2007
Fears over factoids

Recent TV programmes have claimed that the Earth could be destroyed by black holes created in particle accelerators and that helium-3 from the Moon could be used for fusion energy. Frank Close warns that these "factoids" must be stamped out before they become accepted as facts

Did you know that when the Large Hadron Collider (LHC) comes online at CERN next spring, it could end up creating mini black holes that destroy the Earth? This is not something from a Dan Brown novel, but from a TV documentary broadcast as part of the BBC's Horizon series in the UK on 1 May – a programme that has been running for 40 years and is supposedly the flagship of TV science in the country. Although the documentary itself was fairly measured, the producers began the programme with the black-hole claim and used it in their publicity for the show.


Unnecessary drama
Physicists who recall superb Horizon documentaries of the past – for example, on the discovery of the W and Z bosons – will have been disappointed that such a marvellous project as the LHC should have been sensationalized in this way. It was disheartening that the programme makers felt the need to rehash these unnecessary concerns over black holes being produced in particle accelerators, which physicists had already dismissed before the Relativistic Heavy Ion Collider (RHIC) came online at the Brookhaven National Laboratory in 2000 (Physics World July 2000 pp19–20, print edition only).

Meanwhile, another Horizon documentary, broadcast on 10 April, claimed that one reason for sending humans to the Moon is so that we can mine it for helium-3 as a fuel for fusion power back on Earth. The need to bring helium-3 back from the Moon has even been briefly referred to in Physics World (May 2007 pp12–13, print edition only) and, more worryingly, has been presented to US congressional committees, including the Science and Technology Committee of the House of Representatives in 2004.

As a particle physicist, I am of course interested in the LHC; and as the chair of a working group set up by the British National Space Centre to look into the future of UK space science – including the possibility of humans returning to the Moon – I am also intrigued by the helium-3 story. Both of the claims bother me and, on investigation, each is revealed as an example of what I call "factoid science" – myths of dubious provenance that propagate, become received wisdom and could even influence policy. So what is the reality and what can physicists do to correct such mis-information?

Strangelet statistics
The story of the LHC as an Armageddon machine would be laughable were it not so serious. Aficionados of Dan Brown – whose novel Angels and Demons was set partly at CERN – might believe that the Geneva lab produces antimatter capable of making weapons of mass destruction. But I did not expect to find similarly outlandish statements used to promote Horizon. As the programme's website puts it: "Some scientists argue that during a 10-year spell of operation there is a 1 in 50 million chance that experiments like the LHC could cause a catastrophe of epic proportions." The site then invites the public to take part in a poll on whether the LHC should be turned on or not, based on this "probability".

While the LHC will create the most energetic collisions ever seen on Earth, cosmic rays at these and even higher energies have been bombarding our and other planets for billions of years without mishap. When I asked the producers of Horizon where they had obtained the 1-in-50-million statistic, I was told it had been taken from a "reliable source": Our Final Century by Cambridge University cosmologist Martin Rees. But when I read his book, it became clear that the programme's research had sadly been incomplete. On page 124, Rees discusses a paper published in 1999 by CERN theorists Arnon Dar, Alvaro de Rújula and Ulrich Heinz that uses the fact that the Earth and the cosmos have survived for several billion years to estimate the probability of colliders producing hypothetical particles called "strangelets" that might destroy our planet (1999 Phys. Lett. B 470 142).

Rees fairly describes their conclusions as follows: "If the experiment were run for 10 years, the risk of catastrophe was no more than 1 in 50 million." In other words, the chance of disaster is one in at least 50 million (as no disaster has occurred); this is rather different from saying, as Horizon does, that there is a "1 in 50 million" probability of a catastrophe happening from the moment the LHC switches on.

Moreover, when Dar and colleagues wrote their 1999 paper, a committee of eminent physicists appointed by the Brookhaven lab was also investigating if RHIC could produce strangelets (arXiv:hep-ph/ 9910333v3). That study used not just information from cosmology but also data from collisions between heavy ions (albeit at lower energies than RHIC would obtain) to show that the chances of catastrophe are at least one part in 1019.

Furthermore, these figures refer specifically to strangelets being produced at RHIC, as Rees makes clear, and have nothing to do with the question of whether we should risk creating black holes. Indeed, why does Horizon talk about black holes at all? The only reason can be that a theory does exist that posits that mini black holes could be produced in a collider. But if one mentions this theory, then one must include the whole of it, which clearly states that mini black holes pose no hazard whatsoever because they do not grow but evaporate and die.

As if any more evidence was needed that colliders are safe, CERN also set up an "LHC safety-study group" to see if its new collider could create black holes or strangelets. It concluded – in an official CERN report published in 2003 (CERN-2003-001) – that there is "no basis for any conceivable threat" of either eventuality, which is as near as science can get to saying zero. Unfortunately, the Horizon programme made no mention of these serious and time-consuming enquiries even though CERN's press office gave the programme's researchers a copy of the lab's 2003 report. Instead, the public has been led to believe that scientists are prepared to embark on experiments that could spell the end of the planet.

Helium errors
Let me now turn to the helium-3 factoid. At most fusion experiments, such as the Joint European Torus (JET) in the UK, a fuel of deuterium and tritium nuclei is converted in a tokomak into helium-4 and a neutron, thereby releasing energy in the process. No helium-3 is involved, so where does the myth come from? Enter "helium-3 fusion" into Google and you will find numerous websites pointing out that the neutron produced in deuterium–tritium fusion makes the walls of the tokomak radioactive, but that fusion could be "clean" if only we reacted deuterium with helium-3 to produce helium-4 and a proton.

Given that the amount of helium-3 available on Earth is trifling, it has been proposed that we should go to the Moon to mine the isotope, which is produced in the Sun and might be blown onto the lunar surface via the solar wind. Apart from not even knowing for certain if there is any helium-3 on the Moon, there are two main problems with this idea – one obvious and one intriguingly subtle. The first problem is that, in a tokomak, deuterium reacts up to 100 times more slowly with helium-3 than it does with tritium. This is because fusion has to overcome the electrical repulsion between the protons in the fuel, which is much higher for deuterium– helium-3 reactions (the nuclei have one and two protons, respectively) than it is for deuterium– tritium reactions (one proton each).

Clearly, deuterium–helium-3 is a poor fusion process, but the irony is much greater as I shall now reveal. A tokomak is not like a particle accelerator where counter-rotating beams of deuterium and helium-3 collide and fuse. Instead, all of the nuclei in the fuel mingle together, which means that two deuterium nuclei can rapidly fuse to give a tritium nucleus and proton. The tritium can now fuse with the deuterium – again much faster than the deuterium can with helium-3 – to yield helium-4 and a neutron.

So by bringing helium-3 from the Moon, all we will end up doing is create a deuterium– tritium fusion machine, which is the very thing the helium aficionados wanted to avoid! Undeterred, some of these people even suggest that two helium-3 nuclei could be made to fuse with each other to produce deuterium, an alpha particle and energy. Unfortunately, this reaction occurs even more slowly than deuterium–tritium fusion and the fuel would have to be heated to impractically high temperatures that would be beyond the reach of a tokomak. And as not even the upcoming International Thermonuclear Experimental Reactor (ITER) will be able to generate electricity from the latter reaction, the lunar-helium-3 story – like the LHC as an Armageddon machine – is, to my mind, moonshine.

Rising pressure
Does any of this matter beyond raising the blood pressure of some physicists? All publicity is good publicity, some might say. But I believe we should all be concerned. The LHC factoid has now been repeated in the New Yorker and in various reviews of the Horizon documentary. Even some nonphysics colleagues are asking me to explain what it is all about. If Horizon claims to be the flagship TV science series on which the public rely to form their opinions, I would hope that their researchers do their research, and that the editors then take due account of it.

The factoids about mining the Moon for fusion fuel and of the LHC Armageddon make a cautionary tale. A decade from now it is possible that committees of well-informed scientists and rather less-well-informed politicians, with public opinion weighing on their minds, will be deciding on our involvement in mega-projects such as the next huge accelerator, human space exploration, or even a post-ITER commercial fusion plant.

Decision making driven by public opinion that is influenced by factoids already has a dire history in the bio-medical arena: the controversy over whether to give children a combined immunization against measles, mumps and rubella (MMR) being the most recent example. My advice is that if you see an error in the media, speak out, write to the editors and try to get corrections made. It is an opportunity to get good science in the news.

About the author
Frank Close is a theoretical physicist at the University of Oxford, UK
Title: Re: Physics
Post by: Crafty_Dog on December 19, 2007, 08:26:01 AM
Laws of Nature, Source Unknown
NYTimes
By DENNIS OVERBYE
Published: December 18, 2007


“Gravity,” goes the slogan on posters and bumper stickers. “It isn’t just a good idea. It’s the law.”

And what a law. Unlike, say, traffic or drug laws, you don’t have a choice about obeying gravity or any of the other laws of physics. Jump and you will come back down. Faith or good intentions have nothing to do with it.

Existence didn’t have to be that way, as Einstein reminded us when he said, “The most incomprehensible thing about the universe is that it is comprehensible.” Against all the odds, we can send e-mail to Sri Lanka, thread spacecraft through the rings of Saturn, take a pill to chase the inky tendrils of depression, bake a turkey or a soufflé and bury a jump shot from the corner.

Yes, it’s a lawful universe. But what kind of laws are these, anyway, that might be inscribed on a T-shirt but apparently not on any stone tablet that we have ever been able to find?

Are they merely fancy bookkeeping, a way of organizing facts about the world? Do they govern nature or just describe it? And does it matter that we don’t know and that most scientists don’t seem to know or care where they come from?

Apparently it does matter, judging from the reaction to a recent article by Paul Davies, a cosmologist at Arizona State University and author of popular science books, on the Op-Ed page of The New York Times.

Dr. Davies asserted in the article that science, not unlike religion, rested on faith, not in God but in the idea of an orderly universe. Without that presumption a scientist could not function. His argument provoked an avalanche of blog commentary, articles on Edge.org and letters to The Times, pointing out that the order we perceive in nature has been explored and tested for more than 2,000 years by observation and experimentation. That order is precisely the hypothesis that the scientific enterprise is engaged in testing.

David J. Gross, director of the Kavli Institute for Theoretical Physics in Santa Barbara, Calif., and co-winner of the Nobel Prize in physics, told me in an e-mail message, “I have more confidence in the methods of science, based on the amazing record of science and its ability over the centuries to answer unanswerable questions, than I do in the methods of faith (what are they?).”

Reached by e-mail, Dr. Davies acknowledged that his mailbox was “overflowing with vitriol,” but said he had been misunderstood. What he had wanted to challenge, he said, was not the existence of laws, but the conventional thinking about their source.

There is in fact a kind of chicken-and-egg problem with the universe and its laws. Which “came” first — the laws or the universe?

If the laws of physics are to have any sticking power at all, to be real laws, one could argue, they have to be good anywhere and at any time, including the Big Bang, the putative Creation. Which gives them a kind of transcendent status outside of space and time.

On the other hand, many thinkers — all the way back to Augustine — suspect that space and time, being attributes of this existence, came into being along with the universe — in the Big Bang, in modern vernacular. So why not the laws themselves?

Dr. Davies complains that the traditional view of transcendent laws is just 17th-century monotheism without God. “Then God got killed off and the laws just free-floated in a conceptual vacuum but retained their theological properties,” he said in his e-mail message.

But the idea of rationality in the cosmos has long existed without monotheism. As far back as the fifth century B.C. the Greek mathematician and philosopher Pythagoras and his followers proclaimed that nature was numbers. Plato envisioned a higher realm of ideal forms, of perfect chairs, circles or galaxies, of which the phenomena of the sensible world were just flawed reflections. Plato set a transcendent tone that has been popular, especially with mathematicians and theoretical physicists, ever since.

Steven Weinberg, a Nobel laureate from the University of Texas, Austin, described himself in an e-mail message as “pretty Platonist,” saying he thinks the laws of nature are as real as “the rocks in the field.” The laws seem to persist, he wrote, “whatever the circumstance of how I look at them, and they are things about which it is possible to be wrong, as when I stub my toe on a rock I had not noticed.”

The ultimate Platonist these days is Max Tegmark, a cosmologist at the Massachusetts Institute of Technology. In talks and papers recently he has speculated that mathematics does not describe the universe — it is the universe.

===========

Page 2 of 3)



Dr. Tegmark maintains that we are part of a mathematical structure, albeit one gorgeously more complicated than a hexagon, a multiplication table or even the multidimensional symmetries that describe modern particle physics. Other mathematical structures, he predicts, exist as their own universes in a sort of cosmic Pythagorean democracy, although not all of them would necessarily prove to be as rich as our own.

“Everything in our world is purely mathematical — including you,” he wrote in New Scientist.

This would explain why math works so well in describing the cosmos. It also suggests an answer to the question that Stephen Hawking, the English cosmologist, asked in his book, “A Brief History of Time”: “What is it that breathes fire into the equations and makes a universe for them to describe?” Mathematics itself is on fire.

Not every physicist pledges allegiance to Plato. Pressed, these scientists will describe the laws more pragmatically as a kind of shorthand for nature’s regularity. Sean Carroll, a cosmologist at the California Institute of Technology, put it this way: “A law of physics is a pattern that nature obeys without exception.”

Plato and the whole idea of an independent reality, moreover, took a shot to the mouth in the 1920s with the advent of quantum mechanics. According to that weird theory, which, among other things, explains why our computers turn on every morning, there is an irreducible randomness at the microscopic heart of reality that leaves an elementary particle, an electron, say, in a sort of fog of being everywhere or anywhere, or being a wave or a particle, until some measurement fixes it in place.

In that case, according to the standard interpretation of the subject, physics is not about the world at all, but about only the outcomes of experiments, of our clumsy interactions with that world. But 75 years later, those are still fighting words. Einstein grumbled about God not playing dice.

Steven Weinstein, a philosopher of science at the University of Waterloo, in Ontario, termed the phrase “law of nature” as “a kind of honorific” bestowed on principles that seem suitably general, useful and deep. How general and deep the laws really are, he said, is partly up to nature and partly up to us, since we are the ones who have to use them.

But perhaps, as Dr. Davies complains, Plato is really dead and there are no timeless laws or truths. A handful of poet-physicists harkening for more contingent nonabsolutist laws not engraved in stone have tried to come up with prescriptions for what John Wheeler, a physicist from Princeton and the University of Texas in Austin, called “law without law.”

As one example, Lee Smolin, a physicist at the Perimeter Institute for Theoretical Physics, has invented a theory in which the laws of nature change with time. It envisions universes nested like Russian dolls inside black holes, which are spawned with slightly different characteristics each time around. But his theory lacks a meta law that would prescribe how and why the laws change from generation to generation.

Holger Bech Nielsen, a Danish physicist at the Niels Bohr Institute in Copenhagen, and one of the early pioneers of string theory, has for a long time pursued a project he calls Random Dynamics, which tries to show how the laws of physics could evolve naturally from a more general notion he calls “world machinery.”

On his Web site, Random Dynamics, he writes, “The ambition of Random Dynamics is to ‘derive’ all the known physical laws as an almost unavoidable consequence of a random fundamental ‘world machinery.’”

Dr. Wheeler has suggested that the laws of nature could emerge “higgledy-piggledy” from primordial chaos, perhaps as a result of quantum uncertainty. It’s a notion known as “it from bit.” Following that logic, some physicists have suggested we should be looking not so much for the ultimate law as for the ultimate program..

Anton Zeilinger, a physicist and quantum trickster at the University of Vienna, and a fan of Dr. Wheeler’s idea, has speculated that reality is ultimately composed of information. He said recently that he suspected the universe was fundamentally unpredictable.

I love this idea of intrinsic randomness much for the same reason that I love the idea of natural selection in biology, because it and only it ensures that every possibility will be tried, every circumstance tested, every niche inhabited, every escape hatch explored. It’s a prescription for novelty, and what more could you ask for if you want to hatch a fecund universe?

============

Page 3 of 3)



But too much fecundity can be a problem. Einstein hoped that the universe was unique: given a few deep principles, there would be only one consistent theory. So far Einstein’s dream has not been fulfilled.Cosmologists and physicists have recently found themselves confronted by the idea of the multiverse, with zillions of universes, each with different laws, occupying a vast realm known in the trade as the landscape.

In this case there is meta law — one law or equation, perhaps printable on a T-shirt — to rule them all. This prospective lord of the laws would be string theory, the alleged theory of everything, which apparently has 10500 solutions. Call it Einstein’s nightmare.

But it is soon for any Einsteinian to throw in his or her hand. Since cosmologists don’t know how the universe came into being, or even have a convincing theory, they have no way of addressing the conundrum of where the laws of nature come from or whether those laws are unique and inevitable or flaky as a leaf in the wind.

These kinds of speculation are fun, but they are not science, yet. “Philosophy of science is about as useful to scientists as ornithology is to birds,” goes the saying attributed to Richard Feynman, the late Caltech Nobelist, and repeated by Dr. Weinberg.

Maybe both alternatives — Plato’s eternal stone tablet and Dr. Wheeler’s higgledy-piggledy process — will somehow turn out to be true. The dichotomy between forever and emergent might turn out to be as false eventually as the dichotomy between waves and particles as a description of light. Who knows?

The law of no law, of course, is still a law.

When I was young and still had all my brain cells I was a bridge fan, and one hand I once read about in the newspaper bridge column has stuck with me as a good metaphor for the plight of the scientist, or of the citizen cosmologist. The winning bidder had overbid his hand. When the dummy cards were laid, he realized that his only chance of making his contract was if his opponents’ cards were distributed just so.

He could have played defensively, to minimize his losses. Instead he played as if the cards were where they had to be. And he won.

We don’t know, and might never know, if science has overbid its hand. When in doubt, confronted with the complexities of the world, scientists have no choice but to play their cards as if they can win, as if the universe is indeed comprehensible. That is what they have been doing for more than 2,000 years, and they are still winning.

Title: Plane vs. Conveyer Belt
Post by: Crafty_Dog on February 04, 2008, 10:08:54 AM
Plane vs. Conveyer Belt: Hell Yeah the Plane Takes Off
by Higgins - January 31, 2008 - 4:20 PM

Last night the Discovery show Mythbusters settled a longstanding debate: whether an airplane on a conveyer belt (running at the same speed, but in the opposite direction as the plane) can take off. The short answer, as liveblogged by Jason Kottke:

HELL YEAH THE PLANE TAKES OFF

It’s a curious problem. As a thought experiment, it seems (at least to me) like the plane shouldn’t take off, since it’s not gaining takeoff velocity relative to the ground. But according to, you know, SCIENCE, the plane doesn’t need to reach takeoff velocity relative to the ground — it just needs lift an appropriate amount of lift. It’s the velocity of the air relative to the wings that counts, which is generated by the action of the engines.

 

Despite explanations of this sort of physicists, the issue wasn’t really settled until last night’s Mythbusters episode — they replicated the experiment on a small scale, then with a real airplane (albeit an ultralight), using a huge tarp dragged by a truck as the “conveyer belt.” Even the plane’s pilot thought the plane wouldn’t take off. When Jason Kottke first blogged about the issue last February, his comment thread was hot with controversy. So Kottke tuned in to Mythbusters last night and liveblogged the event, with results visible above. His exuberance over the plane’s liftoff has resulted in a “HELL YEAH THE PLANE TAKES OFF” tee-shirt available starting at $18. Wow.

 

Watch the Mythbusters clip in question below…. (Note: if this clip is pulled down, I’ll try to dig up another.)

 

See <http://www.mentalfloss.com/blogs/archives/11750> for the Mythbusters clip...
Title: Electron filmed
Post by: Crafty_Dog on February 25, 2008, 11:26:35 AM
Electron Filmed for First Time - Yahoo! News

http://news.yahoo.com/s/livescience/20080225/sc_livescience/electronfilmedforfirsttime
Title: Re: Physics
Post by: Crafty_Dog on March 29, 2008, 08:14:23 AM
Asking a Judge to Save the World, and Maybe a Whole Lot More



More fighting in Iraq. Somalia in chaos. People in this country can’t afford their mortgages and in some places now they can’t even afford rice.


None of this nor the rest of the grimness on the front page today will matter a bit, though, if two men pursuing a lawsuit in federal court in Hawaii turn out to be right. They think a giant particle accelerator that will begin smashing protons together outside Geneva this summer might produce a black hole or something else that will spell the end of the Earth — and maybe the universe.

Scientists say that is very unlikely — though they have done some checking just to make sure.

The world’s physicists have spent 14 years and $8 billion building the Large Hadron Collider, in which the colliding protons will recreate energies and conditions last seen a trillionth of a second after the Big Bang. Researchers will sift the debris from these primordial recreations for clues to the nature of mass and new forces and symmetries of nature.

But Walter L. Wagner and Luis Sancho contend that scientists at the European Center for Nuclear Research, or CERN, have played down the chances that the collider could produce, among other horrors, a tiny black hole, which, they say, could eat the Earth. Or it could spit out something called a “strangelet” that would convert our planet to a shrunken dense dead lump of something called “strange matter.” Their suit also says CERN has failed to provide an environmental impact statement as required under the National Environmental Policy Act.

Although it sounds bizarre, the case touches on a serious issue that has bothered scholars and scientists in recent years — namely how to estimate the risk of new groundbreaking experiments and who gets to decide whether or not to go ahead.

The lawsuit, filed March 21 in Federal District Court, in Honolulu, seeks a temporary restraining order prohibiting CERN from proceeding with the accelerator until it has produced a safety report and an environmental assessment. It names the federal Department of Energy, the Fermi National Accelerator Laboratory, the National Science Foundation and CERN as defendants.

According to a spokesman for the Justice Department, which is representing the Department of Energy, a scheduling meeting has been set for June 16.

Why should CERN, an organization of European nations based in Switzerland, even show up in a Hawaiian courtroom?

In an interview, Mr. Wagner said, “I don’t know if they’re going to show up.” CERN would have to voluntarily submit to the court’s jurisdiction, he said, adding that he and Mr. Sancho could have sued in France or Switzerland, but to save expenses they had added CERN to the docket here. He claimed that a restraining order on Fermilab and the Energy Department, which helps to supply and maintain the accelerator’s massive superconducting magnets, would shut down the project anyway.

James Gillies, head of communications at CERN, said the laboratory as of yet had no comment on the suit. “It’s hard to see how a district court in Hawaii has jurisdiction over an intergovernmental organization in Europe,” Mr. Gillies said.

“There is nothing new to suggest that the L.H.C. is unsafe,” he said, adding that its safety had been confirmed by two reports, with a third on the way, and would be the subject of a discussion during an open house at the lab on April 6.

“Scientifically, we’re not hiding away,” he said.

But Mr. Wagner is not mollified. “They’ve got a lot of propaganda saying it’s safe,” he said in an interview, “but basically it’s propaganda.”

In an e-mail message, Mr. Wagner called the CERN safety review “fundamentally flawed” and said it had been initiated too late. The review process violates the European Commission’s standards for adhering to the “Precautionary Principle,” he wrote, “and has not been done by ‘arms length’ scientists.”

Physicists in and out of CERN say a variety of studies, including an official CERN report in 2003, have concluded there is no problem. But just to be sure, last year the anonymous Safety Assessment Group was set up to do the review again.

“The possibility that a black hole eats up the Earth is too serious a threat to leave it as a matter of argument among crackpots,” said Michelangelo Mangano, a CERN theorist who said he was part of the group. The others prefer to remain anonymous, Mr. Mangano said, for various reasons. Their report was due in January.

This is not the first time around for Mr. Wagner. He filed similar suits in 1999 and 2000 to prevent the Brookhaven National Laboratory from operating the Relativistic Heavy Ion Collider. That suit was dismissed in 2001. The collider, which smashes together gold ions in the hopes of creating what is called a “quark-gluon plasma,” has been operating without incident since 2000.

=========



Mr. Wagner, who lives on the Big Island of Hawaii, studied physics and did cosmic ray research at the University of California, Berkeley, and received a doctorate in law from what is now known as the University of Northern California in Sacramento. He subsequently worked as a radiation safety officer for the Veterans Administration.


Mr. Sancho, who describes himself as an author and researcher on time theory, lives in Spain, probably in Barcelona, Mr. Wagner said.

Doomsday fears have a long, if not distinguished, pedigree in the history of physics. At Los Alamos before the first nuclear bomb was tested, Emil Konopinski was given the job of calculating whether or not the explosion would set the atmosphere on fire.

The Large Hadron Collider is designed to fire up protons to energies of seven trillion electron volts before banging them together. Nothing, indeed, will happen in the CERN collider that does not happen 100,000 times a day from cosmic rays in the atmosphere, said Nima Arkani-Hamed, a particle theorist at the Institute for Advanced Study in Princeton.

What is different, physicists admit, is that the fragments from cosmic rays will go shooting harmlessly through the Earth at nearly the speed of light, but anything created when the beams meet head-on in the collider will be born at rest relative to the laboratory and so will stick around and thus could create havoc.

The new worries are about black holes, which, according to some variants of string theory, could appear at the collider. That possibility, though a long shot, has been widely ballyhooed in many papers and popular articles in the last few years, but would they be dangerous?

According to a paper by the cosmologist Stephen Hawking in 1974, they would rapidly evaporate in a poof of radiation and elementary particles, and thus pose no threat. No one, though, has seen a black hole evaporate.

As a result, Mr. Wagner and Mr. Sancho contend in their complaint, black holes could really be stable, and a micro black hole created by the collider could grow, eventually swallowing the Earth.

But William Unruh, of the University of British Columbia, whose paper exploring the limits of Dr. Hawking’s radiation process was referenced on Mr. Wagner’s Web site, said they had missed his point. “Maybe physics really is so weird as to not have black holes evaporate,” he said. “But it would really, really have to be weird.”

Lisa Randall, a Harvard physicist whose work helped fuel the speculation about black holes at the collider, pointed out in a paper last year that black holes would probably not be produced at the collider after all, although other effects of so-called quantum gravity might appear.

As part of the safety assessment report, Dr. Mangano and Steve Giddings of the University of California, Santa Barbara, have been working intensely for the last few months on a paper exploring all the possibilities of these fearsome black holes. They think there are no problems but are reluctant to talk about their findings until they have been peer reviewed, Dr. Mangano said.

Dr. Arkani-Hamed said concerning worries about the death of the Earth or universe, “Neither has any merit.” He pointed out that because of the dice-throwing nature of quantum physics, there was some probability of almost anything happening. There is some minuscule probability, he said, “the Large Hadron Collider might make dragons that might eat us up.”
Title: Multiversed
Post by: Body-by-Guinness on July 14, 2009, 09:13:09 PM
How to map the multiverse

04 May 2009 by Anil Ananthaswamy

BRIAN GREENE spent a good part of the last decade extolling the virtues of string theory. He dreamed that one day it would provide physicists with a theory of everything that would describe our universe - ours and ours alone. His bestselling book The Elegant Universe eloquently captured the quest for this ultimate theory.

"But the fly in the ointment was that string theory allowed for, in principle, many universes," says Greene, who is a theoretical physicist at Columbia University in New York. In other words, string theory seems equally capable of describing universes very different from ours. Greene hoped that something in the theory would eventually rule out most of the possibilities and single out one of these universes as the real one: ours.

So far, it hasn't - though not for any lack of trying. As a result, string theorists are beginning to accept that their ambitions for the theory may have been misguided. Perhaps our universe is not the only one after all. Maybe string theory has been right all along.

Greene, certainly, has had a change of heart. "You walk along a number of pathways in physics far enough and you bang into the possibility that we are one universe of many," he says. "So what do you do? You smack yourself in the head and say, 'Ah, maybe the universe is trying to tell me something.' I have personally undergone a sort of transformation, where I am very warm to this possibility of there being many universes, and that we are in the one where we can survive."

We keep banging into the possibility that we are one universe of many. Maybe that's telling us something
Greene's transformation is emblematic of a profound change among the majority of physicists. Until recently, many were reluctant to accept this idea of the "multiverse", or were even belligerent towards it. However, recent progress in both cosmology and string theory is bringing about a major shift in thinking. Gone is the grudging acceptance or outright loathing of the multiverse. Instead, physicists are starting to look at ways of working with it, and maybe even trying to prove its existence.

If such ventures succeed, our universe will go the way of Earth - from seeming to be the centre of everything to being exposed as just a backwater in a far vaster cosmos. And just as we are unable to deduce certain aspects of Earth from first principles - such as its radius or distance from the sun - we will have to accept that some things about our universe are a random accident, inexplicable except in the context of the multiverse.

One of the first to argue for a multiverse was Russian physicist Andrei Linde, now at Stanford University in California. In the 1980s, Linde extended and improved upon an idea called inflation, which suggests that the universe underwent a period of exponential expansion in the first fractions of a second after the big bang. Inflation successfully explains why the universe looks pretty much the same in all directions, and why space-time is "flat", despite Einstein showing that it can just as easily be curved.

Linde realised that inflation could be ongoing or "eternal", in the sense that once space-time starts inflating, it can stop in some parts (such as ours) yet take off with renewed vigour elsewhere. This process continues ad infinitum, giving rise to a patchwork of regions of space, each with different properties. When and how inflation ceases in a particular patch dictates the exact nature and types of fundamental particles there and the laws of physics that govern their behaviour. Over time, eternal inflation gives rise to just about every possible type of universe predicted by string theory. Our universe, argues Linde, is a part of this multiverse.

It wasn't until 1998, however, that the multiverse gained any traction, when astronomers studying distant supernovae announced that the expansion of the universe is accelerating. They put this down to the vacuum of space having a small energy density, which exerts a repulsive force to counteract gravity as the universe ages. This became known as dark energy, or the cosmological constant.

Its discovery was a huge blow. Up till then, physicists had hoped that some ultimate theory would deduce the values of fundamental constants of nature from first principles, including the cosmological constant, and explain why the laws of physics are as they are, just right for the formation of stars and galaxies and possibly the emergence of life. This seems not to be the case. Nothing in string theory, or indeed any other theory in physics, can predict the observed value of the cosmological constant.

However, if our universe is part of a multiverse then we can ascribe the value of the cosmological constant to an accident. The same goes for other aspects of our universe, such as the mass of the electron. The idea is simply that each universe's laws of physics and fundamental constants are randomly determined, and we just happen to live in one where these are suited for life. "If not for the multiverse, you would have these unsolved problems at every corner," says Linde.

The other compelling argument for a multiverse comes from string theory. This maintains that all fundamental particles of matter and forces of nature arise from the vibration of tiny strings in 10 dimensions. For us not to notice the extra six dimensions of space, they must be curled up, or compacted, so small as to be undetectable. For decades, mathematicians toiled over what different forms this compaction could take, and they found myriad ways of scrunching up space-time - a staggering 10500 or more.

Each form gives rise to a different vacuum of space-time, and hence a different universe - with its own vacuum energy, fundamental particles and laws of physics. The hope, nurtured by Greene and others, was that there was some kind of uniqueness principle that would pick out the particular form of space-time that produces our universe.

That hope has since receded dramatically. In 2004, Michael Douglas of the State University of New York in Stony Brook, and Leonard Susskind of Stanford University surveyed the developments in string theory to date and concluded that all these theoretical varieties of space-time should be taken seriously as physical realities - that is, they point to a multiverse. Susskind coined the term "the landscape of string theory" to describe the 10500 or more different universes. Nothing in string theory suggests that any one of these universes is preferred over others. Rather, it appears all are equally likely.

Together, dark energy and string theory are making physicists see the multiverse anew. "Just about everybody is convinced that the idea of uniqueness has gone down the drain," says Susskind. So what are we to do? Throw up our hands and admit that we will never be able to explain why our universe is the way it is?

Exploring the landscape

Not a bit of it. Susskind argues that we can still ask meaningful questions within the context of the multiverse, just not the ones we'd ask if ours were the only universe. Questions such as: can we identify the exact point in the landscape that corresponds to our universe, or at least the parts of the landscape that most closely resemble our universe? Is it possible to tell which of our universe's properties can be derived from first principles and which ones are random?

We can still ask meaningful questions about the universe, just not the ones we'd ask if it were unique
Also, can we find parts of the landscape with the right conditions for eternal inflation to take place? After all, the landscape and eternal inflation are independent concepts. Confirming that they are compatible would lend more credence to the multiverse idea.

These are not trivial questions to answer, but string theorists are rising to the challenge by feverishly exploring the landscape. Investigating a collection of 10500 universes is not a matter of enumerating the properties of each of them, however. "We just can't make a list of 10500 things," says Nobel laureate Steven Weinberg of the University of Texas at Austin. "That's more than the number of atoms in the observable universe."

The first line of attack has been to develop mathematical models of the landscape. These describe the landscape as a terrain of hills and valleys, where each valley represents a place with its own parameters (such as the mass of the electron) and fields (such as gravity).

How does a universe develop according to this scenario, and what can it tell us about ours? Imagine the universe as it starts off as a speck of space-time. This baby universe is filled with fields, whose properties change due to quantum fluctuations. If the conditions are ripe for inflation, the speck will grow and this will alter its nature. Depending on the changing environment inside the emerging universe, the inflationary process could grind to a halt, continue apace or even spawn other specks of space-time.

According to the landscape picture, the baby universe starts off in one valley. Quantum fluctuations can then cause the entire universe to "tunnel" through an adjoining hill, eventually ending up in another valley with different properties. This process continues, with the universe tunnelling from valley to valley, until it reaches a place stable enough for inflation to run its full course.

Given this scenario, one of the most important tasks is reconciling eternal inflation with the landscape. "The whole picture can be boiled down to one issue: is there eternal inflation in the landscape?" says Henry Tye of Cornell University in Ithaca, New York. In Linde's model of eternal inflation, the speck of space-time starts off with high energy density. The energy density slowly falls as space-time inflates. The quest is to find configurations of space-time among the 10500 that match Linde's requirements for eternal inflation.

Until recently, this had seemed impossible. Then, last year, Eva Silverstein and Alexander Westphal of Stanford University identified two places within the landscape for Linde's version of eternal inflation to take place (Physical Review D, vol 78, p 106003).

It's a promising start, but Tye argues that eternal inflation within string theory is not a done deal. Physicists could just as well start with string theory models of the universe with entirely different initial conditions that would lead to inflation, though not eternal inflation.

Experiments are the key to answering such concerns, by testing the predictions of the various alternative theories. For instance, the energy density in the model proposed by Silverstein is high enough to create strong gravitational waves, ripples in space-time generated by the rapid expansion of the universe. Such waves could have polarised the photons of the cosmic microwave background, the radiation left over from the big bang, and such an imprint would still be detectable today. The European Space Agency's Planck satellite, due to launch soon, will look for any polarisation.

If Planck sees it, then it will lend support to Silverstein's models and eternal inflation. But even if experiments like Planck do lend support for eternal inflation, theorists will need independent confirmation for the ideas of string theory. Unfortunately no specific predictions of string theory are yet within experimental reach, but there is one key general property that could be confirmed soon. String theory requires that the universe has a property known as supersymmetry, which posits that every particle known to physicists has a heavier and as yet unseen superpartner. Physicists will be looking for some of these superpartners at the Large Hadron Collider, the new particle accelerator at CERN, near Geneva, Switzerland.

The scenario of a universe tunnelling through the landscape also makes a unique prediction. If our universe emerged after tunnelling in this way, then the theory predicts that space-time today will be ever so slightly curved. That's because in this scenario, inflation does not last long enough to make the universe totally flat.

Today's measurements show the universe to be flat, but the uncertainty in those measurements still leaves room for space-time to be slightly curved - either like a saddle (negatively curved) or like a sphere (positively curved). "If we originated from a tunnelling event from an ancestor vacuum, the bet would be that the universe is negatively curved," says Susskind. "If it turns out to be positively curved, we'd be very confused. That would be a setback for these ideas, no question about it."

Until any such setback the smart money will remain with the multiverse and string theory. "It has the best chance of anything we know to be right," Weinberg says of string theory. "There's an old joke about a gambler playing a game of poker," he adds. "His friend says, 'Don't you know this game is crooked, and you are bound to lose?' The gambler says, 'Yes, but what can I do, it's the only game in town.' We don't know if we are bound to lose, but even if we suspect we may, it is the only game in town."

Anil Ananthaswamy is a consulting editor for New Scientist

http://www.newscientist.com/article/mg20227061.200-how-to-map-the-multiverse.html?full=true&print=true
Title: Levitating mice?!?
Post by: Crafty_Dog on September 10, 2009, 08:47:18 AM


http://news.yahoo.com/s/livescience/20090909/sc_livescience/micelevitatedinlab
Title: Evaporating, Extra Dimensional Black Holes?
Post by: Body-by-Guinness on September 16, 2009, 12:54:08 PM
HUNTING HIDDEN DIMENSIONS
Black holes, giant and tiny, may reveal new realms of space By Diana Steele September 26th, 2009; Vol.176 #7 (p. 22)    Text Size

(http://www.sciencenews.org/view/download/id/47268/name/Black_hole_blast)
Enlarge

Black hole blastView larger version | The creation of black holes in the Large Hadron Collider, which will smash protons together at nearly the speed of light, would indicate the existence of extra dimensions. A simulation of one possible fingerprint of a black hole (above) in the collider's Compact Muon Solenoid detector shows colored cones to represent different particle types, and bar lengths indicate particles' energy intensity. The CMS Collaboration

In many ways, black holes are science’s answer to science fiction. As strange as anything from a novelist’s imagination, black holes warp the fabric of spacetime and imprison light and matter in a gravitational death grip. Their bizarre properties make black holes ideal candidates for fictional villainy. But now black holes are up for a different role: heroes helping physicists assess the real-world existence of another science fiction favorite — hidden extra dimensions of space.

Astrophysical giants several times the mass of the sun and midget black holes smaller than a subatomic particle could provide glimpses of an extra-dimensional existence.

Out in space, astrophysicists are looking hard to see if large black holes are shrinking on a time scale that might be detected by modern telescopes. If so, it might mean the black holes are evaporating into extra dimensions.

In the laboratory, black holes far smaller than anything that could be seen with a microscope might be produced in Europe’s Large Hadron Collider after it starts running again in November (SN: 7/19/08, p. 16). The detection of such a black hole, which would evaporate in a hail of subatomic particles in a tiny fraction of a second, would provide evidence that unseen dimensions of space exist.

What makes either of these ideas even plausible is a bold theory put forth just over 10 years ago that purports to explain the weakness of gravity by supposing that some of it is leaking out into extra dimensions.

Gravity feels strong to humans because it makes climbing hills hard. But one of the fundamental paradoxes about gravity is demonstrated by the fact that an ordinary refrigerator magnet can pick up a paperclip — counteracting the entire mass of the Earth pulling down on the clip.

Physicists call this the “hierarchy problem,” referring to the fact that all the other forces of nature are more than 30 orders of magnitude stronger than gravity.

“It’s hard to explain such a huge number from any mathematical postulate or any physical principle,” says Greg Landsberg, a theoretical physicist at Brown University in Providence, R.I. “It’s a bit of an embarrassment for our field, because what it really means is, we don’t seem to understand gravity.”

Measuring extra dimensions

Isaac Newton declared in the 17th century that gravity gets weaker by the square of the distance between two objects. If the moon were twice as far from Earth, it would feel one-quarter the gravity.

But in 1998, theoretical physicists Nima Arkani-Hamed, Savas Dimopoulos and Gia Dvali pointed out that gravity had never been measured below a distance of about a millimeter. Suppose, they suggested, that gravity differs from Newtonian expectations at distances smaller than a millimeter.

That could happen if there are extra dimensions of space that gravity leaks into. These hidden dimensions might be shaped, for example, like the circumference of a hose. From a distance, the hose looks like a one-dimensional line, but seen up close, it has a curled-up second dimension. Arkani-Hamed, Dimopoulos 
and Dvali — whose model is known as ADD, short for their names — suggest that there could be extra dimensions as large as a millimeter in diameter.

“In principle, the extra dimensions can be so small, like trillions and trillions of times smaller than a millimeter, and that’s what string theory predicts,” says theoretical astrophysicist Dimitrios Psaltis of the University of Arizona in Tucson. But “if you introduce those large extra dimensions, then gravity can get diluted in some way.”

Gravity may spread into the extra dimensions while the other known forces and particles are confined to the three familiar spatial dimensions. So gravity could be just as strong as the other forces — but only felt strongly at short distances.

(http://www.sciencenews.org/view/download/id/47269/name/The_universe_as_a_flatland)
Enlarge
The universe as a flatlandView larger version | The known universe could be very thin in an extra dimension other than the familiar three dimensions of space.J. Korenblat; NASA

Tiny curled extra dimensions aren’t the only possibility. In 1999, theoretical physicists Lisa Randall and Raman Sundrum proposed that one extra dimension might stretch out to infinity. If either theory is true, it would also mean that at very small distances, gravity would be much stronger than Newton’s prediction.

The idea of “large” extra dimensions sent experimental physicists scrambling.

So far, physicists using sensitive small-scale experiments have measured the force of gravity at distances just under 50 micrometers and haven’t found any deviation from Newton’s law yet. But they keep looking.

Shrinking black holes

Black holes, as the most gravitationally dense objects in the universe, might provide another way of testing the extra-dimension hypotheses. Black holes know a thing or two about gravity; the trick is getting them to reveal their secrets.

In the 1970s, theoretical physicist Stephen Hawking calculated that black holes actually lose mass. That mass vanishes over time in the form of what’s now called Hawking radiation. “Over time” generally means over billions of years, like the age of the universe. The larger the black hole is, the more slowly it shrinks. But as it gets smaller, the evaporation rate accelerates.

And if there are extra dimensions of the Randall-Sundrum type, astrophysical black holes might emit gravity waves into these other dimensions and shrink faster than otherwise expected. So, Psaltis thought, finding a small black hole that’s really old would limit the size of the extra dimensions. “If you notice that a black hole lived, for example, a hundred million years,” Psaltis says, “that means that it couldn’t have evaporated, couldn’t have lost its mass really, really fast.”

But finding out the age and weight of a black hole is about as tricky as discovering that of a vain movie star. So Psaltis tried to find a way to get the black hole to reveal a little bit more about itself.

He found a black hole that looked like it had been kicked out of the plane of the Milky Way galaxy following a violent supernova explosion, like a fastball hit over the wall at Fenway Park. Since the black hole would have been born in the explosion, Psaltis could estimate its age by measuring how fast it and its companion star were zooming away from the galaxy, then backtracking to find out how long ago it had been ejected.

He calculated that this particular black hole, J1118+480, was a minimum of 11 million years old. Using that age and an estimated mass, Psaltis put an upper limit of 80 micrometers on the size of any extra dimensions, as he reported in Physical Review Letters in 2007.

Tim Johanssen, Psaltis’ graduate student, came up with another idea for measuring whether black holes are losing weight, one that doesn’t depend on knowing their ages. Most black holes a few times the mass of the sun have been detected because they orbit a companion star. The masses of the star and the black hole, as well as the distance between them, determine how fast the two rotate around each other, like Olympic pair skaters spinning around each other in a death spiral. If the mass of the black hole is changing, the rate at which it and its companion orbit each other, called the orbital period, changes as well.

Johanssen calculated how quickly a black hole would have to lose mass in order to see a noticeable difference in the orbital period. “Just from normal astrophysical mechanisms, we would expect [the period] to halve or double at a time-scale on the order of the age of the universe, billions of years,” says Psaltis. “If extra dimensions exist, and they are as large as, say, a tenth of a millimeter, then that time scale goes down to about several millions of years. Which means that if you make an observation over a year, you expect a change in the orbital period of a few parts per million. This is tiny, but this is something that modern observations of binary systems can actually do.”

Johanssen, Psaltis and astronomer Jeffrey McClintock of the Harvard-Smithsonian Center for Astrophysics looked closely at the best-studied black hole binary, A0620-00, which has been observed for about a decade. So far, they found, there has been no observable change in its orbital period. That let them constrain the size of the extra dimension to less than 161 micrometers. Their results appeared in February 2009 in the Astrophysical Journal.

Another researcher, Oleg Gnedin of the University of Michigan in Ann Arbor, extrapolated from Psaltis’ work. Gnedin learned of a recently discovered black hole in a globular cluster, one of the oldest groups of stars in the universe. Black holes in globular clusters are on the order of 10 billion years old. The mere existence of a black hole this old puts a very tight constraint — less than 3 micrometers — on the size of the Randall-Sundrum extra dimensions, Psaltis says. That work was published online at arXiv.org in June (SN: 8/1/09, p. 7).

Although the globular cluster work sets the tightest constraint on extra-dimension size so far, the researchers admit that it relies on a lot of assumptions.

Psaltis is pinning his hopes on observations of binary systems because, he says, they’re “a measurement of what is happening right now to the black hole that we are seeing. It does not depend on the history.” He says that even though the researchers haven’t seen any changes in orbital periods so far doesn’t mean the extra dimensions don’t exist, just that they haven’t been found yet. Any change in orbital period, he says, would challenge physicists’ current theory of forces and particles in the universe — called the standard model.

But even if the extra-dimension theories are correct, observers still may never find evidence of such dimensions in the astrophysical black holes. One reason may be that the extra dimensions are of the ADD variety, small and curled up, in which case these tiny dimensions make no difference to the massive black holes in outer space.

The other reason may be that black holes don’t really evaporate faster into other dimensions even if they do exist, says Randall, the Harvard theoretical physicist who coauthored two of the popular extra-dimension models. “People have suggested that the decay rates of black holes might be a way of distinguishing” between the models, she says, “but it’s not fully resolved.”

Micro black holes

It will become pretty clear that large extra dimensions exist if a micro-sized black hole happens to appear in the Large Hadron Collider, or LHC, near Geneva. That’s because if gravity really is much stronger than expected at distances around a few micrometers or so, the LHC may be able to pack enough matter and energy into a small enough space that the system will automatically collapse into a black hole.

But before anyone starts worrying about Geneva disappearing into a black hole, know that this gravitationally dense midget wouldn’t even cross the diameter of an atomic nucleus before disintegrating (SN Online: 6/24/08).

“In this sense, these black holes are completely organic,” says Landsberg. “You could put them in your salad, and you wouldn’t notice that they exist because they immediately evaporate.”

But they might make their presence known to the LHC’s detectors.

That’s the province of a number of theoretical physicists, including Glenn Starkman of Case Western Reserve University in Cleveland. Starkman led a team that developed a computer program, called BlackMax, that tells researchers what subatomic debris a black hole might leave behind as evidence.

Inside the LHC, two beams of protons will stream at speeds close to the speed of light in opposite directions around a circular tunnel. Protons are actually somewhat spread out, says Starkman,and mostly made up of subatomic particles called quarks and gluons. It’s extremely unlikely that any two of these particles will hit each other exactly head-on. But if two quarks or two gluons, or one of each, get close enough to each other as they are flying in opposite directions, there could be enough energy in a small enough space that a black hole would form — if, and only if, gravity is strong enough to start playing a role. “For that to happen,” says Starkman, “there have to be more than three dimensions.”

The black hole would evaporate almost instantaneously, perhaps in a hail of subatomic particles shooting forth in all directions, like a cherry bomb firecracker. Or perhaps researchers would see a signature event in which some of the energy disappears, carried away into other dimensions by gravitons — the invisible gravitational counterpart to the photon.

The good thing, from a theoretical physics point of view, is that if the LHC makes any black holes at all, it will make a lot of them — as many as one per second, or 30 million a year. “Now, 30 million a year may involve optimistic assumptions, but perhaps a million or a hundred thousand or even ten thousand is not impossible,” says Stanford University’s Savas Dimopoulos, the middle “D” of the ADD extra-dimension hypothesis. “Even if you have 10,000 black holes, that is a lot of events to do statistics with and to start testing in detail both the existence of the black hole and the framework of large dimensions.”

Randall, like many, is skeptical. “It’s a cute idea,” she says. But with coauthor Patrick Meade, then at Harvard and now at the Institute for Advanced Study in Princeton, N.J., she argues that the scenario is highly unlikely. Their work was published in May 2008 in the Journal of High Energy Physics.

“It’s virtually impossible that you’re going to make genuine black holes at the LHC because the energy isn’t really high enough,” she says. “You could see some evidence of interesting gravitational effects in higher dimensions in terms of how things would scatter off each other … but it seems very unlikely that you would actually have anything that is really a genuine black hole.”

Still, there are a lot of uncertainties, and until the LHC is up and running, no one will really know.

Dimopoulos, for one, remains optimistic, but he has hedged his bets. In addition to large extra dimensions, he has a stake in two other leading candidates for solving the hierarchy problem. These theories, called technicolor and supersymmetry, don’t rely on extra dimensions — and they both might show their colors at the LHC.

But chances are “that nature may choose a completely different route, and it may be that the solutions to the hierarchy problem will be something that nobody ever thought about,” he says. “And that may be the most exciting scenario for what we will discover.”

Diana Steele is a freelance science writer based in Oberlin, Ohio.

http://www.sciencenews.org/view/feature/id/47187/title/Hunting_Hidden_Dimensions
Title: Cosmic Radiation and Warming
Post by: Body-by-Guinness on December 09, 2009, 12:56:12 PM
There is a flash video I can't embed associated with this post that is well worth an hour of your life. Contrast the style of the lecturer to that of the AGW panic mongers.

Revolt of the Physicists

Climate science seemed settled in the 1990s. The only theory around was that the increase in CO2 and other greenhouse gases was causing the increase in world temperatures. But then physicists got involved. My guess is that the average physicist has an IQ of somewhere between 150 and 200. The progress that they have been making is incredible.

If you have a scientific background and you still believe in man-made global warming, get out a cup of coffee, a cup of tea, or a glass of brandy, whatever helps you think best, and watch the following lecture from the Cern, one of Europe's most highly respected centers for scientific research:
 
Windows Media Flash

This lecture by Jasper Kirkby reviews the recent research that physicists have been conducting into climate change. Physicists have discovered that changes in the rate of cosmic ray inflow cause climate change and that solar activity shields the earth from cosmic rays. They haven't completely worked out the mechanism yet, but they think it has to do with cosmic rays causing cloud formation and clouds reflecting sunlight back into space.

When Kirkby gets to the screen showing Galactic Modulation of Climate over the last 500 million years and the cosmic ray variation that explains it, take a close look at the line that plots CO2 over the same period. Note that that line doesn't correspond at all to the temperature periodicity evident in the temperature data. Also listen when Kirkby points out that CO2 concentrations used to be 10 times higher than they are today.

And don't miss the most chilling (literally) prediction of all based on a careful study of sunspot intensity. This prediction was originally submitted and rejected for publication in 2005 (Sunspots May Vanish by 2015), but has been coming true ever since. The earth appears to be headed toward a period of dramatic cooling, at present, due to reduced solar activity.

Meanwhile, clueless world leaders will be meeting at a UN Climate Change Conference in Copenhagen December 7-18 in an attempt to reduce carbon emissions in order to slow global warming.

http://seekingalpha.com/article/175641-climategate-revolt-of-the-physicists
Title: WSJ: Dark Matter?
Post by: Crafty_Dog on December 21, 2009, 07:27:01 PM
By LAWRENCE KRAUSS
In early December, the Cold Dark Matter Search (CDMS) experiment located in the deep Soudan Mine in northern Minnesota leaked a tantalizing hint that they may have discovered something remarkable. The experiment is designed to directly detect new elementary particles that might make up the dark matter known to dominate our own Milky Way galaxy, all galaxies, and indeed all mass in the universe—so news of a possible breakthrough was thrilling.

The actual result? Two pulses were detected over the course of almost a year that might have been due to dark matter, CDMS announced on Dec. 17. However, there is a 25% chance that the pulses were actually caused by background radioactivity in and around the detector.

Physicists remain fascinated by the possibility that the events at CDMS, reported on the back pages of the world's newspapers, might nevertheless be real. If they are, they will represent the culmination of one of the most incredible detective stories in the history of science.

Beginning in the 1970s, evidence began to accumulate that there was much more mass out there than meets the eye. Scientists, mostly by observing the speed of rotation of our galaxy, estimated that there was perhaps 10 times as much dark matter as visible material.

At around the same time, independent computer calculations following the possible gravitational formation of galaxies supported this idea. The calculations suggested that only some new type of material that didn't interact as normal matter does could account for the structures we see.

Meanwhile, in the completely separate field of elementary particle physics, my colleagues and I had concluded that in order to understand what we see, it is quite likely that a host of new elementary particles may exist at a scale beyond what accelerators at the time could detect. This is one of the reasons there is such excitement about the new Large Hadron Collider in Geneva, Switzerland. Last month, it finally began to produce collisions, and it might eventually directly produce these new particles.

Theorists who had proposed the existence of such particles realized that they could have been produced during the earliest moments of the fiery Big Bang in numbers that could account for the inferred abundance of dark matter today. Moreover, these new particles would have exactly the properties needed for such material. They would interact so weakly with normal matter that they could go through the Earth without a single interaction.

Emboldened by all of these arguments, a brave set of experimentalists began to devise techniques by which they might observe such particles. This required building detectors deep underground, far from the reach of most cosmic rays that would overwhelm any sensitive detector, and in clean rooms with no radioactivity that could produce a false signal.

So when the physics community heard rumors that one of these experiments had detected something, we all waited with eager anticipation. A convincing observation would vindicate almost half a century of carefully developed, if fragile, arguments suggesting a whole new invisible world waiting to be discovered.

For the theorist working at his desk alone at night, it seems almost unfathomable that nature might actually obey the delicate theories you develop on pieces of paper. This is especially true when the theories involve ideas from so many different areas of science and require leaps of imagination.

Alas, to celebrate would be premature: The reported results are intriguing, but less than convincing. Yet if the two pulses observed last week in Minnesota are followed by more signals as bigger detectors turn on in the coming year or two, it will provide serious vindication of the power of human imagination. Combined with rigorous logical inference and technological wizardry—all the things that make science worth celebrating—scientists' creativity will have uncovered hidden worlds that a century ago could not have been conceived.

If, on the other hand, the events turn out to have been mere background radioactivity, physicists will not give up. It will only force us to be more clever and more energetic as we try to unravel nature's mysteries.

Mr. Krauss is director of the Origins Institute at Arizona State University, and a theoretical physicist who has been involved in the search for dark matter for 30 years. His newest book, "Quantum Man," will appear in 2010.
Title: Mixed Quantum State
Post by: Body-by-Guinness on March 18, 2010, 10:03:43 PM
Scientists supersize quantum mechanics
Largest ever object put into quantum state.

Geoff Brumfiel


A quantum drum has become the first visible object to be put into a superposition of quantum states.A. Olsen/iStockphoto
A team of scientists has succeeded in putting an object large enough to be visible to the naked eye into a mixed quantum state of moving and not moving.

Andrew Cleland at the University of California, Santa Barbara, and his team cooled a tiny metal paddle until it reached its quantum mechanical 'ground state' — the lowest-energy state permitted by quantum mechanics. They then used the weird rules of quantum mechanics to simultaneously set the paddle moving while leaving it standing still. The experiment shows that the principles of quantum mechanics can apply to everyday objects as well as as atomic-scale particles.

The work is simultaneously being published online today in Nature and presented today at the American Physical Society's meeting in Portland, Oregon1.

According to quantum theory, particles act as waves rather than point masses on very small scales. This has dozens of bizarre consequences: it is impossible to know a particle's exact position and velocity through space, yet it is possible for the same particle to be doing two contradictory things simultaneously. Through a phenomenon known as 'superposition' a particle can be moving and stationary at the same time — at least until an outside force acts on it. Then it instantly chooses one of the two contradictory positions.


The paddle is around 30 micrometres long.O'Connell, A. D. et al.
But although the rules of quantum mechanics seem to apply at small scales, nobody has seen evidence of them on a large scale, where outside influences can more easily destroy fragile quantum states. "No one has shown to date that if you take a big object, with trillions of atoms in it, that quantum mechanics applies to its motion," Cleland says.

There is no obvious reason why the rules of quantum mechanics shouldn't apply to large objects. Erwin Schrödinger, one of the fathers of quantum mechanics, was so disturbed by the possibility of quantum weirdness on the large scale that he proposed his famous 'Schrödinger's cat' thought experiment. A cat is placed in a box with a vial of cyanide and a radioactive source. If the source decays, it triggers a device that will break the vial, killing the cat. During the time the box is shut, Schrödinger argued, the cat is in a superposition of alive and dead — an absurdity as far as he was concerned.

Wonderful weirdness

Cleland and his team took a more direct measure of quantum weirdness at the large scale. They began with a a tiny mechanical paddle, or 'quantum drum', around 30 micrometres long that vibrates when set in motion at a particular range of frequencies. Next they connected the paddle to a superconducting electrical circuit that obeyed the laws of quantum mechanics. They then cooled the system down to temperatures below one-tenth of a kelvin.

At this temperature, the paddle slipped into its quantum mechanical ground state. Using the quantum circuit, Cleland and his team verified that the paddle had no vibrational energy whatsoever. They then used the circuit to give the paddle a push and saw it wiggle at a very specific energy.

Next, the researchers put the quantum circuit into a superposition of 'push' and 'don't push', and connected it to the paddle. Through a series of careful measurements, they were able to show that the paddle was both vibrating and not vibrating simultaneously.

ADVERTISEMENT

"It's wonderful," says Hailin Wang, a physicist at the University of Oregon in Eugene who has been working on a rival technique for putting an oscillator into the ground state. The work shows that the laws of quantum mechanics hold up as expected on a large scale. "It's good for physics for sure," Wang says.

So if trillions of atoms can be put into a quantum state, why don't we see double-decker buses simultaneously stopping and going? Cleland says he believes size does matter: the larger an object, the easier it is for outside forces to disrupt its quantum state.

"The environment is this huge, complex thing," says Cleland. "It's that interaction with this incredibly complex system that makes the quantum coherence vanish."

Still, he says, there's plenty of reasons to keep trying to get large objects into quantum states. Large quantum states could tell researchers more about the relationship between quantum mechanics and gravity — something that is not well understood. And quantum resonators could be useful for something, although Cleland admits he's not entirely sure what. "There might be some interesting application," he says. "But frankly, I don't have one now."

References
O'Connell, A. D. et al. Nature doi:10.1038/nature08967 (2010).

http://www.nature.com/news/2010/100317/full/news.2010.130.html
Title: Re: Physics
Post by: Rarick on March 19, 2010, 02:46:49 AM
There is one of the principle that may get us to the stars.  Now we need to "Macrosize" it, and figure out how to control the variables that determine the W's of the return to normalcy.
Title: Photonic Teleportation
Post by: Crafty_Dog on June 08, 2010, 10:37:55 AM


http://news.discovery.com/tech/teleportation-quantum-mechanics.html
Title: Higgs Boson Found?
Post by: Body-by-Guinness on July 12, 2010, 05:05:02 PM
Large Hadron Collider rival Tevatron 'has found Higgs boson'
Rumours are emerging from the rival to the Large Hadron Collider that the Higgs boson, or so-called "God particle", has been found.
 
By Tom Chivers
Published: 5:24PM BST 12 Jul 2010
Comment


Link to this video
Tommaso Dorigo, a physicist at the University of Padua, has said in his blog that there has been talk coming out of the Fermi National Accelerator Laboratory in Batavia, Illinois, that the Higgs has been discovered.
The Tevatron, the huge particle accelerator at Fermi - the most powerful in the world after the LHC - is expected to be retired when the CERN accelerator becomes fully operational, but may have struck a final blow before it becomes obsolete.
 

If one form of the rumour is to be believed - and Prof Dorigo is extremely circumspect about it - then it is a "three-sigma" signature, meaning that there is a statistical likelihood of 99.7 per cent that it is correct. But, of course, that is only if the rumour is to be believed.
In the post, titled "Rumors about a light Higgs", Prof Dorigo said: "It reached my ear, from two different, possibly independent sources, that an experiment at the Tevatron is about to release some evidence of a light Higgs boson signal.
"Some say a three-sigma effect, others do not make explicit claims but talk of a unexpected result."
While media attention has been focusing on the LHC, the Tevatron has been quietly plugging away in the search for Higgs. In the 27 years since it was first completed (it has been regularly upgraded since then) it has discovered a quark and observed four different baryons. While it has not been able to pinpoint the elusive Higgs, it has narrowed the search, reducing the window of possible masses where it might be found.
Last year, Fermi physicists said they expected to have enough data to find or rule out the Higgs by early next year, and gave themselves a fifty-fifty chance of finding it before the end of 2010.
The Higgs boson is the last of the particles posited by the standard model of particle physics still to be found. It is said to explain why other particles have mass, and its discovery would confirm the standard model. If its existence is ruled out altogether, then other, previously less popular theories will have to be examined.
New Scientist suggests that more may be known this month, when scientists present their findings at the International Conference on High Energy Physics (ICHEP), which opens in Paris on 22 July.

http://www.telegraph.co.uk/science/large-hadron-collider/7885997/Large-Hadron-Collider-rival-Tevatron-has-found-Higgs-boson.html
Title: Reality behind reality?
Post by: prentice crawford on October 20, 2010, 08:58:08 AM
 More news of other dimensions and worlds:

 http://www.news.yahoo.com/s/nm/20101020/sc_nm/us_science_cern
                    
                              P.C.
Title: Paging Trickydog
Post by: G M on January 24, 2011, 05:01:19 PM
http://www.wired.com/wiredscience/2011/01/timelike-entanglement/

Quantum Entanglement Could Stretch Across Time

Thoughts?
Title: When is a Theory not a Theory?
Post by: Body-by-Guinness on March 30, 2011, 01:05:56 PM
Physicist Fist-Fight: What’s the Deal with Strings?
Date: March 9, 2011 | Author: Brian Trent
Category: Astronomy/Space Science, General Science, Physics/Mechanics | Comments: 3 » |
Can one theory explain everything?

This is the oldest objective of science. We can retrace the question to the pre-Socratics, when Thales in 585 BCE first suggested that everything was ultimately made of water, and that the apparent phases of matter were simply different states of this fundamental water, this blood of the Apsu or ichor of the gods.

What Thales was attempting was a universal theory. We have since moved on from water to the discovery of four fundamental forces: the strong nuclear force, the weak nuclear force, gravity, and electro-magnetism. Between this quartet, our understanding of the universe – however incomplete – hangs both powerfully and delicately. Can they ever be combined into a Grand Unification Theory?

Significantly, electro-magnetism was once deemed to be two separate forces, but James Clerk Maxwell, writing towards the end of the nineteenth century, united them for the scientific world.

Today we have come up with two perfectly wonderful, credible theories to tackle the nature of the universe. We have General Relativity to handle gravity, and we have the Standard Model to handle the other three. As best we can tell, both theories are correct and utterly irreconcilable with each other. It was Albert Einstein’s Holy Grail to reconcile them. He didn’t.



And so the quest continues. Yesterday in New York, seven leading physicists attended the American Museum of Natural History for the 11th annual Isaac Asimov Memorial Debate, and they discussed whether or not it was even possible to come up with the proverbial Theory of Everything.

From the article:

Many physicists say our best hope for a theory of everything is superstring theory, based on the idea that subatomic particles are actually teensy tiny loops of vibrating string. When filtered through the lens of string theory, general relativity and quantum mechanics can be made to get along

 

Ah yes, string theory. The radical discipline which emerged, almost out of defiance, against the scientific establishment and purported to contain the secrets to explaining everything. The hip brand of new thinking that seemed to merge science and mysticism (more on that later.)

As Brian Greene, professor of physics and mathematics at Columbia University, says, “There’s been an enormous amount of progress in string theory. There have been issues developed and resolved that I never thought, frankly, we would be able to resolve. The progress over the last 10 years has only solidified my confidence that this is a worthwhile direction to pursue.”



I will freely admit that I am a fan of superstring theory. How can you look at it and not gape at its fascinating aesthetics. It seems very Zen, the cosmological picture of simple elegance. When physicists smash particles together and get a shower of seemingly endless smaller particles, there is something enticing about a theory that states this endlessly diverse shower is nothing more than different vibration pitches of the same underlying string-like structure. I’ve attended Greene’s lectures and he is articulate and convincing — he has the Sagan touch for communicating ideas in a lucid and affecting manner.

There’s just one little problem:

Superstring theory isn’t a theory.

Since science works on theories and the ability to test those theories, string theory is a little too radical for its own good. String theory does not lend itself to being tested. It does not make quantifiable predictions. Without these important elements, it is not science by our prevailing definition. It becomes a thought experiment. A philosophy. Perhaps, even, a religion.

From the article:

Neil deGrasse Tyson, director of the museum’s Hayden Planetarium, suggested that string theory seems to have stalled, and contrasted the lack of progress of “legions” of string theorists with the seemingly short 10 years it took one man – Einstein – to transition from special relativity to general relativity.

“Are you chasing a ghost or is the collection of you just too stupid to figure this out?” deGrasse Tyson teased.



In fact, one of the chief components of string theory is that it requires 11 dimensions to work. The instant we involve higher dimensional planes, we stray into phantasmagoric territory. The addition of dimensions becomes something like spackle. Tyson’s offhand reference to ghosts is probably quite calculated and deliberate.

What do we think? If string theory is philosophy and not science, does this suggest that the tools we use are no longer practical in addressing the investigation of the cosmos? Or is it simply a matter of time before we develop those tools?

http://theness.com/roguesgallery/?p=2617
Title: Re: Quantum Entanglement Could Stretch Across Time
Post by: trickydog on March 30, 2011, 08:23:49 PM
Ooooooh - thinking about time travel just hurts the head.  And the funny thing is that equations treat time and space the same - so why would it be that we can imagine teleporting through space so easily but time travel gets complicated so quickly?

Anyone see the low-budget scifi called Primer?  Available on NetFlix - a great film considering it was made for $7000.  A great film regardless.  But the way it makes your head hurt.... that seems so typical of time travel.

That aside - the thing to keep in mind about that article is that it applies to quantum states, not elephants:

Quote
“You can send your quantum state into the future without traversing the middle time,” said quantum physicist S. Jay Olson of Australia’s University of Queensland, lead author of the new study.

In ordinary entanglement, two particles (usually electrons or photons) are so intimately bound that they share one quantum state — spin, momentum and a host of other variables — between them. One particle always “knows” what the other is doing. Make a measurement on one member of an entangled pair, and the other changes immediately.

So you have to limit your imagination to what you know about quantum states - the Star Trek analogy involving Scotty doesn't apply so readily.

What does that get anyone?  That's where the imagination really comes in and I must say fails me admirably.  What kind of technology could you build with it?  Undetectable signals (since they aren't present for the "middle time")?  Feedforward loops (mechanisms that track states in the past in order to synchronize themselves with past events)?  Past-time sensors that send data to a future receiver?

Basically they are quantum "echoes from the past" - kind of like a camera recording current events so they can be replayed later.  But you aren't replaying them, you are receiving the signal directly through time.

Maybe.... you could put sensors in an environment where they wouldn't be able to send a signal or record either (for some reason) and have them instead beam the signal into the future.  For example, measuring state within a device that is exploding (and would overwhelm a signal with EM or destroy a stored signal).... now why would you want to do that?

OK - how about back to quantum encryption?  You encode data and send it into the future - it is effectively gone until that later time when it arrives and can be recaptured.  Then it can be retransmitted forward again and the information disappears.   That might be a good way to hide information.

Really, since you are stuck in quantum terms, the only thing worth considering is information - data - communications and computation.

Anyone else have a better concept for using time entanglement?
Title: Anti-Gravity from Anti-Atoms?
Post by: Body-by-Guinness on May 03, 2011, 04:27:52 PM
Scientists could be months away from discovering antigravity
 
2
digg
 
Scientists at CERN have announced that they've been able to trap 309 atoms of antihydrogen for over 15 minutes. This is long enough that soon, they'll be able to figure out whether antimatter obeys the law of gravity, or whether it's repelled by normal matter and falls "up" instead. It would be antigravity, for real.

While it's never been tested experimentally due to how difficult it is to create and store the stuff, it's disappointingly likely that antimatter will fall "down" just like regular matter. The thinking behind this is that antimatter (despite the "anti-") is made of regular ordinary energy, and even if it's got an opposite charge, it should still obey the same general rules as matter does. Antimatter falling up would mean a violation of the law of conservation of energy, among other things.

That said, if antimatter were to exhibit antigravity, it would go a long way towards explaining some of the peculiarities of our universe. For example, the universe is supposed to have just as much antimatter as it does matter, but we don't know where the antimatter is. If antimatter and normal matter repelled each other, it could mean that there are entire antimatter galaxies out there. Also, that repulsion would explain why the universe is not just expanding, but speeding up its expansion, something that's tricky to figure out when everything in the universe is always attracted towards everything else.

In either case, the team at CERN should be able to put the debate to rest within a couple months, when they plan to trap a blob of antihydrogen and then just watch it to see which way it falls. Down, and the laws of physics stay in place. Up, and you might just get that hoverboard you've always wanted.

http://dvice.com/archives/2011/05/does-antimatter.php
Title: Safer Ports through Physics
Post by: Body-by-Guinness on May 05, 2011, 08:31:58 AM
Physics for safer ports: New technology uses nuclear 'fingerprints' to scan cargo ships
April 29th, 2011 in Technology / Engineering   

Enlarge

The Port of Savannah is the fourth largest container port in the United States, importing hundreds of large metal boxes from cargo ships shown here. Credit: Georgia Department of Economic Development.
While 700 million travelers undergo TSA's intrusive scans and pat-downs each year, 11 million cargo containers enter American ports with little screening at all. And the volume of those containers, roughly equivalent to 590 Empire State Buildings of cargo, could contain something even worse than box knives or exploding shoes, namely nuclear weapons.
Two teams of North Carolina physicists are mapping the intricacies of the atomic nucleus, which could provide better security at the ports. The scientists have identified new "fingerprints" of nuclear materials, such as uranium and plutonium. The fingerprints would be used in new cargo scanners to accurately and efficiently identify suspicious materials. The physics might also be used to improve analysis of spent nuclear fuel rods, which are a potential source of bomb-making materials.
The problem starts at ports, where terrorists may try to smuggle an entire dirty bomb or even smaller amounts of plutonium or uranium by hiding it within the mountains of cargo that pass into the country each day. Cargo scanners using the new nuclear fingerprints would be sensitive enough to spot an entire bomb or the smaller parts to build one, according to Mohammad Ahmed, a nuclear physicist at Duke University.
Ahmed and his colleagues are developing the fingerprints for the next-generation detectors with HIGS, the High Intensity Gamma-Ray Source. It is the world's most intense and tunable source of polarized gamma rays and is located on Duke's campus as part of the Triangle Universities Nuclear Laboratory. HIGS produces gamma rays that are guided to collide with target materials, causing a variety of nuclear reactions.
In the reaction Ahmed and his Duke colleagues study, the collision creates a spray of particles, which fly into a group of detectors. The detectors count the number of neutrons knocked from the atomic nuclei of the target material in either a parallel or perpendicular direction, compared to the polarization plane of the gamma-ray beam. Dividing the number of neutrons emitted parallel to the plane by the number emitted perpendicular is distinct to each material, giving it a unique fingerprint.
Ahmed said these fingerprints could eventually be used to distinguish special nuclear materials, like weapons-grade uranium, from naturally occurring uranium or ordinary objects such as clothing or granite countertops, distinctions that current port scanners cannot make.
In a separate but related project, nuclear physicists from three North Carolina universities are slamming the HIGS beam into atomic nuclei and observing the energy pattern and distribution of the gamma rays that fluoresce back out of the collision. Each material has a distinct fluorescence pattern based on its nuclear structure, according to physicist Calvin Howell, who leads the Duke group.

Enlarge
New neutron "fingerprints" discovered with polarized gamma-rays at Duke could be the foundation for new port security scanners. Graphic: Ashley Yeager, Duke
Howell and his collaborators are studying the fluorescence patterns of potentially dangerous nuclear materials and non-nuclear contraband such as explosives and drugs. They are also identifying the patterns of steel and lead because terrorists can use the metals to conceal and ship weapon-making materials.
The two anti-terrorism projects were developed with the support of the Department of Homeland Security's Domestic Nuclear Detection Office, or DNDO. The agency awarded Ahmed and his colleagues a $2 million grant, while Howell and his collaborators received grants totaling $2 million. DNDO is funding both projects in response to the SAFE (Security and Accountability For Every) Port Act of 2006, which requires security agents to scan for nuclear materials in all of the containers entering the United States through the nation's 22 busiest ports.
Five years after Congress and the president approved the legislation, the equipment to satisfy this mandate still doesn't exist. Meanwhile, the United States transfers about 20 percent of the world's freight across its borders and has more than 300 maritime ports for sea containers and an additional 300 access points, such as border crossings, where dangerous materials might enter the country.
The Duke scientists say their use of polarized gamma-ray beams could one day help satisfy the SAFE policy, and they are building the fingerprint library to make it happen.
The HIGS data show, for example, that a precisely tuned gamma beam at 6 MeV causes weapons-grade uranium, U-235, to emit one neutron parallel to the polarization plane for each neutron emitted perpendicular to the plane, giving the material a neutron fingerprint of one.
Naturally occurring uranium, U-238, emits three parallel neutrons for every one emitted perpendicular to the polarization plane of the beam, giving it a neutron fingerprint of three.
Beryllium, which can also be a neutron source in nuclear weapons, has a neutron fingerprint of 10. The team is now measuring the neutron fingerprints of plutonium and other fissile materials, Ahmed said.
Howell and his collaborators, meanwhile, work at lower energies on HIGS, about 3 MeV. (Surgeons, for comparison, use a "Gamma Knife" at roughly 1 MeV to treat brain tumors.) Their team has already identified the fluorescence patterns of several special nuclear materials and lead.
Both teams will report their results at a meeting with DNDO officials on Thursday, April 28 in Washington D.C. and will store their results in a nuclear identification database.
Ahmed and Howell said that engineers at one private security company and scientists at U.S. national laboratories have already begun using the database to design new port security scanners.
The new detectors will search cargo for the fingerprints using an electron accelerator, possibly coupled to lasers that produce a finely tuned gamma-ray beam, said Craig Wuest of the Global Security Principal Directorate at Lawrence Livermore National Laboratory (LLNL).
The design sounds complex, but in some ways it resembles medical scanning equipment and appears promising to pursue, he said.
Howell's "nuclear resonance fluorescence" approach is interesting because it uses a beam with lower-energy gamma rays and reduces the potential irradiation and contamination of cargo while providing "sufficient detection sensitivity," Wuest, who was not involved in the research, added.
One of Wuest's colleagues at LLNL, nuclear physicist Dennis McNabb,is more intrigued with Ahmed's and Weller's technique. Scientists are only just beginning to measure the fingerprints and background signatures from this neutron-scattering process, and because "the research is in progress, how to best use the data is still an open question," McNabb said.
He also explained that cargo scanners using the data from both teams could be ready for use at ports in about 10 years.
Still, some scientists question whether the emerging science and technology can mature fast enough to meet the real-world threats of terrorists and dirty bombs. For instance, Thomas Cochran, a physicist and senior scientist at the Natural Resources Defense Council, voiced "serious doubts" and said the government should focus instead on eliminating inventories of highly enriched uranium, improving port security, boosting intelligence efforts and training first responders.
Other experts disagree and are urging the government to accelerate research on new science and technologies that could significantly reduce the threat of nuclear weapons smuggling, which seems likely to persist into the next decade. McNabb, a proponent, said, "it takes time to develop new technologies" and suggests that the research may accelerate development in other areas of nuclear security.
The new information from HIGS could improve analysis of spent nuclear fuel rods, which are an environmental issue as well as a potential source of bomb materials, according to Duke physicist Anton Tonchev.
He works on the nuclear resonance fluorescence project with Howell and said the technique provides a nondestructive way to measure the quantities of plutonium and other nuclear materials that remain after the rods are removed from a nuclear reactor.
Currently, the spent fuel rods must be opened and tested to assess what materials remain in them. The process is expensive, but critical for the International Atomic Energy Agency to accurately calculate the amount of leftover fissile and nuclear materials. McNabb and Tonchev said that a new technique to distinguish the leftover U-235, U-238 and plutonium in the spent rods without opening them could substantially lower the costs to manage and account for nuclear waste to prevent nuclear proliferation by terrorists.
Regardless of how fast engineers turn the fingerprint data and new approaches into workable scanning and nuclear fuel devices, the Duke scientists said there is immediate value in compiling a robust database of both the neutron and nuclear resonance fluorescence fingerprints. Government officials at the DNDO concur and cite HIGS as the only facility with the ability to produce such a database, according to Ahmed.
Because of the demand, the physicists have recruited graduate and undergraduate students from Duke, University of North Carolina, North Carolina Agricultural and Technical State University, North Carolina Central University, James Madison University and George Washington University to help with the effort. They especially encourage students from historically black colleges and universities to participate, hoping the effort will help broaden the diversity of nuclear physicists working to identify new ways to curb the threat of future terror attacks.
Provided by Duke University

http://www.physorg.com/news/2011-04-physics-safer-ports-technology-nuclear.html
Title: Re: Physics
Post by: Crafty_Dog on May 12, 2011, 06:03:08 AM

"Cargo scanners using the new nuclear fingerprints would be sensitive enough to spot an entire bomb or the smaller parts to build one, according to Mohammad Ahmed, a nuclear physicist at Duke University."

Nice work Mohammed!

"They especially encourage students from historically black colleges and universities to participate, hoping the effort will help broaden the diversity of nuclear physicists working to identify new ways to curb the threat of future terror attacks."

Great  :-P  Affirmative action in physics :-P :roll:

Title: Re: Physics
Post by: prentice crawford on May 17, 2011, 01:16:36 AM
Woof,
 I wonder :roll: would a program like that help short whites obtain a diversity on pro basketball teams?
                                          P.C.
Title: reality of the universe beyond human comprehension
Post by: ccp on June 18, 2011, 08:13:28 AM
Good article in Scientific American with interview of this fellow.  He comes to the conclusion after studying and inventing string theory black holes that human minds are totally incapable of truly understanding the universe.  I can't pull up the article but this is the interesting fellow who is probably more interesting then the celebrated Stephen Hawking.  Then again I am a mere mortal compared to any of these people:

***Leonard Susskind
From Wikipedia, the free encyclopedia
Jump to: navigation, search
Leonard Susskind


Leonard Susskind
Born 1940[1]
South Bronx, New York City, New York, USA
Residence USA
Nationality USA
Fields Physicist
Institutions Yeshiva University
University of Tel Aviv
Stanford University
Korea Institute for Advanced Study
Alma mater City College of New York
Cornell University
Doctoral advisor Peter A. Carruthers
Known for Holographic principle
String theory landscape
Quark confinement
Hamiltonian lattice gauge theory
Notable awards American Institute of Physics' Science Writing Award
Sakurai Prize (1998)
Boris Pregel Award, New York Academy of Science (1975)[2]
Notes
Atheist[3]
Leonard Susskind (born 1940)[1] is the Felix Bloch Professor of Theoretical Physics at Stanford University. His research interests include string theory, quantum field theory, quantum statistical mechanics and quantum cosmology.[2] He is a member of the National Academy of Sciences,[4] and the American Academy of Arts and Sciences,[5] an associate member of the faculty of Canada's Perimeter Institute for Theoretical Physics,[6] and a distinguished professor of the Korea Institute for Advanced Study.[7] Susskind is widely regarded as one of the fathers of string theory,[8], having, with Yoichiro Nambu and Holger Bech Nielsen, independently introduced the idea that particles could in fact be states of excitation of a relativistic string.[9] He was the first to introduce the idea of the string theory landscape in 2003.[10] In 1997, Susskind was awarded the J.J. Sakurai Prize for his "pioneering contributions to hadronic string models, lattice gauge theories, quantum chromodynamics, and dynamical symmetry breaking." Susskind's hallmark, according to colleagues, has been the application of "brilliant imagination and originality to the theoretical study of the nature of the elementary particles and forces that make up the physical world."[11]

Contents [hide]
1 Early life and education
2 Career
2.1 Scientific career
2.2 Development of String Theory
3 Books
3.1 The Cosmic Landscape
3.2 The Black Hole War
4 Lectures
4.1 Modern Physics: The Theoretical Minimum
4.2 A separate series of lectures on Quantum Mechanics and Special Relativity
5 Smolin-Susskind Debate
6 See also
7 References
8 Further reading
9 External links
 

[edit] Early life and education
Susskind was born to a poor Jewish family from the South Bronx section of New York City,[12] and now resides in Palo Alto, California. He began working as a plumber at the age of 16, taking over for his father who had become ill.[12] Later, he enrolled in the City College of New York as an engineering student, graduating with a B.S. in physics in 1962.[5] In an interview in the Los Angeles Times, Susskind recalls the moment he discussed with his father this change in career path: "When I told my father I wanted to be a physicist, he said, ‘Hell no, you ain’t going to work in a drug store.’ I said no, not a pharmacist. I said, ‘Like Einstein.’ He poked me in the chest with a piece of plumbing pipe. ‘You ain’t going to be no engineer,’ he said. ‘You’re going to be Einstein.’"[12] Susskind then studied at Cornell University under Peter A. Carruthers where he received his Ph.D. in 1965. He has been married twice, first in 1960,[5] and has four children.

[edit] Career
Susskind was an Assistant Professor of Physics, then an Associate Professor at Yeshiva University (1966–1970), after which he went for a year at the University of Tel Aviv (1971–72), returning to Yeshiva to become a Professor of Physics (1970–1979). Since 1979 he has been Professor of Physics at Stanford University,[13] and since 2000 has held the Felix Bloch Professorship of Physics.

In 2007, Susskind joined the Faculty of Perimeter Institute for Theoretical Physics in Waterloo, Ontario, Canada, as an Associate Member. He has been elected to the National Academy of Sciences and the American Academy of Arts and Sciences, and was awarded the 1998 Sakurai Prize for theoretical physics. He is also a distinguished professor at Korea Institute for Advanced Study.[14]

[edit] Scientific career
Susskind was one of at least three physicists who independently discovered during or around 1970 that the Veneziano dual resonance model of strong interactions could be described by a quantum mechanical model of strings,[15] and was the first to propose the idea of the string theory landscape. Susskind has also made contributions in the following areas of physics:

The independent discovery of the string theory model of particle physics
The theory of quark confinement[16]
The development of Hamiltonian lattice gauge theory[17]
The theory of scaling violations in deep inelastic electroproduction
The theory of symmetry breaking sometimes known as "technicolor theory"[18]
The second, yet independent, theory of cosmological baryogenesis[19] (Sakharov's work was first, but was mostly unknown in the Western hemisphere.)
String theory of black hole entropy[20]
The principle of black hole complementarity[21]
The causal patch hypothesis
The holographic principle[22]
M-theory, including development of the BFSS matrix model [23]
Kogut-Susskind fermions
Introduction of holographic entropy bounds in physical cosmology
The idea of an anthropic string theory landscape[24]
[edit] Development of String Theory
The story goes that "In 1970, a young physicist named Leonard Susskind got stuck in an elevator with Murray Gell-Mann, one of physics' top theoreticians, who asked him what he was working on. Susskind said he was working on a theory that represented particles 'as some kind of elastic string, like a rubber band.' Gell-Mann responded with loud, derisive laughter."[25]

[edit] Books
Susskind is the author of two popular science books, The Cosmic Landscape: String Theory and the Illusion of Intelligent Design[26] published in 2005, and The Black Hole War: My battle with Stephen Hawking to make the world safe for quantum mechanics[27] published in 2008.

[edit] The Cosmic Landscape
Main article: The Cosmic Landscape
The Cosmic Landscape: String Theory and the Illusion of Intelligent Design is Susskind's first popular science book, published by Little, Brown and Company on December 12, 2005.[26] It is Susskind's attempt to bring his idea of the anthropic landscape of string theory to the general public. In the book, Susskind describes how the string theory landscape was an almost inevitable consequence of several factors, one of which was Steven Weinberg's prediction of the cosmological constant in 1987. The question addressed here is why our universe is fine-tuned for our existence. Susskind explains that Weinberg calculated that if the cosmological constant was just a little larger, our universe would cease to exist.

[edit] The Black Hole War
Main article: Susskind-Hawking battle
The Black Hole War: My Battle with Stephen Hawking to Make the World Safe for Quantum Mechanics is Susskind's second popular science book, published by Little, Brown, and Company on July 7, 2008.[27] The book is his most famous work and explains what he thinks would happen to the information and matter stored in a black hole when it evaporates. The book sparked from a debate that started in 1981, when there was a meeting of physicists to try to decode some of the mysteries about how particles of particular elemental compounds function. During this discussion Stephen Hawking stated that the information inside a black hole is lost forever as the black hole evaporates. It took 28 years for Leonard Susskind to formulate his theory that would prove Hawking wrong. He then published his theory in his book, The Black Hole War. Like The Cosmic Landscape, The Black Hole War is aimed at the lay reader. He writes: "The real tools for understanding the quantum universe are abstract mathematics: infinite dimensional Hilbert spaces, projection operators, unitary matrices and a lot of other advanced principles that take a few years to learn. But let's see how we do in just a few pages."

[edit] Lectures
An entire series of courses of lectures on essential theoretical foundations of modern physics by Susskind is available on the iTunes platform from "Stanford on iTunes" [11] and YouTube from "StanfordUniversity's Channel" [12]. These lectures are intended for the general public as well as students. The following courses are available:

[edit] Modern Physics: The Theoretical Minimum
1 Classical Mechanics (Fall 2007) iTunes YouTube
2 Quantum Mechanics (Winter 2008) iTunes YouTube
3 Special Relativity and Classical Field Theory (Spring 2008) iTunes YouTube
4 Einstein's General Theory of Relativity (Fall 2008) iTunes YouTube
5 Cosmology (Winter 2009) iTunes YouTube
6 Statistical Mechanics (Spring 2009) iTunes YouTube
Particle Physics: 1 Basic Concepts (Fall 2009) iTunes YouTube
Particle Physics: 2 Standard Model (Winter 2010) iTunes YouTube
Particle Physics: 3 Supersymmetry, Grand Unification, String Theory (Spring 2010) iTunes
String Theory and M-Theory (Winter 2011) iTunes YouTube
[edit] A separate series of lectures on Quantum Mechanics and Special Relativity
Quantum Entanglements Part 1 (Fall 2006) iTunes YouTube
Quantum Entanglements Part 2 (Not available online)
Quantum Entanglements Part 3 (Spring 2007) iTunes YouTube
(Note that some of the lecture names are a little mixed-up: "Quantum Entanglements Part 3" is in fact a lecture series on special relativity, and the order in which the lectures were given is 1, 4, 5, 6, 7, 2&3, 8 and 9 (in terms of the numbers given on the videos); There is no mention of string theory in the series "Supersymmetry, Grand Unification, String Theory.")

[edit] Smolin-Susskind Debate
The Smolin-Susskind debate refers to the series of intense postings in 2004 between Lee Smolin and Susskind, concerning Smolin’s argument that the "Anthropic Principle cannot yield any falsifiable predictions, and therefore cannot be a part of science."[28] It began on July 26, 2004, with Smolin's publication of "Scientific alternatives to the anthropic principle". Smolin e-mailed Susskind asking for a comment. Having not had the chance to read the paper, Susskind requested a summarization of his arguments. Smolin obliged, and on July 28, 2004, Susskind responded, saying that the logic Smolin followed "can lead to ridiculous conclusions".[28] The next day, Smolin responded, saying that "If a large body of our colleagues feels comfortable believing a theory that cannot be proved wrong, then the progress of science could get stuck, leading to a situation in which false, but unfalsifiable theories dominate the attention of our field." This was followed by another paper by Susskind which made a few comments about Smolin's theory of "cosmic natural selection".[29] The Smolin-Susskind debate finally ended with each of them agreeing to write a final letter which would be posted on Edge, with three conditions attached: (1) No more than one letter each; (2) Neither sees the other's letter in advance; (3) No changes after the fact.

Although the exchanges ended in 2004, the animosity remains. In 2006, Susskind criticized Smolin as a "mid-level theoretical physicist" whose "popular book-writing activities and the related promotional hustling have given him a platform high above that merited by his physics accomplishments."[30]

[edit] See also
Superstring theory
Quantum chromodynamics
Supersymmetry
Susskind-Glogower operator
List of theoretical physicists
Kogut-Susskind fermions
Fischler-Susskind mechanism
Boris Pregel
[edit] References
^ a b His 60th birthday was celebrated with a special symposium at Stanford University on May 20–21, 2000.[1]
^ a b Faculty information sheet, Stanford University, http://www.stanford.edu/dept/physics/people/faculty/susskind_leonard.html, retrieved 2009-09-01 
^ Life in a landscape of possibilities
^ 60 New Members Chosen by Academy, National Academy of Sciences (press release), May 2, 2000, http://www8.nationalacademies.org/onpinews/newsitem.aspx?RecordID=05022000, retrieved 2009-09-01 
^ a b c Edge.org Leonard Susskind - A Biography (last accessed August 12, 2007).
^ [2]
^ [3]
^ NYAS Publication A Walk Across the Landscape
^ [4]
^ [5]
^ [6]
^ a b c "Leonard Susskind discusses duel with Stephen Hawking", "LA Times", July 26, 2008
^ http://www.stanford.edu/dept/physics/people/faculty/susskind_leonard.html
^ Welcome To Kias
^ String Theory: The Early Years, John H. Schwarz, 2000
^ L. Susskind, Lattice Models Of Quark Confinement At High Temperature, Phys. Rev. D20 (1979) 2610.
^ J. Kogut and L. Susskind, Phys. Rev. D 11, 395 (1975).
^ Review of Particle Physics, (W.-M. Yao et al., J. Phys. G 33, 1 (2006)) Dynamical Electroweak Symmetry Breaking section cites two 1979 publications, one by Steven Weinberg, the other by L. Susskind to represent the earliest models with technicolor and technifermions.[7]
^ Biography at APS J. J. Sakurai Prize website (last accessed August 12, 2007)
^ L. Susskind, RU-93-44, hep-th/9309145.
^ L. Susskind, Phys. Rev. Lett. 71, 2368 (1993). String theory and the principle of black hole complementarity
^ "The insistence on unitarity in the presence of black holes led 't Hooft (1993) and Susskind (1995b) to embrace a more radical, holographic interpretation of ..." - The Holographic Principle, Raphael Bousso, Rev. Mod. Phys. 74 (2002) 825-874. [8]
^ T. Banks, W. Fischler, S. H. Shenker, and L. Susskind, M Theory as a Matrix Model: A Conjecture, Phys. Rev. D55 (1997) 5112–5128, hep-th/9610043.
^ L. Susskind, arXiv:hep-th/0302219
^ [9]
^ a b L. Susskind (2005), The cosmic landscape: string theory and the illusion of intelligent design, Little, Brown, ISBN 0316155799 
^ a b L. Susskind (2008), The Black Hole War: My battle with Stephen Hawking to make the world safe for quantum mechanics, Little, Brown, ISBN 0-316-01640-3  [10]
^ a b Smolin vs. Susskind: The Anthropic Principle, Edge Institute, August 2004, http://edge.org/3rd_culture/smolin_susskind04/smolin_susskind.html, retrieved 2009-09-01 
^ http://cohesion.rice.edu/CampusServices/OWeek/emplibrary/letterfromleonardsusskind.pdf
^ Leonard Susskind (25 August 2006), Hold fire! This epic vessel has only just set sail..., Times Higher Education, http://www.timeshighereducation.co.uk/story.asp?storyCode=204991&sectioncode=26, retrieved 2009-09-01 
[edit] Further reading
Chown, Marcus, "Our world may be a giant hologram", New Scientist, 15 January 2009, magazine issue 2691. "The holograms you find on credit cards and banknotes are etched on two-dimensional plastic films. When light bounces off them, it recreates the appearance of a 3D image. In the 1990s physicists Leonard Susskind and Nobel prizewinner Gerard 't Hooft suggested that the same principle might apply to the universe as a whole. Our everyday experience might itself be a holographic projection of physical processes that take place on a distant, 2D surface."
[edit] External links
 Wikiquote has a collection of quotations related to: Leonard Susskind
Leonard Susskind's Homepage (Stanford University)
The Edge:
"Interview with Leonard Susskind."
"Smolin vs. Susskind: The Anthropic Principle" Susskind and Lee Smolin debate the Anthropic principle
Radio Interview from This Week in Science March 14, 2006 Broadcast
"Father of String Theory Muses on the Megaverse": Podcast.
Leonard Susskind at the Internet Movie Database
The Cosmic Landscape book discussion at The Commonwealth Club, February 2007
The Black Hole War speaks on black hole conflict at The Commonwealth Club, July 2008
Leonard Susskind: My friend Richard Feynman - A Ted talk***
Title: Physics in one minute
Post by: Crafty_Dog on September 15, 2011, 03:56:30 PM


http://www.theblaze.com/stories/minute-physics-student-explains-tough-science-using-time-lapsed-drawing/
Title: faster than speed of light?
Post by: ccp on September 22, 2011, 12:28:01 PM
 :? :-o

 A fundamental pillar of physics — that nothing can go faster than the speed of light — appears to be smashed by an oddball subatomic particle that has apparently made a giant end run around Albert Einstein's theories.

Scientists at the world's largest physics lab said Thursday they have clocked neutrinos traveling faster than light. That's something that according to Einstein's 1905 special theory of relativity — the famous E (equals) mc2 equation — just doesn't happen.

"The feeling that most people have is this can't be right, this can't be real," said James Gillies, a spokesman for the European Organization for Nuclear Research, or CERN, outside the Swiss city of Geneva.

Gillies told The Associated Press that the readings have so astounded researchers that they are asking others to independently verify the measurements before claiming an actual discovery.

"They are inviting the broader physics community to look at what they've done and really scrutinize it in great detail, and ideally for someone elsewhere in the world to repeat the measurements," he said Thursday.

Scientists at the competing Fermilab in Chicago have promised to start such work immediately.

"It's a shock," said Fermilab head theoretician Stephen Parke, who was not part of the research in Geneva. "It's going to cause us problems, no doubt about that - if it's true."

The Chicago team had similar faster-than-light results in 2007, but those came with a giant margin of error that undercut its scientific significance.

Outside scientists expressed skepticism at CERN's claim that the neutrinos — one of the strangest well-known particles in physics — were observed smashing past the cosmic speed barrier of 186,282 miles per second (299,792 kilometers per second).

University of Maryland physics department chairman Drew Baden called it "a flying carpet," something that was too fantastic to be believable.

CERN says a neutrino beam fired from a particle accelerator near Geneva to a lab 454 miles (730 kilometers) away in Italy traveled 60 nanoseconds faster than the speed of light. Scientists calculated the margin of error at just 10 nanoseconds, making the difference statistically significant. But given the enormous implications of the find, they still spent months checking and rechecking their results to make sure there was no flaws in the experiment.

"We have not found any instrumental effect that could explain the result of the measurement," said Antonio Ereditato, a physicist at the University of Bern, Switzerland, who was involved in the experiment known as OPERA.

The CERN researchers are now looking to the United States and Japan to confirm the results.

A similar neutrino experiment at Fermilab near Chicago would be capable of running the tests, said Stavros Katsanevas, the deputy director of France's National Institute for Nuclear and Particle Physics Research. The institute collaborated with Italy's Gran Sasso National Laboratory for the experiment at CERN.

Katsanevas said help could also come from the T2K experiment in Japan, though that is currently on hold after the country's devastating March 11 earthquake and tsunami.

Scientists agree if the results are confirmed, that it would force a fundamental rethink of the laws of nature.

Einstein's special relativity theory that says energy equals mass times the speed of light squared underlies "pretty much everything in modern physics," said John Ellis, a theoretical physicist at CERN who was not involved in the experiment. "It has worked perfectly up until now."

He cautioned that the neutrino researchers would have to explain why similar results weren't detected before, such as when an exploding star — or supernova — was observed in 1987.

"This would be such a sensational discovery if it were true that one has to treat it extremely carefully," said Ellis.

@yahoonews  :
Title: Re: speed of light broken
Post by: trickydog on September 23, 2011, 07:25:20 AM
The media needs to take a cold bath.

As has been said clearly, the result is reproducible and therefore "puzzling" - and so the group at CERN have offered up their data to other groups to confirm (or deny).  The natural course of science.  The speed of light limit is a long-standing principle (if you call 100 years "long") and represents a significant challenge to the scientific status quo.  But note that nothing will fall apart or become more or less true should this result stand - even if it does turn out that neutrinos are "breaking the law", like so many revelations in scientific discovery it will simply mean developing newer and better models that account for everything we have seen to date plus this.

This is currently in the realm of "science fiction" - or at least "science speculation".

In gravitation, Newton was proved "wrong".  Then Einstein.  And I'm sure one day Hawkings will also fall to the latest and best description of gravitation.  We all know that we haven't been able to include gravitation properly into the unified theory.  It may be because there is some subtlety about space-time that we haven't accounted for.  This may help us uncover it.

It's exciting frankly.  I don't know why the media portrays it as threatening.  No scientist thinks that they know the ultimate truth about life the universe and everything.  Science is a series of models - not some stranglehold on truth about reality.  Any scientist who believes otherwise should be shown the door.  In fact, challenging existing canon is a natural consequence of what every critical thinking scientist should be doing - albeit not gratuitously.  It's good science (remember, this is not a religion, despite our tendencies).

In this circumstance, unambiguously, the first position taken is that they must be making systematic errors and they want another group to check it independently.  Good scientific process.  Until that is done, this is like a UFO sighting - possibly something revolutionary but much more likely something quite banal and well understood - a mistake in scientific preparation and/or analysis.  Occam's Razor at work.

The media is like some crazed terrier who goes barking madly up and down the hallway every time it hears a floorboard creak.  You need a new dog.

Update:  From the UK Guardian http://www.guardian.co.uk/science/2011/sep/23/physicists-speed-light-violated (http://www.guardian.co.uk/science/2011/sep/23/physicists-speed-light-violated):

Quote
Professor Jim Al-Khalili at the University of Surrey said it was most likely that something was skewing the results. "If the neutrinos have broken the speed of light, it would overturn a keystone theory from the last century of physics. That's possible, but it's far more likely that there is an error in the data. So let me put my money where my mouth is: if the Cern experiment proves to be correct and neutrinos have broken the speed of light, I will eat my boxer shorts on live TV."

Update:  XKCD skewers the issue

XKCD http://xkcd.com/ (http://xkcd.com/)
Title: WSJ agrees with Tricky Dog
Post by: Crafty_Dog on September 26, 2011, 05:58:59 PM


By MICHIO KAKU
Einstein wrong? Impossible!

That was the reaction of physicists around the world last week when they heard that experiments in Switzerland indicate that Einstein's theory of relativity might be wrong. Since 1905, when Einstein declared that nothing in the universe could travel faster than light, the theory has been the bedrock of modern physics. Indeed, most of our high-tech wizardry depends on it.

Of course, crackpots have been denouncing Einstein's theory of relativity for years. Like many physicists, I have boxes full of self-published monographs that were mailed to me from people who claim that Einstein was wrong. In the 1930s the Nazi Party criticized Einstein's theory, publishing a book called "100 Authorities Denounce Relativity." Einstein later quipped that you don't need 100 famous intellectuals to disprove his theory. All you need is one simple fact.

Well, that simple fact may be in the form of the latest experiments at the largest particle accelerators in the world, based at CERN, outside Geneva. Physicists fired a beam of neutrinos (exotic, ghost-like particles that can penetrate even the densest of materials) from Switzerland to Italy, over a distance of 454 miles. Much to their amazement, after analyzing 15,000 neutrinos, they found that they traveled faster than the speed of light—one 60-billionth of a second faster, to be precise. In a billionth of a second, a beam of light travels about one foot. So a difference of 60 feet was quite astonishing.

Cracking the light barrier violated the core of Einstein's theory. According to relativity, as you approach the speed of light, time slows down, you get heavier, and you also get flatter (all of which have been measured in the lab). But if you go faster than light, then the impossible happens. Time goes backward. You are lighter than nothing, and you have negative width. Since this is ridiculous, you cannot go faster than light, said Einstein.

Enlarge Image

CloseCERN
 
A part of the OPERA detector experiment to measure neutrinos.
.The CERN announcement was electrifying. Some physicists burst out with glee, because it meant that the door was opening to new physics (and more Nobel Prizes). New, daring theories would need to be proposed to explain this result. Others broke out in a cold sweat, realizing that the entire foundation of modern physics might have to be revised. Every textbook would have to be rewritten, every experiment recalibrated.

Cosmology, the very way we think of space, would be forever altered. The distance to the stars and galaxies and the age of the universe (13.7 billion years) would be thrown in doubt. Even the expanding universe theory, the Big Bang theory, and black holes would have to be re-examined.

Moreover, everything we think we understand about nuclear physics would need to be reassessed. Every school kid knows Einstein's famous equation E=MC2, where a small amount of mass M can create a vast amount of energy E, because the speed of light C squared is such a huge number. But if C is off, it means that all nuclear physics has to be recalibrated. Nuclear weapons, nuclear medicine and radioactive dating would be affected because all nuclear reactions are based on Einstein's relation between matter and energy.

Related Video
 Michio Kaku, theoretical physics professor at City College of New York, discusses the implications of a recent experiment that undercuts Einstein's theory of relativity.
..If all this wasn't bad enough, it would also mean that the fundamental principles of physics are incorrect. Modern physics is based on two theories, relativity and the quantum theory, so half of modern physics would have to be replaced by a new theory. My own field, string theory, is no exception. Personally, I would have to revise all my theories because relativity is built into string theory from the very beginning.

How will this astonishing result play out? As Carl Sagan once said, remarkable claims require remarkable proof. Laboratories around the world, like Fermilab outside Chicago, will redo the CERN experiments and try to falsify or verify their results.

My gut reaction, however, is that this is a false alarm. Over the decades, there have been numerous challenges to relativity, all of them proven wrong. In the 1960s, for example, physicists were measuring the tiny effect of gravity upon a light beam. In one study, physicists found that the speed of light seemed to oscillate with the time of day. Amazingly, the speed of light rose during the day, and fell at night. Later, it was found that, since the apparatus was outdoors, the sensors were affected by the temperature of daylight.

Reputations may rise and fall. But in the end, this is a victory for science. No theory is carved in stone. Science is merciless when it comes to testing all theories over and over, at any time, in any place. Unlike religion or politics, science is ultimately decided by experiments, done repeatedly in every form. There are no sacred cows. In science, 100 authorities count for nothing. Experiment counts for everything.

Mr. Kaku, a professor of theoretical physics at City College of New York, is the author of "Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100" (Doubleday, 2011).

Title: About tachyons (particles that travel faster than c)
Post by: trickydog on September 26, 2011, 06:41:04 PM
A bartender says "Hey!  We don't serve tachyons in this establishment".
Two tachyons walk into a bar.
Title: Re: About tachyons (particles that travel faster than c)
Post by: G M on September 26, 2011, 06:42:27 PM
A bartender says "Hey!  We don't serve tachyons in this establishment".
Two tachyons walk into a bar.

 :-D
Title: Re: Physics
Post by: DougMacG on September 26, 2011, 09:18:25 PM
"one 60-billionth of a second faster"..."My gut reaction, however, is that this is a false alarm. Over the decades, there have been numerous challenges to relativity, all of them proven wrong. "
-----
Interesting stuff.  I recall an experiment 10-12 years ago where they also allegedly made light travel slightly faster than the speed of light for a very short time.  Nothing seemed to come out of that in terms of theories discarded or product commercialized.  The speed of light is already pretty fast.

Llinks: http://physicsworld.com/cws/article/news/2810
http://www.abc.net.au/science/articles/2000/07/24/154610.htm
Title: Re: About tachyons (particles that travel faster than c)
Post by: Cranewings on September 27, 2011, 10:49:21 PM
A bartender says "Hey!  We don't serve tachyons in this establishment".
Two tachyons walk into a bar.

I can already tell I'm going to start telling this joke constantly.
Title: Re: Physics
Post by: DougMacG on September 28, 2011, 06:44:23 AM
TD, CW,  You run in fast circles if you can tell that joke without some explaining.   :wink:
A bartender says "Hey!  We don't serve tachyons in this establishment".
Two tachyons walk into a bar.
Title: Re: Physics
Post by: Crafty_Dog on September 28, 2011, 07:59:39 AM
Well, I'm not so fast!  Explain it to me please!  :lol:
Title: Re: Physics
Post by: DougMacG on September 28, 2011, 10:29:54 AM
Explaining someone else's joke is dangerous territory, good chance of screwing up.  The joke writers can correct me or build on this. 

In layman's terms, the theory of relativity is about things that happen really fast - messing with the concept of time as we know it - all based on the speed of light, a constant, which is now being challenged. Arrive at your train destination before you departed, that kind of thing... The tachyon is some sub atomic particle that scientists are messing with to violate the limits of the speed of light and threatening Einstein's great theory.  In comes the old joke line, like the priest and the rabbi or the man and his dog go into a bar and the bartender says..., only this time the events are happening in reverse order, messing with our concept of time.

A bartender says "Hey!  We don't serve tachyons in this establishment".
Two tachyons walk into a bar.

??
Title: Pendulum
Post by: Crafty_Dog on October 24, 2011, 05:47:53 AM


http://www.youtube.com/watch?v=yVkdfJ9PkRQ&feature=share
Title: Teenager solves math riddle posed by Sir Isaac Newton
Post by: Crafty_Dog on May 27, 2012, 08:48:24 AM


http://www.news.com.au/breaking-news/world/german-teen-shouryya-ray-solves-300-year-old-mathematical-riddle-posed-by-sir-isaac-newton/story-e6frfkui-1226368490521
Title: BYU Physics class
Post by: Crafty_Dog on November 22, 2012, 10:58:18 AM


http://www.wimp.com/demonstratescannon/
Title: Pi and other infinities
Post by: Crafty_Dog on January 01, 2013, 11:00:38 AM


http://www.nytimes.com/2013/01/01/science/the-life-of-pi-and-other-infinities.html?nl=todaysheadlines&emc=edit_th_20130101
Title: explaining that which is unexplainable
Post by: ccp on October 12, 2013, 09:30:23 AM




Higgs Boson Gets Nobel Prize, But Physicists Still Don’t Know What It Means

By Adam Mann
10.08.13
3:54 PM










Data from the CMS experiment, one of the main Higgs-searching experiments at the Large Hadron Collider. Image: CERN


More than a year ago, scientists found the Higgs boson. This morning, two physicists who 50 years ago theorized the existence of this particle, which is responsible for conferring mass to all other known particles in the universe, got the Nobel, the highest prize in science.

For all the excitement the award has already generated, finding the Higgs — arguably the most important discovery in more than a generation — has left physicists without a clear roadmap of where to go next. While popular articles often describe how the Higgs might help theorists investigating the weird worlds of string theory, multiple universes, or supersymmetry, the truth is that evidence for these ideas is scant to nonexistent.

No one is sure which of these models, if any, will eventually describe reality. The current picture of the universe, the Standard Model, is supposed to account for all known particles and their interactions. But scientists know that it’s incomplete. Its problems need fixing, and researchers could use some help figuring out how. Some of them look at the data and say that we need to throw out speculative ideas such as supersymmetry and the multiverse, models that look elegant mathematically but are unprovable from an experimental perspective. Others look at the exact same data and come to the opposite conclusion.

“Physics is at a crossroads,” said cosmologist Neil Turok, speaking to a class of young scientists in September at the Perimeter Institute, which he directs. “In a sense we’ve entered a very deep crisis.”



The word “crisis” is a charged one within the physics community, invoking eras such as the early 20th century, when new observations were overturning long-held beliefs about how the universe works. Eventually, a group of young researchers showed that quantum mechanics was the best way to describe reality. Now, as then, many troubling observations leave physicists scratching their heads. Chief among them is the “Hierarchy Problem,” which in its simplest form asks why gravity is approximately 10 quadrillion times weaker than the three other fundamental forces in the universe. Another issue is the existence of dark matter, the unseen, mysterious mass thought to be responsible for strange observations in the rotation of galaxies.

The solution to both these problems might come from the discovery of new particles beyond the Higgs. One theory, supersymmetry, goes beyond the Standard Model to say that every subatomic particle — quarks, electrons, neutrinos, and so on — also has a heavier twin. Some of these new particles might have the right characteristics to account for the influence of dark matter. Engineers built the Large Hadron Collider to see if such new particles exist (and may yet see them once it reaches higher energy in 2014), but so far it hasn’t turned up anything other than the Higgs.

In fact, the Higgs itself has turned out to be part of the issue. The particle was the final piece in the Standard Model puzzle. When scientists discovered it at the LHC, it had a mass of 125 GeV, about 125 times heavier than a proton — exactly what standard physics expected. That was kind of a buzzkill. Though happy to know the Higgs was there, many scientists had hoped it would turn out to be strange, to defy their predictions in some way and give a hint as to which models beyond the Standard Model were correct. Instead, it’s ordinary, perhaps even boring.

All this means that confidence in supersymmetry is dropping like a stone, according to Tommaso Dorigo, a particle physicist at the LHC. In one blog post, he shared a rather pornographic plot showing how the findings of the LHC eliminated part of the evidence for supersymmetry. Later, he wrote that many physicists would have previously bet their reproductive organs on the idea that supersymmetric particles would appear at the LHC. That the accelerator’s experiments have failed to find anything yet “has significantly cooled everybody down,” he wrote.

In fact, when the organizers of a Higgs workshop in Madrid last month asked physicists there if they thought the LHC would eventually find new physics other than the Higgs boson, 41 percent said no. As to how to solve the known problems of the Standard Model, respondents were all over the map. String theory fared the worst, with three-quarters of those polled saying they did not think it is the ultimate answer to a unified physics.

One possibility has been brought up that even physicists don’t like to think about. Maybe the universe is even stranger than they think. Like, so strange that even post-Standard Model models can’t account for it. Some physicists are starting to question whether or not our universe is natural. This cuts to the heart of why our reality has the features that it does: that is, full of quarks and electricity and a particular speed of light.

This problem, the naturalness or unnaturalness of our universe, can be likened to a weird thought experiment. Suppose you walk into a room and find a pencil balanced perfectly vertical on its sharp tip. That would be a fairly unnatural state for the pencil to be in because any small deviation would have caused it to fall down. This is how physicists have found the universe: a bunch of rather well-tuned fundamental constants have been discovered that produce the reality that we see.

A natural explanation would show why the pencil is standing on its end. Perhaps there is a very thin string holding the pencil to the ceiling that you never noticed until you got up close. Supersymmetry is a natural explanation in this regard – it explains the structure of universe through as-yet-unseen particles.

But suppose that infinite rooms exist with infinite numbers of pencils. While most of the rooms would have pencils that have fallen over, it is almost certain that in at least one room, the pencil would be perfectly balanced. This is the idea behind the multiverse. Our universe is but one of many and it happens to be the one where the laws of physics happen to be in the right state to make stars burn hydrogen, planets form round spheres, and creatures like us evolve on their surface.

The multiverse idea has two strikes against it, though. First, physicists would refer to it as an unnatural explanation because it simply happened by chance. And second, no real evidence for it exists and we have no experiment that could currently test for it.

As of yet, physicists are still in the dark. We can see vague outlines ahead of us but no one knows what form they will take when we reach them. Finding the Higgs has provided the tiniest bit of light. But until more data appears, it won’t be enough.
Title: CERN May Not Have Discovered Higgs Boson After All
Post by: ccp on November 09, 2014, 06:18:09 AM
CERN May Not Have Discovered Higgs Boson After All

November 9, 2014 By Corey Leighton —Leave a Comment

In July of 2012, researchers at CERN announced that the 40 year hunt for the elusive Higgs boson may have come to an end.  The announcement made headlines around the world, and particle physicists considered it a critical discovery to be one of the first of many from the lab’s famous Large Hadron Collider.  But scientists at the University of Southern Denmark’s Center for Cosmology and Particle Physics Phenomenology are now casting doubt, saying that the detected particle may not be the elusive Higgs boson after all.

The Higgs boson is one of the key building blocks of the Standard Model of particle physics.  The standard model attempts to explain the electromagnetic, weak, and strong nuclear forces, and the Higgs boson is a critical piece of the puzzle.  Its discovery would lead the way to understanding the Higgs field, which, in turn, would explain how everything we see around us has mass.  So, the announcement from CERN that it had been detected was received with much fanfare… excitement which might now be premature.

“The current data is not precise enough to determine exactly what the particle is,” says university researcher Mads Toudal Frandsen. “It could be a number of other known particles.”
 
Frandsen’s team now suggests that the detected particle may not only not be a Higgs boson, but it could be a ‘techni-higgs’ particle which would  support a set of theories that are beyond the standard model known as ‘Technicolor”.

“A techni-higgs particle is not an elementary particle. Instead, it consists of so-called techni-quarks, which we believe are elementary,” he says.

“Techni-quarks may bind together in various ways to form for instance techni-higgs particles, while other combinations may form dark matter. We therefore expect to find several different particles at the LHC, all built by techni-quarks.”

The ultimate verdict most likely likes deep in the heart of the now-dormant LHC, which is currently silent while CERN scientists work to increase the power of the world’s most powerful particle supercollider.  CERN hopes to have the LHC back online in early 2015.

Source: Tech Times

Title: The Big Rip
Post by: Body-by-Guinness on July 02, 2015, 09:29:00 PM
New model of cosmic stickiness favors ‘Big Rip’ demise of universe

big-rip-universe
This is a time line of life of the universe that ends in a Big Rip. Credit Jeremy Teaford, Vanderbilt University

From Vanderbilt University:

The universe can be a very sticky place, but just how sticky is a matter of debate.

That is because for decades cosmologists have had trouble reconciling the classic notion of viscosity based on the laws of thermodynamics with Einstein’s general theory of relativity. However, a team from Vanderbilt University has come up with a fundamentally new mathematical formulation of the problem that appears to bridge this long-standing gap.

The new math has some significant implications for the ultimate fate of the universe. It tends to favor one of the more radical scenarios that cosmologists have come up with known as the “Big Rip.” It may also shed new light on the basic nature of dark energy.

The new approach was developed by Assistant Professor of Mathematics Marcelo Disconzi in collaboration with physics professors Thomas Kephart and Robert Scherrer and is described in a paper published earlier this year in the journal Physical Review D.

“Marcelo has come up with a simpler and more elegant formulation that is mathematically sound and obeys all the applicable physical laws,” said Scherrer.

The type of viscosity that has cosmological relevance is different from the familiar “ketchup” form of viscosity, which is called shear viscosity and is a measure of a fluid’s resistance to flowing through small openings like the neck of a ketchup bottle. Instead, cosmological viscosity is a form of bulk viscosity, which is the measure of a fluid’s resistance to expansion or contraction. The reason we don’t often deal with bulk viscosity in everyday life is because most liquids we encounter cannot be compressed or expanded very much.

Disconzi began by tackling the problem of relativistic fluids. Astronomical objects that produce this phenomenon include supernovae (exploding stars) and neutron stars (stars that have been crushed down to the size of planets).

Scientists have had considerable success modeling what happens when ideal fluids – those with no viscosity – are boosted to near-light speeds. But almost all fluids are viscous in nature and, despite decades of effort, no one has managed to come up with a generally accepted way to handle viscous fluids traveling at relativistic velocities. In the past, the models formulated to predict what happens when these more realistic fluids are accelerated to a fraction of the speed of light have been plagued with inconsistencies: the most glaring of which has been predicting certain conditions where these fluids could travel faster than the speed of light.

“This is disastrously wrong,” said Disconzi, “since it is well-proven experimentally that nothing can travel faster than the speed of light.”

These problems inspired the mathematician to re-formulate the equations of relativistic fluid dynamics in a way that does not exhibit the flaw of allowing faster-than-light speeds. He based his approach on one that was advanced in the 1950s by French mathematician André Lichnerowicz.

Next, Disconzi teamed up with Kephart and Scherrer to apply his equations to broader cosmological theory. This produced a number of interesting results, including some potential new insights into the mysterious nature of dark energy.

In the 1990s, the physics community was shocked when astronomical measurements showed that the universe is expanding at an ever-accelerating rate. To explain this unpredicted acceleration, they were forced to hypothesize the existence of an unknown form of repulsive energy that is spread throughout the universe. Because they knew so little about it, they labeled it “dark energy.”

Most dark energy theories to date have not taken cosmic viscosity into account, despite the fact that it has a repulsive effect strikingly similar to that of dark energy. “It is possible, but not very likely, that viscosity could account for all the acceleration that has been attributed to dark energy,” said Disconzi. “It is more likely that a significant fraction of the acceleration could be due to this more prosaic cause. As a result, viscosity may act as an important constraint on the properties of dark energy.”

Another interesting result involves the ultimate fate of the universe. Since the discovery of the universe’s run-away expansion, cosmologists have come up with a number of dramatic scenarios of what it could mean for the future.

One scenario, dubbed the “Big Freeze,” predicts that after 100 trillion years or so the universe will have grown so vast that the supplies of gas will become too thin for stars to form. As a result, existing stars will gradually burn out, leaving only black holes which, in turn, slowly evaporate away as space itself gets colder and colder.

An even more radical scenario is the “Big Rip.” It is predicated on a type of “phantom” dark energy that gets stronger over time. In this case, the expansion rate of the universe becomes so great that in 22 billion years or so material objects begin to fall apart and individual atoms disassemble themselves into unbound elementary particles and radiation.

The key value involved in this scenario is the ratio between dark energy’s pressure and density, what is called its equation of state parameter. If this value drops below -1 then the universe will eventually be pulled apart. Cosmologists have called this the “phantom barrier.” In previous models with viscosity the universe could not evolve beyond this limit.

In the Desconzi-Kephart-Scherrer formulation, however, this barrier does not exist. Instead, it provides a natural way for the equation of state parameter to fall below -1.

“In previous models with viscosity the Big Rip was not possible,” said Scherrer. “In this new model, viscosity actually drives the universe toward this extreme end state.”

According to the scientists, the results of their pen-and-paper analyses of this new formulation for relativistic viscosity are quite promising but a much deeper analysis must be carried out to determine its viability. The only way to do this is to use powerful computers to analyze the complex equations numerically. In this fashion the scientists can make predictions that can be compared with experiment and observation.

###

The research was supported by National Science Foundation grant 1305705 and Department of Energy grant DE-SC0011981.

http://wattsupwiththat.com/2015/07/02/claim-universe-to-end-with-a-big-rip-where-atoms-are-ripped-apart/
Title: Re: Physics & Mathematics
Post by: Crafty_Dog on July 03, 2015, 09:01:21 AM
A bit over my head, but provokes a sense of wonder nonetheless.
Title: Anti-Matter
Post by: Crafty_Dog on August 19, 2015, 12:39:32 PM


http://www.iflscience.com/physics/cern-symmetry-experiment-confirms-matter-and-antimatter-are-mirrors-each-other
Title: Quantum spookiness confirmed
Post by: Crafty_Dog on August 31, 2015, 06:42:54 PM
http://www.sciencealert.com/quantum-spookiness-has-been-confirmed-by-first-loophole-free-experiment
Title: Controlling the paths of photons?
Post by: ccp on October 25, 2015, 06:37:12 PM
Controlling the paths of photons the same as electrons.  I remember this from the GTR days.   I remember one company called Lumenon that went bust claiming they could get bend the course of photons.

This could be for real:

http://qz.com/532580/scientists-have-found-a-way-to-make-light-waves-travel-infinitely-fast/
Title: Physics & Mathematics - The Most Mind-Bending Fact I Learned in Physics
Post by: DougMacG on November 12, 2015, 10:36:44 AM
http://www.realclearscience.com/blog/2015/11/the_most_amazing_fact_i_learned_in_physics_class.html

The Most Mind-Bending Fact I Learned in Physics
Posted by Tom Hartsfield

(http://images.realclear.com/321053_5_.jpg)

Physics is built out of philosophically fascinating ideas. Or, at least, ideas that fascinate us as physicists. We are often moved to reverentially proclaim the beauty of various concepts and theories. Sometimes this beauty makes sense to other people (we're made of star stuff) and other times it's opaque (Frobenius manifolds in psuedo-Euclidean spaces).

I have my own personal favorite idea. It arises from the philosophically fantastic (but mathematically moderate) workings of Einstein's relativity theory. The theory of special relativity holds that time and space are not separate entities, each operating on its own; rather they are intimately and inextricably codependent. We are born, live, and die along "world-lines" through a four-dimensional spacetime.


Here's what awes me: we travel through this 4-D spacetime always at a constant speed: c, the speed of light.

No matter what we do in our momentary lives, we are always truly traveling through our universe in time and space together, always at at the same rate. Let's consider a few facts that follow from this realization.

A man who sits still uses none of his lightspeed to travel through space. Instead he is travelling in time at the speed of light. He ages--in the view of those around him--at the fastest rate possible: light speed. (How's that for a philosophical argument against sloth?)

As we travel about in our daily lives, we use up a miniscule amount of our alotted light speed to move through the spatial dimensions surrounding us. We borrow that speed from our travel forward in time and thus we age more slowly than our sedentary neighbors. You've probably never noticed that fact, and there's a clear explanation why. It's only when you travel at unimaginablly high speeds that the weirdness of time becomes large enough to notice. The mathematical reason for this is that the effect of time dilation at a particular speed "v" is only (v/c)2.

Try putting the fastest you've ever traveled into the top of that equation and then dividing it by the 671 million miles per hour that light travels. Then square that tiny number to make it vastly smaller.

Imagine a strange jet-setter who spends an entire 80-year lifespan cruising at 500 mph on a Boeing 747. When his long flight finally touches down, the watch on his wrist, set to match the airport clock at takeoff, will be only one millisecond behind. However, we can watch a subatomic particle live five times longer at 98% of light speed than sitting still.

Maybe the strangest case of this phenomenon is light itself, the sole thing capable of travel at c. From our point of view, then, a photon is using the entirity of its spacetime velocity to travel through space. It never ages (from our frame of reference, watching)! That's why we see photons will fly through space in a straight line from one side of the universe to the other for all of eternity without changing in any way unless externally influenced. This imperviousness makes them excellent historical records. And here, the deeper general theory of relativity (also courtesy of Einstein) leads us to something more bizarre.

Many of the photons generated at nearly the beginning of the universe are still travelling through space in their birthday suits. But, over the course of their billions of years in transit to us, the space they inhabit along their path through the stars has grown more than 1000 times bigger since they were born. This expansion of spacetime has stretched the wavelength of the photons along with it, like an enormous slinky being pulled apart. Now they are a thousand times longer but still timeless to us.

Spacetime physics, adhering to relativity as we know it, reveals utterly surreal truths. Many of these are posed as famous puzzles and arguments, such as the twin paradox, the ladder paradox, and the failure of simultaneity. But the mere fact that we always travel through spacetime at the speed of light never ceases to stop me in my tracks (metaphorically speaking). I believe it is the most stunning thing I've ever absorbed in a physics class.

Tom Hartsfield is a physicist who just defended his PhD at the University of Texas
Title: NExt Einstein? Sabrina Pasterski
Post by: Crafty_Dog on January 17, 2016, 05:19:38 PM
http://www.ozy.com/rising-stars/this-millennial-might-be-the-new-einstein/65094?utm_source=NH&utm_medium=pp&utm_campaign=pp
Title: Here is her website
Post by: ccp on January 18, 2016, 04:44:05 AM
http://physicsgirl.com/

She is one in a billion (or at least 100 million)

If I had another life this is what I would wish to be:  not a President, not a sports star, not a celebrity, not the richest man, but to be a genius in physics.  Physics is the closest to truth.   The rest, as Shakespeare said, is just a playing a part on a stage.
Title: Looks like Einstein was right about black holes
Post by: Crafty_Dog on February 11, 2016, 08:36:54 AM
http://www.nytimes.com/2016/02/12/science/ligo-gravitational-waves-black-holes-einstein.html?emc=edit_na_20160211&nl=bna&nlid=49641193&te=1&_r=0

also see

https://www.facebook.com/verge/videos/1035654389804237/
Title: Re: Physics & Mathematics
Post by: ccp on February 12, 2016, 07:47:07 AM
So what's the big deal?  Wasn't this obvious?   :-D

"Conveyed by these gravitational waves, power 50 times greater than the output of all the stars in the universe combined vibrated a pair of L-shaped antennas in Washington State and Louisiana known as LIGO on Sept. 14."

Wow.  One tiny step closer to an explanation of what the heck is going on.  8-)

Title: Re: Physics & Mathematics
Post by: ccp on April 26, 2016, 08:21:59 PM
I struggle to get past the Ariana Huffington BS to occasionally find a good article.  Forget about figuring out what is is, or, there is classified and there is classified.  This is real food for thought:

http://www.huffingtonpost.com/george-musser/space-time-illusion_b_9703656.html
Title: View of black hole?
Post by: ccp on April 13, 2017, 10:10:02 AM
https://www.yahoo.com/news/scientists-peeered-black-hole-taken-120937630.html
Title: Everything we know , , , for now
Post by: Crafty_Dog on September 08, 2018, 01:25:32 PM


https://bigthink.com/design-for-good/the-physics-of-everything-in-one-neat-map
Title: Time Travel
Post by: Crafty_Dog on December 16, 2018, 02:18:09 AM
http://www.milkywayastronomers.com/2017/09/physicists-send-particles-of-light-into.html?fbclid=IwAR1yEhyxQ6wLk3Yl5DiX6GvEjXMwoVQB3eYlwkjdQtAJ6s8VSDRxv1Iz-ZY
Title: Physicists puzzled by strange numbers that could explain reality
Post by: Crafty_Dog on February 01, 2019, 07:01:28 AM
This went well over my head with nary a look back, , ,


https://bigthink.com/surprising-science/physicists-puzzled-by-strange-numbers-that-could-explain-reality?rebelltitem=6#rebelltitem6
Title: Base 60 Trig 1,000 years before the Greeks
Post by: Crafty_Dog on March 29, 2019, 10:33:26 PM


https://www.theepochtimes.com/3700-year-old-mystery-babylonian-tablet-gets-translated_2808867.html?fbclid=IwAR2jF_tk207eXvPXK36qgxTM9kOHcEI8jT6QVCzKfCsVMb-SDJZyN6dzBG0


Title: Babylonian Math
Post by: Crafty_Dog on April 01, 2019, 12:29:55 AM
Challenging the preceding:

https://blogs.scientificamerican.com/roots-of-unity/dont-fall-for-babylonian-trigonometry-hype/?fbclid=IwAR2n8gykMBh9maxvtxE3rkXVFSOoeH3v-IHaeh5gKyVt3fqW-wCp1t6RCkA

Sumerian/Babylonian Mathematics-- I found this really interesting:
https://www.storyofmathematics.com/sumerian.html?fbclid=IwAR22g8amoim4dl1KVo7QFm3_S0-Ts8mJVFXkDfMaICgYjMw2TFELDDouHEg 

Separately:   "One of me favorite "tricks " is to have someone add any two numbers together one after another ( 1 + 2 =3...2+3=5...5+3=8, etc, then divide last number by next to last number...no matter what numbers you start with, the answer is 1.618..."

Title: Feynman lecture on Law of gravity
Post by: ccp on April 13, 2019, 09:11:09 PM
and the history of its explanation by Richard Feynman:

http://www.cornell.edu/video/richard-feynman-messenger-lecture-1-law-of-gravitation
Title: Traffic Flow and Traffic Jams;
Post by: Crafty_Dog on April 25, 2019, 04:26:25 AM


http://nautil.us/issue/71/flow/why-a-traffic-flow-suddenly-turns-into-a-traffic-jam
Title: Is the Universe a Hologram?
Post by: Crafty_Dog on December 11, 2019, 09:22:10 AM
https://getpocket.com/explore/item/new-evidence-for-the-strange-idea-that-the-universe-is-a-hologram?utm_source=pocket-newtab&fbclid=IwAR0C3fCC7sji1vSCzZkKqof3VD67KluBnX4d2MO6xKcDbR3D-_tYF25saSw
Title: 9 lessons from Feynman
Post by: Crafty_Dog on December 15, 2019, 09:52:43 PM
https://bigthink.com/surprising-science/richard-feynman?rebelltitem=3#rebelltitem3
Title: Why Cats Always Land on Their Feet
Post by: Crafty_Dog on December 27, 2019, 08:17:04 PM


https://arstechnica.com/science/2019/12/the-surprisingly-complicated-physics-of-why-cats-always-land-on-their-feet/?utm_source=pocket-newtab
Title: Conway Knot Problem solved
Post by: Crafty_Dog on May 24, 2020, 04:18:36 PM
https://www.wired.com/story/a-grad-student-solved-the-epic-conway-knot-problem-in-a-week/?mbid=social_twitter&utm_brand=wired&utm_medium=social&utm_social-type=owned&utm_source=twitter
Title: Saving Schroedinger's cat
Post by: Crafty_Dog on June 09, 2020, 07:05:26 PM
https://www.sciencealert.com/physicists-think-they-ve-figured-out-a-way-to-save-schroedinger-s-cat
Title: Time
Post by: Crafty_Dog on August 09, 2020, 08:53:47 PM
https://getpocket.com/explore/item/forget-everything-you-think-you-know-about-time?utm_source=pocket-newtab
Title: Re: Time
Post by: DougMacG on August 10, 2020, 07:56:47 AM
https://getpocket.com/explore/item/forget-everything-you-think-you-know-about-time?utm_source=pocket-newtab

We can't look at the stars and their locations in the sky.  We're seeing where they were thousands of years ago.
---

Space travel contemplated 45 years ago:
"so many years have gone though I'm older but a year..."
  - Brian May https://www.songfacts.com/lyrics/queen/39

Title: Spooky Action at a Distance
Post by: Crafty_Dog on December 31, 2021, 03:04:01 AM
https://bigthink.com/13-8/einstein-spooky-action-at-a-distance/?utm_medium=Social&utm_source=Facebook&fbclid=IwAR3xBAwfz87_wm253vQBjO2T8LXEzGc5QmByBTDv_P9c2fJzmnOBb98v0S0#Echobox=1640209457-1
Title: The Math of Nature
Post by: DougMacG on January 05, 2022, 05:13:29 PM
https://www.sciencealert.com/the-exquisite-beauty-of-nature-reveals-a-world-of-math
Title: Re: Physics & Mathematics
Post by: Crafty_Dog on January 07, 2022, 12:12:05 PM
 8-)
Title: the physics behind hitting a tennis ball at rocket speed
Post by: ccp on September 09, 2022, 04:18:28 PM
if you missed fastest server recorded 163 mph, you can watch it again - and still miss it:

https://www.youtube.com/watch?v=uKeL-W7xft0

the physics behind this :

https://theconversation.com/fast-serves-dont-make-sense-unless-you-factor-in-physics-106937
Title: physics nobel prize
Post by: ccp on October 05, 2022, 02:16:02 PM
https://www.yahoo.com/finance/news/win-nobel-prize-physics-scientists-210200546.html

If I could come back I would wish I had the mind to understand any of this.....

Title: Re: Physics & Mathematics
Post by: Crafty_Dog on October 05, 2022, 07:03:31 PM
Just reading that article left me feeling stupid. :-D
Title: Re: Physics & Mathematics
Post by: DougMacG on October 05, 2022, 08:30:43 PM
Don't feel bad.  They don't understand it either.

Einstein was shown to be wrong?  But he made his mark showing that Newton was wrong. How long until we find out these guys are wrong too.

Wrong is kind of a harsh characterization.  The best minds puts the best explanations possible on what is known at the time.

https://www.snexplores.org/article/quantum-world-mind-bogglingly-weird
Title: Tesla Mathematics
Post by: Crafty_Dog on November 05, 2022, 02:34:41 PM
https://www.youtube.com/watch?v=6ZrO90AI0c8

Haven't watched this yet, looks intriguing.
Title: Quantum Uncertainty
Post by: Crafty_Dog on December 10, 2022, 08:12:03 PM


https://www.youtube.com/watch?v=qC0UWxgyDD0
Title: Re: Physics & Mathematics
Post by: Crafty_Dog on August 29, 2023, 01:53:06 PM


https://phys.org/news/2023-08-visualizing-mysterious-quantum-entanglement-photons.html?fbclid=IwAR2hIHqdFGbXfkgHajEqdT9lTFvYf3P9ez4dXIDLXIsv6C2vw2tCTTvFT6Y

AUGUST 21, 2023

 Editors' notes
Visualizing the mysterious dance: Quantum entanglement of photons captured in real-time
by University of Ottawa

Biphoton state holographic reconstruction. Image reconstruction. a, Coincidence image of interference between a reference SPDC state and a state obtained by a pump beam with the shape of a Ying and Yang symbol (shown in the inset). The inset scale is the same as in the main plot. b, Reconstructed amplitude and phase structure of the image imprinted on the unknown pump. Credit: Nature Photonics (2023). DOI: 10.1038/s41566-023-01272-3
Researchers at the University of Ottawa, in collaboration with Danilo Zia and Fabio Sciarrino from the Sapienza University of Rome, recently demonstrated a novel technique that allows the visualization of the wave function of two entangled photons, the elementary particles that constitute light, in real-time.


Using the analogy of a pair of shoes, the concept of entanglement can be likened to selecting a shoe at random. The moment you identify one shoe, the nature of the other (whether it is the left or right shoe) is instantly discerned, regardless of its location in the universe. However, the intriguing factor is the inherent uncertainty associated with the identification process until the exact moment of observation.

The wave function, a central tenet in quantum mechanics, provides a comprehensive understanding of a particle's quantum state. For instance, in the shoe example, the "wave function" of the shoe could carry information such as left or right, the size, the color, and so on.

More precisely, the wave function enables quantum scientists to predict the probable outcomes of various measurements on a quantum entity, e.g. position, velocity, etc.

This predictive capability is invaluable, especially in the rapidly progressing field of quantum technology, where knowing a quantum state which is generated or input in a quantum computer will allow to test the computer itself. Moreover, quantum states used in quantum computing are extremely complex, involving many entities that may exhibit strong non-local correlations (entanglement).

Knowing the wave function of such a quantum system is a challenging task—this is also known as quantum state tomography or quantum tomography in short. With the standard approaches (based on the so-called projective operations), a full tomography requires large number of measurements that rapidly increases with the system's complexity (dimensionality).

Previous experiments conducted with this approach by the research group showed that characterizing or measuring the high-dimensional quantum state of two entangled photons can take hours or even days. Moreover, the result's quality is highly sensitive to noise and depends on the complexity of the experimental setup.

The projective measurement approach to quantum tomography can be thought of as looking at the shadows of a high-dimensional object projected on different walls from independent directions. All a researcher can see is the shadows, and from them, they can infer the shape (state) of the full object. For instance, in CT scan (computed tomography scan), the information of a 3D object can thus be reconstructed from a set of 2D images.


In classical optics, however, there is another way to reconstruct a 3D object. This is called digital holography, and is based on recording a single image, called interferogram, obtained by interfering the light scattered by the object with a reference light.

The team, led by Ebrahim Karimi, Canada Research Chair in Structured Quantum Waves, co-director of uOttawa Nexus for Quantum Technologies (NexQT) research institute and associate professor in the Faculty of Science, extended this concept to the case of two photons.

Reconstructing a biphoton state requires superimposing it with a presumably well-known quantum state, and then analyzing the spatial distribution of the positions where two photons arrive simultaneously. Imaging the simultaneous arrival of two photons is known as a coincidence image. These photons may come from the reference source or the unknown source. Quantum mechanics states that the source of the photons cannot be identified.

This results in an interference pattern that can be used to reconstruct the unknown wave function. This experiment was made possible by an advanced camera that records events with nanosecond resolution on each pixel.

Dr. Alessio D'Errico, a postdoctoral fellow at the University of Ottawa and one of the co-authors of the paper, highlighted the immense advantages of this innovative approach, "This method is exponentially faster than previous techniques, requiring only minutes or seconds instead of days. Importantly, the detection time is not influenced by the system's complexity—a solution to the long-standing scalability challenge in projective tomography."

The impact of this research goes beyond just the academic community. It has the potential to accelerate quantum technology advancements, such as improving quantum state characterization, quantum communication, and developing new quantum imaging techniques.

The study "Interferometric imaging of amplitude and phase of spatial biphoton states" was published in Nature Photonics.
Title: Imaging Electrons in an Billionth of Billionth Second
Post by: Body-by-Guinness on October 13, 2023, 11:11:12 PM
Ye gods, Nobel prize winner find a way to illuminate electrons on so fleeting a scale they appear almost still, allowing processes that involve electron exchange to be better visualized and understood:

https://www.quantamagazine.org/physicists-who-explored-tiny-glimpses-of-time-win-nobel-prize-20231003/?fbclid=IwAR2FAUnBBh_lZkvWxW69ngtmYWwn7aNiX8FMj-26e4Ii6MCMCIsKnHe9Af8
Title: G-Hat Symmetries & Unified Math Theory?
Post by: Body-by-Guinness on October 14, 2023, 12:52:49 AM
2nd post:

The math here is WAY beyond me (I very much regret how little math I’ve learned and wish I had studied it more in school, or better yet found a math teacher able to deal with the drummer I dance to) but suggests some beautiful symmetries that may very well build bridges across various disciplines that will likely bear momentous fruit.

https://www.quantamagazine.org/echoes-of-electromagnetism-found-in-number-theory-20231012/?fbclid=IwAR3XJwaeIsRpzvSBYsG666rFnmliXMBQAuh9BwpRARL9UpvKFqLimTZvvBo
Title: Re: Physics & Mathematics
Post by: Crafty_Dog on October 18, 2023, 05:39:55 AM
Though well over my head with nary a look back, glad to see you posting such things for they too are part of what this forum is about.
Title: First Instance of AI Solving Unsolvable Math Problem
Post by: Body-by-Guinness on December 17, 2023, 09:17:43 AM
Expect reports like this to increase, though I wonder if mathematician’s ability to understand the proofs will keep pace:

https://www.technologyreview.com/2023/12/14/1085318/google-deepmind-large-language-model-solve-unsolvable-math-problem-cap-set/
Title: Simultaneous Realities?
Post by: Body-by-Guinness on January 25, 2024, 04:11:13 AM
Physicists conduct an experiment confirming a photon can be in two states at the same instant, raining on objective reality’s parade:

https://digitimed.com/two-contradictory-versions-of-reality-exist-simultaneously-in-quantum-experiment/?fbclid=IwAR3DnB-rdVisDFZyFY9BfGGVkQev1aJkd-3gurBrgHeSMx2FhkdAbroD4Fc
Title: Unifying Classic & Quantum Theories?
Post by: Body-by-Guinness on January 25, 2024, 09:11:46 AM
If this pans out it is a Very Big Deal, and indeed appears to have an intrinsic elegance I find intriguing:

https://charmingscience.com/breaking-new-theory-unites-einsteins-gravity-with-quantum-mechanics/
Title: Small Scale Gravity Measurements First Step in Unified Theory?
Post by: Body-by-Guinness on March 04, 2024, 04:59:33 PM
My grandfather was a self-taught engineer for Allis-Chalmers back in the day and was forever tinkering and conducting various home brew experiments, one of them that wasn’t too far removed from the one mentioned below. In his case he didn’t have the kind of small scale measurement accuracy shown here—he tore up his garage floor, placed a tub of water in the center of it he could float a large cork on, had q finned piece of metal on the cork, and poured concrete in such a way that clear channel “spokes” hit the tub at an acute angel. He developed a method to rotate the copper fins as precise speeds with the acute channels in the concrete and then against them, measuring the difference

Here the difference measured is a 1,000,000,000,000,000,000 of a Newton, which I imagine the friction coefficient of the water grandpop floated his cork in beats a million or so times over. Though he didn’t have the tools to measure things so precisely, it made me grin to read this and recall helping him pour concrete in his garage. It this stuff pans out it could help unify classic and quantum theories among other possibilities:

Gravity Experiments on the Kitchen Table: Why a Tiny, Tiny Measurement May Be a Big Leap Forward for Physics
Singularity Hub / by Sam Baron / Mar 4, 2024 at 3:08 PM
Just over a week ago, European physicists announced they had measured the strength of gravity on the smallest scale ever.

In a clever tabletop experiment, researchers at Leiden University in the Netherlands, the University of Southampton in the UK, and the Institute for Photonics and Nanotechnologies in Italy measured a force of around 30 attonewtons on a particle with just under half a milligram of mass. An attonewton is a billionth of a billionth of a newton, the standard unit of force.

The researchers say the work could “unlock more secrets about the universe’s very fabric” and may be an important step toward the next big revolution in physics.

But why is that? It’s not just the result: it’s the method, and what it says about a path forward for a branch of science critics say may be trapped in a loop of rising costs and diminishing returns.

Gravity

From a physicist’s point of view, gravity is an extremely weak force. This might seem like an odd thing to say. It doesn’t feel weak when you’re trying to get out of bed in the morning!

Still, compared with the other forces that we know about—such as the electromagnetic force that is responsible for binding atoms together and for generating light, and the strong nuclear force that binds the cores of atoms—gravity exerts a relatively weak attraction between objects.

And on smaller scales, the effects of gravity get weaker and weaker.

It’s easy to see the effects of gravity for objects the size of a star or planet, but it is much harder to detect gravitational effects for small, light objects.

The Need to Test Gravity

Despite the difficulty, physicists really want to test gravity at small scales. This is because it could help resolve a century-old mystery in current physics.

Physics is dominated by two extremely successful theories.

The first is general relativity, which describes gravity and spacetime at large scales. The second is quantum mechanics, which is a theory of particles and fields—the basic building blocks of matter—at small scales.

These two theories are in some ways contradictory, and physicists don’t understand what happens in situations where both should apply. One goal of modern physics is to combine general relativity and quantum mechanics into a theory of “quantum gravity.”

One example of a situation where quantum gravity is needed is to fully understand black holes. These are predicted by general relativity—and we have observed huge ones in space—but tiny black holes may also arise at the quantum scale.

At present, however, we don’t know how to bring general relativity and quantum mechanics together to give an account of how gravity, and thus black holes, work in the quantum realm.

New Theories and New Data

A number of approaches to a potential theory of quantum gravity have been developed, including string theory, loop quantum gravity, and causal set theory.

However, these approaches are entirely theoretical. We currently don’t have any way to test them via experiments.

To empirically test these theories, we’d need a way to measure gravity at very small scales where quantum effects dominate.

Until recently, performing such tests was out of reach. It seemed we would need very large pieces of equipment: even bigger than the world’s largest particle accelerator, the Large Hadron Collider, which sends high-energy particles zooming around a 27-kilometer loop before smashing them together.

Tabletop Experiments

This is why the recent small-scale measurement of gravity is so important.

The experiment conducted jointly between the Netherlands and the UK is a “tabletop” experiment. It didn’t require massive machinery.

The experiment works by floating a particle in a magnetic field and then swinging a weight past it to see how it “wiggles” in response.

This is analogous to the way one planet “wiggles” when it swings past another.

By levitating the particle with magnets, it can be isolated from many of the influences that make detecting weak gravitational influences so hard.

The beauty of tabletop experiments like this is they don’t cost billions of dollars, which removes one of the main barriers to conducting small-scale gravity experiments, and potentially to making progress in physics. (The latest proposal for a bigger successor to the Large Hadron Collider would cost $17 billion.)

Work to Do

Tabletop experiments are very promising, but there is still work to do.

The recent experiment comes close to the quantum domain, but doesn’t quite get there. The masses and forces involved will need to be even smaller to find out how gravity acts at this scale.

We also need to be prepared for the possibility that it may not be possible to push tabletop experiments this far.

There may yet be some technological limitation that prevents us from conducting experiments of gravity at quantum scales, pushing us back toward building bigger colliders.

Back to the Theories

It’s also worth noting some of the theories of quantum gravity that might be tested using tabletop experiments are very radical.

Some theories, such as loop quantum gravity, suggest space and time may disappear at very small scales or high energies. If that’s right, it may not be possible to carry out experiments at these scales.

After all, experiments as we know them are the kinds of things that happen at a particular place, across a particular interval of time. If theories like this are correct, we may need to rethink the very nature of experimentation so we can make sense of it in situations where space and time are absent.

On the other hand, the very fact we can perform straightforward experiments involving gravity at small scales may suggest that space and time are present after all.

Which will prove true? The best way to find out is to keep going with tabletop experiments, and to push them as far as they can go.

https://singularityhub.com/2024/03/04/gravity-experiments-on-the-kitchen-table-why-a-tiny-tiny-measurement-may-be-a-big-leap-forward-for-physics/
Title: Re: Physics & Mathematics
Post by: Crafty_Dog on March 04, 2024, 05:08:09 PM
Your Grandad sounds like quite the trip!
Title: Re: Physics & Mathematics
Post by: Body-by-Guinness on March 04, 2024, 05:43:29 PM
Your Grandad sounds like quite the trip!
His basement was had just about one of everything a mad scientist might need, none of it was kid proof, and I snuck down there all the time. When I think of all the exposed amperage and such I could get into there I’m kinda surprised I made it out alive.
Title: Understanding of Proton Structure Refined
Post by: Body-by-Guinness on March 14, 2024, 06:25:47 PM
Fascinating experiments provide a glimpse of proton structure and its constituent parts and forces:

https://www.quantamagazine.org/swirling-forces-crushing-pressures-measured-in-the-proton-20240314/
Title: Invisibility Shield?
Post by: Body-by-Guinness on March 28, 2024, 01:35:37 PM
I’m at a loss on where to best post this and am indeed tempted to start a “Stuff BBG Don’t Know Where to Put” thread, but will call this thread as close as I can get:

https://gearjunkie.com/technology/invisibility-shield-2-kickstarter

Back in my misspent youth this puppy would have come in handy when dodging the local constabulary….
Title: Re: Invisibility Shield?
Post by: DougMacG on March 28, 2024, 07:00:30 PM
Ok this is really cool.
Title: Re: Physics & Mathematics
Post by: Crafty_Dog on March 29, 2024, 05:48:02 AM
This thread will do  :-D  This one is an option too:

https://firehydrantoffreedom.com/index.php?topic=1385.msg13240#msg13240
Title: Quantum Compasses Catching Qubits?
Post by: Body-by-Guinness on May 22, 2024, 08:34:32 PM
Fascinating piece and interview with an author stalking dark matter:

https://www.quantamagazine.org/he-seeks-mystery-magnetic-fields-with-his-quantum-compass-20240517/
Title: James Clerk Maxwell
Post by: Body-by-Guinness on June 18, 2024, 07:43:34 AM
Sounds like a fellow we owe a lot:

Today is the birth anniversary of James Clerk Maxwell.

James Clerk Maxwell (1831-1879)was one of the greatest scientists of the nineteenth century. He is best known for the formulation of the theory of electromagnetism and in making the connection between light and electromagnetic waves. He also made significant contributions in the areas of physics, mathematics, astronomy and engineering. He considered by many as the father of modern physics.

Maxwell was born in Edinburgh, Scotland in 1831. Even though most of his formal higher education took place in London, he was always drawn back to his family home in the hills of Scotland. As a young child, Maxwell was fascinated with geometry and mechanical models. When he was only 14 years old, he published his first scientific paper on the mathematics of oval curves and ellipses that he traced with pins and thread. Maxwell continued to publish papers on a variety of subjects. These included the mathematics of human perception of colors, the kinetic theory of gases, the dynamics of a spinning top, theories of soap bubbles, and many others.

Maxwell's early education took place at Edinburgh Academy and the University of Edinburgh. In 1850 he went on to study at the University of Cambridge and, upon graduation from Cambridge, Maxwell became a professor of natural philosophy at Marischal College in Aberdeen until 1860. He then moved to London to become a professor of natural philosophy and astronomy at King's College. In 1865, Maxwell's father died and he returned to the family home in Scotland to devote his time to research. In 1871 he accepted a position as the first professor of experimental physics at Cambridge where he set up the world famous Cavendish Laboratory in 1874.

While at Aberdeen, Maxwell was challenged by the subject of the Adams Prize of 1857: the motion of Saturn's rings. He had previously thought and theorized about the nature of the rings when he was only 16 years old. He decided to compete for the prize, and the next two years were taken up with developing a theory to explain the physical composition of the rings. He was finally able to demonstrate, by purely mathematical reasoning, that the stability of rings could only be achieved if they consisted of numerous small particles. His theory won him the prize and, more significantly, nearly a hundred years later, the Voyager 1 space probe proved his theory right.

Much of modern technology has been developed from the basic principles of electromagnetism formulated by Maxwell. The field of electronics, including the telephone, radio, television, and radar, stem from his discoveries and formulations. While Maxwell relied heavily on previous discoveries about electricity and magnetism, he also made a significant leap in unifying the theories of magnetism, electricity, and light. His revolutionary work lead to the development of quantum physics in the early 1900's and to Einstein's theory of relativity.

Maxwell began his work in electromagnetism by extending Michael Faraday's theories of electricity and magnetic lines of force. He then began to see the connections between the approaches of Faraday, Reimann and Gauss. As a result, he was able to derive one of the most elegant theories yet formulated. Using four equations, he described and quantified the relationships between electricity, magnetism and the propagation of electromagnetic waves. The equations are now known as Maxwell's Equations.

One of the first things that Maxwell did with the equations was to calculate the speed of an electromagnetic wave and found that the speed of an electromagnetic wave was almost identical to the speed of light. Based on this discovery, he was the first to propose that light was an electromagnetic wave. In 1862 Maxwell wrote:

"We can scarcely avoid the conclusion that light consists in the transverse undulations of the same medium which is the cause of electric and magnetic phenomena."

This was a remarkable achievement, for it not only unifies the theories of electricity and magnetism, but of optics as well. Electricity, magnetism and light can now be understood as aspects of a single phenomenon: electromagnetic waves.
Maxwell also described the thermodynamic properties of gas molecules using statistical mechanics. His improvements to the kinetic theory of gases included showing that temperature and heat are caused only by molecular movement. Though Maxwell did not originate the kinetic theory, he was the first to apply probability and statistics to describe temperature changes at the molecular level. His theory is still widely used by scientists as a model for rarefied gases and plasmas.

Maxwell also contributed to the development of color photography. His analysis of color perception led to his invention of the trichromatic process. By using red, green and blue filters he created the first color photograph. The trichromatic process is the basis modern color photography.

Maxwell's particular gift was in applying mathematical reasoning in solving complex theoretical problems. Maxwell's Electromagnetic Equations are perfect examples of how mathematics can be used to provide relatively simple and elegant explanations of the complex mysteries of the universe. Richard Feynman wrote of Maxwell:

"From a long view of the history of mankind, seen from, say, ten thousand years from now, there can be little doubt that the most significant event of the nineteenth century will be judged as Maxwell's discovery of the laws of electrodynamics."

Maxwell continued his work at the Cavendish Laboratory until illness forced him to resign in 1879. He returned to Scotland and died soon afterwards. He was buried with little ceremony in a small churchyard in the village of Parton in Scotland.

Source:FSU
Title: Re: James Clerk Maxwell
Post by: DougMacG on June 18, 2024, 10:47:15 AM
Yes. Wow! Rock star of modern science.

Wish we had more like him today.
Title: Swimmers Using Olympic Level Math
Post by: Body-by-Guinness on July 10, 2024, 09:34:35 PM
A very interesting synthesis of math and athletic performance discussed here. It would be interesting to see these tools applied to martial arts:

https://www.quantamagazine.org/how-americas-fastest-swimmers-use-math-to-win-gold-20240710/
Title: Re: Physics & Mathematics
Post by: ccp on July 11, 2024, 06:09:30 AM
fascinating.
this is great

using math physics biometrics to get the human athlete to perform at maximum efficiency.

more honest and not cheating like steroids in my view.

the only sport this would not be useful for might be chess or poker.

might even have use in internet games by studying head, eye and hand motions
Title: Re: Physics & Mathematics
Post by: Crafty_Dog on July 11, 2024, 08:03:08 AM
I will give this a careful read.
Title: Cryptography and Secrets
Post by: Body-by-Guinness on August 02, 2024, 03:02:07 AM
Tempted to start a cryptography thread off this piece. Don’t know that it would attract a lot of posts, but the ones it did would likely be fascinating:

Can you keep a secret? Modern techniques for maintaining the confidentiality of information are based on mathematical problems that are inherently too difficult for anyone to solve without the right hints. Yet what does that mean when quantum computers capable of solving many problems astronomically faster are on the horizon? In this episode, host Janna Levin talks with computer scientist Boaz Barak about the cryptographic techniques that keep information confidential, and why “security through mathematics” beats “security through obscurity.”


[Theme plays]

JANNA LEVIN: We all have secrets we want to obscure, from childhood notes between friends, to Da Vinci’s notebooks, to the wartime messages famously cracked by Alan Turing and a cohort of English cryptographers. To share secrets with a friend, an ally, a co-conspirator, there is cryptography. There are codes and ciphers, ingenious means to safeguard information against prying eyes. But in lockstep, there are codebreakers and equally ingenious means to decipher the hidden information.

Cryptography has become crucial to modern life and commerce to protect our emails, our banks, and our national security. While developing more and more secure encryptions, researchers have recently made some unexpected discoveries that reveal deeper truths about the theoretical limits of secrecy itself.

I’m Janna Levin and this is “The Joy of Why,” a podcast from Quanta Magazine where I take turns at the mic with my cohost, Steve Strogatz, exploring the biggest questions in math and science today.

In this episode, theoretical computer scientist Boaz Barak demystifies cryptography as we ask: Is it possible to perfectly protect secrets?

[Theme fades out]

Boaz is the Gordon McKay professor of computer science at Harvard University. He’s also a member of the scientific advisory board for Quanta Magazine and the Simons Institute for the Theory of Computing. His research centers on cryptography, computational complexity, quantum computing and the foundations of machine learning.

Boaz, welcome to the show.


Boaz Barak
BOAZ BARAK: Thank you. Thank you very much.

LEVIN: So glad to have you. This is quite a challenging subject, and to open, I kind of want to do the opposite of encrypting this conversation. We’re here to share ideas. And so let’s start by opening the dictionary of terms here. What is cryptography? What are ciphers?

BARAK: So, cryptography’s meaning has really evolved over the years. I think in the early days, since humans began writing, they had this notion of secret writing, some ways to obscure their secrets. And cryptography was kind of synonymous with that, with basically encryption.

But then, more recently, since the 1970s, cryptography has really expanded and evolved, and it’s no longer just encryption, it’s also authentication — like digital signatures and even more, fancier things like zero-knowledge proofs, multiparty secure computation, and many other ways to basically protect not just communication but also computation.

LEVIN: So it’s as though we figured out how to write language, and then we decide sometimes we don’t want to share our inner thoughts, and so we write secretly. And that must go back quite a long way. So what are some of the earliest encryption and decryption techniques?

BARAK: So a famous example in encryption is Caesar’s cipher, which is attributed to Julius Caesar — I believe it has predated him — which is a very, very simple system of obscuring data or messages, where the idea is just you shift letters of the alphabet. So, for example, the letter A maps to, say, the letter F, the letter B maps to G, the letter C maps to H, et cetera. And this is a very simplistic approach, which is not that hard to break.

Generally, the Caesar cipher is a special case of what’s known as a substitution cipher, where you have some kind of a table mapping the letters of your message that you’re trying to encrypt. That is what we typically call the plaintext into the encrypted form, which we call the ciphertext.

And these types of substitution ciphers have been very common. One of the famous examples was used by Mary, the Queen of Scots, when she was planning a coup against her cousin, Elizabeth. And substitution ciphers are not very secure. You can typically break them. By just looking at the frequency of how many symbols appear in the cypher text, for example, you can figure out that the most frequent symbol is probably the encoding of E because that’s the most frequent letter in the English alphabet.

LEVIN: I was going to guess E.

BARAK: Yes. And using that, Queen Elizabeth’s spies managed to crack Mary’s cipher, and it didn’t end up well for Mary, who was executed.

And that has actually been kind of a typical story with encryption throughout the years, where someone comes up with an encryption scheme, they believe it is unbreakable, they use it for life-and-death applications. And it turns out that it is breakable. And typically when you use something for life-and-death applications and it doesn’t work, it doesn’t bode well for you.

LEVIN: Yeah. Dire consequences.  To speak to that point, the 20th century really was a critical point in cryptography. I mean, there were two world wars where encryption and decryption played a really major role. And at the same time, maybe as a result, the field of cryptography began to become a very serious and important subject, both for intellectual and scientific reasons, but also for survival — for the fate of the world, right?

And we mentioned one of the central figures, like Alan Turing, and his role famously in cracking the German Enigma cipher. What else was really shifting in the significance of cryptography in the 20th century?

BARAK: So, from a cryptography point of view, I think, yes, the Enigma cipher, which was believed to be unbreakable by the Germans, partly because if you were trying to figure out how many combinations were for the secret key of the Enigma, which involved setting wires of several rotors …

LEVIN: It was kind of like a typewriter, almost.

BARAK: Yes, it looked exactly like a typewriter. When I teach at Harvard, I always have to bring up a photo of a typewriter because now these days students don’t know what the typewriter looks like.

But it looked like a typewriter. But when you hit a key, then something else would come out. So you hit the letter A, maybe a letter Z would come out. And it wasn’t a simple substitution cipher in the sense that, say, if you hit the letter A again, then another letter would come out. So the state of the system would be constantly evolving, which made it much harder to break.

And the secret key was actually the setting of the wires of the rotors inside this typewriter. So there were like several rotors, which were wired in a certain way. And the number of possibilities for the secret key was absolutely enormous, was something like 10100, which even with today’s computers, if we were trying to break the Enigma by brute force, by simply trying out all possibilities, the sun would die out and collapse before we were done.

And let alone the computing devices that they had in the 1940s. So it took a lot of mathematical ingenuity that was done by Alan Turing, many other people at Bletchley Park, and even before that, uh, some insights were done by the Polish intelligence services before that, to actually break the Enigma.

And one of the lessons that cryptography took from that is that trying to build security by having a very complicated system like the Enigma is actually not going to succeed. Cryptography transitioned into relying in some sense on simpler systems, but with a more sound mathematical analysis. And a key figure in bringing about the mathematical analysis of cryptography was Claude Shannon, who in the late 1940s wrote some influential papers, starting with a mathematical theory of encryption.

And then in the 1970s, people like [Whitfield] Diffie, [Martin] Hellman and [Ralph] Merkel — these are three separate people — started with the mathematical theory of public key cryptography, which was then built up. And really in the late ’70s and early ’80s we started to have a mathematical theory of cryptography that, rather than being based on obscure problems, like the Enigma, was actually based on very simple problems, like say the problem of finding the prime factors of large integers, that have been around for thousands of years, but which, despite this history, we still don’t know an efficient algorithm for.

LEVIN: Now, it’s interesting that Alan Turing comes up not only in these world changing crises over cracking codes, but also he is well known as the inventor of the modern concept of computation at all. You know, Turing thought about mechanizing thought, and it led him to the notion of a computer that wasn’t a human, which is how the term “computer” had originally been used. Computers were people who computed, and Alan Turing changed that to the idea of a machine that could compute. So you’re talking about these wonderful, theoretical changes. How did cryptography change with the advent of actual machines that we now call computers?

BARAK: So, indeed, I think some of the earliest mechanical computers were built in Bletchley Park exactly for the task of automating some of the analysis in breaking the Enigma and other ciphers. And Alan Turing had a broader vision than that. So specific computing devices have always been around to mechanize computation to some extent.

But Alan Turing had this broader vision of a general-purpose computer. I should say that Charles Babbage had this same vision maybe 70 years earlier, but he had never gotten around to actually building the device.

And Turing had this idea that you could actually build this device that would be general purpose. In some sense, there is a notion of a universal computer, or as we call it today, a universal Turing machine, that is the same piece of hardware that can run arbitrary functionality by basically being supplied with software.

LEVIN: It is quite an amazing evolution. So now here we are, and cryptography plays a part in nearly everything we do — whether we’re fully aware of it or not. I mean, we use encryption to hide private messages, of course, but also to secure information, to compact it and secure it. It shows up in telecommunications, medical records, how photos are stored on our phones. So tell us a little bit about how cryptography is integrated into all of our everyday lives.

BARAK: Yeah. So people don’t realize, for example, the conversations such as the one that you and I are having, and millions of people are having, using Zoom or other telecommunication framework, they often use wireless connections, which basically means that the signals that we are transmitting are going through the air and anyone can pick them up. The reason our conversations are still secure is because they are encrypted.

Now, also, all of us basically carry around a device that both stores all of our private information and has a microphone and a camera that can record, potentially, all of our communication.

And, moreover, this device is a fully programmable computer. All our smartphones are fully programmable computers. And they can get over-the-air updates to completely change their functionality. The reason that, say, some hackers don’t send us over-the-air updates that can convert our smartphones into listening, recording and then surveillance devices is because we use cryptography, and specifically digital signatures so that the device — even if it gets some piece of software update — can verify using digital signatures that the software update actually came from the manufacturer.

LEVIN: So fascinating that all of this is really part of our absolutely ordinary, everyday lives, not just life or death stuff. We’re going to take a little break and be right back after this message.

[Break insertion for ads]

LEVIN: Welcome back to “The Joy of Why.” We’re speaking with Boaz Barak about the art and science of cryptography.

Now, you’ve mentioned some of the ways where today we’re trying to protect information — and why we’re trying to protect information. So, what still makes us vulnerable? I mean, we have these algorithms, we have data that we can encrypt, we have all kinds of rules about passwords and user privacy. But what makes a system vulnerable? And how do they typically break?

BARAK: So that’s a great question. So there are generally two ways in which systems can be broken. So first of all, while we have these great encryption schemes, we actually don’t have a mathematical proof that they are secure. And proving that they are secure is actually related to some of the greatest open questions in computer science and science at large, and specifically the P-versus-NP question.

LEVIN: Can you remind us what P and NP mean?

BARAK: Yes. So the P means polynomial time and NP means non-deterministic polynomial time.

So the P refers to the class of problems that can be solved efficiently by a computing device, whether that computing device is our standard digital computer or any other computing device. And NP refers to the class that can be verified by computing device. So the P-versus-NP question really asks whether every problem whose solution can be efficiently verified can actually be also efficiently found.

Now, intuitively, we think that P should be different than NP, that there are some problems where it’s much easier to verify a solution once someone gave it to you than to find it yourself from scratch. But we actually don’t have a mathematical proof that this is the case, even though it is widely conjectured.

And one example of a problem that is of this type is if someone gave you the secret key that could decrypt all of the communication between two parties, then you could easily verify that the secret key actually works. But finding out the secret key can take time, which, as I said, could potentially be longer than the time that it would take for the sun to die out.

So if P equals NP, then in some sense we could break every possible encryption scheme. But it is widely conjectured that it is not. So we don’t have a proof that the encryption schemes that we use are unbreakable. And once in a while, people do manage to break the underlying cryptography. Although at least the main cryptosystems that we are using basically have been unbroken since the 1970s.

But it’s far more common to go around the cryptography. And that’s how hackers actually break into our systems. So one metaphor that I like to use is that cryptography, when properly implemented, is basically like a hundred-ton steel door, but the systems where we use it are basically like a wooden shack. So if you’re installing like a hundred-ton steel door on a wooden shack, then yes, a thief would not be able to break the door, but they might find a different way around it.

And the sheer complexity of modern systems means that it’s really hard to secure all sides of them. So hackers typically don’t break the cryptography, at least when it’s properly implemented, but go around the cryptography.

LEVIN: Fascinating, because a lot of this also dates back to those very deep concepts in mathematics about what’s provable and unprovable. And in a way, this is kind of the computing manifestation of that. That’s a whole other episode. Very deep stuff.

So before the 1970s, most cryptography was symmetric in the sense that the same key would be used to encrypt and decrypt a message. Is that what you’re referring to, that since the 1970s, cryptography is asymmetric, where there’s some key used for encryption and a private key used for decryption?

BARAK: Yes. So this was one of the major changes that happens in the 1970s. So before the 1970s, indeed, cryptography was basically what we call private key cryptography, where the sender and receiver share a secret key that they use for communication.

And this kind of worked okay for the military applications of cryptography, where you have a spy, and maybe before you send them off to a mission you give them a code book that they would use to communicate with the home base. But it doesn’t really work with the modern economic applications of cryptography.

So I don’t know about you, but, you know, I’m a Gmail user and I rely on encrypted communication between me and the Google servers to look at my email securely. But I’ve never actually paid a visit to Google headquarters to exchange with them a private key. And if every user of Google had to physically exchange a key with the Google service, the modern internet would not exist.

So in public key cryptography, two parties can communicate over an unsecured channel and exchange confidential information by basically having one party, let’s say the receiver, send their public key to the sender. And then the sender can use that public key to encrypt their message, send it over the unsecured channel. And the receiver can use their secret key to decrypt it.

Point being that, yes, people could be listening on this channel and they could all learn the public key — but the public key can only be used for encryption. It cannot be used for decryption.

LEVIN: It does make me wonder, though — all of that’s quite amazing. That’s such tremendous progress. We exchanged Gmails today and I feel pretty confident nobody read our Gmails, not least because they weren’t that interesting, right? “I’ll see you there then,” you know, “I’m running late,” whatever. But are there theoretical limits to cryptography? And can things truly be unconditionally secure?

BARAK: So there are two answers to this question.

First of all, yes, it is known that public key cryptography can never be unconditionally secure in the sense that it could always be broken by basically trying all possible keys. But trying all possible keys is an effort that scales exponentially with the key size. So, at very moderate key sizes, that already requires, say, spending more cycles than there are atoms in the observable universe, or similarly astronomical quantities, which you are probably more familiar than me. So that’s not really a barrier.

So, theoretically, we could have a mathematical theorem that tells us that this cryptosystem cannot be broken by any attacker that would spend less than, say, a number of operations that scales like 2n, where n is the length of the key.

We don’t have a theorem like that. And the reason is that such a theorem would in particular also imply that P is different from NP, which is a major unsolved problem that we haven’t been able to solve. So at the moment, we have to settle for conditional proofs of security that are conditional based on certain conjectures.

LEVIN: Now, there’s also twists on this whole idea. Sometimes I don’t want to completely obscure everything that’s going on. Sometimes I want to let you know I know something. But I don’t necessarily want to reveal all, right? I believe these are known as zero-knowledge proofs. Can you expand on that? Explain to me why sometimes I want you to know I know, I want to prove to you that I know. I don’t want you to just take my word for it without revealing the information I’m protecting.

BARAK: Sure. Actually, let me give you an example in the context of nuclear disarmament. You know, you have these, say Russia and the U.S., both have a huge stockpile of nuclear warheads, far more than necessary to destroy, you know, major parts of the world several times over. It’s very expensive. It’s actually in both countries’ interest to reduce this stockpile — and it’s also in the interest, obviously, of world safety, because the less warheads out there, the less likely that we would have like a completely devastating nuclear war.

But part of the problem is that this is an equilibrium, and it’s hard to agree on reducing the stockpiles. Another part is, how do you verify that the other side really did, say, destroy the warheads? One solution to that is simple. You know, say, for example, the Russians declare we are going to destroy these hundred warheads.

They invite American inspectors to come to the warehouse where the warheads are stored and take a look at them, examine them, and then they go into whatever machine that destroys all of these things. That is great — except that the design of a nuclear warhead is one of the most classified secrets that the country has. And the Russians have no intention of letting American inspectors anywhere near opening up the warhead and examining it.

So then it becomes a question of, say, I have a barrel. Can I prove to you that there is a nuclear warhead inside the barrel without giving you any details of the exact design of this warhead?

And this is the type of question that zero-knowledge proofs are designed to address. So, you want to, say, prove that something satisfies a certain predicate. So, for example, this barrel contains a nuclear warhead, or maybe this number is a composite number, or it’s a composite number where one of the moduli has the last digit of 7.

So, you have a certain object, and you want to prove that it satisfies a certain predicate without giving away the information such as the design of the warhead or the factorization of the number that really proves why the object satisfies this predicate.

LEVIN: Fascinating example, and unfortunately timely. Does this relate to the concept of “obfuscation” that I’ve been reading about?

BARAK: So obfuscation is kind of a vast generalization of a lot of things, including zero-knowledge proofs and others. Basically, the idea is, could you take, say, a computer program and transform it into a way that it will become like a virtual black box, in a sense that you will be able to run it, but you will not be able to examine its internals.

So you could have, say, some computer program that potentially takes as input some of the secret information, but only produces like a 0 or 1 bit — is the secret information satisfies a certain predicate or not? So obfuscation can be used to achieve zero-knowledge proofs. It can be used to achieve a lot of other cryptographic primitives.

And one of them, for example, is the idea of secure multiparty computation, where the idea is, maybe, for example, you have a certain private input. I have a certain private input. Maybe, you know, you are a hospital and you have your own patient records; I am another hospital, we have our own patient records. For reasons of patient confidentiality, we cannot share with each other the patient records. But could we run some computation that at the end of it, we will learn some statistical information about both of the records without revealing any of the secret information? And this falls under secure multiparty computation.

LEVIN: Now, in particular, I’ve read the phrase “indistinguishability obfuscation.” Doesn’t exactly roll off the tongue, but I believe I’ve heard you refer to it as, you know, “the one cryptographic primitive to rule them all.”

BARAK: Yes. So obfuscation in some sense, if you could have it generically, you could basically do anything you want in cryptography. And unfortunately, in 2001, I and some colleagues proved that the most natural notion of obfuscation — virtual black-box obfuscation, which is kind of a mathematical translation of what I said before, that you take a program and you compile it in such a way that it is virtually a black box — so we proved that that is impossible to achieve.

But then we said, there are weaker notions of obfuscation, in particular this notion of indistinguishability obfuscation, which our impossibility proof didn’t apply to. But we had no idea whether that notion of indistinguishability obfuscation is possible to achieve and, if so, whether it would actually be useful for all of the applications.

So then in 2013, there was a breakthrough showing that, yes, indistinguishability obfuscation can be constructed and, in fact, that it can be useful for many applications. And since then there’s been a steady stream of works showing more and more applications of indistinguishability obfuscation, and also works trying to make it not just theoretically possible, but also practically feasible.

The latter is still very much a work in progress. So right now, the overhead is such that we cannot use it. But the theoretical feasibility shows that perhaps in the future, we will be able to improve the efficiency and make it usable.

LEVIN: Now, there’s this beautiful thing on the horizon that keeps looming and receives a lot of discussion. And that’s quantum computers, of which we have no examples yet. But how would these methods fare in a quantum world against quantum technologies?

BARAK: So this is a fascinating question, and actually a very timely one at the time we are recording this interview. So, first of all, quantum computers are still computers. They can be mathematically modeled, and while they appear to give exponential speedups for some very structured computational problems, they are not a magic bullet that can speed up any computation whatsoever. In particular, essentially all of the secret key encryptions that we currently use will remain secure also against quantum computers.

However, maybe the most important and famous problem for which quantum computers can offer an exponential speedup is the integer factoring problem, which lies at the heart of the RSA encryption scheme.

That means that if scalable quantum computers are built, then all the public encryption schemes that are based on these number theoretic objects — such as RSA system, based on integer factoring, Diffie-Hellman, based on the discrete logarithm, and its variant based on elliptic curves — will all be broken.

Unfortunately, it’s very hard to construct a public key encryption, so we don’t have many candidates. This is one major family of candidates, and there is one other major family of candidates for public key encryptions, which is based on problems relating to lattices. So we do have these problems-based lattices that are not known to be broken by quantum computers. So these form the basis for what is sometimes known as post-quantum cryptography, at least in the public encryption setting.

LEVIN: Now, you’ve already mentioned a number of major other techniques used in cryptography, and we don’t really have a chance to pick apart each of these. What does the lattice mean, or the integer approach? But I think we get the impression. They’re rooted deeply in fundamental mathematics, I think, is an important point, which maybe not everyone realizes. And that the tools are somehow kind of fundamental in some way also to how nature encodes math and how math encodes information in a deep way. And that a lot of these complex techniques go back to fundamental mathematics and keep relying on that sort of core discipline to progress.

BARAK: Absolutely. And one of the things I tell students when I lecture on cryptography is that we have moved from a “security through obscurity” to “security through mathematics.”

And it turns out that security through mathematics is much more reliable. So, even though, today, attackers have access to much higher amounts of computational resources that far dwarf the resources that people had in Bletchley Park, we still have cryptographic schemes that nobody has been able to break, and as I said, therefore attackers always go around the cryptography.

And the reason is that rather than trying to build very complicated esoteric systems like the Enigma, we are relying on simpler principles, but applied for mathematical understanding. And that is how we are getting more reliably secure.

LEVIN: Yeah, that’s a big change. So I guess it’s natural for me to ask here — maybe you’ve already implicitly answered that in prompting this question — but where is cryptography headed next?

Barak: So there are maybe four different research strands in cryptography. One is expanding the reach of cryptography. So going beyond, say, secret writing to public key encryption, and then to even fancier things like zero-knowledge proofs, multiparty secure computation, obfuscation.

The second is bringing things from theory to practice. So taking some theoretical constructions that initially are far too much overhead to be used in practice and improving it.

The third is sharpening our understanding of the cryptographic schemes that we currently have by attempting to break them. So cryptoanalysis and attempting to break cryptographic schemes is also important area of research.

And the fourth is really understanding the basic assumptions that we are using in cryptography. Things like integer factoring, lattice-based assumptions, and maybe we can find new assumptions. And trying to understand their validity and whether we can provide rigorous evidence for their correctness, or show that they are incorrect.

LEVIN: It’s a field that’s taken many turns, from the kind of natural instinctive urge to have secret notes to the depths of mathematics to computing to quantum computing. It’s really a fascinating subject. And, at this point, we have a question we like to ask, which is, what about this research brings you joy?

BARAK: When I started as a graduate student in the Weizmann Institute of Science, I didn’t actually intend to study cryptography. I didn’t know much about cryptography, and I thought it was a very technical field just having to do with number theory, which is nice, but not something that I was passionate about.

The thing that kind of blew my mind was when I took a cryptography class with Oded Goldreich, who became my advisor, and I realized that you could actually mathematically define what it means to be secure. And I kind of just found it fascinating that this intuitive notion that I thought that had had no bearing with formal mathematics can actually be captured by mathematics, and then we can actually prove things about it.

And this is the thing that I still find so fascinating about cryptography that it brings math into places where we didn’t really think it would hold.

LEVIN: And it reminds me of these very deep ideas of a hundred years ago that we can prove that there are unknowable facts about math.

BARAK: Yes, some of the techniques are actually sometimes similar.

Maybe another reason why I find, kind of, cryptography fascinating is that I think as human beings and as scientists, we have a long history of being fascinated by impossibility results. So there was, you know, the impossibility of deriving the parallel postulate from the other axioms of geometry, impossibility of trisecting an angle with just a compass and a straight edge, and of course, Gödel’s theorems of impossibilities of proving all true facts about mathematics.

But cryptography is about the practical application of impossibility, which I kind of find really fascinating, that we take what we think of as fundamentally negative results that are just for intellectual curiosity and would have no practical implication whatsoever, and we turn them into something that we actually apply and are using every day to do commerce over the internet.

LEVIN: So compelling. Thank you so much. We’ve been speaking with computer scientist Boaz Barak about cryptography in the modern era. Boaz, thank you for joining us on “The Joy of Why.”

BARAK: Thank you very much.

[Theme plays]

LEVIN: Thanks for listening. If you’re enjoying “The Joy of Why” and you’re not already subscribed, hit the subscribe or follow button where you’re listening. You can also leave a review for the show — it helps people find this podcast.

“The Joy of Why” is a podcast from Quanta Magazine, an editorially independent publication supported by the Simons Foundation. Funding decisions by the Simons Foundation have no influence on the selection of topics, guests or other editorial decisions in this podcast or in Quanta Magazine.

RELATED:
Complexity Theory’s 50-Year Journey to the Limits of Knowledge
Cryptographers Discover a New Foundation for Quantum Secrecy
Mathematicians Seal Back Door to Breaking RSA Encryption
How Claude Shannon Invented the Future
“The Joy of Why” is produced by PRX Productions. The production team is Caitlin Faulds, Livia Brock, Genevieve Sponsler and Merritt Jacob. The executive producer of PRX Productions is Jocelyn Gonzales. Morgan Church and Edwin Ochoa provided additional assistance.

From Quanta Magazine, John Rennie and Thomas Lin provided editorial guidance, with support from Matt Carlstrom, Samuel Velasco, Arleen Santana and Meghan Willcoxon.

Samir Patel is Quanta’s Editor in Chief.

Our theme music is from APM Music. Julian Lin came up with the podcast name. The episode art is by Peter Greenwood and our logo is by Jaki King and Kristina Armitage. Special thanks to the Columbia Journalism School and Bert Odom-Reed at the Cornell Broadcast Studios.

I’m your host, Janna Levin. If you have any questions or comments for us, please email us at quanta@simonsfoundation.org. Thanks for listening.

[Theme

https://www.quantamagazine.org/how-does-math-keep-secrets-20240801/
Title: Re: Physics & Mathematics
Post by: Crafty_Dog on August 02, 2024, 12:56:43 PM
See Reply 505 at

https://firehydrantoffreedom.com/index.php?topic=1586.msg115798;topicseen#msg115798

Also use the search function for Cryptography, Cryptogram, Crypto-Gram, Bruce Schier

Then see if any of the existing threads would serve your purpose (perhaps with the addition of a term in the Subject line?) or if I should have the webmaster merge some threads or, if not, then start a new thread.
Title: Re: Physics & Mathematics
Post by: Body-by-Guinness on August 02, 2024, 07:49:22 PM
See Reply 505 at

https://firehydrantoffreedom.com/index.php?topic=1586.msg115798;topicseen#msg115798

Also use the search function for Cryptography, Cryptogram, Crypto-Gram, Bruce Schier

Then see if any of the existing threads would serve your purpose (perhaps with the addition of a term in the Subject line?) or if I should have the webmaster merge some threads or, if not, then start a new thread.

I searched the term “crypto” and here’s what came up:

https://firehydrantoffreedom.com/index.php?action=search2
Title: Re: Physics & Mathematics
Post by: Crafty_Dog on August 04, 2024, 04:42:00 PM
This is another one for me  to clean up when I get back.
Title: This is gigantic if true - toward a unified field theory
Post by: ccp on October 23, 2024, 06:18:28 AM
https://www.msn.com/en-us/news/technology/revolutionary-new-theory-finally-unites-quantum-mechanics-and-einstein-s-theory-of-general-relativity/ar-AA1sKLme?ocid=msedgntp&pc=DCTS&cvid=a7f292c62db141d595e83c7390899d1a&ei=14

I've long dreamed that in my new life I would be smart enough to find this.

I haven't read entire article (and would not really understand  anyway  :-o) but it sounds cool
Title: The Universe Understood as a Mirror Metaphor
Post by: Body-by-Guinness on October 29, 2024, 03:33:49 PM
The “mirror universe” theory posited here is intriguing in its elegance and simplicity. In particular I like how it rids us of string theory, something that always struck me as a fictional mechanism meant to impose a structure, even a not particularly useful one, on a poorly understood area of physics. Here a mirror metaphor is used to find a needed yet elusive symmetry.

Did the Early Cosmos Inflate Like a Balloon? A Mirror Universe Going Backwards in Time May Be a Simpler Explanation

Singularity Hub / by Neil Turok / Oct 29, 2024 at 4:53 PM

We live in a golden age for learning about the universe. Our most powerful telescopes have revealed that the cosmos is surprisingly simple on the largest visible scales. Likewise, our most powerful “microscope,” the Large Hadron Collider, has found no deviations from known physics on the tiniest scales.

These findings were not what most theorists expected. Today, the dominant theoretical approach combines string theory, a powerful mathematical framework with no successful physical predictions as yet, and “cosmic inflation”—the idea that, at a very early stage, the universe ballooned wildly in size. In combination, string theory and inflation predict the cosmos to be incredibly complex on tiny scales and completely chaotic on very large scales.

The nature of the expected complexity could take a bewildering variety of forms. On this basis, and despite the absence of observational evidence, many theorists promote the idea of a “multiverse”: an uncontrolled and unpredictable cosmos consisting of many universes, each with totally different physical properties and laws.

So far, the observations indicate exactly the opposite. What should we make of the discrepancy? One possibility is that the apparent simplicity of the universe is merely an accident of the limited range of scales we can probe today, and that when observations and experiments reach small enough or large enough scales, the asserted complexity will be revealed.

The other possibility is that the universe really is very simple and predictable on both the largest and smallest scales. I believe this possibility should be taken far more seriously. For, if it is true, we may be closer than we imagined to understanding the universe’s most basic puzzles. And some of the answers may already be staring us in the face.

The Trouble With String Theory and Inflation

The current orthodoxy is the culmination of decades of effort by thousands of serious theorists. According to string theory, the basic building blocks of the universe are minuscule, vibrating loops and pieces of sub-atomic string. As currently understood, the theory only works if there are more dimensions of space than the three we experience. So, string theorists assume that the reason we don’t detect them is that they are tiny and curled up.

Unfortunately, this makes string theory hard to test, since there are an almost unimaginable number of ways in which the small dimensions can be curled up, with each giving a different set of physical laws in the remaining, large dimensions.

Meanwhile, cosmic inflation is a scenario proposed in the 1980s to explain why the universe is so smooth and flat on the largest scales we can see. The idea is that the infant universe was small and lumpy, but an extreme burst of ultra-rapid expansion blew it up vastly in size, smoothing it out and flattening it to be consistent with what we see today.

Inflation is also popular because it potentially explains why the energy density in the early universe varied slightly from place to place. This is important because the denser regions would have later collapsed under their own gravity, seeding the formation of galaxies.

Over the past three decades, the density variations have been measured more and more accurately both by mapping the cosmic microwave background—the radiation from the big bang—and by mapping the three-dimensional distribution of galaxies.

In most models of inflation, the early extreme burst of expansion which smoothed and flattened the universe also generated long-wavelength gravitational waves—ripples in the fabric of space-time. Such waves, if observed, would be a “smoking gun” signal confirming that inflation actually took place. However, so far the observations have failed to detect any such signal. Instead, as the experiments have steadily improved, more and more models of inflation have been ruled out.

Furthermore, during inflation, different regions of space can experience very different amounts of expansion. On very large scales, this produces a multiverse of post-inflationary universes, each with different physical properties.

The history of the universe according to the model of cosmic inflation.
The history of the universe according to the model of cosmic inflation. Image Credit: Wikipedia, CC BY-SA
The inflation scenario is based on assumptions about the forms of energy present and the initial conditions. While these assumptions solve some puzzles, they create others. String and inflation theorists hope that somewhere in the vast inflationary multiverse, a region of space and time exists with just the right properties to match the universe we see.

However, even if this is true (and not one such model has yet been found), a fair comparison of theories should include an “Occam factor,” quantifying Occam’s razor, which penalizes theories with many parameters and possibilities over simpler and more predictive ones. Ignoring the Occam factor amounts to assuming that there is no alternative to the complex, unpredictive hypothesis—a claim I believe has little foundation.

Over the past several decades, there have been many opportunities for experiments and observations to reveal specific signals of string theory or inflation. But none have been seen. Again and again, the observations turned out simpler and more minimal than anticipated.

It is high time, I believe, to acknowledge and learn from these failures and to start looking seriously for better alternatives.

A Simpler Alternative

Recently, my colleague Latham Boyle and I have tried to build simpler and more testable theories that do away with inflation and string theory. Taking our cue from the observations, we have attempted to tackle some of the most profound cosmic puzzles with a bare minimum of theoretical assumptions.

Our first attempts succeeded beyond our most optimistic hopes. Time will tell whether they survive further scrutiny. However, the progress we have already made convinces me that, in all likelihood, there are alternatives to the standard orthodoxy—which has become a straitjacket we need to break out of.

I hope our experience encourages others, especially younger researchers, to explore novel approaches guided strongly by the simplicity of the observations—and to be more skeptical about their elders’ preconceptions. Ultimately, we must learn from the universe and adapt our theories to it rather than vice versa.

Boyle and I started out by tackling one of cosmology’s greatest paradoxes. If we follow the expanding universe backward in time, using Einstein’s theory of gravity and the known laws of physics, space shrinks away to a single point, the “initial singularity.”

In trying to make sense of this infinitely dense, hot beginning, theorists including Nobel laureate Roger Penrose pointed to a deep symmetry in the basic laws governing light and massless particles. This symmetry, called “conformal” symmetry, means that neither light nor massless particles actually experience the shrinking away of space at the big bang.

By exploiting this symmetry, one can follow light and particles all the way back to the beginning. Doing so, Boyle and I found we could describe the initial singularity as a “mirror”: a reflecting boundary in time (with time moving forward on one side, and backward on the other).

Picturing the big bang as a mirror neatly explains many features of the universe which might otherwise appear to conflict with the most basic laws of physics. For example, for every physical process, quantum theory allows a “mirror” process in which space is inverted, time is reversed, and every particle is replaced with its anti-particle (a particle similar to it in almost all respects, but with the opposite electric charge).

According to this powerful symmetry, called CPT symmetry, the “mirror” process should occur at precisely the same rate as the original one. One of the most basic puzzles about the universe is that it appears to violate CPT symmetry because time always runs forward and there are more particles than anti-particles.

Our mirror hypothesis restores the symmetry of the universe. When you look in a mirror, you see your mirror image behind it: if you are left-handed, the image is right-handed and vice versa. The combination of you and your mirror image are more symmetrical than you are alone.

Likewise, when Boyle and I extrapolated our universe back through the big bang, we found its mirror image, a pre-bang universe in which (relative to us) time runs backward and antiparticles outnumber particles. For this picture to be true, we don’t need the mirror universe to be real in the classical sense (just as your image in a mirror isn’t real). Quantum theory, which rules the microcosmos of atoms and particles, challenges our intuition so at this point the best we can do is think of the mirror universe as a mathematical device which ensures that the initial condition for the universe does not violate CPT symmetry.

Surprisingly, this new picture provided an important clue to the nature of the unknown cosmic substance called dark matter. Neutrinos are very light, ghostly particles which, typically, move at close to the speed of light and which spin as they move along, like tiny tops. If you point the thumb of your left hand in the direction the neutrino moves, then your four fingers indicate the direction in which it spins. The observed, light neutrinos are called “left-handed” neutrinos.

Heavy “right-handed” neutrinos have never been seen directly, but their existence has been inferred from the observed properties of light, left-handed neutrinos. Stable, right-handed neutrinos would be the perfect candidate for dark matter because they don’t couple to any of the known forces except gravity. Before our work, it was unknown how they might have been produced in the hot early universe.

Our mirror hypothesis allowed us to calculate exactly how many would form and to show they could explain the cosmic dark matter.

A testable prediction followed: If the dark matter consists of stable, right-handed neutrinos, then one of three light neutrinos that we know of must be exactly massless. Remarkably, this prediction is now being tested using observations of the gravitational clustering of matter made by large-scale galaxy surveys.

The Entropy of Universes

Encouraged by this result, we set about tackling another big puzzle: Why is the universe so uniform and spatially flat, not curved, on the largest visible scales? The cosmic inflation scenario was, after all, invented by theorists to solve this problem.

Entropy is a concept which quantifies the number of different ways a physical system can be arranged. For example, if we put some air molecules in a box, the most likely configurations are those which maximize the entropy—with the molecules more or less smoothly spread throughout space and sharing the total energy more or less equally. These kinds of arguments are used in statistical physics, the field which underlies our understanding of heat, work, and thermodynamics.

The late physicist Stephen Hawking and collaborators famously generalized statistical physics to include gravity. Using an elegant argument, they calculated the temperature and the entropy of black holes. Using our “mirror” hypothesis, Boyle and I managed to extend their arguments to cosmology and to calculate the entropy of entire universes.

To our surprise, the universe with the highest entropy (meaning it is the most likely, just like the atoms spread out in the box) is flat and expands at an accelerated rate, just like the real one. So statistical arguments explain why the universe is flat and smooth and has a small positive accelerated expansion, with no need for cosmic inflation.

How would the primordial density variations, usually attributed to inflation, have been generated in our symmetrical mirror universe? Recently, we showed that a specific type of quantum field (a dimension zero field) generates exactly the type of density variations we observe, without inflation. Importantly, these density variations aren’t accompanied by the long wavelength gravitational waves which inflation predicts—and which haven’t been seen.

These results are very encouraging. But more work is needed to show that our new theory is both mathematically sound and physically realistic.

Even if our new theory fails, it has taught us a valuable lesson. There may well be simpler, more powerful and more testable explanations for the basic properties of the universe than those the standard orthodoxy provides.

By facing up to cosmology’s deep puzzles, guided by the observations and exploring directions as yet unexplored, we may be able to lay more secure foundations for both fundamental physics and our understanding of the universe.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Image Credit: The mirror universe, with the big bang at the center / Neil Turok, CC BY-SA

https://singularityhub.com/2024/10/29/did-the-early-cosmos-inflate-like-a-balloon-a-mirror-universe-going-backwards-in-time-may-be-a-simpler-explanation/
Title: Settled: No Number of Monkeys will Ever Write the Works of Shakespeare
Post by: Body-by-Guinness on November 01, 2024, 03:26:32 PM
Glad we got that out of the way:

https://www.bbc.com/news/articles/c748kmvwyv9o
Title: A Physics PhD and Dog Brother friend comments
Post by: Crafty_Dog on November 04, 2024, 05:14:44 AM


Hey Marc - great catching up with you today.

Quick feedback on the article - its definitely interesting. Worth reading. However.... (long pause) .... it's just a "theory". Actually, it's just an "hypothesis" - something is called a "theory" when its predictions match experimentally observed results. The theorists say they don't have any results (yet). Though they say they "rigorously tested the theory" - meaning they checked the mathematics diligently. But no data yet.

From the article:
"To put their theory to the test, the researchers propose a groundbreaking experiment aimed at detecting fluctuations in mass over time."

There are THOUSANDS of candidates for a Grand Unified Theory so this is another one. It may be really interesting, it may be catching the eye of the media and other scientists. All good. But until it actually predicts something and that shows up experimentally, its just another "theory" (or rather "hypothesis"). Einstein's Theories of Special Relativity and General Relativity are notable ONLY because experiments have shown that they correctly predict real phenomena.

NOTE: The experimentalist who proves a theory usually wins the Nobel Prize, not the theorist. That's because there are far more theories than good experiments to test them.

BUT... I will say that it is notable that they think they CAN do experiments to test it - lots of famous "theories" like String Theory cannot be tested. Sadly.