The Interesting, The Strange, The News.

Archive for the ‘Physics’ Category

Scientists plan $1.5bn laser strong enough ‘to tear the fabric of space

leave a comment »

A laser powerful enough to tear apart the fabric of space could be built in Britain.

The major scientific project will follow in the footsteps of the Large Hadron Collider and will answer questions about the universe.

The laser will be capable of producing a beam of light so intense that it will be similar to the light the earth receives from the sun but focused on a speck smaller than a pin prick.

Extreme: A laser powerful enough to tear apart the fabric of space could be built in Britain

Extreme: A laser powerful enough to tear apart the fabric of space could be built in Britain.

Scientists say it will be so powerful they will be able to boil the very fabric of space and create a vacuum.

A vacuum fizzles with mysterious particles that come in and out of existence but the phenomenon happens so fast that no-one has ever actually been able to prove it.

It is hoped the Extreme Light Infrastructure Ultra-High Field Facility would allow scientists to prove the particles are real by pulling the vacuum fabric apart.

Scientists even believe it might help them to prove whether other dimensions actually exist.

This latest experiment will follow the footsteps of the Large Hadron Collider and be the next big scientific experiment

This latest experiment will follow the footsteps of the Large Hadron Collider and be the next big scientific experiment.

Professor John Collier, a scientific leader for the ELI project and director of the Central Laser Facility at Rutherford Appleton Laboratory in Didcot, Oxfordshire, said the laser would be the most powerful on earth.

‘At this kind of intensity we start to get into unexplored territory as it is an area of physics that we have never been before,’ he told the Sunday Telegraph.

The ELI ultra-high field laser, which will be completed by the end of the decade, will cost £1bn and the UK is among a number of European countries in the running to house it.

The European Commission has already authorised plans for three more lasers which will become prototypes for the ultra-high field laser.

Scientists hope the laser will also allow them to see how particles inside an atom behave and it is hoped it might be able to explain the mystery of why the universe contains more matter than previously detected by revealing what dark matter really is.


  • The ultra-high field laser will be made up of 10 beams – each more powerful than the prototype lasers.
  • It will produce 200 petawatts of power – more than 100,000 times the power of the world’s combined electricity production but in less than a trillionth a second.
  • The energy needed to power the laser will be stored up beforehand and then used to produce a beams several feet wide which will then be combined and eventually focused down onto a tiny spot.
  • The intensity of the beam is so powerful and will produce such extreme conditions, that do not even exist in the center of the sun.

Powerful: The ultra-high field laser will be made up of 10 beams - each more powerful than the prototypes

Powerful: The ultra-high field laser will be made up of 10 beams – each more powerful than the prototypes.

Via DailyMail

‘Super-Earth’ Found in Habitable Zone

leave a comment »

The Milky Way abounds with low-mass planets, including small, rocky ones such as Earth. That’s the main conclusion of a team of European astronomers, based on their latest haul of extrasolar planets. The new discoveries—55 new planets, including 19 “super-Earths”—were presented here today at the Extreme Solar Systems II conference by team leader Michel Mayor of the University of Geneva in Switzerland. “We find that 40% of all Sun-like stars are accompanied by at least one planet smaller than Saturn,” he says. The number of Earth-like planets is expected to be even higher.

The new planets were found with HARPS (High Accuracy Radial velocity Planet Searcher), an extremely sensitive instrument used to analyze starlight, mounted on the 3.6-meter telescope of the European Southern Observatory (ESO) at Cerro La Silla in northern Chile. HARPS detects the minute periodic wobbles of stars, caused by the gravity of orbiting planets. So far, HARPS has discovered 155 exoplanets, including two-thirds of all planets less massive than Neptune.

Of the 19 newly found super-Earths (exoplanets between a few and 10 times the mass of Earth), the most intriguing is HD 85512b, which weighs in at only 3.6 Earth masses. Its orbit lies in the habitable zone of its parent star, which means temperatures are just right for liquid water to exist on its surface, says Lisa Kaltenegger of the Max Planck Institute for Astronomy in Heidelberg, Germany. “We’re entering an incredibly exciting period in history.”

Meanwhile, scientists disagree about which technique offers the best chances of finding the first true “Earth analog”—an Earth-like planet orbiting in the habitable zone of its Sun-like star. (H85512b is too massive, and it’s star is too cool.) Mayor says HARPS might find this Holy Grail of exoplanet research within 5 years or so, after new upgrades to increase the instrument’s sensitivity. But planet hunter Geoffrey Marcy of the University of California, Berkeley, disagrees. NASA’s Kepler space telescope is “by far the best,” he says. “We will find them if they’re there, probably within the next 2 or 3 years.”

At the meeting, Kepler co-investigator Natalie Batalha of NASA’s Ames Research Center announced that the number of exoplanet candidates from the Kepler mission has increased by some 50% since last February, to 1781. Most are less than three times the size of the Earth. Kepler, launched in March 2009, finds planets by measuring the slight periodic dimming of their parent stars, when they happen to pass between the star and Earth.

No matter who finds the first Earth analog, the HARPS planets offer better prospects for detailed follow-up observations, Mayor says, because HARPS focuses on relatively nearby stars, while almost all Kepler stars are much farther away. For instance, ESO astronomer Markus Kissler-Patig predicts that the future 39.2-meter European Extremely Large Telescope (E-ELT) should be able to directly image HD 85512b. Analyzing the starlight it reflects will provide important information about the planet’s atmospheric composition. “The E-ELT will be able to probe for biomarkers,” Kissler-Patig says, referring to chemicals thought to indicate the presence of life.

While ESO is planning more-sensitive planet-hunting instruments for its existing Very Large Telescope and for the future E-ELT, Kepler is facing an uncertain future. “Kepler’s goal of finding true Earth analogs can only be reached by extending the mission duration” past its planned operational lifetime of 3.5 years, Batalha says. In February 2012, NASA will decide on a possible mission extension. Marcy is optimistic. Kepler is so incredibly successful, he says, that it seems unlikely NASA will terminate the mission next year. “I’m sure NASA is wiser than that.”


Via ScienceNow

SpaceX chief sets his sights on Mars

leave a comment »

Brendan Smialowski / Getty Images

SpaceX CEO Elon Musk stands alongside rocket models at the National Press Club as he announces plans to build the Falcon Heavy rocket. Observers say the heavy-lift launch system could send an 11-ton payload to Mars.

Don’t expect to hear any nostalgia about the soon-to-end space shuttle era from Elon Musk, the millionaire founder of Space Exploration Technologies. Musk isn’t prone to look to the past, but rather to the future — to a “new era of spaceflight” that eventually leads to Mars.

SpaceX may be on the Red Planet sooner than you think: When I talked with him in advance of the shuttle Atlantis’ last liftoff, the 40-year-old engineer-entrepreneur told me the company’s Dragon capsule could take on a robotic mission to Mars as early as 2016. And he’s already said it’d be theoretically possible to send humans to Mars in the next 10 to 20 years —  bettering NASA’s target timeframe of the mid-2030s.

You can’t always take Musk’s timelines at face value. This is rocket science, after all, and Musk himself acknowledges that his company’s projects don’t always finish on time. But if he commits himself to a task, he tends to see it through. “It may take more time than I expected, but I’ll always come through,” he told me a year ago.

Since that interview, a lot of things have come through for SpaceX. The company has conducted successful tests of its Falcon 9 rocket and Dragon capsule. Before the end of the year, another test flight is expected to send a Dragon craft all the way to the space station for the first time. If that test is successful, SpaceX can start launching cargo to the International Space Station under the terms of a $1.6 billion NASA contract.

The company is also in line to receive $75 million more from NASA to start turning the Dragon into a crew-worthy space taxi for astronauts by 2015 or so. And just today, the company broke ground on a California launch pad that could be used by the next-generation Falcon Heavy rocket starting in 2013.

Once the Dragon and the Falcon Heavy are in service, the main pieces would be in place for a Mars mission, Musk said.

“One of the ideas we’re talking to NASA about is … using Dragon as a science delivery platform for Mars and a few other locations,” he told me. “This would be possibly be several tons of payload — actually, a single Dragon mission could land with more payload than has been delivered to Mars cumulatively in history.”

SpaceX is working with NASA’s Ames Research Center in California on an interplanetary mission concept that could theoretically be put into effect for a launch “five or six years from now,” Musk said.

By that time, astronauts will once again be riding on U.S.-made spaceships to the space station, including the Dragon — that is, if the current schedules hold true. But there’s a lot of doubt surrounding those schedules. As you’d expect, the end of the space shuttle program and the shape of spaceships to come were major themes in my conversation with Musk. Here’s an edited version of the Q&A on those subjects:

Cosmic Log: A lot of people are saying that when the space shuttle stops flying, that might be the end of the American space program. The idea is that commercial spaceflight providers are not going to be able to do the job, and there won’t be sustainable interest in building the beyond-Earth-orbit rocket that NASA has on the drawing board. What’s your response to the claim that this is really the end?

Elon Musk: It flies in the face of the facts. Six months ago, we had the second launch of the Falcon 9 and the first launch of the Dragon. The Dragon orbited Earth twice, it performed orbital maneuvers, it made a precision re-entry under the control of thrusters, and it landed within a mile of our target. We brought the Dragon back, and it was actually in good enough condition that we could fly it again if we wanted to.

So as far as I’m concerned, it’s not the death of anything. What we’re really facing is quite the opposite. I think we’re at the dawn of a new era of spaceflight, one which is going to advance much faster than it ever has in the past.

The space shuttle was designed in the ’70s, and it really didn’t improve after almost 40 years. They’ve upgraded the electronics here and there, but that’s about it. That’s incredibly static when you consider how other fields of technology have improved.

Now, with the public-private partnership that NASA has established with SpaceX, and the efforts made by other companies, we’re actually going to see dramatic improvements in spaceflight technology for the first time since the ’60s. The Dragon is taking technology to a whole new level beyond the shuttle.

The shuttle is fairly constrained because it’s a winged vehicle with a landing gear. It can’t land anywhere except Earth, and even on Earth, it can land only on certain runways. It doesn’t have any ability to go beyond Earth orbit. But because the Dragon has a propulsion-based landing system and a much more capable heatshield than the shuttle’s, it can land anywhere in the solar system with a solid surface — as long as you can throw it there. The Falcon Heavy can throw it pretty much anywhere in the solar system.

Q: The Dragon certainly looks different from the shuttle, and some people might get the impression that it’s a step backward, back to the days of Apollo.

A: I’ve heard that. But I hope we can make it clear that this is actually a big step forward from the shuttle. It can do all sorts of things that the shuttle can’t do. People look at something like wings and say, yeah, that’s how a spaceship should look. But let’s say you had a boat, and you put wheels on it and drove it down the road. It’d look pretty silly, right? Well, why do you have wings in a vacuum?

Q: One of the issues that always comes up when discussing commercial involvement in NASA spaceflight is the safety issue. A lot of the critics of your program have focused on that concern as the sticking point. NASA certainly devotes a lot of attention to safety assurance, and some say that’s why it’s so expensive to put humans into space. Any attempt to cut corners on that would make the whole enterprise look questionable. How do you respond to that?

A: Well, first of all, I suspect that the people saying that wouldn’t have a problem flying on Southwest Airlines or driving a car or taking other types of transport that are not government-operated. The government does have a role in safety oversight, and anything we do for NASA goes through an extremely rigorous safety and liability examination. But I think what actually needs to happen is a dramatic improvement in safety. The current state of affairs with the shuttle is not acceptable at all. The shuttle’s accident rate is not OK. Who would get on an airplane if you had a 1.5 percent chance of dying?

Q: Do you see any sign that NASA has different standards for oversight of commercial operations and for the shuttle program? After all, there’s a whole army of engineers dealing with shuttle operations and processing.

A: I do think there are different standards. For us, the standards are higher. The shuttle, for example, has no escape system. We would not launch [astronauts on] our vehicle without an escape system, nor would NASA want us to. Also, with our vehicle, there’s far less to go wrong on any given flight. With the shuttle, if anything serious goes wrong with this extremely complex vehicle, it’s curtains. There’s no escape. If the shuttle’s level of reliability was acceptable, we could fly astronauts this year.

Q: Do you think NASA has the right vision for spaceflight? The idea is that space station resupply in low Earth orbit would be left to commercial ventures, freeing NASA up to develop the heavy-lift Space Launch System for exploration beyond Earth orbit. Some people have wondered whether the Space Launch System is really going to be necessary.

A: Personally, my view is that space transport overall should be much more of a private-public partnership, and that applies to heavy lift as well. The best use of NASA’s resources is to focus on the unique scientific instruments and payloads that are truly one-off items. That’s actually how it works right now for Earth-observing and space science missions. They launch the spacecraft primarily on United Launch Alliance rockets, a Delta or an Atlas. If it’s a probe to Mars, or to the asteroid belt, or it’s a weather satellite, it’ll go up on a United Launch Alliance rocket. Obviously, in the future, they’ll go up on our vehicles as well. I think that works pretty well, and I think it makes sense to extend that model to all sizes of rockets.

Q: So it sounds as if you see a role for SpaceX in exploration beyond Earth orbit. Do you see any scenario where a mission to the moon or Mars could be completely private-sector?

A: It’s not out of the question. I do think missions like that are ideally handled as public-private partnerships. There are questions about how you’d pay for the missions. But the absolute goal of SpaceX is to develop the technologies to make life multiplanetary, which means being able to transport huge volumes of people and cargo to Mars. So we’ll do whatever is necessary to achieve that goal.



A step closer to explaining our existence

leave a comment »

Fred Ullrich / Fermilab

Confidence is growing in results from a particle physics experiment at the Tevatron collider that may help explain why the universe is full of matter.

Why are we here? It remains one of the largest unexplained mysteries of the universe, but particle physicists are gaining more confidence in a result from an atom smashing experiment that could be a step toward providing an answer.

We exist because the universe is full of matter and not the opposite, so-called antimatter. When the Big Bang occurred, equal parts of both should have been created and immediately annihilated each other, leaving nothing leftover to build the stars, planets and us.

Thankfully, it didn’t happen that way. There’s an asymmetry between matter and antimatter. Why this is remains inadequately explained, Stefan Soldner-Rembold, a co-spokesman for the particle physics experiment at the Fermi National Accelerator Laboratory  outside of Chicago, told me on Thursday.

“We are looking for a larger asymmetry than we currently know in the best theories in physics, which is called the standard model,” said Soldner-Rembold, who is based at the University of Manchester in England.

Using the Fermilab’s Tevatron collider, members of the DZero experiment are smashing together protons and their antiparticle, called antiprotons, which are perfectly symmetric in terms of matter and antimatter, he explained.

“So you expect what comes out will also be symmetric in terms of matter and antimatter,” he said. “But what we observe is that there is a slight, on the order of 1 percent, asymmetry where more matter particles are produced than antiparticles.”

This 1 percent asymmetry is larger than predicted by the standard model and thus helps explain why there is more matter than antimatter in the universe.

The DZero team announced this finding of asymmetry in 2010, but their confidence in the result wasn’t sufficient to call it a discovery. At that point, there was a 0.07 chance the result was due to a random fluctuation in the data.

The team has now analyzed 1.5 times more data with a refined technique, increasing their confidence in the result. The probability that the asymmetry is due to a random fluctuation is now just 0.005 percent. They’d like to get to an uncertainty of less than 0.00005 percent before popping open the champagne.

The new results were presented Thursday at Fermilab.

“There are very high thresholds in physics so that people can really call something a discovery and be absolutely sure,” Soldner-Rembold said. “We are going in the right direction.”

Even more work at Fermilab and further, complementary experiments with the Large Hadron Collider in Geneva will be required to shore up confidence that what they are seeing really is real, and thus a step toward explaining why the universe has much more matter than antimatter.

“To really understand how the universe evolved is the next step,” he said. “We do a particular process in the lab. In order to say is this enough to explain the amount of matter around us is not as easy as saying 1 percent sounds good.”

And for those hoping that science has all the answers, Soldner-Rembold cautions that science will never answer the question of “why we are here, it only tries to understand the underlying laws of nature.”


Via MSNBC/John Roach

Fermilab’s MINOS Experiment Also Sees Neutrino Quick-Change

leave a comment »


Physicists continue to close in on the mystery of neutrino oscillation — the process by which one type of neutrino morphs into another as it travels through space.

Two weeks ago, the Japanese T2K (Tokai to Kamioka) experiment announced the first evidence of a rare form of neutrino oscillation, whereby muon neutrinos turn into electron neutrinos as they travel from the beam source to the detectors.

Now Fermilab’s Main Injector Neutrino Oscillation Search (MINOS) has reported findings consistent with the T2K results, using different methods and analysis techniques than the Japanese researchers. The neutrinos in question traveled 450 miles from Fermilab’s Main Injector accelerator to a detector in the Soudan Underground Laboratory in Minnesota.

Neutrinos are tiny subatomic particles that travel very near the speed of light. They’re extremely difficult to detect, because they very rarely interact with any type of matter, even though they’re the most abundant type of particle in the known universe. Only one out of every 1,000 billion solar neutrinos would collide with an atom on its journey through the Earth.

The Standard Model of particle physics calls for three different kinds of neutrinos (electron, muon and tau, paired to the leptons known as electron, muon and tau). These “ghost particles” have no charge and very little mass, and experiments conducted over the last 10 years indicate that they can change from one type of neutrino into another.

Prior experiments — by MINOS and the OPERA experiment at the Gran Sasso National Laboratory — provided compelling evidence of muon neutrinos morphing into tau neutrinos, but catching a muon neutrino in the act of morphing into an electron neutrino is more difficult to detect.


The T2K signal was small: just shy of of “3-sigma.” But it was still statistically strong enough, given the rarity of the event, to be considered a genuine signal, not just background noise.The experiment detected 88 candidate events for the oscillation of muon neutrinos into electron neutrinos, based on data collected between January 2010 and March 11, 2011.

In contrast, MINOS recorded a total of 62 candidate events; if this particular type of quick change does not occur, they should have recorded only 49 such events. If the T2K analysis is correct, MINOS should have seen 71 events. The slight discrepancy enables physicists to further narrow the range of values for the rate at which this transformation occurs.

As always, more data is needed before an actual “discovery” can be claimed. The T2K data run was cut short because the major earthquake that devastated Japan also damage the experiment’s muon neutrino source. But researchers expect to have the machine back online and taking more data by January 2012. With more data, the current 3-sigma signal should strengthen sufficiently to claim a solid discovery. MINOS will also continue collecting data until February 2012.

Physicists want to know more about neutrino oscillations, and their masses, because this provides a potential clue to why there is something in the universe, rather than nothing. Back when our universe was still in its infancy, matter and antimatter were colliding and annihilating each other out of existence constantly.

This process slowed down as our universe gradually cooled, but there should have been equal parts matter and antimatter. Instead, there were slightly more matter particles than antimatter, and that slight excess formed everything around us. Physicists think that neutrinos, with their teensy-tiny bits of mass, might have been the tipping point that tilted the scales to matter’s favor.

Image credit: Fermilab


Via Discovery

We May Not Live in a Hologram After All

leave a comment »


You may remember the hubbub that a Fermilab physicist caused last year when he started to investigate some strange results coming from the GEO600 gravitational wave experiment.

In a nutshell, GEO600 — a mindbogglingly sensitive piece of kit — started to detect what particle physicist Craig Hogan interpreted as quantum “fuzziness.” This fuzziness, or blurriness on the smallest possible scales, could be interpreted as evidence for the “holographic universe” hypothesis.

This hypothesis describes the 3-dimensional universe we live in as a projection from a 2-dimensional “shell” at the very edge of the universe. As with any projection, the projected “pixels” will become fuzzy the closer you zoom in on them. The quantum fuzziness GEO600 seemed to detect could be evidence for this projection effect. The Universe is therefore a hologram, so the idea goes.

Spurred on by the GEO600 results, Hogan is currently working on a project to build a “Holometer” at Fermilab to probe these quantum scales, hopefully shedding some light on what this fuzziness could be.

However, as announced this week, a space-borne European satellite that should be able to measure these small scales too, doesn’t appear to be registering any quantum fuzziness. In fact, it has yet to detect anything quantum, indicating that spacetime’s “graininess” is composed of quanta that a lot smaller than predicted — and in my view, puts a question mark over the interpretation of the GEO600 results.

Gamma-Ray Bursts and Grains of Quanta

The European Space Agency’s Integral gamma-ray observatory can make very precise measurements of the gamma-rays emitted by energetic (and often mysterious) gamma-ray bursts (GRBs).

GRBs are thought to be caused by the collapse of massive stars as they reach the end of their lives, explode and form neutron stars or black holes. As they explode, they blast a high-energy pulse of gamma-ray radiation from their poles, outshining entire galaxies. If correctly aligned with Earth, we can detect GRBs as a bright, transient flash.

As the gamma-rays — high-energy photons that exist at the extreme end of the electromagnetic spectrum — travel through space, their polarization (or “twist”) is affected by the spacetime they travel through.

If spacetime is composed of tiny quantum “grains,” the gamma-ray photons’ polarization should change from random polarization (at the GRB source) to biased toward a certain polarization when received by the Integral spacecraft.

Also, high-energy gamma-rays should be more twisted than lower energy gamma-rays; the difference in the polarization can therefore be used to estimate the size of the quantum grains.

What’s the Polarization?

If spacetime was smooth and continuous (as Einstein viewed the Universe), the polarization will remain random, and there will be no difference between high- and low energy photons no matter how far the gamma-rays travel. But if spacetime is composed of grains (as quantum mechanics predicts), the further the gamma-rays travel, the greater the polarization difference.

So, Philippe Laurent of CEA Saclay and his collaborators analyzed the polarization of gamma-rays from a very energetic gamma-ray burst. GRB 041219A occurred on Dec. 19, 2004, and it was immediately recognized as being in the top one percent of GRBs for brightness.

Also, due to its distance — 300 million light-years away — data from this explosion should have also revealed a measurable difference in the polarization between low- and high-energy gamma-ray photons.

Alas, no polarization difference was detected.

Some theories predict the quantum graininess should manifest itself at scales of around 10-35 meters — a scale known as the Planck length, the fundamental scale for quantum dynamics. Through the precise nature of its polarization measurements, Integral hasn’t found any quantum graininess down to a scale of 10-48 meters; that’s 10,000,000,000,000 times smaller than the “fundamental” Planck length.

So, if quantum predictions are correct, the spacetime quanta must be made from grains that are 10-48 meters in scale or less.

What does this mean?

Holographic Universe… or Not?

For Hogan’s interpretation of the GEO600 results to be correct, this graininess should be measurable over larger scales. In fact, GEO600 started to detect quantum fuzziness at scales of around 10-16 meters — that’s 10,000,000,000,000,000,000 times largerthan the Planck length.

At first glance, the Integral results appear to contradict the GEO600 interpretation, therefore disputing the holographic universe hypothesis all together. If these “fuzzy” 10-16 meter scales aren’t detected through Integral’s polarization measurements of gamma-rays, perhaps the GEO600 quantum fuzziness is an effect of overlooked instrumental error.

However, all may not be lost.

The Integral polarization results depend on spacetime being constructed from discrete quanta that behave in a way that fits with quantum theory. The holographic universe hypothesis goes one step further, constructing 3-dimensional spacetime from projections of a 2-dimensional “shell” — perhaps gamma-ray photons behave differently in this fuzzy, projected, quantum world, and this could be why no polarization difference between gamma-ray photons are detected.

Proving or disproving a holographic universe, of course, isn’t the focus of this Integral study; it is an attempt at revealing the very fabric of spacetime, helping physicists understand what our Universe is made of.

“This is a very important result in fundamental physics and will rule out some string theories and quantum loop gravity theories,” said Laurent in the ESA press release.

“Fundamental physics is a less obvious application for the gamma-ray observatory, Integral,” added Christoph Winkler, ESA’s Integral Project Scientist. “Nevertheless, it has allowed us to take a big step forward in investigating the nature of space itself.”


Via Physorg.com

Discovery Adds Mystery to Earth’s Genesis

leave a comment »

Artist's conception of a dusty planet-forming disk orbiting a stellar object known as IRS 46.

Earth and the other rocky planets aren’t made out of the solar system’s original starting material, two new studies reveal.

Scientists examined solar particles snagged in space by NASA’s Genesis probe, whose return capsule crash-landed on Earth in 2004. These salvaged samples show that the sun’s basic building blocks differ significantly from those of Earth, the moon and other denizens of the inner solar system, researchers said.

Nearly 4.6 billion years ago, the results suggest, some process altered many of the tiny pieces that eventually coalesced into the rocky planets, after the sun had already formed.

“From any kind of consensus view, or longer historical view, this is a surprising result,” said Kevin McKeegan of UCLA, lead author of one of the studies. “And it’s just one more example of how the Earth is not the center of everything.”

Salvaging the samples

The Genesis spacecraft launched in 2001 and set up shop about 900,000 miles (1.5 million kilometers) from Earth. It spent more than two years grabbing bits of the solar wind, the million-mph stream of charged particles blowing from the sun.

The idea was to give scientists an in-depth look at the sun’s composition, which in turn could help them better understand the formation and evolution of the solar system.

To that end, Genesis sent its sample-loaded return capsule back to Earth in September 2004. But things didn’t go well; the capsule’s parachute failed to deploy, and it smashed into the Utah dirt at 190 mph (306 kph).

While some of Genesis’ samples were destroyed in the crash, others were salvageable, as the two new studies show. Two different research teams looked at the solar wind particles’ oxygen and nitrogen — the most abundant elements found in Earth’s crust and atmosphere, respectively.

And they did so with a great deal of care, knowing that the crash had limited their supplies of pristine solar material.

“The stakes were raised on the samples that did survive well,” McKeegan told SPACE.com. “There wasn’t as much to go around.”

The Genesis return capsule slammed into the Utah dirt at nearly 200 mph on Sept. 8, 2004 when its parachute failed to deploy.

The Genesis return capsule slammed into the Utah dirt at nearly 200 mph on Sept. 8, 2004 when its parachute failed to deploy.

Analzying oxygen

McKeegan and his team measured the abundance of solar wind oxygen isotopes. Isotopes are versions of an element that have different numbers of neutrons in their atomic nuclei. Oxygen has three stable isotopes: oxygen-16 (eight neutrons), oxygen-17 (nine neutrons) and oxygen-18 (ten neutrons).

The researchers found that the sun has significantly more oxygen-16, relative to the other two isotopes, than Earth. Some process enriched the stuff that formed our planet — and the other rocky bodies in the inner solar system — with oxygen-17 and oxygen-18 by about 7 percent.

While scientists don’t yet know for sure how this happened, they have some ideas. The leading contender, McKeegan said, may be a process called “isotopic self-shielding.”

About 4.6 billion years ago, the planets had not yet coalesced out of the solar nebula, a thick cloud of dust and gas. Much of the oxygen in this cloud was probably bound up in gaseous carbon monoxide (CO) molecules.

But the oxygen didn’t stay bound up forever. High-energy ultraviolet light from the newly formed sun (or nearby stars) blasted into the cloud, breaking apart the CO. The liberated oxygen quickly glommed onto other atoms, forming molecues that eventually became the rocky building blocks of planets.

Photons of slightly different energy were required to chop up the CO molecules, depending on which oxygen isotope they contained. Oxygen-16 is far more common than either of the other two, so there would have been much more of this substance throughout the solar nebula, researchers said.

The result, the self-shielding theory goes, is that many of the photons needed to break up the oxygen-16 CO were “used up,” or absorbed, on the edges of the solar nebula, leaving much of the stuff in the cloud’s interior intact.

By contrast, relatively more of the photons that could strip out oxygen-17 and oxygen-18 got through to the inner parts of the cloud, freeing these isotopes, which were eventually incorporated into the rocky planets. And that, according to the theory, is why the sun and Earth’s oxygen isotope abundances are so different.

“The result that we’re publishing this week gives support to the self-shielding idea,” McKeegan said. “But we don’t know the answer yet.”

Nitrogen, too

In a separate study, another research team led by Bernard Marty of Nancy University in France analyzed the nitrogen isotopes in Genesis’ samples. (Nitrogen has two stable isotopes: nitrogen-14, which has seven neutrons, and nitrogen-15, which has eight.)

Marty and his colleagues found an even more dramatic difference than McKeegan’s group did: The solar wind has about 40 percent less nitrogen-15 (compared to nitrogen-14) than do samples taken from Earth’s atmosphere.

Previous studies had hinted that the sun’s nitrogen might be very different from that of Earth, Mars and other rocky bodies in the inner solar system, Marty said. But the new study establishes this firmly.

“Before Genesis and the present measurement of the N isotopic composition of the solar wind and by extension of the sun, it was not possible to understand the logic of such variations,” Marty told SPACE.com in an email interview. “Now we understand that the starting composition, the solar nebula, was poor in 15N, so that variations among solar system objects are the result of mixing with a 15N-rich end-member.”

As to how this enrichment of nitrogen-15 could have happened, Marty as well suggests some type of self-shielding as a possible mechanism. But it’s not a certainty.

“This is a scenario that is consistent with present-day observations,” he said. “We cannot eliminate yet the possibility that these 15N-rich compounds were imported from outer space as dust in the solar system.”

The new results also suggest that most nanodiamonds — tiny carbon specks that are a major component of stardust — likely formed in our own solar system, because they share similar nitrogen isotope ratios with the sun. Some scientists have regarded nanodiamonds as being primarily presolar, thinking they were ejected from other stellar systems by supernova explosions.

Both studies appear in the June 23 issue of the journal Science.

Genesis’ legacy

The two new studies should help scientists get a better understanding of the solar system’s early days, researchers said.

And the results should help rehabilitate the reputation of the $264 million Genesis mission, showing that the capsule crash didn’t render it a failure, McKeegan said.

“We managed to accomplish all the science that we set out to do, all the important stuff,” he said. “The enduring image in everybody’s mind — the picture of the crashed spacecraft in the desert — will be more of a footnote instead of the primary thing that people remember. That’s my hope, anyway.”


Via Space