The Interesting, The Strange, The News.

Archive for the ‘Energy’ Category

Japan’s citizen scientists map radiation, DIY-style

leave a comment »

With the Japanese government only providing spotty information about the radiation leaking from the damaged Fukushima nuclear plant in the early days after the devastating March 11 earthquake and tsunami, a group of tech-minded citizen scientists set out to fill in the “black holes” in the knowledge base.

They did so by crafting their own Geiger counters and handing them out to volunteers in the disaster area to measure the fallout. Months later, they have assembled thousands of radiation readings plotted on maps that they hope will one day be an invaluable resource for researchers studying the impact of the meltdown at the crippled nuclear complex.

Volunteer Toshikatsu Watanabe, left, and Safecast’s Kalin Kozhuharov take radiation measurements in Koriyama, Japan.

The volunteer network of scientists, tech enthusiasts and residents of Japan collectively known as Safecast (an amalgam of “safety” and “broadcast”) sprang to life in the weeks after the devastating 9.0-magnitude earthquake and tsunami struck Japan, cutting off power to the Fukushima Daiichi Nuclear Power Station and knocking out its backup generators. That shut down the plant’s cooling system, triggering meltdowns or partial meltdowns in three of the plant’s four reactors, followed by explosions that released radioactive substances into the air and allowed contaminated water to leak into the ocean.

“For the scientific community, this is a huge chance to further understand what this all means,” said Pieter Franken, co-founder of Safecast and a senior researcher at Keio University in Tokyo, which is collaborating on the project. “Chernobyl was 25 years ago and delivered lots of information. But we’re now in the Internet age, and we have a huge opportunity to do a much better job in measuring it and tracking it.”

Residents in the surrounding areas were understandably alarmed, but in the early days after the disaster, information from the government came in bits and pieces, and was difficult to find.

Franken and Sean Bonner, a Los Angeles-based technology buff involved in numerous online citizen-involved projects, saw an opportunity to use technology to augment the government’s reports and to make the information widely available.

The pair found Uncorked Studios, a Portland, Ore., website development firm, which wanted to map the radiation numbers from all sources “to try to get a better picture of things on a larger scale,” Bonner said.

The initial effort resulted in a map that revealed the dearth of information available: “We realized that there were some massive holes and that the data that was being published was not that specific,” said Bonner. “There would be one reading for an entire city. But we wouldn’t know exactly where in the city that reading was taken.”

With so many “unknowns,” the group decided to buy as many Geiger counters as possible and distribute them to people in the map’s “black holes,” Bonner said. But that wasn’t feasible because the supply of the radiation-measuring devices was limited, he  said.

So Safecast turned to a source they knew well: Hackerspaces, a loose confederation of high-tech tinkerers around the globe.

The TokyoHackerSpace had already drafted a to-do list in the disaster’s aftermath that included radiation monitoring. But with Safecast’s encouragement, the group stepped up its efforts. Members soon figured out how to build basic Geiger counters with Geiger tubes (which measure radiation) purchased through an initial fundraising campaign and modified so they could be attached to vehicles and upload data to the Internet, Christopher Wang, a specialist in sensor networks also known by his hacker nickname of “Akiba,” wrote in an email to msnbc.com.

After meeting Safecast, the hackers decided the best use of the jury-rigged devices would be to drive around taking measurements, allowing one “Geiger counter to cover a huge amount of range,” Wang wrote.

“We put together a custom circuit board that would mount on the outside of a car and had GPS (for timestamp and location data), an input for the Geiger counter, an SD card slot (for data logging), and wireless communication (to send the data inside the car and let the driver know if they are in an area with high radiation),” he said.

Other hackerspaces around the world — such as CRASH space in Los Angeles — soon enlisted in the effort and before long Safecast had the resources to launch an ambitious measuring and mapping effort.

Components of the jury-rigged Geiger counters.

While signing up volunteers, Safecast also developed a training regimen so the recruits would be able to take reliable readings with the instruments and send the data to the group.

Having average citizens involved was crucial, Franken said.

“We want to bring the radiation levels to people’s doorstep, so people can see around their house what is happening,” he said.

Safecast took its first reading on April 16. Today, it has about 50 regular volunteers who collect data from their homes or while driving, build devices or assist in other ways. Those using vehicles equipped with Geiger counters cover an area that Franken estimates to be about 620 miles long by 185 miles wide. To date, they’ve collected 251,000 data points from their drives and fixed reporting stations, and have received about 60,000 more from other sources, including people with their own Geiger counters.

Safecast publishes the data on its website and publishes it to a number of other places so the information can be used by the greatest number of people, Bonner said. It also aggregates radiation data from a number of sources, including the Japanese government.

A Safecast map shows radiation readings from northeastern Japan.

The color-coded maps that Safecast has published don’t always agree with the government’s readings. But Franken said the effort isn’t intended to suggest that the government’s information is bad. The government currently has available a website with the readings of environmental radioactivity level by prefecture.

“We really don’t want to say that the government is wrong,” he said. “And, in fact, in many cases we find that the measurements are fairly much in sync where they are comparable — we have just much more data points and locations measured.”

For example, Safecast’s mapping has revealed some radiation hotspots far from the plant, while other areas closer to it show lower levels. This is due to local weather conditions and air flow, meaning distribution of radioactive materials is not just a matter of proximity, Franken said.

“It’s not so predictable and it really pays to go and map the whole area, and literally find areas that are higher or lower as we go,” he said, noting that in some cases radiation levels can vary by street and even within a home.

“It’s kind of a heavy task because it requires a certain amount of guts to go and do it,” he said of the volunteers, noting he had recently trained a woman and her 12-year-old son in Fukushima City how to measure radiation.

But knowing what the levels are has helped ease some of the anxiety over the radiation exposure, Franken said.

“The measurements may or may not affect people’s decisions but in many cases we see that it more or less gives a sense of confidence that this is what it is and, ‘yeah, I’m going to stay and this is probably going to be manageable,’ or ‘no, I really don’t want to take the risk for my family, I’m going to avoid this.’”

One of the volunteers helping in the effort is Brett Waterman, a 46-year-old Australian who runs an English-language after-school program for children nearly 30 miles from the Fukushima plant, in the city of Iwaki. He has been surveying the radiation levels using a Geiger counter mounted on his car.

“There are many people who have decided that the lack of information implied that there was too much risk so they just decided to leave,” he said.

But through his work, he has learned that the radiation levels were low in the area.

“We can’t see it, but if we map it out, like we are doing street by street, we can sort of start to see it in a sense. We can get a picture of what this radiation stuff is,” he said.

His 13-year-old son is a “significant motivator” for him to take the readings. He noted that though residents don’t yet know what the long-term effects of the radiation will be, the information will be key in the future.

“In 10 years or 20 years’ time, you can’t go back to three months after the event and then find out what the data was like. But if we record it now, and then we continue to record it over the months and years to come, then from a scientific and a community point of view there is a database that can be referenced.”

Some researchers and government agencies welcome Safecast’s endeavor. Andrew Maidment, associate professor of radiology at the Hospital of the University of Pennsylvania, said the efforts were “necessary and helpful,” though he added two “cautionary notes.”

“The first is that the data are only useful, if it is clear (1) how the measurements were performed and (2) exactly where the measurements are performed,” he wrote in an email to msnbc.com. “In general, it is very easy to get erroneous measurements; consistency in following a specific protocol and lots of practice are necessary to do this right. … However, I will say that the data looks consistent since there are repeated measurements and they are spatially correlated. The second problem is that interpretation of the data is hard. Thus, the use of a color code is questionable.”

Japan’s Ministry of Education, Culture, Sports, Science and Technology did not respond to emails and a call seeking comment on the project.

The U.S. Nuclear Regulatory Commission said it was not in a position to comment on the initiative, but public affairs officer Scott Burnell noted in an email: “Speaking very generally, significant training and specialized equipment is required to provide the most accurate surveying and analysis of radioactive materials in the environment.”

Franken said Safecast encouraged dialogue with critics and supporters: “We feel that it is good to have an independent measurement available to people … I think just having more is probably better,” he said.

And Bonner said the initiative has the potential to eventually extend far beyond Japan.

“What all of this did sort of brought to light the fact that this data doesn’t exist in the quantities that it should and is not as readily available as would be helpful,” he said. “So while Japan is the focus at the moment, you know, longer term we sort of are shifting to a global outlook. There is a lot more ground to cover once everything in Japan is wrapped up.”


Via MSNBC/Miranda Leitsinger


Microbe could make biofuels hot

leave a comment »


A 94°C geothermal pool, with a level-maintaining siphon, near Gerlach, Nevada. Sediment from the floor of this pool was enriched on pulverized miscanthus at 90°C and subsequently transferred to filter paper in order to isolate microbes able to subsist on cellulose alone.

A record-breaking microbe that thrives while munching plant material at near boiling temperatures has been discovered in a Nevada hot spring, researchers announced in a study published today.

Scientists are eyeing the microbe’s enzyme responsible for breaking down cellulose — called a cellulase — as a potential workhouse in the production of biofuels and other industrial processes.

Cellulose is a chain of linked sugar molecules that makes up the woody fiber of plants. To produce biofuels, enzymes are required to breakdown cellulose into its constituent sugars so that yeasts can then ferment them into the type of alcohol that makes cars (not people) go vroom.

At the industrial scale, this process is done most efficiently at high temperatures that kill other microbes that could otherwise contaminate the reaction, Douglas Clark, a chemical and biomolecular engineer at the University of California at Berkeley, told me today.

“So finding cellulases that can operate at those temperatures are of interest,” he said.

Hot spring
That’s what led Clark, microbiologist Frank Robb from the University of Maryland, and colleagues to collect sediment and water samples from the Great Boiling Springs near Gerlach, Nevada. The spring is 203 degrees F, just short of boiling.

“It’s on private land and has been surrounded by a low wall to keep cattle from going into it and that maintains the temperature,” Robb explained to me today, noting that most hot springs have varying temperatures depending on the weather and water levels in the spring.

In addition, a siphon has been added to Gerlach hot spring to keep it from overflowing. The combination gives whatever microbes that are in there no choice but to grow at high temperatures, Robb noted. Bits of grass and woody material blown into the spring serve as a food source.

The team grew microbes found in the samples on pulverized miscanthus, a type of grass that is a common biofuel feedstock, to isolate the microbes that grow with plant fiber as their only source of carbon.

They then sequenced the community of surviving microbes, which indicated three species of Archaea, a type of single celled microorganism, were able to utilize cellulose as food. Genetic techniques identified the specific cellulase involved in the breakdown of cellulose.

This cellulase, dubbed EBI-244, was found in the most abundant of the three Archaea.

“We didn’t really expect to find an organism that could grow at such a high temperature and degrade cellulose in this particular environment. But you never know,” Clark told me. “It really underscores the diversity of life. And, obviously, if you don’t look, you won’t find it.”

Too hot
The enzyme EBI-244 works optimally at 228 degrees F (109 degrees C), which is actually too hot for the efficient breakdown of cellulose into fermentable sugars due to side reactions that can occur, Clark noted.

“But it is interesting to know that such cellulases are out there,” Clark said. “And then this cellulase might also serve as a good starting point to be engineered to work at a lower temperature but maintain the high stability that it has naturally evolved to work at such high temperatures.”

Robb likened this engineering process to building a street car from parts used on cars found at the racetrack. “The enzyme itself could be the parts bin,” he said.

So, the enzyme itself probably won’t be hard at work anytime soon producing fuel to put in your gas tank, but it does lead researchers down the road to engineering the biofuels of the future. What’s more, EBI-244 is a record holder for heat tolerance in cellulase.

“It is always nice to have a record breaker,” Clark noted. “It adds to that wow factor a little bit.”



A step closer to explaining our existence

leave a comment »

Fred Ullrich / Fermilab

Confidence is growing in results from a particle physics experiment at the Tevatron collider that may help explain why the universe is full of matter.

Why are we here? It remains one of the largest unexplained mysteries of the universe, but particle physicists are gaining more confidence in a result from an atom smashing experiment that could be a step toward providing an answer.

We exist because the universe is full of matter and not the opposite, so-called antimatter. When the Big Bang occurred, equal parts of both should have been created and immediately annihilated each other, leaving nothing leftover to build the stars, planets and us.

Thankfully, it didn’t happen that way. There’s an asymmetry between matter and antimatter. Why this is remains inadequately explained, Stefan Soldner-Rembold, a co-spokesman for the particle physics experiment at the Fermi National Accelerator Laboratory  outside of Chicago, told me on Thursday.

“We are looking for a larger asymmetry than we currently know in the best theories in physics, which is called the standard model,” said Soldner-Rembold, who is based at the University of Manchester in England.

Using the Fermilab’s Tevatron collider, members of the DZero experiment are smashing together protons and their antiparticle, called antiprotons, which are perfectly symmetric in terms of matter and antimatter, he explained.

“So you expect what comes out will also be symmetric in terms of matter and antimatter,” he said. “But what we observe is that there is a slight, on the order of 1 percent, asymmetry where more matter particles are produced than antiparticles.”

This 1 percent asymmetry is larger than predicted by the standard model and thus helps explain why there is more matter than antimatter in the universe.

The DZero team announced this finding of asymmetry in 2010, but their confidence in the result wasn’t sufficient to call it a discovery. At that point, there was a 0.07 chance the result was due to a random fluctuation in the data.

The team has now analyzed 1.5 times more data with a refined technique, increasing their confidence in the result. The probability that the asymmetry is due to a random fluctuation is now just 0.005 percent. They’d like to get to an uncertainty of less than 0.00005 percent before popping open the champagne.

The new results were presented Thursday at Fermilab.

“There are very high thresholds in physics so that people can really call something a discovery and be absolutely sure,” Soldner-Rembold said. “We are going in the right direction.”

Even more work at Fermilab and further, complementary experiments with the Large Hadron Collider in Geneva will be required to shore up confidence that what they are seeing really is real, and thus a step toward explaining why the universe has much more matter than antimatter.

“To really understand how the universe evolved is the next step,” he said. “We do a particular process in the lab. In order to say is this enough to explain the amount of matter around us is not as easy as saying 1 percent sounds good.”

And for those hoping that science has all the answers, Soldner-Rembold cautions that science will never answer the question of “why we are here, it only tries to understand the underlying laws of nature.”


Via MSNBC/John Roach


Fermilab’s MINOS Experiment Also Sees Neutrino Quick-Change

leave a comment »


Physicists continue to close in on the mystery of neutrino oscillation — the process by which one type of neutrino morphs into another as it travels through space.

Two weeks ago, the Japanese T2K (Tokai to Kamioka) experiment announced the first evidence of a rare form of neutrino oscillation, whereby muon neutrinos turn into electron neutrinos as they travel from the beam source to the detectors.

Now Fermilab’s Main Injector Neutrino Oscillation Search (MINOS) has reported findings consistent with the T2K results, using different methods and analysis techniques than the Japanese researchers. The neutrinos in question traveled 450 miles from Fermilab’s Main Injector accelerator to a detector in the Soudan Underground Laboratory in Minnesota.

Neutrinos are tiny subatomic particles that travel very near the speed of light. They’re extremely difficult to detect, because they very rarely interact with any type of matter, even though they’re the most abundant type of particle in the known universe. Only one out of every 1,000 billion solar neutrinos would collide with an atom on its journey through the Earth.

The Standard Model of particle physics calls for three different kinds of neutrinos (electron, muon and tau, paired to the leptons known as electron, muon and tau). These “ghost particles” have no charge and very little mass, and experiments conducted over the last 10 years indicate that they can change from one type of neutrino into another.

Prior experiments — by MINOS and the OPERA experiment at the Gran Sasso National Laboratory — provided compelling evidence of muon neutrinos morphing into tau neutrinos, but catching a muon neutrino in the act of morphing into an electron neutrino is more difficult to detect.


The T2K signal was small: just shy of of “3-sigma.” But it was still statistically strong enough, given the rarity of the event, to be considered a genuine signal, not just background noise.The experiment detected 88 candidate events for the oscillation of muon neutrinos into electron neutrinos, based on data collected between January 2010 and March 11, 2011.

In contrast, MINOS recorded a total of 62 candidate events; if this particular type of quick change does not occur, they should have recorded only 49 such events. If the T2K analysis is correct, MINOS should have seen 71 events. The slight discrepancy enables physicists to further narrow the range of values for the rate at which this transformation occurs.

As always, more data is needed before an actual “discovery” can be claimed. The T2K data run was cut short because the major earthquake that devastated Japan also damage the experiment’s muon neutrino source. But researchers expect to have the machine back online and taking more data by January 2012. With more data, the current 3-sigma signal should strengthen sufficiently to claim a solid discovery. MINOS will also continue collecting data until February 2012.

Physicists want to know more about neutrino oscillations, and their masses, because this provides a potential clue to why there is something in the universe, rather than nothing. Back when our universe was still in its infancy, matter and antimatter were colliding and annihilating each other out of existence constantly.

This process slowed down as our universe gradually cooled, but there should have been equal parts matter and antimatter. Instead, there were slightly more matter particles than antimatter, and that slight excess formed everything around us. Physicists think that neutrinos, with their teensy-tiny bits of mass, might have been the tipping point that tilted the scales to matter’s favor.

Image credit: Fermilab


Via Discovery


Solstice Sun Storm May Spark Dazzling Northern Lights Today

leave a comment »

Norwegian photographer and skywatcher Terje Sorgjerd created an amazing video of the March 2011 auroras, or northern lights, which appear in this still from his project, entitiled "The Aurora.”

Norwegian photographer and skywatcher Terje Sorgjerd created an amazing video of the March 2011 auroras, or northern lights, which appear in this still from his project, entitiled “The Aurora.” CREDIT: Terje Sorgjerd

A wave of sun particles unleashed during a strong solar flare this week is arriving at Earth today (June 24) and could touch off a dazzling northern lights display, NASA officials say.

The solar storm occurred Tuesday, June 21, during Earth’s solstice, which marked the first day of summer in the Northern Hemisphere and the start of winter in the Southern Hemisphere.

The storm triggered a powerful explosion on the sun, called a coronal mass ejection, which sent a vast wave of solar particles directly at Earth at a speed of about 1.4 million mph (2.3 million kph). Those particles are now buffeting Earth’s magnetic field in interactions that could amplify the planet’s polar auroras, also known as the northern and southern lights.

“High-latitude sky watchers should be alert for auroras,” officials with NASA’s Goddard Space Center said in an update today.

Sun photo of June 21, 2011 solar storm and eruption

The SOHO sun observatory caught this view of a large solar flare and coronal mass ejection (top of sun) erupting from the sun’s surface early June, 21, 2011. CREDIT: SOHO/NASA/ESA

Supercharged auroras

Auroras occur when solar wind particles collide with atoms of oxygen and nitrogen in Earth’s upper atmosphere. The interaction excites the atoms, which then emit light (the aurora) as they return to their normal energy level.

Tuesday’s solar flare registered as a class C7.7 flare (C-class flares are the weakest types of flares), but lasted for several hours. There are three classes of solar flares. M-class solar flares are medium-strength flares, while the most intense solar storms register as X-class flares.

There is a 30 percent to 35 percent chance of a minor geomagnetic storm in Earth’s atmosphere today from this week’s storm, NASA officials said.


Partial Halo Coronal Mass Ejection

A broadly widening cloud of particles, observed by SOHO’s C3 coronagraph, rushed away from the Sun as a coronal mass ejection (CME) erupted over about 12 hours (June 14, 2011). Data from the Solar Dynamics Observatory shows an eruptive prominence breaking away from the Sun about where the event originated. While the originating event did not appear to be substantial, the particle cloud was pretty impressive. The bright circle with an extending horizontal line (above and left of the blue occulting disk) is a distortion caused by the brightness of planet Mercury. CREDIT: SOHO (ESA & NASA)

The active sun

This week’s solar flare was detected by the space-based Solar and Heliospheric Observatory (SOHO) operated by NASA and the European Space Agency. It came just weeks after another strong solar flare on June 7, which unleashed a massive coronal mass ejection that stunned astronomers with its intensity.

The June 7 event  kicked up a wave of plasma that rained back down on the sun over an area 75 times the width of Earth. The leading edge of the particles that erupted from the sun were traveling at about 3.5 million mph (5.7 million kph), SOHO officials have said.

Another coronal mass ejection on June 14 unleashed an eerie wave of material that formed a partial halo as it expanded into space.

The most severe solar storms, when aimed at Earth, can pose a danger to astronauts in space, satellites and even ground-based communications and power systems. This week’s solar flare, however, is not powerful enough to pose a serious risk, NASA officials said.

The sun is currently in an active period of its 11-year solar cycle. NASA and other space and weather agencies are keeping a close watch on the sun using space-based observatories, satellites and ground-based monitoring systems.


Via Space


Ice spray shooting out of Saturn moon points to a giant ocean lurking beneath its surface

leave a comment »

Scientists have collected the strongest evidence yet that Saturn moon Enceladus has a large saltwater ocean lurking beneath its surface.

Samples of ice spray shooting out of the moon have been collected by the Nasa’s Cassini spacecraft during one of its frequent Saturn fly-bys.

The plumes shooting water vapor and tiny grains of ice into space were originally discovered emanating from Enceladus – one of 19 known moons of Saturn – by the Cassini spacecraft in 2005.

Samples of ice spray shooting out of Saturn moon Enceladus have been collected by Nasa's Cassini spacecraft. Scientists believe it is the strongest evidence yet that Enceladus has a large saltwater ocean lurking beneath its surface

Samples of ice spray shooting out of Saturn moon Enceladus have been collected by Nasa’s Cassini spacecraft. Scientists believe it is the strongest evidence yet that Enceladus has a large saltwater ocean lurking beneath its surface

They were originating from the so-called ‘tiger stripe’ surface fractures at the moon’s south pole and apparently have created the material for the faint E Ring that traces the orbit of Enceladus around Saturn.

During three of Cassini’s passes through the plume in 2008 and 2009, the Cosmic Dust Analyser (CDA) on board measured the composition of freshly ejected plume grains.

The icy particles hit the detector’s target at speeds of up to 11miles-per-second, instantly vaporising them. The CDA separated the constituents of the resulting vapor clouds, allowing scientists to analyse them.

The ice grains found further out from Enceladus are relatively small and mostly ice-poor, closely matching the composition of the E Ring. Closer to the moon, however, the Cassini observations indicate that relatively large, salt-rich grains dominate.

Lead researcher Frank Postberg, of the University of Heidelberg in Germany, said: ‘There currently is no plausible way to produce a steady outflow of salt-rich grains from solid ice across all the tiger stripes other than the salt water under Enceladus’ icy surface.’

Plumes, both large and small, spray water ice from multiple locations along the 'tiger stripes' near the south pole of Enceladus

Plumes, both large and small, spray water ice from multiple locations along the ‘tiger stripes’ near the south pole of Enceladus.

Co-author Sascha Kempf, of the University of Colorado Boulder, added: ‘The study indicates that “salt-poor” particles are being ejected from the underground ocean through cracks in the moon at a much higher speed than the larger, salt-rich particles.

‘The E Ring is made up predominately of such salt-poor grains, although we discovered that 99 per cent of the mass of the particles ejected by the plumes was made up of salt-rich grains, which was an unexpected finding.

‘Since the salt-rich particles were ejected at a lower speed than the salt-poor particles, they fell back onto the moon’s icy surface rather than making it to the E Ring.’

According to the researchers, the salt-rich particles have an ‘ocean-like’ composition that indicates most, if not all, of the expelled ice comes from the evaporation of liquid salt water rather than from the icy surface of the moon.

When salt water freezes slowly the salt is ‘squeezed out’, leaving pure water ice behind. If the plumes were coming from the surface ice, there should be very little salt in them, which was not the case, according to the research team.

Dwarfed: Enceladus can be seen near Saturn's south pole at the bottom of this image

Dwarfed: Enceladus can be seen near Saturn’s south pole at the bottom of this image

 The scientists believe that perhaps 50 miles beneath the surface crust of Enceladus a layer of water exists between the rocky core and the icy mantle that is kept in a liquid state by gravitationally driven tidal forces created by Saturn and several neighboring moons, as well as by heat generated by radioactive decay.

It is thought that roughly 440lbs of water vapor are lost every second from the plumes, along with smaller amounts of ice grains.

Calculations show the liquid ocean must have a sizable evaporating surface or it would easily freeze over, halting the formation of the plumes.

‘This study implies that nearly all of the matter in the Enceladus plumes originates from a saltwater ocean that has a very large evaporating surface,’ said Dr Kempf.

The team’s study is published in the journal Nature.


Via DailyMail


Fusion Experiment Faces New Hurdles

leave a comment »

The world’s most-ambitious nuclear experiments have escalated at Lawrence Livermore National Laboratory.

Federal researchers there are seeking to fuse some of the lightest atoms in the universe to study — and hopefully harness — the type of energy produced by hydrogen bombs and the sun.

The tests were delayed six months while safety devices were installed to protect workers from radiation at the National Ignition Facility, a stadium-sized laboratory that contains 192 lasers trained on a target the size of a BB. The goal is to generate temperatures of more than 100 million degrees to fuse hydrogen atoms and release nuclear energy.

Scientists describe this process, which they hope to achieve next year, as the creation of a miniature star on earth.

But the $3.5 billion ignition facility, derided by some critics as taxpayer-financed science fiction, is running into new challenges that may further delay and perhaps scuttle its goal.

Among those challenges is the unanticipated presence of particles that clog filters designed to prevent the escape of radioactive material. Officials have proposed bypassing the filters for some experiments and venting radioactive particles directly into the air.

Officials say the radiation risks to people living in the surrounding area and to Lawrence Livermore researchers not involved with the experiments will be negligible. But according to a worst-case scenario outlined in a draft environmental report, an average of one worker involved in the experiments could die every 18 years from cancer caused by radiation exposure.

Tri-Valley CAREs, a watchdog group that monitors Lawrence Livermore, argues that the National Nuclear Security Agency, which financed construction of the facility, should not allow an increase in the amount of radiation produced by the fusion project.

“There is no safe level of exposure,” said Marylia Kelley, the group’s executive director.

The ignition facility was designed to help the United States government monitor the safety of nuclear weapons without having to test them. One of its primary missions is to help improve the United States’ weapons arsenal, but officials also describe the facility as an effort to revolutionize nuclear power.

If researchers can fuse atoms and control the energy that is released, an era of abundant carbon-free power could dawn. The technology would minimize the waste and storage issues faced by fission-based nuclear power plants, which split heavy atoms into smaller ones.

The tipping point for nuclear fusion is “ignition,” the moment when the lasers release the same amount of energy that is required to power them.

But that goal has remained elusive.

“If it was easy, we would have done it 50 years ago,” said Doug Eddy, a senior nuclear security agency operations manager working on the project.

Mr. Eddy said the ignition facility was engaged in a “tuning campaign,” raising the amount of fuel used and the amount of energy generated by the lasers.

“You keep bringing it up a bit more and more and more,” he said. “You don’t want to go big-time straight off the bat.”

Researchers have discovered that more power will be needed for some tests than first thought, Mr. Eddy said. They propose nearly tripling the amount of laser power to 120 megajoules, roughly the equivalent amount of energy produced by 50 pounds of TNT, which will increase radiation levels.

The types of hydrogen that will be fused are called deuterium and tritium. Tritium is radioactive, and fine molecular filters are installed at the facility to prevent it from escaping.

But the tritium is proving difficult to manage. The molecules are so small that other tiny atoms are also captured in the filters. Workers frequently enter the experiment chamber to change the clogged filters.

To solve that problem, officials propose allowing more tritium to accumulate before the filters are removed and sent to Nevada as low-level radioactive waste.

More controversially, the officials have proposed bypassing the filters during some experiments and venting tritium through an exhaust system into the air.

Tritium dissolves in water, persists for decades in the environment and can cause cancer.

“It will bind to DNA, so it gets pretty much everywhere in the body once it’s been absorbed,” said Mark Little, a senior scientist at the National Cancer Institute who has published papers dealing with tritium’s hazards. “With large quantities, damage can be done. As long as the releases are kept within mandatory limits, I would imagine the risks are small.”

Officials at the Department of Energy say the tritium releases at Lawrence Livermore would remain below safety limits set by the Environmental Protection Agency. But Tri-Valley CAREs points to a long list of tritium accidents and airborne releases from Livermore facilities, which have caused radioactive material to accumulate in Livermore’s water, food, honey and wine.

“When tritium gets into the environment and it’s on top of radiation being released from other parts of the laboratory, it potentially increases the dose and potentially increases the risk,” Ms. Kelley said.

And tritium is not her organization’s only concern, she added.

When tritium and deuterium fuse to create helium, a neutron is squeezed out and radiation is released. The neutrons can seep out of the building and rise into the atmosphere, where they cause additional radiation called skyshine to rain back down.

“If it’s high-enough energy, it can scatter and go up to the atmosphere, scatter in the atmosphere and bounce back down,” Mr. Eddy said. “Where it will scatter down is mostly around the site, but there’s no guarantee.”

Officials said that they would determine an area around the Livermore building where radiation might exceed federal safety standards and that Livermore personnel not involved with the research would be evacuated from those areas. Employees will be warned not to enter the area until after the experiment.

Despite several delays, Mr. Eddy said he was confident that ignition would occur next year. But some scientists question whether ignition will ever be possible.

“It’s a tough job, and some of the peer review questioned whether it would work,” said Frank von Hippel, a Princeton University physics professor and former science adviser to President Bill Clinton. “I think there are still skeptics out there.”


Via NYTimes