Monday, August 28, 2006

Researchers find controls to gold nanocatalysis



Researchers at the Georgia Institute of Technology have made a discovery that could allow scientists to exercise more control over the catalytic activity of gold nanoclusters. The finding – that the dimensionality and structure, and thus the catalytic activity, of gold nanoclusters changes as the thickness of their supporting metal-oxide films is varied – is an important one in the rapidly developing field of nanotechnology. This and further advances in nanocatalysis may lead to lowering the cost of manufacturing materials from plastics to fertilizers. The research appeared in the July 21, 2006 issue of the journal Physical Review Letters.

"We've been searching for methods for controlling and tuning the nanocatalytic activity of gold nanoclusters," said Uzi Landman, director of the Center for Computational Materials Science and Regents' professor and Callaway chair of physics at Georgia Tech. "I believe the effect we discovered, whereby the structure and dimensionality of supported gold nanoclusters can be influenced and varied by the thickness of the underlying magnesium-oxide film may open new avenues for controlled nanocatalytic activity," he said.

Landman's research group has been exploring the catalytic properties of gold, which is inert in its bulk form, for about seven years. In 1999, along with the experimental group of Ueli Heiz and Wolf-Dieter Schneider at the University of Lausanne, Landman's group showed that gold exhibits remarkable catalytic capabilities to speed the rate of chemical reactions if it is clustered in groups of eight to about two dozen atoms in size.

Last year in the journal Science, the teams of Landman and Heiz (now at the Technical University of Munich) showed that this catalytic activity involves defects, in the form of missing oxygen atoms, in the catalytic bed on which the gold clusters rest. These defect sites, referred to as F-centers, serve as sites for the gold to anchor itself, giving the gold clusters a slight negative charge. The charged gold transfers an electron to the reacting molecules, weakening the chemical bonds that keep them together. Once the bond is sufficiently weakened, it may be broken, allowing reactions to occur between the adsorbed reactants.

Now Landman's group has found that by using a thin catalytic bed with a thickness of up to 1 nanometer (nm), or 4-5 layers, of magnesium oxide, one may activate the gold nanoclusters which may act then as catalysts even if the bed is defect-free. A model reaction tested in these studies is one where carbon monoxide and molecular oxygen combine to form carbon dioxide, even at low temperatures. In these reactions, the bond connecting the two atoms in the adsorbed oxygen molecule weakens, thus, promoting the reaction with CO.

In this study, Landman and company simulated the behavior of gold nanoclusters containing eight, sixteen and twenty atoms when placed on catalytic beds of magnesium oxide with a molybdenum substrate supporting the magnesium oxide film. Quantum mechanical calculations showed that when the magnesium oxide film was greater than 5 layers or 1 nm in thickness, the gold cluster kept its three-dimensional structure. However, when the film was less than 1nm, the cluster changed its structure and lied flat on the magnesia bed –wetting and adhering to it.

The gold flattens because the electronic charge from the molybdenum penetrates through the thin layer of magnesium oxide and accumulates at the region where the gold cluster is anchored to the magnesium oxide. With a negative charge underneath the gold nanocluster, its attraction to the molybdenum substrate, located under the magnesia film, causes the cluster to collapse.

"It's the charge that controls the adhesive strength of gold to the magnesia film, and at the same time it makes gold catalytically active," said Landman. "When you have a sufficiently thin layer of magnesium oxide, the charge from the underlying metal penetrates through – all the way to the interface of the gold cluster."

In the previous experimental studies, defects in the magnesium oxide were required to bring about charging of the adsorbed clusters.

"Until now, the metal substrate was regarded only as an experimental necessity for growing the magnesium oxide films on top of it. Now we found that it can be used as a design feature of the catalytic system. This field holds many surprises," said Landman.

Landman's group is currently undertaking further explorations into possibilities to regulate the charge, and hence the catalytic activity, in gold nanocatalytic systems.

Landman and Heiz's book titled "Nanocatalysis" is scheduled to be published this month.

The current research was performed at the Center for Computational Materials Science by postdoctoral fellows Davide Ricci and Angelo Bongiorno under the supervision of Landman. The research team also included Dr. Gianfranco Pacchioni, a colleague from the University of Milano.

Contact: David Terraso
d.terraso@gatech.edu
404-385-2966
Georgia Institute of Technology

Saturday, August 26, 2006

Microcapsules open in tumor cells

Medicines are most helpful when they directly affect the diseased organs or cells - for example, tumour cells. Scientists at the Max Planck Institute of Colloids and Interfaces in Potsdam, Germany, and Ludwig-Maximilian-University in Munich, have come one step closer to that goal: they have intentionally released a substance in a tumour cell. The scientists placed the substance in a tiny capsule which gets channelled into cancer cells, and is then "unpacked" with a laser impulse. The laser light cracks its polymer shell by heating it up and the capsule'TMs contents are released. (Angewandte Chemie, July 2006).

Treating malignant tumours is difficult. Doctors have to destroy the tumour, but healthy tissue needs to be preserved. Chemotherapy tends to kill diseased cells, at the same time causing great damage to the body in general. So scientists are looking for ways to destroy only the rampant tumour cells. One way to achieve this is to transport substances inside of microcapsules into the tumour cells and release them there. Researchers led by Andre Skirtach and Gleb Sukhorukov at the Max Planck Institute of Colloids and Interfaces in Potsdam, Germany, along with Wolfgang Parak at Ludwig-Maximilian-University in Munich, have now used a laser as a means of opening microcapsules inserted into a tumour cell. The capsules subsequently release their contents, a fluorescent test substance, into the cell. The scientists used a light microscope to monitor how the luminous materials distribute themselves within the cell.

The vehicle that the researchers used was a polymer capsule only a few micrometres in diameter. The walls of the capsules were built from a number of layers of charged polymers, alternating positive and negative. In the laboratory, at least, this is an established way of producing transport containers for medicines, cosmetics, or nutrients, which can also pass through cell membranes. Andre Skirtach and his colleagues equipped the capsules with a kind of "open sesame". But it didn't require any magic - just nanoparticles made of gold or silver atoms. The scientists mixed together charged metal nanoparticles along with the polymers composing the walls of the vesicle. The tumour cells absorbed the microcapsules and then the scientists aimed an infrared laser at them. Metal nanoparticles are particularly good at absorbing the laser light and transmitting the heat further into their surroundings, heating up the walls. They became so hot that the bonds broke between the polymers and the shell and the capsules eventually opened.

For the time being, the scientists have only been trying out their methods on isolated tumour cells. "In principle, however, active substances could be released into the body this way," says Helmuth Mohwald, director of the Max Planck Institute of Colloids and Interfaces, and one of the participating scientists. This has to (do) with the fact that infrared laser light can penetrate at least one centimetre deep into the tissue. The cells of the body heat up negligibly because laser light at this wavelength is insignificantly absorbed in the tissue. It is the metal particles in the walls of the microcapsules only that absorb the light - even when the microcapsules are in a cell, because the laser affects only them.

Besides using a "thermal opener", the scientists have found another way of making the capsules more stable. They simply heat up the newly created microcapsules very slightly, so that the diameter of the hollow capsules becomes smaller. At the same time, the molecules in their shell are located closer to each other, thickening the capsule walls and better protecting their contents.

There is still, however, a major problem to solve before scientists can use this technology to create medicines which squeeze microcapsules into tumour cells. There is still no way to "steer" the microcapsules. Helmuth Mohwald says, "we have to add some kind of feature to the capsules so that they only recognise the target cells." Only these cells would then allow microcapsules through their membrane.

Contact: Dr. Andre Skirtach
skirtach@mpikg.mpg.de
0049-331-567-9235
Max-Planck-Gesellschaft

Burning palm oil fuels climate change

Proposals by RWE npower to run the Littlebrook power station in Kent on palm oil have been criticised by Friends of the Earth. The group warned that use of palm oil as a biofuel threatens to exacerbate climate change, because it would lead to a major increase in demand for palm oil leading to even more rainforest destruction.

Palm oil is grown in lowland rainforest areas in South East Asia and growing demand is already leading to the destruction of rainforest, including through burning. This increases emissions of carbon dioxide in to the atmosphere [1]. Forest clearance also leads to the exposure and burning of lowland peat bogs, again emitting greenhouse gases. Tropical deforestation already contributes between 10 and 30 per cent of global warming emissions [2].

Palm oil, which is currently used as an ingredient in food and cosmetics, is one of the cheapest vegetable oils on the world market. As such, it is seen as an attractive option by energy companies.

Friends of the Earth's Palm Oil Campaigner Ed Matthew said:

"Current levels of demand for palm oil for the food industry are already threatening the forests of Indonesia with annihilation. It is a big enough challenge converting this into demand for sustainably grown palm oil. These forests and the people and wildlife they support simply cannot cope with a steep rise in global demand for palm oil for the energy industry. It will sound the death knell for the orang-utan and create further conflict between palm oil companies and local communities. But it will also hamper the fight against climate change, the very problem the biofuels industry is supposed to be helping overcome."

The destruction of lowland rainforest in Indonesia and Malaysia also threatens the survival of the orang-utan, one of man's closest relatives, and has resulted in accusations of human rights abuses from local communities, many of whom depend on the forests for their livelihoods.

More than 90 per cent of world exports of palm oil come from Malaysia and Indonesia where the rapid expansion of the palm oil industry is leading to forest clearance and is threatening wildlife and local communities. Last year, experts warned that the industry threatened the survival of the orang-utan, one of man's closest relatives.

Up to 10 million hectares of rainforest have already been destroyed as a result of the palm oil industry and pressure is growing to increase the area of forest available for plantations, with proposals to expand into the Tanjung Putting National Park - one of the world's most important sites for orang-utan conservation.

Friends of the Earth said that many palm oil companies in Indonesia have used fire to clear the land, exacerbating the problem. Burning forests adds to emissions of carbon dioxide. It is estimated that the great forest fires in Indonesia of 1997-1998 resulted in carbon emissions equivalent to 40 per cent of all emissions from burning fossil fuels in the world that year. Many oil palm plantations in Indonesia are also being developed in areas of peat swamp - which if degraded can also result in the release of huge amounts of global warming gases.

Palm oil should not be used as a fuel source until governments in Malaysia and Indonesia can demonstrate that rainforest is not being cleared to make way for the plantations and that the rights of indigenous peoples and local communities are fully recognised in law, Friends of the Earth said.

Why are so many people dying on Everest?

Why are so many people dying on Mount Everest, asks doctor and climber, Andrew Sutherland in this week's BMJ?

It used to be thought that it would be physiologically impossible to climb Mount Everest with or without oxygen. In 1953 Hillary and Tenzing proved that it was possible to reach the summit with oxygen and in 1978 Messner and Habeler demonstrated it was possible without oxygen.

Although Everest has not changed, and we now have a better understanding of acclimatisation, improved climbing equipment, and established routes, it would therefore seem logical that climbing Everest might have become an altogether less deadly activity.

However, this year the unofficial body count on Mount Everest has reached 15, the most since the disaster of 1996 when 16 people died, eight in one night following an unexpected storm.

The death rate on Mount Everest has not changed over the years, with about one death for every 10 successful ascents. For anyone who reaches the summit, they have about a 1 in 20 chance of not making it down again.

So why are there so many people dying on Mount Everest? And more importantly, can we reduce this number?

The main reasons for people dying while climbing Mount Everest are injuries and exhaustion. However, there is also a large proportion of climbers who die from altitude related illness, specifically from high altitude cerebral oedema (HACE) and high altitude pulmonary oedema (HAPE).

This year, the author was on the north side of Everest as the doctor on the Everestmax expedition (www.everestmax.com) and was shocked by both the amount of altitude related illness and the relative lack of knowledge among people attempting Everest.

He writes: "On our summit attempt we were able to help with HAPE at 7000 metres, but higher up the mountain we passed four bodies of climbers who had been less fortunate. The last body we encountered was of a Frenchman who had reached the summit four days earlier but was too exhausted to descend. His best friend had tried in vain to get him down the mountain, but they had descended only 50 metres in six hours and he had to abandon him."

"Some people believe that part of the reason for the increase in deaths is the number of inexperienced climbers, who pay large sums of money to ascend Everest," he says. "In my view, climbers are not climbing beyond their ability but instead beyond their altitude ability. Unfortunately it is difficult to get experience of what it is like climbing above Camp 3 (8300 metres) without climbing Everest. Climbers invariably do not know what their ability above 8300 metres is going to be like."

He suggests that climbers need to think less about 'the climb' and more about their health on the way up.

No matter what the affliction, whether it be HACE, HAPE, or just exhaustion, the result is invariably the same – the climber starts to climb more slowly, he explains. If you are too slow this means that something is wrong and your chances of not making it off the mountain are greatly increased. But with the summit in sight this advice is too often ignored.

When the author visited the French consulate in Kathmandu to confirm the Frenchman's death, the consul, not a climbing or an altitude expert, shook his head and said, "He didn't reach the summit until 12.30; that is a 14 hour climb – it is too long."

Contact: Emma Dickinson
edickinson@bmj.com
44-207-383-6529
BMJ-British Medical Journal

Adult stem cells are touchy-feely, need environmental clues

A certain type of adult stem cell can turn into bone, muscle, neurons or other types of tissue depending on the "feel" of its physical environment, according to researchers at the University of Pennsylvania.

The researchers discovered that mesenchymal stem cells, which regularly reside in the bone marrow as part of the body's natural regenerative mechanism, depend on physical clues from their local environment in order to transform into different types of tissue. The researchers were even able to manipulate stem cells by changing the firmness of the gel on which they were grown.

The researchers believe that their findings, which appear in the Aug. 25 issue of the journal Cell, could change the way in which people work with stem cells.

"Basically, mesenchymal stem cells feel where they're at and become what they feel," said Dennis Discher, a professor in Penn's School of Engineering and Applied Science. "The results begin to establish a physical basis for both stem-cell use against diseases and for stem-cell behavior in embryonic development,"

Much of the work in stem-cell science has involved the study of the chemical microenvironment, the soup of chemical messenger signals that are generally thought to guide stem cells through the process of differentiation, where relatively "blank" stem cells turn into specific cell types. For the first time, the Penn researchers have proven that the physical microenvironment is also crucial for guiding the cells through differentiation. According to Adam Engler, the first author on the study and a graduate student in the School of Engineering and Applied Science, soft microenvironments, that mimic the brain, guide the cells toward becoming neurons, stiffer microenvironments, that mimic muscle, guide the cells toward becoming muscle cells and comparatively rigid microenvironments guide the cells toward becoming bone.

"While I anticipated that the physical environment might limit the fate of stem cells, I never really thought that it would be sufficient to direct cell fate," said Lee Sweeney, a coauthor of the study and chairman of Physiology in Penn's School of Medicine. "When I saw Adam's first images, I was stunned to see that the physical environment alone was telling the stem cells to become neurons or muscle or bone. "

Mesenchymal stem cells sense their environment through the force it takes them to push against surrounding objects. Each cell has its own skeleton and molecular motors that it uses as muscles. According to the researchers, the amount of force the stem cell needs to move its cellular muscles triggers an internal chemical signal that, in turn, directs the cell to differentiate.

"The cytoskeleton uses motors that, like our muscles, are based on the mechanical tension created by molecules of actin and myosin," Engler said. "When we deprive these stem cells of myosin, the cells do not respond to their physical environment, only their chemical environment."

But, the physical microenvironment can change due to injury and also in disease, which would make it difficult to use these stem cells in certain types of therapy. After a heart attack, for example, the heart becomes so scarred that stem cells seem ineffective in fixing the damage by turning into replacement cardiac muscle.

"The cardiac tissue may have been so damaged during the heart attack that the stem cells do not recognize the microenvironment as a guide for turning into heart muscle," Discher said; "however, our studies show that it might be possible to 'prime' stem cells for therapy in the lab, before implanting them in the heart, spine or whatever damaged environment you want to place them."

Contact: Greg Lester
glester@pobox.upenn.edu
215-573-6604
University of Pennsylvania

Nanowire arrays can detect signals along individual neurons

Merger of nanowires and neurons could boost efforts to measure and understand brain activity

CAMBRIDGE, Mass. -- Opening a whole new interface between nanotechnology and neuroscience, scientists at Harvard University have used slender silicon nanowires to detect, stimulate, and inhibit nerve signals along the axons and dendrites of live mammalian neurons.

Harvard chemist Charles M. Lieber and colleagues report on this marriage of nanowires and neurons this week in the journal Science.

"We describe the first artificial synapses between nanoelectronic devices and individual mammalian neurons, and also the first linking of a solid-state device -- a nanowire transistor -- to the neuronal projections that interconnect and carry information in the brain," says Lieber, the Mark Hyman, Jr., Professor of Chemistry in Harvard's Faculty of Arts and Sciences and Division of Engineering and Applied Sciences. "These extremely local devices can detect, stimulate, and inhibit propagation of neuronal signals with a spa-tial resolution unmatched by existing techniques."

Electrophysiological measurements of brain activity play an important role in understanding signal propagation through individual neurons and neuronal networks, but existing technologies are relatively crude: Micropipette electrodes poked into cells are invasive and harmful, and microfabricated electrode arrays are too bulky to detect activity at the level of individual axons and dendrites, the neuronal projections responsible for electrical signal propagation and inter-neuron communication.

By contrast, the tiny nanowire transistors developed by Lieber and colleagues gently touch a neuronal projection to form a hybrid synapse, making them noninvasive, and are thousands of times smaller than the electronics now used to measure brain activity.

Lieber's group has previously shown that nanowires can detect, with great precision, molecular markers indicating the presence of cancer in the body, as well as single viruses. Their latest work takes advantage of the size similarities between ultra-fine silicon nanowires and the axons and dendrites projecting from nerve cells: Nanowires, like neuronal offshoots, are just tens of nanometers in width, making the thin filaments a good match for intercepting nerve signals.

Because the nanowires are so slight -- their contact with a neuron is no more than 20 millionths of a meter in length -- Lieber and colleagues were able to measure and manipulate electrical conductance at as many as 50 locations along a single axon.

The current work involves measurement of signals only within single mammalian neurons; the researchers are now working toward monitoring signaling among larger networks of nerve cells. Lieber says the devices could also eventually be configured to measure or detect neurotransmitters, the chemicals that leap synapses to carry electrical impulses from one neuron to another.

"This work could have a revolutionary impact on science and technology," Lieber says. "It provides a powerful new approach for neuroscience to study and manipulate signal propagation in neuronal networks at a level unmatched by other techniques; it provides a new paradigm for building sophisticated interfaces between the brain and external neural prosthetics; it represents a new, powerful, and flexible approach for real-time cellular assays useful for drug discovery and other applications; and it opens the possibility for hybrid circuits that couple the strengths of digital nanoelectronic and biological computing components."

Contact: Steve Bradt
steve_bradt@harvard.edu
617-496-8070
Harvard University

Planet Earth may have 'tilted' to keep its balance

Imagine a shift in the Earth so profound that it could force our entire planet to spin on its side after a few million years, tilting it so far that Alaska would sit at the equator. Princeton scientists have now provided the first compelling evidence that this kind of major shift may have happened in our world's distant past.

By analyzing the magnetic composition of ancient sediments found in the remote Norwegian archipelago of Svalbard, Princeton University's Adam Maloof has lent credence to a 140-year-old theory regarding the way the Earth might restore its own balance if an unequal distribution of weight ever developed in its interior or on its surface.

The theory, known as true polar wander, postulates that if an object of sufficient weight --such as a supersized volcano -- ever formed far from the equator, the force of the planet's rotation would gradually pull the heavy object away from the axis the Earth spins around. If the volcanoes, land and other masses that exist within the spinning Earth ever became sufficiently imbalanced, the planet would tilt and rotate itself until this extra weight was relocated to a point along the equator.

"The sediments we have recovered from Norway offer the first good evidence that a true polar wander event happened about 800 million years ago," said Maloof, an assistant professor of geosciences. "If we can find good corroborating evidence from other parts of the world as well, we will have a very good idea that our planet is capable of this sort of dramatic change."

Maloof's team, which includes researchers from Harvard University, the California Institute of Technology and the Massachusetts Institute of Technology as well as Princeton, will publish their findings in the Geological Society of America Bulletin on Friday, Aug. 25.

True polar wander is different from the more familiar idea of "continental drift," which is the inchwise movement of individual continents relative to one another across the Earth's surface. Polar wander can tip the entire planet on its side at a rate of perhaps several meters per year, about 10 to 100 times as fast as the continents drift due to plate tectonics. Though the poles themselves would still point in the same direction with respect to the solar system, the process could conceivably shift entire continents from the tropics to the Arctic, or vice versa, within a relatively brief geological time span.

While the idea that the continents are slowly moving in relation to one another is a well-known concept, the less familiar theory of true polar wander has been around since the mid-19th century, several decades before continental drift was ever proposed. But when the continents were proven to be moving under the influence of plate tectonics in the 1960s, it explained so many dynamic processes in the Earth's surface so well that true polar wander became an obscure subject.

"Planetary scientists still talk about polar wander for other worlds, such as Mars, where a massive buildup of volcanic rock called Tharsis sits at the Martian equator," Maloof said. "But because Earth's surface is constantly changing as the continents move and ocean crustal plates slide over and under one another, it's more difficult to find evidence of our planet twisting hundreds of millions of years ago, as Mars likely did while it was still geologically active."

However, the sediments that the team studied in Svalbard from 1999 to 2005 may have provided just such long-sought evidence. It is well known that when rock particles are sinking to the ocean floor to form layers of new sediment, tiny magnetic grains within the particles align themselves with the magnetic lines of the Earth. Once this rock hardens, it becomes a reliable record of the direction the Earth's magnetic field was pointing at the time of the rock's formation. So, if a rock has been spun around by a dramatic geological event, its magnetic field will have an apparently anomalous orientation that geophysicists like those on Maloof's team seek to explain.

"We found just such anomalies in the Svalbard sediments," Maloof said. "We made every effort to find another reason for the anomalies, such as a rapid rotation of the individual crustal plate the islands rest upon, but none of the alternatives makes as much sense as a true polar wander event when taken in the context of geochemical and sea level data from the same rocks."

The findings, he said, could possibly explain odd changes in ocean chemistry that occurred about 800 million years ago. Other similar changes in the ocean have cropped up in ancient times, Maloof said, but at these other times scientists know that an ice age was to blame.

"Scientists have found no evidence for an ice age occurring 800 million years ago, and the change in the ocean at this juncture remains one of the great mysteries in the ancient history of our planet," he said. "But if all the continents were suddenly flipped around and their rivers began carrying water and nutrients into the tropics instead of the Arctic, for example, it could produce the mysterious geochemical changes science has been trying to explain."

Because the team obtained all its data from the islands of Svalbard, Maloof said their next priority would be to seek corroborating evidence within sediments of similar age from elsewhere on the planet. This is difficult, Maloof said, because most 800-million-year-old rocks have long since disappeared. Because the Earth's crustal plates slide under one another over time, they take most of geological history back into the planet's deep interior. However, Maloof said, a site his team has located in Australia looks promising.

"We cannot be certain of these findings until we find similar patterns in rock chemistry and magnetics on other continents," Maloof said. "Rocks of the same age are preserved in the Australian interior, so we'll be visiting the site over the next two years to look for additional evidence. If we find some, we'll be far more confident about this theory's validity."

Maloof said that true polar wander was most likely to occur when the Earth's landmasses were fused together to form a single supercontinent, something that has happened at least twice in the distant past. But he said we should not worry about the planet going through a major shift again any time soon.

"If a true polar wander event has occurred in our planet's history, it's likely been when the continents formed a single mass on one side of the Earth," he said. "We don't expect there to be another event in the foreseeable future, though. The Earth's surface is pretty well balanced today."

Contact: Chad Boutin
cboutin@princeton.edu
609-258-5729
Princeton University

Researchers explore turning fuel ethanol into beverage alcohol

Fuel ethanol could be cheaply and quickly converted into the purer, cleaner alcohol that goes into alcoholic drinks, cough medicines, mouth washes and other products requiring food-grade alcohol, say Iowa State University researchers.

But there's still a lot of purifying and studying to be done before fuel made from corn is turned into your next vodka or mixed into your morning mouth wash.

Jacek Koziel, an Iowa State assistant professor of agricultural and biosystems engineering, is leading a research project that's attempting to develop and refine two technologies that work together to efficiently purify and remove bad-tasting components from fuel ethanol. The project is partially supported by a $79,900 grant from the state's Grow Iowa Values Fund.

Koziel is collaborating on the project with Hans van Leeuwen, the vice president of MellO3z, a Cedar Rapids company that has developed technology for purifying alcoholic beverages. Van Leeuwen is also an Iowa State professor of civil, construction and environmental engineering.

Iowa certainly has an abundance of fuel ethanol for the researchers to work with. Iowa is the country's leading producer of fuel ethanol. The Iowa Corn Promotion Board says the state has 25 plants capable of producing 1.5 billion gallons per year with more plants on the way.

Van Leeuwen said the fuel produced by those plants and the alcohol produced for the beverage industry are very similar. But alcohol produced for fuel isn't made with the same care and purity as alcohol for consumption, he said. The multiple distillations required to make food-grade alcohol raise production costs to about 50 cents per gallon more than it costs to produce fuel ethanol.

Van Leeuwen said the researchers are working to develop technologies that can purify fuel into beverage alcohol for less than an additional penny per gallon.

"That's the whole point," van Leeuwen said. "And based on my experience treating water and wastewater with these technologies, this could cost a lot less than a cent per gallon."

The potential to cut costs has one large producer of ethanol and food-grade alcohol interested in the research project, van Leeuwen said.

Koziel said the researchers are using two purification technologies: they're bubbling ozone gas through the fuel to remove impurities and they're filtering the fuel through granular activated carbon to absorb impurities. A patent for the process is pending.

Underpinning the research is sophisticated chemical and sensory analysis of the raw fuel and the purified alcohol. Koziel will use a technology called solid phase microextraction to collect samples of the compounds in the alcohols. He'll also use a technology called gas chromatography-mass spectrometry to identify and quantify all the compounds in the samples. And he'll use his lab's olfactometry equipment to separate and analyze the smells created by the various compounds.

"If this is viable," Koziel said, "we are looking at adding a lot of value to relatively cheap fuel-grade ethanol."

Contact: Jacek Koziel
koziel@iastate.edu
515-294-4206
Iowa State University

Weather forecast accuracy gets boost with new computer model

An advanced forecasting model that predicts several types of extreme weather with substantially improved accuracy has been adopted for day-to-day operational use by civilian and military weather forecasters. The new computer model was created through a partnership that includes the National Oceanic and Atmospheric Administration (NOAA), the National Center for Atmospheric Research (NCAR), and more than 150 other organizations and universities in the United States and abroad.

The high-resolution Weather Research and Forecasting model (WRF) is the first model to serve as both the backbone of the nation's public weather forecasts and a tool for cutting-edge weather research. Because the model fulfills both functions, it is easier for research findings to be translated into improved operational models, leading to better forecasts.

The model was adopted for use by NOAA's National Weather Service (NWS) as the primary model for its one-to-three-day U.S. forecasts and as a key part of the NWS's ensemble modeling system for short-range forecasts. The U.S. Air Force Weather Agency (AFWA) also has used WRF for several areas of operations around the world.

"The Weather Research and Forecasting model development project is the first time researchers and operational scientists have come together to collaborate on a weather modeling project of this magnitude," says Louis Uccellini, director of NOAA's National Centers for Environmental Prediction.

By late 2007, the new model will shape forecasts that serve more than a third of the world's population. It is being adopted by the national weather agencies of Taiwan, South Korea, China, and India.

"WRF is becoming the world's most popular model for weather prediction because it serves forecasters as well as researchers," says NCAR director Tim Killeen.

Multiple benefits

Tests over the last year at NOAA and AFWA have shown that the new model offers multiple benefits over its predecessor models. For example:

* Errors in nighttime temperature and humidity across the eastern United States are cut by more than 50%.

* The model depicts flight-level winds in the subtropics that are stronger and more realistic, thus leading to improved turbulence guidance for aircraft.

* The model outperformed its predecessor in more than 70% of the situations studied by AFWA.

* WRF incorporates data from satellites, radars, and a wide range of other tools with greater ease than earlier models.

Advanced research

NCAR has been experimenting with an advanced research version of WRF, with very fine resolution and innovative techniques, to demonstrate where potential may exist for improving the accuracy of hurricane track, intensity, and rainfall forecasts. A special hurricane-oriented version of WRF, the HWRF, is now being developed by scientists from NOAA, the Naval Research Laboratory, the University of Rhode Island, and Florida State University to support NOAA hurricane forecasting. The high-resolution HWRF will track waves and other features of the ocean and atmosphere, including the heat and moisture exchanged between them. Its depiction of hurricane cores and the ocean below them will be enhanced by data from satellites, aircraft, and other observing tools.

WRF also is skilled at depicting intense squall lines, supercell thunderstorms, and other types of severe weather. Although no model can pinpoint hours ahead of time where a thunderstorm will form, WRF outpaces many models in its ability to predict what types of storms could form and how they might evolve.

Approximately 4,000 people in 77 countries are registered users of WRF. Many of these users suggest improvements, which are tested for operational usefulness at a testbed facility based at NCAR and supported by NOAA.

"WRF will continue to improve because of all the research and development pouring into it from our nation's leading academic and scientific institutions," said AFWA commander Patrick Condray.

Contact:David Hosansky
hosansky@ucar.edu
303-497-8611
National Center for Atmospheric Research/University Corporation for Atmospheric Research

Dennis Feltgen
dennis.feltgen@noaa.gov
301-763-0622, ext. 127
National Oceanic and Atmospheric Administration

Miles Brown
miles.brown@afwa.af.mil
402-294-2862
Air Force Weather Agency Public Affairs

Engineers forge greener path to iron production

MIT engineers have demonstrated an eco-friendly way to make iron. The new method eliminates the greenhouse gases usually associated with iron production.

The American Iron and Steel Institute (AISI) announced today that the team, led by Donald R. Sadoway of the Department of Materials Science and Engineering, has shown the technical viability of producing iron by molten oxide electrolysis (MOE).

"What sets molten oxide electrolysis apart from other metal-producing technologies is that it is totally carbon-free and hence generates no carbon dioxide gases -- only oxygen," said Lawrence W. Kavanagh, AISI vice president of manufacturing and technology.

The work was funded by the AISI/Department of Energy Technology Roadmap Program (TRP). The TRP goal is to increase the competitiveness of the U.S. steel industry while saving energy and enhancing the environment. According to the AISI, the MIT work "marks one of TRP's breakthrough projects toward meeting that goal."

Unlike other iron-making processes, MOE works by passing an electric current through a liquid solution of iron oxide. The iron oxide then breaks down into liquid iron and oxygen gas, allowing oxygen to be the main byproduct of the process.

Electrolysis itself is nothing new -- all of the world's aluminum is produced this way. And that is one advantage of the new process: It is based on a technology that metallurgists are already familiar with. Unlike aluminum smelting, however, MOE is carbon-free.

"What's different this time is that we have the resources to take the time to unravel the underlying basic science," said Sadoway, the John F. Elliott Professor of Materials Chemistry. "No one has ever studied the fundamental electrochemistry of a process operating at 1600ºC. We're doing voltammetry at white heat!"

The result? "I now can confirm that in molten oxide electrolysis we'll see iron productivities at least five times that of aluminum, maybe as high as 10 times. This changes everything when it comes to assessing technical viability at the industrial scale."

MIT will continue further experiments to determine how to increase the rate of iron production and to discover new materials capable of extending the life of certain reactor components to industrially practical limits. This work will set the stage for construction of a pilot-scale cell to further validate the viability of the MOE process and identify scale-up parameters.

Mountain climate change trends could predict water resources

New research into climate change in the Western Himalaya and the surrounding Karakoram and Hindu Kush mountains could explain why many glaciers there are growing and not melting.

The findings suggest this area, known as the Upper Indus Basin, could be reacting differently to global warming, the phenomenon blamed for causing glaciers in the Eastern Himalaya, Nepal and India, to melt and shrink.

Researchers from Newcastle University, UK, who publish their findings in the American Meteorological Society's Journal of Climate, looked at temperature trends in the Upper Indus Basin over the last century.

They found a recent increase in winter temperatures and a cooling of summer temperatures. These trends, combined with an increase in snow and rainfall - a finding from earlier in their research - could be causing glaciers to grow, at least in the higher mountain regions.

These findings are particularly significant because temperature and rain and snow trends in the Upper Indus Basin also impact on the water availability for more than 50 million Pakistani people.

Melt water from glaciers and the previous winter's snow supplies water for the summer 'runoff' which feeds irrigation both in the mountains and in the plains of the Lower Indus. The vast Indus Basin Irrigation System is the mainstay of the national economy of Pakistan, which has 170,000 square kilometres of irrigated land, an area two-thirds the size of the United Kingdom.

Being able to predict trends could contribute to more effective, forward-thinking management of the two major dams in the Upper Indus Basin - called the Mangla Dam and the Tarbela Dam - and thus allow a better long-term control of water for irrigation and power supplies. These dams have the capacity to produce around 5,000 Megawatts of electric power.

The amount of runoff depends on the elaborate interplay of weather conditions. One third of the runoff - that which comes from the higher mountain regions - is largely dependent on the temperature in the summer, research shows. Specifically, the fall of one degree centigrade in mean summer temperature since 1961 is thought to have caused a 20 per cent drop in runoff into the higher mountain rivers.

Yet two-thirds of runoff - that from the lower mountain regions - is dependent on the amount of snow in the previous winter. Heavy winter snowfall is followed by a greater volume of summer runoff.

Dr Hayley Fowler, lead author on the research paper and a senior research associate with Newcastle University's School of Civil Engineering and Geosciences, said: "Very little research of this kind has been carried out in this region and yet the findings from our work have implications for the water supplies of around 50 million people in Pakistan who are dependent on the activity of the glaciers.

"Our research suggests we could be able to predict in advance the volume of summer runoff, which is very useful in planning ahead for water resources and also the output from the dams."

Co-researcher Mr David Archer, a visiting fellow with Newcastle University, added: "Our research is concerned with both climate change and the climate variability that is happening from year to year.

"Information on variability is more important for the management of the water system as it will help to forecast the inflow into reservoirs and allow for better planning of water use for irrigation.

"However, information on the impacts of climatic change is important for the longer term management of water resources and to help us understand what is happening in the mountains under global warming."

Contact: Dr. Hayley Fowler
h.j.fowler@ncl.ac.uk
01-912-227-113
University of Newcastle upon Tyne

Friday, August 25, 2006

New ultrasonic technology could help prevent train derailments

Researchers have developed a new technique they said is better able than currently used technology to find defects in steel railroad tracks

UCSD researchers in March 2006 successfully detected internal defects and surface cuts with a prototype vehicle at a test track in Gettysburg, PA.

Researchers at the University of California, San Diego have developed a new technique they said is better able than currently used technology to find defects in steel railroad tracks, including hard-to-find internal cracks that can break under the weight of passing trains. Track defects account for about one-third of the 2,200 annual train derailments in the U.S., according to the Federal Railroad Administration (FRA), the federal agency charged with enforcing rail safety regulations.

A team led by UCSD structural engineering professor Francesco Lanza di Scalea describes in the Aug. 22 issue of the Journal of Sound and Vibration a defect-detection technique that uses laser beam pulses to gently "tap" on steel rails. Each laser tap sends ultrasonic waves traveling 1,800 miles per second along the steel rails. Downward facing microphones are positioned a few inches above the rail and 12 inches from the downward pointed laser beam. As the prototype vehicle rolls down the test track delivering laser beams taps at one-foot intervals, the microphones detect any telltale reductions in the strength of the ultrasonic signals, pinpointing surface cuts, internal cracks, and other defects.

In March 2006, Lanza di Scalea, project scientist Piervincenzo Rizzo and doctoral students Stefano Coccia and Ivan Bartoli tested a prototype vehicle equipped with the UCSD technology at a test track in Gettysburg, PA. The researchers detected 76.9 to 100 percent of internal defects and 61.5 to 90 percent of surface cuts in dry and wet conditions, respectively.

The UCSD team was supported by ENSCO, Inc., an engineering and technology company headquartered in Falls Church, VA, that develops inspection technologies for the Department of Transportation and other government agencies.

Lanza di Scalea and his team will test an improved design of their technology this fall in Gettysburg as part of an ongoing study funded by the FRA.

"Some of the worst derailments in this country have occurred on tracks recently inspected by the current generation of technology, which often doesn't detect interior cracks in rails that happen to lie under areas of superficial cracking," said Lanza di Scalea. "Our technique is much better able to find such defects, and it can work under varying weather conditions while the inspection vehicle is zipping along a track at speeds of up to 70 mph."

Rail carriers moved 42 percent of America's total coal, chemicals, minerals, food and other goods shipped in 2005, according to the U.S. Department of Transportation. Track-related damage from derailments and other incidents doubled from $55 million in 1993 to $111 million in 2000, and the trend toward longer trains pulling heavier cars at higher speeds creates the potential for even greater losses.

The current generation of track-inspection technologies relies on a variety of techniques, including water-filled wheels or sleds that move over track surfaces at roughly 30 mph while sending ultrasonic pulses downward into the track. The inaudible ultrasonic pulses reflect back as echoes when they encounter cracks. Unfortunately, the signals are routinely blocked by superficial surface cracks from detecting more dangerous internal cracks.

Surface cracking does not interfere with the movement of ultrasonic pulses in the UCSD technology. "The ultrasonic sound we use doesn't come from the top of the rail, but instead travels along the rail," said Lanza di Scalea. "Our pulsed-laser technique, combined with ultrasonic microphones positioned a few inches above the rails and sophisticated software that filters out noise and other sources of variability is potentially very effective at finding internal rail defects."

Contact: Rex Graham
ragraham@ucsd.edu
858-822-3075
University of California - San Diego

Ever-happy mice may hold key to new treatment of depression

A new breed of permanently 'cheerful' mouse is providing hope of a new treatment for clinical depression. TREK-1 is a gene that can affect transmission of serotonin in the brain. Serotonin is known to play an important role in mood, sleep and sexuality. By breeding mice with an absence of TREK-1, researchers were able create a depression-resistant strain. The details of this research, which involved an international collaboration with scientists from the University of Nice, France, are published in Nature Neuroscience this week.

"Depression is a devastating illness, which affects around 10% of people at some point in their life," says Dr. Guy Debonnel an MUHC psychiatrist, professor in the Department of Psychiatry at McGill University, and principal author of the new research. "Current medications for clinical depression are ineffective for a third of patients, which is why the development of alternate treatments is so important."

Mice without the TREK-1 gene ('knock-out' mice) were created and bred in collaboration with Dr. Michel Lazdunski, co-author of the research, in his laboratory at the University of Nice, France. "These 'knock-out' mice were then tested using separate behavioral, electrophysiological and biochemical measures known to gauge 'depression' in animals," says Dr. Debonnel. "The results really surprised us; our 'knock-out' mice acted as if they had been treated with antidepressants for at least three weeks."

This research represents the first time depression has been eliminated through genetic alteration of an organism. "The discovery of a link between TREK-1 and depression could ultimately lead to the development of a new generation of antidepressant drugs," noted Dr. Debonnel.

According to Health Canada and Statistics Canada, approximately 8% of Canadians will suffer from depression at some point in their lifetime. Around 5% of Canadians seek medical advice for depression each year; a figure that has almost doubled in the past decade. Figures in the U.S. are comparable, with approximately 18.8 million American adults (about 9.5% of the population) suffering depression during their life.

Contact: Ian Popple
ian.popple@muhc.mcgill.ca
514-843-1560
McGill University

Blood clots can be treated by injections at home

Treatment of blood clots in the deep veins of the legs or the lungs with an older, less expensive form of the anticoagulant medication heparin can be just as safe and effective as similar treatment with a newer and more expensive heparin, according to a study led by Clive Kearon, professor of medicine at McMaster University, published in the August 23 issue of JAMA (The Journal of the American Medical Association).

When injected subcutaneously (beneath the skin), unfractionated (regular) heparin was shown in a randomized trial to work just as well as subcutaneous injection of the more expensive, low-molecular weight heparin in the treatment of venous thromboembolism. Traditionally, when unfractionated heparin is used in treatment, it is administered intravenously and accompanied by coagulation monitoring, which requires hospitalization. This standard approach includes ongoing dose adjustment in response to measurement of the APTT, a test that measures how fast the blood clots in a test tube under certain conditions.

The newer low-molecular weight heparins, which are administered by injection in fixed-weight doses, have gradually been replacing unfractionated heparin.

Kearon and colleagues conducted a randomized trial to study how fixed-dose, subcutaneous injection of unfractionated heparin compared to injection with the newer heparin in the treatment of blood clots in the legs or lungs.

The study was conducted from September 1998 through February 2004 at six university-affiliated clinical centres in Canada and New Zealand. Patients in the trial received either unfractionated or low-molecular-weight heparin administered subcutaneously every 12 hours. About 70 per cent of both groups were treated as outpatients. All patients received three months of warfarin (an anticoagulant drug) therapy.

Recurrent thromboembolism occurred in 3.8 per cent of the 345 patients in the unfractionated heparin group, and in 3.4 percent of the 352 patients in the low-molecular weight heparin group. The rate of major bleeding was comparable in the two groups.

The authors estimate that drug costs for a six-day course of treatment with low-molecular-weight heparin would be $712 (US), while unfractionated heparin would cost just $37 - assuming both drugs are administered in the regimens used in the study. The study indicates the potential for huge costs savings.

"Fixed-dose subcutaneous unfractionated heparin is as effective and safe as low-molecular-weight heparin for initial treatment of patients with venous thromboembolism and is suitable for treatment at home," concluded Dr. Kearon, who is a physician at Hamilton Health Sciences. "In addition, the results of this study question the value of APTT monitoring in patients who are treated with currently recommended doses of unfractionated heparin."

"We've come a long way from having to spend several weeks in hospital with an intravenous heparin drip to a possible out-patient treatment that is safe, efficient, and less expensive," says Dr. Andreas Wielgosz, spokesperson for the Heart and Stroke Foundation.

Contact: Veronica McGuire
vmcguir@mcmaster.ca
90-552-591-402-2169

Hydrogen peroxide sensor could aid security

A new family of molecules used to detect hydrogen peroxide and other reactive chemicals in living cells could be a useful addition to anti-terrorist arsenals, says the University of California, Berkeley, chemist who developed these substances last year.

When mixed with a material that contains even trace amounts of hydrogen peroxide, the powdery sensors turn bright yellow or red and light up. "You don't need anything to read the selective reaction other than your eye," said the chemist, Christopher Chang, an assistant professor at UC Berkeley's Department of Chemistry. "You could also use a hand-held black-light lamp to detect the fluorescence."

British authorities say that some two dozen suspects they arrested earlier this month in an alleged plot to blow up as many as 10 jetliners may have been planning to blend liquids on board the crafts to create explosives. Peroxides and acetone top the list of liquids under suspicion.

Chang's sensors, which are non-toxic white powders, could be easily designed and modified to change to any color or to detect any number of peroxides or other chemicals that may be of interest to authorities, he said. "They could check for many things at once," he added. "You'd just have to look for the color."

To make the sensors even easier to use, they could be manufactured in a variety of forms, including paper strips similar to the ones used to measure pH levels, the chemist said. Authorities testing for illicit substances could then simply dip the strip into the suspect liquid or gel and look for a color change.

Chang has developed these sensors for use by cell biologists and other researchers working on diseases associated with aging, such as cancer and neurodegenerative diseases. What maladies like these have in common is an unregulated production in cells of compounds that can trigger oxidative damage to tissue and organs. Hydrogen peroxide is a major byproduct of such processes, and its presence is a good indicator that cells have undergone oxidative stress.

The sensors are based on a fluorescent dye called fluorescein used in many biological research applications. They are composed of small molecules that incorporate what Chang calls "chemical cages," molecular fragments he designed to react selectively with hydrogen peroxide. In what is known as a cleavage reaction, hydrogen peroxide strips boronates off of this class of molecules, turning them into their fluorescent counterparts.

Chang's paper describing these substances was published in the Journal of the American Chemical Society in November 2005. Co-authors of the paper are Evan Miller and Aaron Albers at UC Berkeley's Department of Chemistry and Arnd Pralle and Ehud Isacoff at UC Berkeley's Department of Molecular and Cell Biology.

Funding for Chang's work on the sensors comes from UC Berkeley, the Dreyfus and Beckman foundations, the American Federation for Aging Research and the National Science Foundation.

By Liese Greensfelder

New lab technique churns out fungus' potential cancer fighter

For the first time, researchers have developed a way to synthesize a cancer-killing compound called rasfonin in enough quantity to learn how it works.

Derived from a fungus discovered clinging to the walls of a New Zealand cave, the chemical tricks certain cancer cells into suicide while leaving healthy cells untouched.

"In 2000, scientists in Japan discovered that this compound might have some tremendous potential as a prototype anticancer agent, but no one has been able to study or develop it because it's so hard to get enough of it from natural sources," says Robert K. Boeckman, professor of chemistry.

"You either grow the fungus that makes it, or you go through a complicated chemical synthesis process that still yields only a minute amount," he says. "Now, after five years of effort, we've worked out a process that lets researchers finally produce enough rasfonin to really start investigating how it functions, and how we might harness it to fight cancer."

In 2000, researchers from Chiba University in Japan and the University of Tokyo simultaneously discovered a compound in certain fungi that selectively destroyed cells depending upon a gene called ras--one of the first known cancer-causing genes. They had found rasfonin, a compound that seemed tailor-made to knock out ras-dependent cancers like pancreatic cancer.

After six years, however, rasfonin's secrets remain a mystery because researchers can't make enough of it to carry out tests.

To bring about a new drug, organic chemists must produce a new chemical in enough quantity to test it under many different circumstances to tease out its modus operandi. Until now, no method existed to generate rasfonin, aside from growing more fungus--a time-consuming and terribly inefficient method. Boeckman, the Marshall D. Gates, Jr. Professor of Chemistry at the University of Rochester, has now revealed a process that produces 67 times more rasfonin than any previous method. For the first time, scientists can obtain enough rasfonin to conduct proper biological tests on it.

"At a guess, I'd say that rasfonin itself will not be the final compound that might come to market," says Boeckman. "But we need to figure out how it works, how it triggers the cancer cell to shut itself down. The key is to find exactly what buttons rasfonin is pushing, and then figure out if there's a way we can safely and more simply push those same buttons. But we couldn't do that until we have enough to test."

Even Boeckman's simplified process is notably complex, employing sophisticated organic reactions. Instead of the original method's 23 steps, Boeckman's has just 16--but finding them took five years of his team's hard work, skill and intuition.

Boeckman's paper, published in the Aug. 30 issue of the Journal of the American Chemical Society, outlines the sequence of steps showing how Boeckman's group inserted, removed, or altered the three-dimensional and chemical structure of their compound until they produced complete rasfonin. Diagrams of the complete process are available on the Web at pubs.acs.org.

"Very soon, researchers should be able to scale up this process rather easily to whatever volume they need," says Boeckman. "It may be a long road to a possible treatment, but at least we're now past the first hurdle."

Contact: Jonathan Sherwood
jonathan.sherwood@rochester.edu
585-273-4726
University of Rochester

Water filtration technique removes dangerous freshwater algae toxins

A water filtration technique that normally cleans up agricultural chemicals is also effective at removing a toxin secreted by algae found in lakes and rivers, an Ohio State University study has found.

Engineers here determined that the technique greatly outperformed other methods by removing at least 95 percent of a toxin secreted by Microcystis, a blue-green algae.

Some water filtration plants around the country already use the technique, which couples activated carbon with membrane filters, said Hal Walker, associate professor of civil and environmental engineering and geodetic science at Ohio State .

Microcystis is native to freshwater lakes and rivers around the country, and secretes toxins that can cause liver damage in animals including humans. Worsening environmental pollution in Lake Erie during the last decade has caused algal blooms, the most recent of which began this August.

Some 13 million people rely on Lake Erie for their water supply, so Microcystis is a growing concern there, Walker said. But dangerous algal blooms have occurred across the country this summer, from Massachusetts to California .

And while many water filtration plants are beginning to use high-tech ultrafiltration membranes with very fine holes to filter water, Microcystin toxins are small enough to slip through. For example, the toxin used in this study was microcystin-lr, a tiny molecule made up of only seven amino acids.

The study will appear in the journal Environmental Science & Technology, and has been published in advance on the journal's Web site.

Rather than invent a new technology for filtering microcystin-lr, Walker and his colleagues decided to test whether combining activated carbon with membrane filters would do the trick. That technology has already proven effective for removing herbicides and pesticides from drinking water.

"This toxin is an organic molecule, and we knew that activated carbon is good at removing organics," Walker said, "so we coupled the carbon with membranes. Together, they provide a way for water treatment plants to remove the toxin by basically upgrading the membrane system they already have."

Water treatment plants that already had membranes in place could add carbon to their systems without purchasing new equipment, he added.

Activated carbon is a highly porous form of charcoal that sticks to organic molecules. It's often used to filter water and clean up environmental spills, and it's even administered to poison victims to clean toxins from the digestive tract.

The engineers combined the active carbon with three different commercially available membrane filters to remove microcystin-lr from samples of Lake Erie drinking water. Each combination produced good results: one removed 95 percent of the toxin, one removed 97 percent, and the other removed 99 percent. Without the carbon, even the most effective ultrafiltration membrane removed only 78 percent of the toxin.

This is the first time this technique has been used to remove an algal toxin, and Walker cautioned that more research needs to be done before commercial water treatment plants could adopt it wholesale.

"Microcystis secretes a whole range of toxins, and we only looked at the one we thought would be the most important for health reasons," he said. "Then there's a whole host of other toxic algae that secrete their own toxins. And we don't know if there are synergistic effects between the toxins. Still, I suspect this technology would be pretty effective for all these toxins."

He would like to start a pilot project with a water treatment plant that uses membrane filters, ideally to test the system during an algal bloom.

Contact: Hal Walker
Walker.455@osu.edu
614-292-8263
Ohio State University

Microcapsules open in tumor cells

Medicines are most helpful when they directly affect the diseased organs or cells - for example, tumour cells. Scientists at the Max Planck Institute of Colloids and Interfaces in Potsdam, Germany, and Ludwig-Maximilian-University in Munich, have come one step closer to that goal: they have intentionally released a substance in a tumour cell. The scientists placed the substance in a tiny capsule which gets channelled into cancer cells, and is then "unpacked" with a laser impulse. The laser light cracks its polymer shell by heating it up and the capsule'TMs contents are released. (Angewandte Chemie, July 2006).

Treating malignant tumours is difficult. Doctors have to destroy the tumour, but healthy tissue needs to be preserved. Chemotherapy tends to kill diseased cells, at the same time causing great damage to the body in general. So scientists are looking for ways to destroy only the rampant tumour cells. One way to achieve this is to transport substances inside of microcapsules into the tumour cells and release them there. Researchers led by Andre Skirtach and Gleb Sukhorukov at the Max Planck Institute of Colloids and Interfaces in Potsdam, Germany, along with Wolfgang Parak at Ludwig-Maximilian-University in Munich, have now used a laser as a means of opening microcapsules inserted into a tumour cell. The capsules subsequently release their contents, a fluorescent test substance, into the cell. The scientists used a light microscope to monitor how the luminous materials distribute themselves within the cell.

The vehicle that the researchers used was a polymer capsule only a few micrometres in diameter. The walls of the capsules were built from a number of layers of charged polymers, alternating positive and negative. In the laboratory, at least, this is an established way of producing transport containers for medicines, cosmetics, or nutrients, which can also pass through cell membranes. Andre Skirtach and his colleagues equipped the capsules with a kind of "open sesame". But it didn't require any magic - just nanoparticles made of gold or silver atoms. The scientists mixed together charged metal nanoparticles along with the polymers composing the walls of the vesicle. The tumour cells absorbed the microcapsules and then the scientists aimed an infrared laser at them. Metal nanoparticles are particularly good at absorbing the laser light and transmitting the heat further into their surroundings, heating up the walls. They became so hot that the bonds broke between the polymers and the shell and the capsules eventually opened.

For the time being, the scientists have only been trying out their methods on isolated tumour cells. "In principle, however, active substances could be released into the body this way," says Helmuth Mohwald, director of the Max Planck Institute of Colloids and Interfaces, and one of the participating scientists. This has to (do) with the fact that infrared laser light can penetrate at least one centimetre deep into the tissue. The cells of the body heat up negligibly because laser light at this wavelength is insignificantly absorbed in the tissue. It is the metal particles in the walls of the microcapsules only that absorb the light - even when the microcapsules are in a cell, because the laser affects only them.

Besides using a "thermal opener", the scientists have found another way of making the capsules more stable. They simply heat up the newly created microcapsules very slightly, so that the diameter of the hollow capsules becomes smaller. At the same time, the molecules in their shell are located closer to each other, thickening the capsule walls and better protecting their contents.

There is still, however, a major problem to solve before scientists can use this technology to create medicines which squeeze microcapsules into tumour cells. There is still no way to "steer" the microcapsules. Helmuth Mohwald says, "we have to add some kind of feature to the capsules so that they only recognise the target cells." Only these cells would then allow microcapsules through their membrane.

Contact: Dr. Andre Skirtach
skirtach@mpikg.mpg.de
0049-331-567-9235
Max-Planck-Gesellschaft

Tiny ion pump sets new standard in cooling hot computer chips

University of Washington researchers have succeeded in building a cooling device tiny enough to fit on a computer chip that could work reliably and efficiently with the smallest microelectronic components.

The device, which uses an electrical charge to create a cooling air jet right at the surface of the chip, could be critical to advancing computer technology because future chips will be smaller, more tightly packed and are likely to run hotter than today's chips. As a result, tomorrow's computers will need cooling systems far more efficient than the fans and heat sinks that are used today.

"With this pump, we are able to integrate the entire cooling system right onto a chip," said Alexander Mamishev, associate professor of electrical engineering and principal investigator on the project. "That allows for cooling in applications and spaces where it just wasn't realistic to do before." The micro-pump also represents the first time that anyone has built a working device at this scale that uses this method, Mamishev added.

"The idea has been around for several years," he said. "But until now it hasn't been physically demonstrated in terms of a working prototype."

Mamishev and doctoral students Nels Jewell-Larsen and Chi-Peng Hsu presented a paper on the device at the American Institute of Aeronautics and Astronautics/American Society of Mechanical Engineers Joint Thermophysics and Heat Transfer Conference earlier this summer and are scheduled to give an additional presentation this fall. In addition, the UW researchers and collaborators with Kronos Advanced Technologies and Intel Corp. have been awarded a $100,000 grant from the Seattle-based Washington Technology Center for the second phase of the project.

The device utilizes an electrical field to accelerate air to speeds previously possible only with the use of traditional blowers. Trial runs showed that the prototype device significantly cooled an actively heated surface on just 0.6 watts of power.

The prototype cooling chip contains two basic components: an emitter and a collector. The emitter has a tip radius of about 1 micron – so small that up to 300 tips could fit across a human hair. The tip creates air ions, electrically charged particles that are propelled in an electric field to the collector surface. As the ions travel from tip to collector, they create an air jet that blows across the chip, taking heat with it. The volume of the airflow can be controlled by varying the voltage between the emitter and collector.

The findings are significant for future computing applications, which will incorporate denser circuitry to boost computing power. More circuitry equals more heat and a greater need for innovative cooling technologies that go beyond bulky, noisy and relatively inefficient fans and heat sinks – metal plates with fins to increase surface area and help dissipate heat. Circulating liquids among the chips to draw away heat is one possibility, but computer chips and liquids don't mix well; the cost of a cooling system breakdown could be steep.

"Our goal is to develop advanced cooling systems that can be built right onto next-generation microchips," Jewell-Larsen said. "Such systems could handle both the increased heat generation of future chips and the fact that they would be distributed throughout a computer or electronic device." Added Mamishev: "It promises a new dimension in thermal management strategy and design."

A few challenges remain, he added. One involves developing the mathematical models to control vast systems of chips with built-in coolers. "These pumps end up being very complicated, dynamic systems," Mamishev said. "You have flow on a microscale, electrohydrodynamic forces, electrical fields and moving charges."

A second challenge is identifying the best materials to use in building devices that are high-performing and durable. "There is evidence that nanotubes and other nano-structures could give significant performance gains," Jewell-Larsen said. "Those are avenues we are currently pursuing."

Contact: Rob Harrill
rharrill@u.washington.edu
206-543-2580
University of Washington

Unusual rods

Get thicker when stretched, thinner when compressed: simulations identify auxetic molecules

Day-to-day experience teaches us that stretching an object makes it thinner; pushing it together makes it thicker. However, there are also materials that behave contrary to our expectations: they get thicker when stretched and thinner when compressed. Known as "auxetic" substances, these materials include some foams and special crystals. Researchers at the Bar-Ilan University and the Israel Institute of Technology have now used quantum mechanical calculations to identify the first class of chemical compounds that behave auxetically on a molecular level.

When a usual material is, for example, hit by a ball, the material "flows" outward from the impact zone making the point of impact weaker. However, in auxetic materials, the matter "flows" inward, thus strengthening this zone. Such materials would be advantageous for bulletproof vests. Auxetic materials also provide interesting possibilities for medical technology. The introduction of implants such as stents to hold open blood vessels would be easier if, under pressure, the device would get thinner instead of thicker in the perpendicular direction.

In the auxetic materials known to date, the unusual behavior is a macroscopic property that stems from a special arrangement of the particles within the material, such as a particular weblike structure. Nanoscale auxetic materials are so far unknown.

By using quantum mechanical calculations, a team led by Shmaryahu Hoz has now predicted that there also exist certain molecules that behave auxetically: a class of compounds known as polyprismanes. These are rod-shaped molecules built up of several three-, four-, five-, or six-membered rings of carbon atoms stacked on top of each other. The prismanes made of three- and four-membered carbon rings show roughly equal auxetic effects, regardless of the number of stacked rings. The ones made of five- and six-membered carbon rings demonstrate significantly higher auxetic effects. Of all of the variations for which calculations were carried out, the prismane made of four six-membered rings showed the strongest effect. The researchers have not yet been able to unambiguously explain why prismane molecules behave auxetically.

"Although prismanes were discovered over 30 years ago, very few representatives of this class of compounds have been synthesized so far," says Hoz. "We hope that our insights will act as an incentive to produce and characterize more prismanes."

Contact: Shmaryahu Hoz
shoz@mail.biu.ac.il
972-353-18318
John Wiley & Sons, Inc.

Astronomers revise planet definition

Pluto is now a 'dwarf'

The "United Nations" of astronomers has announced a new definition of what a planet is, slightly revising the description preferred by an international panel including an MIT professor that was tasked with the challenge.

Members of the International Astronomical Union (IAU) voted on August 24 to define a planet as an object that is in orbit around the sun, is large enough for its own gravity to pull it into a nearly spherical shape, and has cleared the neighborhood around its orbit -- in other words, it has no other large bodies crossing its path.

The third condition was added to the draft definition of a planet submitted to the IAU about a week ago by MIT's Richard Binzel and colleagues. Binzel, a professor of planetary science in the Department of Earth, Atmospheric and Planetary Sciences, took responsibility for presenting the final version of the resolution at the time of the final vote.

This means that while Pluto will still be considered a planet, it will technically be a dwarf planet because it is smaller than Mercury. It will be joined in that category by Ceres and 2003 UB313 (a temporary name for an object discovered only three years ago). More "dwarf planets" are expected to be announced by the IAU in the coming months and years.

As a result of the new definition, our solar system now contains eight planets: Mercury, Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, plus the three dwarf planets led by Pluto.

Elizabeth A. Thomson, News Office