Larock, a University Professor of chemistry at Iowa State University, found the thin, square piece he was looking for and smacked it against his hand. This one is made from soybean oil reinforced with glass fibers, he said. And it's the kind of tough bioplastic he and his industrial collaborators will use to develop, test and manufacture new hog feeders.
Richard Larock displays some of the plastics he has made from corn, soybean and other bio-based oils. Larock said his research project is about as Iowa as you can get. The state, after all, is the country's leading producer of corn, soybeans and pork.
The project is partially supported by a grant of $96,000 from the Grow Iowa Values Fund, a state economic development program. Larock is working with AgVantage Inc., a Rockford, Ill., company with manufacturing facilities in Iowa, and R3 Composites, a Muscatine manufacturer.
Larock has invented and patented a process for producing various bioplastics from inexpensive natural oils, which make up 40 percent to 80 percent of the plastics. Larock said the plastics have excellent thermal and mechanical properties and are very good at dampening noises and vibrations. They're also very good at returning to their original shapes when they're heated.
And so Larock is optimistic about the future of bioplastics in commercial applications: "This project should create new technology and jobs, expand opportunities for bio-based industries and agricultural suppliers, decrease our dependence on oil, strengthen the agricultural economy of Iowa, utilize ISU patented technology, provide new markets for farmers and marry new agricultural product development with sophisticated manufacturing skills and the knowledge to commercialize these projects," he wrote in a summary of the hog feeder project.
Ron Hagemann, a principal with AgVantage, said designs for a bioplastic hog feeder have been drawn up. The designs include radio frequency identification technology that can monitor and record the feeding habits of individual hogs. Molds for the high-tech feeders should be completed later this year and prototypes should be ready for testing in a hog building next spring. If all goes well, he said a product should be ready for commercialization by the end of next year.
Hagemann said the feeders' biggest advantage in the marketplace will be material costs. Corn and soybean oils are significantly cheaper than petrochemicals. And that's particularly true when oil prices are high.
Hagemann said he expects this project to be a very good test of Larock's plastics.
Hogs, after all, aren't known for being gentle with their feeders.
"I've told Richard that if we can do this, it's all downhill from here," Hagemann said.
But Larock isn't stopping with the feeder project. He's looking at adding other low-cost agricultural ingredients to his bioplastics. He's now studying whether distillers dried grains, a co-product of ethanol production that's sold as animal feed, can add strength to his bioplastics.
Contact: Richard Larock
larock@iastate.edu
515-294-4660
Iowa State University
Saturday, September 23, 2006
Scientists develop technology for roll-up laptop screens
New ‘morphing’ structures have multiple applications
Scientists at the University of Cambridge have developed a range of unique, shape-changing structures, which can be used as roll-up display screens (such as laptop screens), re-usable packaging, roll-up keyboards and self-erecting, temporary habitats.
These structures, also known as ‘morphing’ structures, afford multiple configurations without the need of complex parts or sophisticated manufacturing. Dr Keith Seffen, from the Department of Engineering, has developed the structures and is currently exploring various applications for their ingenious behaviour with co-worker Dr Simon Guest and graduate student Alex Norman.
Dr Seffen said, “They offer substantial shape-changing capabilities whilst preserving structural integrity. They are simply made and their operation does not rely upon advanced materials. They afford compact, inexpensive solutions for multifunctional devices, which are required to be lightweight, stiff, but foldable on demand.”
By using an ordinary sheet of metal, Dr Seffen can produce structures with no moving parts but which can be configured between at least two distinct, self-locking and stable forms. For example, an A5-sized flat screen can be snapped into the shape of a tube for compact carriage in a briefcase or pocket.
The operation does not require hinges, latches or locks, and without these extra parts, production times and costs are reduced compared to traditional folding structures.
Genevieve Maul, University of Cambridge Office of Communications, Tel: + 44 (0) 1223 332300, e-mail: Genevieve.Maul@admin.cam.ac.uk
Robert Fender, Cambridge Enterprise Tel: + 44 (0) 1223 760339 e-mail: robert.fender@enterprise.cam.ac.uk
Scientists at the University of Cambridge have developed a range of unique, shape-changing structures, which can be used as roll-up display screens (such as laptop screens), re-usable packaging, roll-up keyboards and self-erecting, temporary habitats.
These structures, also known as ‘morphing’ structures, afford multiple configurations without the need of complex parts or sophisticated manufacturing. Dr Keith Seffen, from the Department of Engineering, has developed the structures and is currently exploring various applications for their ingenious behaviour with co-worker Dr Simon Guest and graduate student Alex Norman.
Dr Seffen said, “They offer substantial shape-changing capabilities whilst preserving structural integrity. They are simply made and their operation does not rely upon advanced materials. They afford compact, inexpensive solutions for multifunctional devices, which are required to be lightweight, stiff, but foldable on demand.”
By using an ordinary sheet of metal, Dr Seffen can produce structures with no moving parts but which can be configured between at least two distinct, self-locking and stable forms. For example, an A5-sized flat screen can be snapped into the shape of a tube for compact carriage in a briefcase or pocket.
The operation does not require hinges, latches or locks, and without these extra parts, production times and costs are reduced compared to traditional folding structures.
Genevieve Maul, University of Cambridge Office of Communications, Tel: + 44 (0) 1223 332300, e-mail: Genevieve.Maul@admin.cam.ac.uk
Robert Fender, Cambridge Enterprise Tel: + 44 (0) 1223 760339 e-mail: robert.fender@enterprise.cam.ac.uk
Hope for significant new diabetes treatment in Stanford discovery
STANFORD, Calif. - Certain immune-suppressing drugs, such as those taken by patients who have had organ transplants, greatly increase the risk of developing diabetes. These drugs are known to put a stranglehold on a protein called calcineurin.
So it's not exactly a surprise that Seung Kim, MD, PhD, assistant professor of developmental biology at the Stanford University School of Medicine, chose to study why calcineurin inhibition leads to the disease. What is surprising is just how central calcineurin turns out to be in the health and happiness of the insulin-producing pancreatic beta cells. His findings, to be published in the Sept. 21 issue of Nature, could shake up diabetes research, lead to new classes of diabetes drugs and aid in efforts to develop stem cell treatments for diabetes.
"This work has the potential to be big," said Scott Campbell, PhD, vice president of research for the American Diabetes Association. He said that drugs based on this research could potentially expand the numbers of the few beta cells that remain in diabetics and make those cells perform better. "That would have a major impact on the lives of people with diabetes."
In diabetes, the beta cells produce too little insulin or none at all, which prevents cells of the body from being able to take in sugar after a meal. Sugar accumulates in the blood, damaging the blood vessels, kidneys and eyes. Diabetics are also prone to nerve damage. In the United States, 20.8 million people, or 7 percent of the population, have diabetes.
Knowing the potential link between calcineurin-inhibiting drugs and diabetes, Kim and MD/PhD graduate student Jeremy Heit collaborated with Gerald Crabtree, MD, professor of pathology, in a series of experiments to clarify the connection. They worked with mice that had been bred to produce calcineurin in the pancreas only until they were born. After birth, the pancreas in each mouse stopped producing the protein. By 12 weeks of age, the mice, which had been born with a normal number of beta cells, were severely diabetic.
Squelching calcineurin prevented the beta cells from increasing their numbers as the mice grew - more body mass requires more beta cells to keep blood sugar in check. It also reduced the amount of insulin made by the existing beta cells. What's more, calcineurin was found to regulate 10 genes that already had been associated with diabetes.
"This work has led us and others to think in entirely new ways about diabetes," Heit said. Until now people had identified individual genes or processes that were involved in diabetes. The new findings show that these lines of research are connected through a common regulator in calcineurin.
Heit and Kim used further genetic trickery to bypass calcineurin by artificially activating its protein sidekick, called NFAT. Beta cells lacking calcineurin but with active NFAT behaved normally, multiplying as the mice aged and producing normal amounts of insulin.
The implications of these findings are many:
Drugs that enhance the activity of calcineurin or NFAT could become a new treatment for type-2, or adult-onset diabetes, in which the beta cells don't produce enough insulin.
Drugs that inhibit calcineurin or NFAT could treat diseases in which the beta cells produce too much insulin, such as hypoglycemia or some pancreatic tumors.
Treating isolated beta cells with drugs that enhance calcineurin could make those cells divide, producing more cells for transplantation.
Activating calcineurin could help Kim in his efforts to direct embryonic stem cells to become insulin-producing cells.
Kim, whose work in diabetes includes the development of islet cells, identifying new drug targets and potential stem cell treatments, said the calcineurin findings have wide-ranging implications. "The finding that the calcineurin pathway regulates other pathways in the beta cell makes it highly relevant to many areas of diabetes research," he said.
Campbell said the next step is to verify that the findings in mice also hold true in humans. "This is a step in the right direction and a major leap forward, but now we need to take it into to humans," he said.
Contact: Amy Adams
amyadams@stanford.edu
650-723-3900
Stanford University Medical Center
So it's not exactly a surprise that Seung Kim, MD, PhD, assistant professor of developmental biology at the Stanford University School of Medicine, chose to study why calcineurin inhibition leads to the disease. What is surprising is just how central calcineurin turns out to be in the health and happiness of the insulin-producing pancreatic beta cells. His findings, to be published in the Sept. 21 issue of Nature, could shake up diabetes research, lead to new classes of diabetes drugs and aid in efforts to develop stem cell treatments for diabetes.
"This work has the potential to be big," said Scott Campbell, PhD, vice president of research for the American Diabetes Association. He said that drugs based on this research could potentially expand the numbers of the few beta cells that remain in diabetics and make those cells perform better. "That would have a major impact on the lives of people with diabetes."
In diabetes, the beta cells produce too little insulin or none at all, which prevents cells of the body from being able to take in sugar after a meal. Sugar accumulates in the blood, damaging the blood vessels, kidneys and eyes. Diabetics are also prone to nerve damage. In the United States, 20.8 million people, or 7 percent of the population, have diabetes.
Knowing the potential link between calcineurin-inhibiting drugs and diabetes, Kim and MD/PhD graduate student Jeremy Heit collaborated with Gerald Crabtree, MD, professor of pathology, in a series of experiments to clarify the connection. They worked with mice that had been bred to produce calcineurin in the pancreas only until they were born. After birth, the pancreas in each mouse stopped producing the protein. By 12 weeks of age, the mice, which had been born with a normal number of beta cells, were severely diabetic.
Squelching calcineurin prevented the beta cells from increasing their numbers as the mice grew - more body mass requires more beta cells to keep blood sugar in check. It also reduced the amount of insulin made by the existing beta cells. What's more, calcineurin was found to regulate 10 genes that already had been associated with diabetes.
"This work has led us and others to think in entirely new ways about diabetes," Heit said. Until now people had identified individual genes or processes that were involved in diabetes. The new findings show that these lines of research are connected through a common regulator in calcineurin.
Heit and Kim used further genetic trickery to bypass calcineurin by artificially activating its protein sidekick, called NFAT. Beta cells lacking calcineurin but with active NFAT behaved normally, multiplying as the mice aged and producing normal amounts of insulin.
The implications of these findings are many:
Drugs that enhance the activity of calcineurin or NFAT could become a new treatment for type-2, or adult-onset diabetes, in which the beta cells don't produce enough insulin.
Drugs that inhibit calcineurin or NFAT could treat diseases in which the beta cells produce too much insulin, such as hypoglycemia or some pancreatic tumors.
Treating isolated beta cells with drugs that enhance calcineurin could make those cells divide, producing more cells for transplantation.
Activating calcineurin could help Kim in his efforts to direct embryonic stem cells to become insulin-producing cells.
Kim, whose work in diabetes includes the development of islet cells, identifying new drug targets and potential stem cell treatments, said the calcineurin findings have wide-ranging implications. "The finding that the calcineurin pathway regulates other pathways in the beta cell makes it highly relevant to many areas of diabetes research," he said.
Campbell said the next step is to verify that the findings in mice also hold true in humans. "This is a step in the right direction and a major leap forward, but now we need to take it into to humans," he said.
Contact: Amy Adams
amyadams@stanford.edu
650-723-3900
Stanford University Medical Center
Thursday, September 21, 2006
Climate secrets -- past, present and future revealed with new tool
Made possible through National Science Foundation funding, the XRF Core Scanner will be able to chemically analyze earth and marine sediment cores quickly to find answers to historic climate changes.
VIRGINIA KEY, FLA. A few years ago, chemical analyses of deep sea muds that used a new X-ray technology were able to help explain why the Classic Mayan civilization collapsed more than a thousand years ago. At the University of Miami Rosenstiel School of Marine and Atmospheric Science, a new tool will apply a similar technology to find answers to historic climate changes from earth and marine sediment core samples. The XRF (X-ray Fluorescence) Core Scanner is only the second to make its way to the United States, and the first of this new and improved model made by Avaatech, a company based in the Netherlands.
"From a paleoclimate researcher's perspective, this is a dream come true," said Larry C. Peterson, associate dean of students and the marine geology professor whose lab houses the scanner. "There is a tremendous amount of information about earth history preserved in the chemical composition of sediments deposited on the ocean floor, in lakes, and on land. By measuring the concentration of specific elements in these sediments, the XRF Core Scanner can help us document the history of drastic climate variations and past geological events, giving us more of an idea of the current and future state of our environment."
Made possible through National Science Foundation funding, the XRF Core Scanner will be able to chemically analyze sediment cores quickly and without any physical damage. "Previously, analyses of this type could only be done by a time-consuming process of sampling the cores, then preparing and chemically analyzing the individual samples. The Core Scanner now allows us to determine the complete chemical composition of the same cores without disturbing them, and at a speed and measurement resolution previously unimaginable. What normally would take weeks or months of laboratory time can now be done within a few hours," said Peterson. Data collected from each scan are transferred directly to computers in his lab for analysis. Once cores are loaded in the Core Scanner, the instrument can be operated from remote locations over the Internet.
Peterson and his German collaborator, Gerald Haug, were featured in the July/August 2005 issue of American Scientist for their work studying core samples taken from the Cariaco Basin off the Venezuelan coast. Using a similar XRF machine, the scientists were able to find geological records of severe droughts between 800 and 1000 AD – coincident with the collapse of Classic Mayan civilization.
"We have a collection of several thousand sediment cores from all the world's oceans stored here at the university," Peterson said. "For each sample we take or receive, we usually study half and archive the remaining portion. Those archives will comprise the greater part of our research right now. We have a number of ongoing research projects, focusing mostly on climate change in the tropics, for which this new instrument will be invaluable."
Contact: Ivy Kupec
ikupec@miami.edu
305-421-4704
University of Miami Rosenstiel School of Marine & Atmospheric Science
How space travel affects the hearts of Space Shuttle astronauts
Study evaluates loss of heart mass in astronauts
Andover, Mass. – Royal Philips Electronics (NYSE: PHG, AEX: PHI) announced today that the National Aeronautics and Space Administration (NASA) is using the Philips iE33 echocardiography system and QLAB Quantification software to evaluate the effects of space flight on the hearts of Space Shuttle astronauts and, in the near future, astronauts on the International Space Station and ground-based analogs. Of interest to NASA researchers is the loss of heart mass brought on by space flight.
Astronauts commonly are thought to lose heart mass during prolonged flight. Two-dimensional echocardiography measurements reveal a 5 percent decrease, which usually returns within three days of being back on Earth. Researchers are interested in learning the cause of these changes. Possible explanations include heart atrophy caused by weightlessness, dehydration from space travel or error caused by the geometric assumptions used in two-dimensional echo.
The new technology being used captures a full-volume image of the beating heart in less than a minute and allows physicians to examine the heart as if they were holding it in their hands. It also allows the researchers to make accurate measurements of heart mass, ejection fraction, blood flow, strain rate and cardiac wall motion pre- and post-flight.
“We have a very short window of time in which to do an echo exam on the astronauts,” said David S. Martin of Wyle Laboratories, Inc., ultrasound lead for the NASA Cardiovascular Laboratory at the Johnson Space Center in Houston, Texas. “Live 3D Echo allows us to quickly grab all the image data we need to do a full examination of the heart anatomy and function and send the astronauts on their way. Following the image acquisition, we use off-line analysis software to do several measurements that help us evaluate changes after space travel.”
The use of this heart imaging and measurement technology will be part of ongoing research at the NASA Cardiovascular Laboratory. It will also complement the imaging done by a modified Philips HDI 5000 ultrasound system that was installed in the International Space Station’s Human Research Facility in 2001.
“These new ultrasound technologies help us efficiently conduct sophisticated cardiac research of astronauts and the effects of microgravity,” said Martin.
For further information please contact:
Steve Kelly
Philips Medical Systems
Tel +1 425 487 7479
email steve.kelly@philips.com
About Royal Philips Electronics
Royal Philips Electronics of the Netherlands (NYSE: PHG, AEX: PHI) is one of the world's biggest electronics companies and Europe's largest, with sales of EUR 30.4 billion in 2005. With activities in the three interlocking domains of healthcare, lifestyle and technology and 158,000 employees in more than 60 countries, it has market leadership positions in medical diagnostic imaging and patient monitoring, color television sets, electric shavers, lighting and silicon system solutions.
Andover, Mass. – Royal Philips Electronics (NYSE: PHG, AEX: PHI) announced today that the National Aeronautics and Space Administration (NASA) is using the Philips iE33 echocardiography system and QLAB Quantification software to evaluate the effects of space flight on the hearts of Space Shuttle astronauts and, in the near future, astronauts on the International Space Station and ground-based analogs. Of interest to NASA researchers is the loss of heart mass brought on by space flight.
Astronauts commonly are thought to lose heart mass during prolonged flight. Two-dimensional echocardiography measurements reveal a 5 percent decrease, which usually returns within three days of being back on Earth. Researchers are interested in learning the cause of these changes. Possible explanations include heart atrophy caused by weightlessness, dehydration from space travel or error caused by the geometric assumptions used in two-dimensional echo.
The new technology being used captures a full-volume image of the beating heart in less than a minute and allows physicians to examine the heart as if they were holding it in their hands. It also allows the researchers to make accurate measurements of heart mass, ejection fraction, blood flow, strain rate and cardiac wall motion pre- and post-flight.
“We have a very short window of time in which to do an echo exam on the astronauts,” said David S. Martin of Wyle Laboratories, Inc., ultrasound lead for the NASA Cardiovascular Laboratory at the Johnson Space Center in Houston, Texas. “Live 3D Echo allows us to quickly grab all the image data we need to do a full examination of the heart anatomy and function and send the astronauts on their way. Following the image acquisition, we use off-line analysis software to do several measurements that help us evaluate changes after space travel.”
The use of this heart imaging and measurement technology will be part of ongoing research at the NASA Cardiovascular Laboratory. It will also complement the imaging done by a modified Philips HDI 5000 ultrasound system that was installed in the International Space Station’s Human Research Facility in 2001.
“These new ultrasound technologies help us efficiently conduct sophisticated cardiac research of astronauts and the effects of microgravity,” said Martin.
For further information please contact:
Steve Kelly
Philips Medical Systems
Tel +1 425 487 7479
email steve.kelly@philips.com
About Royal Philips Electronics
Royal Philips Electronics of the Netherlands (NYSE: PHG, AEX: PHI) is one of the world's biggest electronics companies and Europe's largest, with sales of EUR 30.4 billion in 2005. With activities in the three interlocking domains of healthcare, lifestyle and technology and 158,000 employees in more than 60 countries, it has market leadership positions in medical diagnostic imaging and patient monitoring, color television sets, electric shavers, lighting and silicon system solutions.
Wednesday, September 20, 2006
Arctic summer ice anomaly shocks scientists
The image is an Envisat ASAR mosaic of Arctic ice acquired on Aug. 24, 2005. (Courtesy: Polar View)
Satellite images acquired from 23 to 25 August 2006 have shown for the first time dramatic openings – over a geographic extent larger than the size of the British Isles – in the Arctic's perennial sea ice pack north of Svalbard, and extending into the Russian Arctic all the way to the North Pole.
Observing data from Envisat's Advanced Synthetic Aperture Radar (ASAR) instrument and the AMSR-E instrument aboard the EOS Aqua satellite, scientists were able to determine that around 5-10 percent of the Arctic's perennial sea ice, which had survived the summer melt season, has been fragmented by late summer storms. The area between Spitzbergen, the North Pole and Severnaya Zemlya is confirmed by AMSR-E to have had much lower ice concentrations than witnessed during earlier years.
Mark Drinkwater of ESA's Oceans/Ice Unit said: "This situation is unlike anything observed in previous record low ice seasons. It is highly imaginable that a ship could have passed from Spitzbergen or Northern Siberia through what is normally pack ice to reach the North Pole without difficulty.
"If this anomaly trend continues, the North-East Passage or 'Northern Sea Route' between Europe and Asia will be open over longer intervals of time, and it is conceivable we might see attempts at sailing around the world directly across the summer Arctic Ocean within the next 10-20 years."
During the last 25 years, satellites have been observing the Arctic and have witnessed reductions in the minimum ice extent – the lowest amount of ice recorded in the area annually – at the end of summer from around 8 million km² in the early 1980s to the historic minimum of less than 5.5 million km² in 2005, changes widely viewed as a consequence of greenhouse warming.
Satellite observations in the past couple of years have also shown that the extent of perennial ice is rapidly declining, but this strange condition in late August marks the first time the perennial ice pack appears to exhibit thinner and more mobile conditions in the European sector of the Central Arctic than in earlier years.
Both sets of images were taken by two different satellite instruments – ASAR on the left and AMSR-E on the right. In the coloured AMSR-E images, ice cover, or the concentration of ice, is represented by the colour. Pink represents pack ice and the colour blue open water. Intermediate colours orange, yellow, and green indicate lower ice concentrations of 70%, 50% and 30%, respectively. In the ASAR images, ice cover is represented by the uniform grey area which extends radially-outwards from the North Pole, represented by the central black hole.
The set of images on the top were both acquired on 24 August 2005, while the bottom left ASAR image was acquired on 23 August 2006 and the AMSR-E on 24 August 2006. In 2005, the uniform grey area in the ASAR image and the pink colour in the AMSR-E image are both consistent all the way around the pole (black hole), indicating pack ice with 100% ice concentration.
However in 2006 there is a significant extent of leads – fractures and openings in the sea-ice cover – just below the pole in both the ASAR image, seen as splashes of dark grey and black, and the AMSR-E image (with British Isles shown for scale), seen by the high concentration of yellow, orange and green colours, signifying low ice concentrations.
In the last weeks, what was open water has begun to freeze, as the autumn air temperatures over the Arctic begin to fall. Although a considerable fraction of darker leads can still be seen in the area using ASAR, the AMSR-E sensor no longer shows openings.
ASAR is an active microwave instrument which sends periodic radar pulses toward the Earth and measures the signals return. AMSR-E is a passive microwave instrument which does not send radar pulses down but receives radiation naturally emitted from the Earth. Passive microwave data contain a certain amount of ambiguity in interpretation of ice types, particularly in mid summer during melting. However, this ambiguity is removed in high resolution active microwave data.
Though the reason for the considerable change in the ice pack configuration is still unknown, it is likely due to the stormy weather conditions in August that characterised the month.
The effect stormy conditions have on ice is illustrated in this ASAR image, taken on 25 August 2006, as the ice in the red circle is divergent as a consequence of a low pressure system centred on the North Pole.
"As autumn freeze-up begins, the current pattern will undoubtedly precondition the ice situation in the Central Arctic for the subsequent ice season," Drinkwater said.
Contact: Mariangela D'Acunto
mariangela.dacunto@esa.int
39-069-418-0856
European Space Agency
You don't need a big lottery win for long term happiness
Researchers at the University of Warwick and Watson Wyatt have been examining just how much money one needs to win in the lottery to have a long term impact on personal happiness. Unsurprisingly the researchers found that small wins in tens or hundreds of pound made little long term difference, but they also found one did not need to win the jackpot to gain a significant increase in long-term mental wellbeing.
In work to be published in the Journal of Health Economics, researchers Professor Andrew Oswald from the University of Warwick and Dr Jonathan Gardner from Watson Wyatt showed that medium-sized lottery wins ranging from around just ?1000 to ?120,000 had a long term sustained impact in the overall happiness of those winners. On average, two years after their win medium-sized lottery winners had a mental wellbeing GHQ score 1.4 points better than previously - meaning loosely that two years after their win they were just over 10% happier than the average person without a win or only a tiny lottery win.
Intriguingly the researchers also found that this increased happiness is not obvious immediately after the medium-sized win and takes some time to show through. Economist Professor Andrew Oswald from the University of Warwick said:
"This delay could be due the short term disruptive effect on one's live of actually winning, but a more plausible explanation of the delay is that initially many windfall lottery funds are saved and spent later."
The researchers studied 14 years of longitudinal data from the British Household Panel Survey (BHPS) which tracks 5,000 British households.
Contact: Peter Dunn
p.j.dunn@warwick.ac.uk
0247-652-3708
University of Warwick
In work to be published in the Journal of Health Economics, researchers Professor Andrew Oswald from the University of Warwick and Dr Jonathan Gardner from Watson Wyatt showed that medium-sized lottery wins ranging from around just ?1000 to ?120,000 had a long term sustained impact in the overall happiness of those winners. On average, two years after their win medium-sized lottery winners had a mental wellbeing GHQ score 1.4 points better than previously - meaning loosely that two years after their win they were just over 10% happier than the average person without a win or only a tiny lottery win.
Intriguingly the researchers also found that this increased happiness is not obvious immediately after the medium-sized win and takes some time to show through. Economist Professor Andrew Oswald from the University of Warwick said:
"This delay could be due the short term disruptive effect on one's live of actually winning, but a more plausible explanation of the delay is that initially many windfall lottery funds are saved and spent later."
The researchers studied 14 years of longitudinal data from the British Household Panel Survey (BHPS) which tracks 5,000 British households.
Contact: Peter Dunn
p.j.dunn@warwick.ac.uk
0247-652-3708
University of Warwick
First evidence that musical training affects brain development in young children
Researchers have found the first evidence that young children who take music lessons show different brain development and improved memory over the course of a year compared to children who do not receive musical training.
The findings, published today (20 September 2006) in the online edition of the journal Brain [1], show that not only do the brains of musically-trained children respond to music in a different way to those of the untrained children, but also that the training improves their memory as well. After one year the musically trained children performed better in a memory test that is correlated with general intelligence skills such as literacy, verbal memory, visiospatial processing, mathematics and IQ.
The Canadian-based researchers reached these conclusions after measuring changes in brain responses to sounds in children aged between four and six. Over the period of a year they took four measurements in two groups of children – those taking Suzuki music lessons and those taking no musical training outside school – and found developmental changes over periods as short as four months. While previous studies have shown that older children given music lessons had greater improvements in IQ scores than children given drama lessons, this is the first study to identify these effects in brain-based measurements in young children.
Dr Laurel Trainor, Professor of Psychology, Neuroscience and Behaviour at McMaster University and Director of the McMaster Institute for Music and the Mind, said: "This is the first study to show that brain responses in young, musically trained and untrained children change differently over the course of a year. These changes are likely to be related to the cognitive benefit that is seen with musical training." Prof Trainor led the study with Dr Takako Fujioka, a scientist at Baycrest's Rotman Research Institute.
The research team designed their study to investigate (1) how auditory responses in children matured over the period of a year, (2) whether responses to meaningful sounds, such as musical tones, matured differently than responses to noises, and (3) how musical training affected normal brain development in young children.
At the beginning of the study, six of the children (five boys, one girl) had just started to attend a Suzuki music school; the other six children (four boys, two girls) had no music lessons outside school.
The researchers chose children being trained by the Suzuki method for several reasons: it ensured the children were all trained in the same way, were not selected for training according to their initial musical talent and had similar support from their families. In addition, because there was no early training in reading music, the Suzuki method provided the researchers with a good model of how training in auditory, sensory and motor activities induces changes in the cortex of the brain. Brain activity was measured by magnetoencephalography (MEG) while the children listened to two types of sounds: a violin tone and a white noise burst. MEG is a non-invasive brain scanning technology that measures the magnetic fields outside the head that are associated with the electrical fields generated when groups of neurons (nerve cells) fire in synchrony. When a sound is heard, the brain processes the information from the ears in a series of stages. MEG provides millisecond-by-millisecond information that tracks these stages of processing; the stages show up as positive or negative deflections (or peaks), called components, in the MEG waveform. Earlier peaks tend to reflect sensory processing and later peaks, perceptual or cognitive processing.
The researchers recorded the measurements four times during the year, and during the first and fourth session the children also completed a music test (in which they were asked to discriminate between same and different harmonies, rhythms and melodies) and a digit span memory test (in which they had to listen to a series of numbers, remember them and repeat them back to the experimenter).
Analysis of the MEG responses showed that across all children, larger responses were seen to the violin tones than to the white noise, indicating that more cortical resources were put to processing meaningful sounds. In addition, the time that it took for the brain to respond to the sounds (the latency of certain MEG components) decreased over the year. This means that as children matured, the electrical conduction between neurons in their brains worked faster.
Of most interest, the Suzuki children showed a greater change over the year in response to violin tones in an MEG component (N250m) related to attention and sound discrimination than did the children not taking music lessons.
Analysis of the music tasks showed greater improvement over the year in melody, harmony and rhythm processing in the children studying music compared to those not studying music. General memory capacity also improved more in the children studying music than in those not studying music.
Prof Trainor said: "That the children studying music for a year improved in musical listening skills more than children not studying music is perhaps not very surprising. On the other hand, it is very interesting that the children taking music lessons improved more over the year on general memory skills that are correlated with non-musical abilities such as literacy, verbal memory, visiospatial processing, mathematics and IQ than did the children not taking lessons. The finding of very rapid maturation of the N250m component to violin sounds in children taking music lessons fits with their large improvement on the memory test. It suggests that musical training is having an effect on how the brain gets wired for general cognitive functioning related to memory and attention."
Dr Fujioka added: "Previous work has shown assignment to musical training is associated with improvements in IQ in school-aged children. Our work explores how musical training affects the way in which the brain develops. It is clear that music is good for children's cognitive development and that music should be part of the pre-school and primary school curriculum."
The next phase of the study will look at the benefits of musical training in older adults.
Contact: Laurel Trainor
ljt@mcmaster.ca
905-525-9140 x23007
Oxford University Press
The findings, published today (20 September 2006) in the online edition of the journal Brain [1], show that not only do the brains of musically-trained children respond to music in a different way to those of the untrained children, but also that the training improves their memory as well. After one year the musically trained children performed better in a memory test that is correlated with general intelligence skills such as literacy, verbal memory, visiospatial processing, mathematics and IQ.
The Canadian-based researchers reached these conclusions after measuring changes in brain responses to sounds in children aged between four and six. Over the period of a year they took four measurements in two groups of children – those taking Suzuki music lessons and those taking no musical training outside school – and found developmental changes over periods as short as four months. While previous studies have shown that older children given music lessons had greater improvements in IQ scores than children given drama lessons, this is the first study to identify these effects in brain-based measurements in young children.
Dr Laurel Trainor, Professor of Psychology, Neuroscience and Behaviour at McMaster University and Director of the McMaster Institute for Music and the Mind, said: "This is the first study to show that brain responses in young, musically trained and untrained children change differently over the course of a year. These changes are likely to be related to the cognitive benefit that is seen with musical training." Prof Trainor led the study with Dr Takako Fujioka, a scientist at Baycrest's Rotman Research Institute.
The research team designed their study to investigate (1) how auditory responses in children matured over the period of a year, (2) whether responses to meaningful sounds, such as musical tones, matured differently than responses to noises, and (3) how musical training affected normal brain development in young children.
At the beginning of the study, six of the children (five boys, one girl) had just started to attend a Suzuki music school; the other six children (four boys, two girls) had no music lessons outside school.
The researchers chose children being trained by the Suzuki method for several reasons: it ensured the children were all trained in the same way, were not selected for training according to their initial musical talent and had similar support from their families. In addition, because there was no early training in reading music, the Suzuki method provided the researchers with a good model of how training in auditory, sensory and motor activities induces changes in the cortex of the brain. Brain activity was measured by magnetoencephalography (MEG) while the children listened to two types of sounds: a violin tone and a white noise burst. MEG is a non-invasive brain scanning technology that measures the magnetic fields outside the head that are associated with the electrical fields generated when groups of neurons (nerve cells) fire in synchrony. When a sound is heard, the brain processes the information from the ears in a series of stages. MEG provides millisecond-by-millisecond information that tracks these stages of processing; the stages show up as positive or negative deflections (or peaks), called components, in the MEG waveform. Earlier peaks tend to reflect sensory processing and later peaks, perceptual or cognitive processing.
The researchers recorded the measurements four times during the year, and during the first and fourth session the children also completed a music test (in which they were asked to discriminate between same and different harmonies, rhythms and melodies) and a digit span memory test (in which they had to listen to a series of numbers, remember them and repeat them back to the experimenter).
Analysis of the MEG responses showed that across all children, larger responses were seen to the violin tones than to the white noise, indicating that more cortical resources were put to processing meaningful sounds. In addition, the time that it took for the brain to respond to the sounds (the latency of certain MEG components) decreased over the year. This means that as children matured, the electrical conduction between neurons in their brains worked faster.
Of most interest, the Suzuki children showed a greater change over the year in response to violin tones in an MEG component (N250m) related to attention and sound discrimination than did the children not taking music lessons.
Analysis of the music tasks showed greater improvement over the year in melody, harmony and rhythm processing in the children studying music compared to those not studying music. General memory capacity also improved more in the children studying music than in those not studying music.
Prof Trainor said: "That the children studying music for a year improved in musical listening skills more than children not studying music is perhaps not very surprising. On the other hand, it is very interesting that the children taking music lessons improved more over the year on general memory skills that are correlated with non-musical abilities such as literacy, verbal memory, visiospatial processing, mathematics and IQ than did the children not taking lessons. The finding of very rapid maturation of the N250m component to violin sounds in children taking music lessons fits with their large improvement on the memory test. It suggests that musical training is having an effect on how the brain gets wired for general cognitive functioning related to memory and attention."
Dr Fujioka added: "Previous work has shown assignment to musical training is associated with improvements in IQ in school-aged children. Our work explores how musical training affects the way in which the brain develops. It is clear that music is good for children's cognitive development and that music should be part of the pre-school and primary school curriculum."
The next phase of the study will look at the benefits of musical training in older adults.
Contact: Laurel Trainor
ljt@mcmaster.ca
905-525-9140 x23007
Oxford University Press
Imaging technology restores 700-year-old sacred Hindu text
RIT scientists travel twice to India to work on damaged manuscript
Scientists who worked on the Archimedes Palimpsest are using modern imaging technologies to digitally restore a 700-year-old palm-leaf manuscript containing the essence of Hindu philosophy.
The project led by P.R. Mukund and Roger Easton, professors at Rochester Institute of Technology, will digitally preserve the original Hindu writings known as the Sarvamoola granthas attributed to scholar Shri Madvacharya (1238-1317). The collection of 36 works contains commentaries written in Sanskrit on sacred Hindu scriptures and conveys the scholar's Dvaita philosophy of the meaning of life and the role of God.
The document is difficult to handle and to read, the result of centuries of inappropriate storage techniques, botched preservation efforts and degradation due to improper handling. Each leaf of the manuscript measures 26 inches long and two inches wide, and is bound together with braided cord threaded through two holes. Heavy wooden covers sandwich the 340 palm leaves, cracked and chipped at the edges. Time and a misguided application of oil have aged the palm leaves dark brown, obscuring the Sanskrit writings.
"It is literally crumbling to dust," says Mukund, the Gleason Professor of Electrical Engineering at RIT.
According to Mukund, 15 percent of the manuscript is missing.
"The book will never be opened again unless there is a compelling reason to do so," Mukund says. "Because every time they do, they lose some. After this, there won't be a need to open the book."
Mukund first became involved with the project when his spiritual teacher in India brought the problem to his attention and urged him to find a solution. This became a personal goal for Mukund, who studies and teaches Hindu philosophy or "our way of life" and understood the importance of preserving the document for future scholars. The accuracy of existing printed copies of the Sarvamoola granthas is unknown.
Mukund sought the expertise of RIT colleague Easton, who imaged the Dead Sea Scrolls and is currently working on the Archimedes Palimpsest. Easton, a professor at RIT's Chester F. Carlson Center for Imaging Science, brought in Keith Knox, an imaging senior scientist at Boeing LTS, as a consultant. Mukund added Ajay Pasupuleti, a doctoral candidate in microsystems at RIT, and the team was formed.
The scientists traveled to India in December 2005 to assess the document stored at a monastery-like mathas in Udupi, India. Sponsored by a grant from RIT, the team returned to the monastery in June and spent six days imaging the document using a scientific digital camera and an infrared filter to enhance the contrast between the ink and the palm leaf. Images of each palm leaf, back and front, were captured in eight to 10 sections, processed and digitally stitched together. The scientists ran the 7,900 total images through various image-processing algorithms using Adobe Photoshop and Knox's own custom software.
"This is a very significant application of the same types of tools that we have used on the Archimedes Palimpsest," Easton says. "Not incidentally, this also has been one of the most enjoyable projects in my career, since the results will be of great interest to a large number of people in India."
The processed images of the Sarvamoola granthas will be stored in a variety of media formats, including electronically, in published books and on silicon wafers for long-term preservation. Etching the sacred writings on silicon wafers was the idea of Mukund's student Pasupuleti. The process, called aluminum metallization, transfers an image to a wafer by creating a negative of the image and depositing metal on the silicon surface.
According to Pasupuleti, each wafer can hold the image of three leaves. More than 100 wafers will be needed to store the entire manuscript. As an archival material, silicon wafers are both fire- and waterproof, and readable with the use of a magnifying glass.
Mukund and Pasupuleti will return to India at the end of November to give printed and electronic versions of the Sarvamoola granthas to the monastery in Udupi in a public ceremony in Bangalore, the largest city in the Karnataka region.
"We feel we were blessed to have this opportunity to do this," Mukund says. "It was a fantastic and profoundly spiritual experience. And we all came away cleansed."
Based on the success of this project, Mukund is seeking funding to image other Dvaita manuscripts in the Udupi region written since the time of Shri Madvacharya. He estimates the existence of approximately 800 palm leaf manuscripts, some of which are in private collections.
Contact: Susan Gawlowicz
smguns@rit.edu
585-475-5061
Rochester Institute of Technology
Scientists who worked on the Archimedes Palimpsest are using modern imaging technologies to digitally restore a 700-year-old palm-leaf manuscript containing the essence of Hindu philosophy.
The project led by P.R. Mukund and Roger Easton, professors at Rochester Institute of Technology, will digitally preserve the original Hindu writings known as the Sarvamoola granthas attributed to scholar Shri Madvacharya (1238-1317). The collection of 36 works contains commentaries written in Sanskrit on sacred Hindu scriptures and conveys the scholar's Dvaita philosophy of the meaning of life and the role of God.
The document is difficult to handle and to read, the result of centuries of inappropriate storage techniques, botched preservation efforts and degradation due to improper handling. Each leaf of the manuscript measures 26 inches long and two inches wide, and is bound together with braided cord threaded through two holes. Heavy wooden covers sandwich the 340 palm leaves, cracked and chipped at the edges. Time and a misguided application of oil have aged the palm leaves dark brown, obscuring the Sanskrit writings.
"It is literally crumbling to dust," says Mukund, the Gleason Professor of Electrical Engineering at RIT.
According to Mukund, 15 percent of the manuscript is missing.
"The book will never be opened again unless there is a compelling reason to do so," Mukund says. "Because every time they do, they lose some. After this, there won't be a need to open the book."
Mukund first became involved with the project when his spiritual teacher in India brought the problem to his attention and urged him to find a solution. This became a personal goal for Mukund, who studies and teaches Hindu philosophy or "our way of life" and understood the importance of preserving the document for future scholars. The accuracy of existing printed copies of the Sarvamoola granthas is unknown.
Mukund sought the expertise of RIT colleague Easton, who imaged the Dead Sea Scrolls and is currently working on the Archimedes Palimpsest. Easton, a professor at RIT's Chester F. Carlson Center for Imaging Science, brought in Keith Knox, an imaging senior scientist at Boeing LTS, as a consultant. Mukund added Ajay Pasupuleti, a doctoral candidate in microsystems at RIT, and the team was formed.
The scientists traveled to India in December 2005 to assess the document stored at a monastery-like mathas in Udupi, India. Sponsored by a grant from RIT, the team returned to the monastery in June and spent six days imaging the document using a scientific digital camera and an infrared filter to enhance the contrast between the ink and the palm leaf. Images of each palm leaf, back and front, were captured in eight to 10 sections, processed and digitally stitched together. The scientists ran the 7,900 total images through various image-processing algorithms using Adobe Photoshop and Knox's own custom software.
"This is a very significant application of the same types of tools that we have used on the Archimedes Palimpsest," Easton says. "Not incidentally, this also has been one of the most enjoyable projects in my career, since the results will be of great interest to a large number of people in India."
The processed images of the Sarvamoola granthas will be stored in a variety of media formats, including electronically, in published books and on silicon wafers for long-term preservation. Etching the sacred writings on silicon wafers was the idea of Mukund's student Pasupuleti. The process, called aluminum metallization, transfers an image to a wafer by creating a negative of the image and depositing metal on the silicon surface.
According to Pasupuleti, each wafer can hold the image of three leaves. More than 100 wafers will be needed to store the entire manuscript. As an archival material, silicon wafers are both fire- and waterproof, and readable with the use of a magnifying glass.
Mukund and Pasupuleti will return to India at the end of November to give printed and electronic versions of the Sarvamoola granthas to the monastery in Udupi in a public ceremony in Bangalore, the largest city in the Karnataka region.
"We feel we were blessed to have this opportunity to do this," Mukund says. "It was a fantastic and profoundly spiritual experience. And we all came away cleansed."
Based on the success of this project, Mukund is seeking funding to image other Dvaita manuscripts in the Udupi region written since the time of Shri Madvacharya. He estimates the existence of approximately 800 palm leaf manuscripts, some of which are in private collections.
Contact: Susan Gawlowicz
smguns@rit.edu
585-475-5061
Rochester Institute of Technology
Ceramic microreactors developed for on-site hydrogen production
CHAMPAIGN, Ill. -- Scientists at the University of Illinois at Urbana-Champaign have designed and built ceramic microreactors for the on-site reforming of hydrocarbon fuels, such as propane, into hydrogen for use in fuel cells and other portable power sources.
Applications include power supplies for small appliances and laptop computers, and on-site rechargers for battery packs used by the military.
"The catalytic reforming of hydrocarbon fuels offers a nice solution to supplying hydrogen to fuel cells while avoiding safety and storage issues related to gaseous hydrogen," said Paul Kenis, a professor of chemical and biomolecular engineering at Illinois and corresponding author of a paper accepted for publication in the journal Lab on a Chip, and posted on its Web site.
In previous work, Kenis and colleagues developed an integrated catalyst structure and placed it inside a stainless steel housing, where it successfully stripped hydrogen from ammonia at temperatures up to 500 degrees Celsius.
In their latest work, the researchers incorporated the catalyst structure within a ceramic housing, which enabled the steam reforming of propane at operating temperatures up to 1,000 degrees Celsius. Using the new ceramic housing, the researchers also demonstrated the successful decomposition of ammonia at temperatures up to 1,000 degrees Celsius. High-temperature operation is essential for peak performance in microreactors, said Kenis, who also is a researcher at the university's Beckman Institute for Advanced Science and Technology. When reforming hydrocarbons such as propane, temperatures above 800 degrees Celsius prevent the formation of soot that can foul the catalyst surface and reduce performance.
"The performance of our integrated, high-temperature microreactors surpasses that of other fuel reformer systems," Kenis said. "Our microreactors are superior in both hydrogen production and in long-term stability." Kenis and his group are now attempting to reform other, higher hydrocarbon fuels, such as gasoline and diesel, which have well-developed distribution networks around the world.
Contact: James E. Kloeppel
kloeppel@uiuc.edu
217-244-1073
University of Illinois at Urbana-Champaign
Applications include power supplies for small appliances and laptop computers, and on-site rechargers for battery packs used by the military.
"The catalytic reforming of hydrocarbon fuels offers a nice solution to supplying hydrogen to fuel cells while avoiding safety and storage issues related to gaseous hydrogen," said Paul Kenis, a professor of chemical and biomolecular engineering at Illinois and corresponding author of a paper accepted for publication in the journal Lab on a Chip, and posted on its Web site.
In previous work, Kenis and colleagues developed an integrated catalyst structure and placed it inside a stainless steel housing, where it successfully stripped hydrogen from ammonia at temperatures up to 500 degrees Celsius.
In their latest work, the researchers incorporated the catalyst structure within a ceramic housing, which enabled the steam reforming of propane at operating temperatures up to 1,000 degrees Celsius. Using the new ceramic housing, the researchers also demonstrated the successful decomposition of ammonia at temperatures up to 1,000 degrees Celsius. High-temperature operation is essential for peak performance in microreactors, said Kenis, who also is a researcher at the university's Beckman Institute for Advanced Science and Technology. When reforming hydrocarbons such as propane, temperatures above 800 degrees Celsius prevent the formation of soot that can foul the catalyst surface and reduce performance.
"The performance of our integrated, high-temperature microreactors surpasses that of other fuel reformer systems," Kenis said. "Our microreactors are superior in both hydrogen production and in long-term stability." Kenis and his group are now attempting to reform other, higher hydrocarbon fuels, such as gasoline and diesel, which have well-developed distribution networks around the world.
Contact: James E. Kloeppel
kloeppel@uiuc.edu
217-244-1073
University of Illinois at Urbana-Champaign
Potential sea level rise worse than previously expected
Scientific studies over the last year show that ice is being discharged from Greenland and the West Antarctic Peninsula at a much greater rate than glaciologists previously thought possible, Professor Chris Rapley told the Climate Clinic in Brighton today (Tuesday 19 September). Professor Rapley also said the discharge from Greenland, and probably also Antarctica, is accelerating. The findings have profound consequences for the world's sea levels.
The Director of the British Antarctic Survey - an acknowledged world expert - said the scientific understanding of what was happening was moving so quickly with significant new evidence on the speed of ice loss coming to light in the last few months.
Satellite gravity data shows a loss of about 210 cubic kilometres per year from Greenland. In the West Antarctic a similar amount of ice is being lost annually, while on the Antarctic Peninsula 87 per cent of glaciers are retreating. The worrying behaviour of the ice sheets is almost certainly a consequence of global warming, Professor Rapley said.
"It's like opening a window and seeing what's going on, and the message is it's worse than we thought," Professor Rapley said.
He added that although the complexity of the situation made it difficult to predict the impact on sea level, historical evidence pointed to a potential rise of five metres in mean sea levels. The most intense sea level rise in recent history, known as a `meltwater pulse' saw levels rise by 5m in a single century. Professor Rapley says a similar catastrophic rate is unlikely to occur in the near future.
But even if carbon dioxide levels are successfully stabilised, sea levels will continue to rise into the future as a result of greenhouse gases already emitted, leaving a legacy for future generations.
Politicians must respond to the urgency of the issue, he said, adding that current international political action on curbing greenhouse gas emissions is inadequate.
"Climate change is real, climate change is serious, and climate change is now," he said.
Professor Rapley was speaking at the Climate Clinic, a global warming conference within the Lib Dem conference in Brighton which has been organised by the UK's leading green organisations with the support of business and the Energy Saving Trust.
Climate Clinic Spokesperson Phillip Sellwood said:
"Politicians from all parties must listen to what Professor Rapley has to say. We are facing a situation unlike anything we have ever faced before and they must respond accordingly. All parties must support urgent action to avert crisis and prevent the kind of devastating sea level rises that Professor Rapley tells us are possible."
Notes
Organised by the UK's leading green organisations and supported by business and the Energy Saving Trust, the Climate Clinic is taking place at each of the main party political conferences this year. The Clinic is calling on politicians of all parties to support urgent government action to avert crisis by preventing global temperatures rising more than two degrees above pre-industrial levels. See www.climateclinic.org.uk
British Antarctic Survey is a world leader in research into global issues in an Antarctic context. It is the UK's national operator and is a component of the Natural Environment Research Council. It has an annual budget of around £40 million, runs nine research programmes and operates five research stations, two Royal Research Ships and five aircraft in and around Antarctica. More information about the work of the Survey can be found at: www.antarctica.ac.uk
Chris Rapley Prof Chris Rapley CBE is Director of the British Antarctic Survey (BAS). Prior to this he was for four years the Executive Director of the International Geosphere-Biosphere Programme (IGBP) at the Royal Swedish Academy of Sciences in Stockholm. This followed an extended period as Professor of Remote Sensing Science and Associate Director of University College London's Mullard Space Science Laboratory. He has a first degree in physics from Oxford, a M.Sc. in radioastronomy from Manchester University, and a Ph.D. in X-ray astronomy from University College London. He has been a Principal Investigator on both NASA and European Space Agency satellite missions and is a member of the NASA JPL Cassini mission Science Team. He has been a member of numerous national and international committees and boards including Vice President of the Scientific Committee for Antarctic Research and Chair of the International Council for Science's (ICSU) International Polar Year 2007-2008 (IPY) Planning Group. He is currently a member of the European Polar Board's Executive and ICSU - World Meteorological Organisation (WMO) Joint Committee for IPY. He is a Fellow of St Edmund's College Cambridge, and is an Honorary Professor at University College London and at the University of East Anglia.
Contact details:
Friends of the Earth
26-28 Underwood St.
LONDON
N1 7JQ
Tel: 020 7490 1555
Fax: 020 7490 0881
Email: info@foe.co.uk
Website: www.foe.co.uk
The Director of the British Antarctic Survey - an acknowledged world expert - said the scientific understanding of what was happening was moving so quickly with significant new evidence on the speed of ice loss coming to light in the last few months.
Satellite gravity data shows a loss of about 210 cubic kilometres per year from Greenland. In the West Antarctic a similar amount of ice is being lost annually, while on the Antarctic Peninsula 87 per cent of glaciers are retreating. The worrying behaviour of the ice sheets is almost certainly a consequence of global warming, Professor Rapley said.
"It's like opening a window and seeing what's going on, and the message is it's worse than we thought," Professor Rapley said.
He added that although the complexity of the situation made it difficult to predict the impact on sea level, historical evidence pointed to a potential rise of five metres in mean sea levels. The most intense sea level rise in recent history, known as a `meltwater pulse' saw levels rise by 5m in a single century. Professor Rapley says a similar catastrophic rate is unlikely to occur in the near future.
But even if carbon dioxide levels are successfully stabilised, sea levels will continue to rise into the future as a result of greenhouse gases already emitted, leaving a legacy for future generations.
Politicians must respond to the urgency of the issue, he said, adding that current international political action on curbing greenhouse gas emissions is inadequate.
"Climate change is real, climate change is serious, and climate change is now," he said.
Professor Rapley was speaking at the Climate Clinic, a global warming conference within the Lib Dem conference in Brighton which has been organised by the UK's leading green organisations with the support of business and the Energy Saving Trust.
Climate Clinic Spokesperson Phillip Sellwood said:
"Politicians from all parties must listen to what Professor Rapley has to say. We are facing a situation unlike anything we have ever faced before and they must respond accordingly. All parties must support urgent action to avert crisis and prevent the kind of devastating sea level rises that Professor Rapley tells us are possible."
Notes
Organised by the UK's leading green organisations and supported by business and the Energy Saving Trust, the Climate Clinic is taking place at each of the main party political conferences this year. The Clinic is calling on politicians of all parties to support urgent government action to avert crisis by preventing global temperatures rising more than two degrees above pre-industrial levels. See www.climateclinic.org.uk
British Antarctic Survey is a world leader in research into global issues in an Antarctic context. It is the UK's national operator and is a component of the Natural Environment Research Council. It has an annual budget of around £40 million, runs nine research programmes and operates five research stations, two Royal Research Ships and five aircraft in and around Antarctica. More information about the work of the Survey can be found at: www.antarctica.ac.uk
Chris Rapley Prof Chris Rapley CBE is Director of the British Antarctic Survey (BAS). Prior to this he was for four years the Executive Director of the International Geosphere-Biosphere Programme (IGBP) at the Royal Swedish Academy of Sciences in Stockholm. This followed an extended period as Professor of Remote Sensing Science and Associate Director of University College London's Mullard Space Science Laboratory. He has a first degree in physics from Oxford, a M.Sc. in radioastronomy from Manchester University, and a Ph.D. in X-ray astronomy from University College London. He has been a Principal Investigator on both NASA and European Space Agency satellite missions and is a member of the NASA JPL Cassini mission Science Team. He has been a member of numerous national and international committees and boards including Vice President of the Scientific Committee for Antarctic Research and Chair of the International Council for Science's (ICSU) International Polar Year 2007-2008 (IPY) Planning Group. He is currently a member of the European Polar Board's Executive and ICSU - World Meteorological Organisation (WMO) Joint Committee for IPY. He is a Fellow of St Edmund's College Cambridge, and is an Honorary Professor at University College London and at the University of East Anglia.
Contact details:
Friends of the Earth
26-28 Underwood St.
LONDON
N1 7JQ
Tel: 020 7490 1555
Fax: 020 7490 0881
Email: info@foe.co.uk
Website: www.foe.co.uk
Linspire Offers Cash Incentive For Pre-installing Linux on Desktop Computers
New System Builder Program Automatically Shares Revenues When Customers Purchase Linux Software and Services
SAN DIEGO, Linspire, Inc., developer of the commercial desktop Linux operating system of the same name and Freespire, the free community desktop Linux operating system, launched a revamped partner program today that pays system builders a percentage on all commercial Linux software and services purchased by users of either Linspire and Freespire pre-installed desktop and laptop computers using CNR (Click N' Run) technology. Offering 18 months of Revenue Share per computer, the new program also features an automated, real-time, revenue share system that reports computer Light-up data, CNR Registration User percentage, and average revenue per-user generated through CNR.
"Linspire has been very successful in selling products and services to desktop Linux users via our CNR technology," said Kevin Carmony, CEO of Linspire." We are excited to now share this successful model with our valued system builder partners. Providing post-sale revenue will further entice PC suppliers around the world to pre-install Linux on desktop and laptop computers, critical for Linux's success."
Free to join, the Builder Program http://www.linspire.com/builder has no annual fees or volume commitments, and offers system builders the option to build pre-installed computer systems with both the commercial Linspire and community-driven Freespire desktop Linux operating systems. By logging into a personalized portal, system builders can see real-time data when their computer systems are turned on and connected to the Internet for the first time, as well as conversion data for purchases and their share of the revenue from each purchase. Checks are then automatically mailed on a quarterly basis to participating system builders. System builders can utilize this data to help them monitor their success in the program, and make adjustments to marketing messaging, initial out-of-box experience, support, and so on.
"Having served the desktop Linux channel for over four years now, we continually listen to our partners." said Larry Kettler, Vice President, Worldwide Sales & Marketing of Linspire. "They wanted us to eliminate all barriers to enter the desktop Linux market for system builders of all sizes and help them generate more profit per computer shipped out the door...this new program does both."
In addition to the new Builder Program, a new, easy-to-use, Linspire channel web portal was launched at http://partners.linspire.com which provides marketing, sales, and support resources and tools for Resellers and software Publishers as well. The revamped Linspire Reseller Program makes it easier for computer and software resellers to effectively sell desktop Linux software and solutions. As with the Builder Program, the Reseller Program is free to join, and members can purchase products directly from Linspire or Ingram Micro. The program also includes more sales materials and sales training resources that will help resellers better promote and understand the value proposition for Linspire and desktop Linux.
About Linspire, Inc.
Linspire, Inc. (www.linspire.com) was founded in 2001 to bring choice into the operating system market. The company's flagship product, the Linspire operating system, is an affordable, easy-to-use Linux-based operating system for home, school, and business users. Linspire pioneered CNR ("Click 'N Run") Technology, which allows Linspire users access to thousands of software programs, each of which can be downloaded and installed with just one mouse click. The thousands of software titles available in the CNR Warehouse (www.linspire.com/cnr) include full office and productivity suites, games, multimedia players, photo management software, accounting tools, and more.
About Freespire
Freespire (www.freespire.org) is a community-driven, Linux-based operating system that combines the best that free, open source software has to offer (community driven, freely distributed, open source code, etc.), but also provides users the choice of including proprietary codecs, drivers and applications as they see fit. With Freespire, the choice is yours as to what software is installed on your computer, with no limitations or restrictions placed on that choice. How you choose to maximize the performance of your computer is entirely up to you.
About CNR ("Click 'N Run")
CNR, with access to over 20,000 Linux software applications, makes it extremely easy for non-technical users to install, uninstall, update and manage Linux software on their desktop or laptop computers (http://wiki.freespire.org/index.php/CNR_Warehouse). With the CNR Service you can freely install thousands of Linux software titles direct from the CNR Warehouse (http://linspire.com/warehouse), all with just a single mouse click. Users also get a powerful way to manage their entire software library, with advanced features, such as customizable "aisles" where you can install entire groups of software with a single click.
For more information and interview requests:
Linspire, Inc.
858-587-6700, ext. 283
858-587-8095 Fax
pr@linspireinc.com
SAN DIEGO, Linspire, Inc., developer of the commercial desktop Linux operating system of the same name and Freespire, the free community desktop Linux operating system, launched a revamped partner program today that pays system builders a percentage on all commercial Linux software and services purchased by users of either Linspire and Freespire pre-installed desktop and laptop computers using CNR (Click N' Run) technology. Offering 18 months of Revenue Share per computer, the new program also features an automated, real-time, revenue share system that reports computer Light-up data, CNR Registration User percentage, and average revenue per-user generated through CNR.
"Linspire has been very successful in selling products and services to desktop Linux users via our CNR technology," said Kevin Carmony, CEO of Linspire." We are excited to now share this successful model with our valued system builder partners. Providing post-sale revenue will further entice PC suppliers around the world to pre-install Linux on desktop and laptop computers, critical for Linux's success."
Free to join, the Builder Program http://www.linspire.com/builder has no annual fees or volume commitments, and offers system builders the option to build pre-installed computer systems with both the commercial Linspire and community-driven Freespire desktop Linux operating systems. By logging into a personalized portal, system builders can see real-time data when their computer systems are turned on and connected to the Internet for the first time, as well as conversion data for purchases and their share of the revenue from each purchase. Checks are then automatically mailed on a quarterly basis to participating system builders. System builders can utilize this data to help them monitor their success in the program, and make adjustments to marketing messaging, initial out-of-box experience, support, and so on.
"Having served the desktop Linux channel for over four years now, we continually listen to our partners." said Larry Kettler, Vice President, Worldwide Sales & Marketing of Linspire. "They wanted us to eliminate all barriers to enter the desktop Linux market for system builders of all sizes and help them generate more profit per computer shipped out the door...this new program does both."
In addition to the new Builder Program, a new, easy-to-use, Linspire channel web portal was launched at http://partners.linspire.com which provides marketing, sales, and support resources and tools for Resellers and software Publishers as well. The revamped Linspire Reseller Program makes it easier for computer and software resellers to effectively sell desktop Linux software and solutions. As with the Builder Program, the Reseller Program is free to join, and members can purchase products directly from Linspire or Ingram Micro. The program also includes more sales materials and sales training resources that will help resellers better promote and understand the value proposition for Linspire and desktop Linux.
About Linspire, Inc.
Linspire, Inc. (www.linspire.com) was founded in 2001 to bring choice into the operating system market. The company's flagship product, the Linspire operating system, is an affordable, easy-to-use Linux-based operating system for home, school, and business users. Linspire pioneered CNR ("Click 'N Run") Technology, which allows Linspire users access to thousands of software programs, each of which can be downloaded and installed with just one mouse click. The thousands of software titles available in the CNR Warehouse (www.linspire.com/cnr) include full office and productivity suites, games, multimedia players, photo management software, accounting tools, and more.
About Freespire
Freespire (www.freespire.org) is a community-driven, Linux-based operating system that combines the best that free, open source software has to offer (community driven, freely distributed, open source code, etc.), but also provides users the choice of including proprietary codecs, drivers and applications as they see fit. With Freespire, the choice is yours as to what software is installed on your computer, with no limitations or restrictions placed on that choice. How you choose to maximize the performance of your computer is entirely up to you.
About CNR ("Click 'N Run")
CNR, with access to over 20,000 Linux software applications, makes it extremely easy for non-technical users to install, uninstall, update and manage Linux software on their desktop or laptop computers (http://wiki.freespire.org/index.php/CNR_Warehouse). With the CNR Service you can freely install thousands of Linux software titles direct from the CNR Warehouse (http://linspire.com/warehouse), all with just a single mouse click. Users also get a powerful way to manage their entire software library, with advanced features, such as customizable "aisles" where you can install entire groups of software with a single click.
For more information and interview requests:
Linspire, Inc.
858-587-6700, ext. 283
858-587-8095 Fax
pr@linspireinc.com
Doctors cut repeat LASIK visits dramatically
Ophthalmologists have developed a formula that slashes by nearly two-thirds the likelihood that patients will need repeat visits to an eye surgeon to adjust their vision after their initial LASIK visit. That's because the formula makes it more likely that surgeons will get it right the first time.
The new results, presented at the European Society of Cataract and Refractive Surgery meeting in London, are the result of a complex computer formula compiled by doctors and scientists at the University of Rochester Medical Center that takes into account myriad imperfections within the eye that weren’t even known to exist a decade ago.
Even though most patients come out of refractive surgery with vision that is 20/20 or better, doctors have noticed that some patients exit the surgery slightly farsighted – not enough to seriously degrade their quality of vision or to require contact lenses or reading glasses, but enough to be a leading reason why people complain about the results of the surgery. A few others end up slightly nearsighted. While many of these patients still see at a level around 20/20, the slight farsightedness or nearsightednessis is one of the chief barriers preventing them from seeing even better, at a level around 20/16.
Eye surgeon Scott MacRae, M.D., of the University of Rochester Eye Institute presented the results showing a dramatic drop in farsightedness among LASIK patients. In a recent study where MacRae and colleagues used the formula, known as the University of Rochester Nomogram, during surgery, just six of 445 eyes or 1.3 percent were slightly farsighted after LASIK. He compared this to results from a previous study five years ago without the formula. In that study of 340 eyes, even though 91 percent of patients had 20/20 vision or better – the highest known percentage of any large study in the world at the time – 74 of the 340 eyes treated, or 21.8 percent, were slightly farsighted.
"Though those results were among the best anyone had gotten to date, we thought we could do better," said MacRae, who worked for two years with post-doctoral associate Manoj Venkiteshwar, Ph.D., to develop the formula.
While some doctors have noticed that patients are more likely to be slightly farsighted than nearsighted after LASIK, doctors have had no way to predict which patients would be affected, MacRae said. If a doctor adjusted all of his or her surgeries to avoid the problem, then the other 80 percent of patients would wind up slightly nearsighted.
The new formula takes the guesswork out of the picture and establishes a scientific basis for the phenomenon.
The software developed by Venkiteshwar and MacRae controls how the laser beam dances around the surface of the cornea during a LASIK procedure, allowing the surgeon to sculpt the cornea into just the right shape so that it produces as flawless an image as possible. During a procedure that typically might last anywhere from 15 to 60 seconds, the laser beam hits the cornea about 50 times per second, with generally 750 to 3,000 pulses. The timing and aim, controlled by both the surgeon and the software, have to be precise.
By taking into account the unique anomalies in each person’s eye, the formula predicts which patients are most likely to be slightly farsighted after a LASIK procedure, then adjusts the laser to avoid that outcome.
Ironically, Venkiteshwar and MacRae found that the cause of the shift was the new capability doctors have to fix subtle visual imperfections that weren’t even known to exist until David Williams, Ph.D., at the University of Rochester developed a system to see them.
Williams' system opened the door, for the first time in history, to the possibility of fixing not only the three major flaws in the eye that reading glasses and contact lenses have corrected for decades, but also approximately 60 additional imperfections that were never known before. Nearly everyone has these flaws in their eyes to some extent; while most people don’t notice them, they hurt our quality of vision in subtle ways. Since Williams' discovery, several companies have introduced technology that makes possible a technique known as customized ablation, a form of LASIK that corrects these imperfections, bringing about a super-crisp quality of eyesight. Beyond making vision on the order of 20/15 or 20/16 possible or even commonplace in some groups of patients, the technology also increases the eye’s ability to see in situations where there is low light or little contrast.
MacRae and Venkiteshwar were surprised to find that fixing these subtle imperfections affects vision in unexpected ways. They found that some of the improvements make an eye undergoing LASIK more prone to becoming slightly farsighted in some patients, and slightly nearsighted in a few patients. They’ve found the relationship in at least three different laser systems used in LASIK procedures.
"This is not something anyone would have predicted," said MacRae, who is a professor of Ophthalmology and of Visual Science. "When you fix these flaws, it can affect vision in ways that were previously unpredictable."
For instance, the team found that treating coma – a subtle imperfection where a point of light looks like it has the tale of a comet – affects a patient’s astigmatism as well as his or her degree of nearsightedness or farsightedness. Other common flaws that can now be fixed, but which also affect a person’s vision more broadly, include spherical aberration, where a point of light appears to have several rings of light around it; trefoil, where a point of light seems to be surrounded by three other points; and others such as secondary astigmatism, quadrafoil and pentafoil.
MacRae credits the new formula, part of a procedure he calls second-generation customized ablation, with slashing the need for repeat treatments in patients from about 8 percent to 3 percent.
The latest results are part of an ongoing program by MacRae, a pioneer in the field of customized ablation, to bring Williams' findings to the clinic and improve patients’ vision to unprecedented levels. Each year, MacRae says, scientists and physicians learn new things that help future patients.
"We’ve taken a very good procedure and made it even better. I am extremely confident in this technology, which I’ve even had done on myself. A conscientious, systematic approach to evaluating patients is key. Not everyone is a good candidate for LASIK. Surgeons need to be extremely diligent about their pre-operative evaluations to maximize safety and the outcomes for their patients," said MacRae, the author of the best-selling book on customized ablation, Customized Corneal Ablation: The Quest for Supervision.
Contact: Tom Rickey
tom_rickey@urmc.rochester.edu
585-275-7954
University of Rochester Medical Center
The new results, presented at the European Society of Cataract and Refractive Surgery meeting in London, are the result of a complex computer formula compiled by doctors and scientists at the University of Rochester Medical Center that takes into account myriad imperfections within the eye that weren’t even known to exist a decade ago.
Even though most patients come out of refractive surgery with vision that is 20/20 or better, doctors have noticed that some patients exit the surgery slightly farsighted – not enough to seriously degrade their quality of vision or to require contact lenses or reading glasses, but enough to be a leading reason why people complain about the results of the surgery. A few others end up slightly nearsighted. While many of these patients still see at a level around 20/20, the slight farsightedness or nearsightednessis is one of the chief barriers preventing them from seeing even better, at a level around 20/16.
Eye surgeon Scott MacRae, M.D., of the University of Rochester Eye Institute presented the results showing a dramatic drop in farsightedness among LASIK patients. In a recent study where MacRae and colleagues used the formula, known as the University of Rochester Nomogram, during surgery, just six of 445 eyes or 1.3 percent were slightly farsighted after LASIK. He compared this to results from a previous study five years ago without the formula. In that study of 340 eyes, even though 91 percent of patients had 20/20 vision or better – the highest known percentage of any large study in the world at the time – 74 of the 340 eyes treated, or 21.8 percent, were slightly farsighted.
"Though those results were among the best anyone had gotten to date, we thought we could do better," said MacRae, who worked for two years with post-doctoral associate Manoj Venkiteshwar, Ph.D., to develop the formula.
While some doctors have noticed that patients are more likely to be slightly farsighted than nearsighted after LASIK, doctors have had no way to predict which patients would be affected, MacRae said. If a doctor adjusted all of his or her surgeries to avoid the problem, then the other 80 percent of patients would wind up slightly nearsighted.
The new formula takes the guesswork out of the picture and establishes a scientific basis for the phenomenon.
The software developed by Venkiteshwar and MacRae controls how the laser beam dances around the surface of the cornea during a LASIK procedure, allowing the surgeon to sculpt the cornea into just the right shape so that it produces as flawless an image as possible. During a procedure that typically might last anywhere from 15 to 60 seconds, the laser beam hits the cornea about 50 times per second, with generally 750 to 3,000 pulses. The timing and aim, controlled by both the surgeon and the software, have to be precise.
By taking into account the unique anomalies in each person’s eye, the formula predicts which patients are most likely to be slightly farsighted after a LASIK procedure, then adjusts the laser to avoid that outcome.
Ironically, Venkiteshwar and MacRae found that the cause of the shift was the new capability doctors have to fix subtle visual imperfections that weren’t even known to exist until David Williams, Ph.D., at the University of Rochester developed a system to see them.
Williams' system opened the door, for the first time in history, to the possibility of fixing not only the three major flaws in the eye that reading glasses and contact lenses have corrected for decades, but also approximately 60 additional imperfections that were never known before. Nearly everyone has these flaws in their eyes to some extent; while most people don’t notice them, they hurt our quality of vision in subtle ways. Since Williams' discovery, several companies have introduced technology that makes possible a technique known as customized ablation, a form of LASIK that corrects these imperfections, bringing about a super-crisp quality of eyesight. Beyond making vision on the order of 20/15 or 20/16 possible or even commonplace in some groups of patients, the technology also increases the eye’s ability to see in situations where there is low light or little contrast.
MacRae and Venkiteshwar were surprised to find that fixing these subtle imperfections affects vision in unexpected ways. They found that some of the improvements make an eye undergoing LASIK more prone to becoming slightly farsighted in some patients, and slightly nearsighted in a few patients. They’ve found the relationship in at least three different laser systems used in LASIK procedures.
"This is not something anyone would have predicted," said MacRae, who is a professor of Ophthalmology and of Visual Science. "When you fix these flaws, it can affect vision in ways that were previously unpredictable."
For instance, the team found that treating coma – a subtle imperfection where a point of light looks like it has the tale of a comet – affects a patient’s astigmatism as well as his or her degree of nearsightedness or farsightedness. Other common flaws that can now be fixed, but which also affect a person’s vision more broadly, include spherical aberration, where a point of light appears to have several rings of light around it; trefoil, where a point of light seems to be surrounded by three other points; and others such as secondary astigmatism, quadrafoil and pentafoil.
MacRae credits the new formula, part of a procedure he calls second-generation customized ablation, with slashing the need for repeat treatments in patients from about 8 percent to 3 percent.
The latest results are part of an ongoing program by MacRae, a pioneer in the field of customized ablation, to bring Williams' findings to the clinic and improve patients’ vision to unprecedented levels. Each year, MacRae says, scientists and physicians learn new things that help future patients.
"We’ve taken a very good procedure and made it even better. I am extremely confident in this technology, which I’ve even had done on myself. A conscientious, systematic approach to evaluating patients is key. Not everyone is a good candidate for LASIK. Surgeons need to be extremely diligent about their pre-operative evaluations to maximize safety and the outcomes for their patients," said MacRae, the author of the best-selling book on customized ablation, Customized Corneal Ablation: The Quest for Supervision.
Contact: Tom Rickey
tom_rickey@urmc.rochester.edu
585-275-7954
University of Rochester Medical Center
Road wends its way through stomach
A computer model or "virtual stomach" revealed a central "road" in the human stomach, dubbed the Magenstrasse, that could explain why pharmaceuticals sometimes have a large variability in drug activation times, according to a team creating computer simulations of stomach contractions.
"We are predicting variables that we wish we could measure, but we cannot," says Dr. James G. Brasseur, professor of mechanical engineering, bioengineering and mathematics at Penn State. "Now that we know the Magenstrasse exists, we can look for it, but, it will not be easy to measure its existence and could require expensive technology."
Brasseur, working with Anupam Pal, research associate, Penn State and Bertil Abrahamsson, AstraZeneca, was interested in how the stomach empties its contents and how material passes from the stomach into the small intestines.
"The sphincter between the stomach and the small intestine is interactive," said Brasseur. "The sphincter opens and closes in a controlled way to regulate the flow of nutrient to the small intestines. Sensor cells in the intestines modulate the opening and closing."
Two types of muscle contractions control food movement in the stomach. One type of contraction, antral contractions, occur in the lower portion of the stomach and break down and mix stomach contents. The other type of contraction, fundic contractions, is over the upper surface of the stomach. It was thought that the fundic contractions move food from the top of the stomach where it enters from the esophagus, to the bottom of the stomach where the chyme leaves and enters the small intestine. The assumption was that particles left the stomach in the same order they entered the stomach.
The researchers modeled the stomach contents and discovered that a narrow path forms in the center of the stomach along which food exits the stomach more rapidly than the regions near the walls of the stomach. They used MRI data from human subjects to create the proper geometry of the muscle contractions.
"We looked at a ten-minute window of digestion and we tagged all the particles as they left the virtual stomach," said Brasseur. "We then reversed the flow on the computer and saw where the particles came from."
In essence they ran the simulation backwards and were surprised to see a central road appear. Those particles in the virtual stomach that were on the central road, exited the stomach in 10 minutes. The Magenstrasse extended all the way from the stomach's exit up to the top of the stomach's fundus. Material that entered the stomach off this Magenstrasse could remain in the stomach a long time, even hours in the real stomach.
"This discovery might explain observed high variability in drug initiation time, and may have important implications to both drug delivery and digestion," the researchers report online in the Journal of Biomechanics. The paper will appear in a print edition in 2007.
Because most drugs target the small intestines for absorption, a pill disintegrates in the stomach and activates in the small intestines. With this new understanding of how the stomach works, where in the stomach a pill or capsule disintegrates becomes very important. Drug delivery times may differ from 10 minutes to hours depending on location.
"Therefore, drugs released on the Magenstrasse will enter the duodenum rapidly and at a high concentration," the researchers report. "Drug released off the gastric emptying Magenstrasse, however, will mix well and enter the duodenum much later, at low concentration."
For some drugs, rapid release is important, for others, slow release over long periods of time is the desired outcome.
"If you do not know a Magenstrasse exists, you will not factor it into the designs," says Brasseur. "Now that we know, perhaps researchers can design pills with higher densities to sit around at the bottom of the stomach, outside the Magenstrasse, and let the drug out slowly."
Contact: A'ndrea Elyse Messer
aem1@psu.edu
814-865-9481
Penn State
"We are predicting variables that we wish we could measure, but we cannot," says Dr. James G. Brasseur, professor of mechanical engineering, bioengineering and mathematics at Penn State. "Now that we know the Magenstrasse exists, we can look for it, but, it will not be easy to measure its existence and could require expensive technology."
Brasseur, working with Anupam Pal, research associate, Penn State and Bertil Abrahamsson, AstraZeneca, was interested in how the stomach empties its contents and how material passes from the stomach into the small intestines.
"The sphincter between the stomach and the small intestine is interactive," said Brasseur. "The sphincter opens and closes in a controlled way to regulate the flow of nutrient to the small intestines. Sensor cells in the intestines modulate the opening and closing."
Two types of muscle contractions control food movement in the stomach. One type of contraction, antral contractions, occur in the lower portion of the stomach and break down and mix stomach contents. The other type of contraction, fundic contractions, is over the upper surface of the stomach. It was thought that the fundic contractions move food from the top of the stomach where it enters from the esophagus, to the bottom of the stomach where the chyme leaves and enters the small intestine. The assumption was that particles left the stomach in the same order they entered the stomach.
The researchers modeled the stomach contents and discovered that a narrow path forms in the center of the stomach along which food exits the stomach more rapidly than the regions near the walls of the stomach. They used MRI data from human subjects to create the proper geometry of the muscle contractions.
"We looked at a ten-minute window of digestion and we tagged all the particles as they left the virtual stomach," said Brasseur. "We then reversed the flow on the computer and saw where the particles came from."
In essence they ran the simulation backwards and were surprised to see a central road appear. Those particles in the virtual stomach that were on the central road, exited the stomach in 10 minutes. The Magenstrasse extended all the way from the stomach's exit up to the top of the stomach's fundus. Material that entered the stomach off this Magenstrasse could remain in the stomach a long time, even hours in the real stomach.
"This discovery might explain observed high variability in drug initiation time, and may have important implications to both drug delivery and digestion," the researchers report online in the Journal of Biomechanics. The paper will appear in a print edition in 2007.
Because most drugs target the small intestines for absorption, a pill disintegrates in the stomach and activates in the small intestines. With this new understanding of how the stomach works, where in the stomach a pill or capsule disintegrates becomes very important. Drug delivery times may differ from 10 minutes to hours depending on location.
"Therefore, drugs released on the Magenstrasse will enter the duodenum rapidly and at a high concentration," the researchers report. "Drug released off the gastric emptying Magenstrasse, however, will mix well and enter the duodenum much later, at low concentration."
For some drugs, rapid release is important, for others, slow release over long periods of time is the desired outcome.
"If you do not know a Magenstrasse exists, you will not factor it into the designs," says Brasseur. "Now that we know, perhaps researchers can design pills with higher densities to sit around at the bottom of the stomach, outside the Magenstrasse, and let the drug out slowly."
Contact: A'ndrea Elyse Messer
aem1@psu.edu
814-865-9481
Penn State
Tuesday, September 19, 2006
Engine on a chip promises to best the battery
MIT researchers are putting a tiny gas-turbine engine inside a silicon chip about the size of a quarter. The resulting device could run 10 times longer than a battery of the same weight can, powering laptops, cell phones, radios and other electronic devices.
It could also dramatically lighten the load for people who can't connect to a power grid, including soldiers who now must carry many pounds of batteries for a three-day mission -- all at a reasonable price.
The researchers say that in the long term, mass-production could bring the per-unit cost of power from microengines close to that for power from today's large gas-turbine power plants.
Making things tiny is all the rage. The field -- called microelectromechanical systems, or MEMS -- grew out of the computer industry's stunning success in developing and using micro technologies. "Forty years ago, a computer filled up a whole building," said Professor Alan Epstein of the Department of Aeronautics and Astronautics. "Now we all have microcomputers on our desks and inside our thermostats and our watches."
While others are making miniature devices ranging from biological sensors to chemical processors, Epstein and a team of 20 faculty, staff and students are looking to make power -- personal power. "Big gas-turbine engines can power a city, but a little one could 'power' a person," said Epstein, whose colleagues are spread among MIT's Gas Turbine Laboratory, Microsystems Technology Laboratories, and Laboratory for Electromagnetic and Electronic Systems.
How can one make a tiny fuel-burning engine? An engine needs a compressor, a combustion chamber, a spinning turbine and so on. Making millimeter-scale versions of those components from welded and riveted pieces of metal isn't feasible. So, like computer-chip makers, the MIT researchers turned to etched silicon wafers.
Their microengine is made of six silicon wafers, piled up like pancakes and bonded together. Each wafer is a single crystal with its atoms perfectly aligned, so it is extremely strong. To achieve the necessary components, the wafers are individually prepared using an advanced etching process to eat away selected material. When the wafers are piled up, the surfaces and the spaces in between produce the needed features and functions.
Making microengines one at a time would be prohibitively expensive, so the researchers again followed the lead of computer-chip makers. They make 60 to 100 components on a large wafer that they then (very carefully) cut apart into single units.
Challenges ahead
The MIT team has now used this process to make all the components needed for their engine, and each part works. Inside a tiny combustion chamber, fuel and air quickly mix and burn at the melting point of steel. Turbine blades, made of low-defect, high-strength microfabricated materials, spin at 20,000 revolutions per second -- 100 times faster than those in jet engines. A mini-generator produces 10 watts of power. A little compressor raises the pressure of air in preparation for combustion. And cooling (always a challenge in hot microdevices) appears manageable by sending the compression air around the outside of the combustor.
"So all the parts work…. We're now trying to get them all to work on the same day on the same lab bench," Epstein said. Ultimately, of course, hot gases from the combustion chamber need to turn the turbine blades, which must then power the generator, and so on. "That turns out to be a hard thing to do," he said. Their goal is to have it done by the end of this year.
Predicting how quickly they can move ahead is itself a bit of a challenge. If the bonding process is done well, each microengine is a monolithic piece of silicon, atomically perfect and inseparable. As a result, even a tiny mistake in a single component will necessitate starting from scratch. And if one component needs changing -- say, the compressor should be a micron smaller -- the microfabrication team will have to rethink the entire design process.
For all the difficulties, Epstein said the project is "an astonishing amount of fun" -- and MIT is the ideal place for it. "Within 300 feet of my office, I could find the world's experts on each of the technologies needed to make the complete system," he said.
In addition, the project provides an excellent opportunity for teaching. "No matter what your specialty is -- combustion or bearings or microfabrication -- it's equally hard," he said. "As an educational tool, it's enormously useful because the students realize that their success is dependent upon other people's success. They can't make their part easier by making somebody else's part harder, because then as a team we don't succeed."
This research was funded by the U.S. Army Research Laboratory.
Nancy Stauffer, Laboratory for Energy and the Environment
It could also dramatically lighten the load for people who can't connect to a power grid, including soldiers who now must carry many pounds of batteries for a three-day mission -- all at a reasonable price.
The researchers say that in the long term, mass-production could bring the per-unit cost of power from microengines close to that for power from today's large gas-turbine power plants.
Making things tiny is all the rage. The field -- called microelectromechanical systems, or MEMS -- grew out of the computer industry's stunning success in developing and using micro technologies. "Forty years ago, a computer filled up a whole building," said Professor Alan Epstein of the Department of Aeronautics and Astronautics. "Now we all have microcomputers on our desks and inside our thermostats and our watches."
While others are making miniature devices ranging from biological sensors to chemical processors, Epstein and a team of 20 faculty, staff and students are looking to make power -- personal power. "Big gas-turbine engines can power a city, but a little one could 'power' a person," said Epstein, whose colleagues are spread among MIT's Gas Turbine Laboratory, Microsystems Technology Laboratories, and Laboratory for Electromagnetic and Electronic Systems.
How can one make a tiny fuel-burning engine? An engine needs a compressor, a combustion chamber, a spinning turbine and so on. Making millimeter-scale versions of those components from welded and riveted pieces of metal isn't feasible. So, like computer-chip makers, the MIT researchers turned to etched silicon wafers.
Their microengine is made of six silicon wafers, piled up like pancakes and bonded together. Each wafer is a single crystal with its atoms perfectly aligned, so it is extremely strong. To achieve the necessary components, the wafers are individually prepared using an advanced etching process to eat away selected material. When the wafers are piled up, the surfaces and the spaces in between produce the needed features and functions.
Making microengines one at a time would be prohibitively expensive, so the researchers again followed the lead of computer-chip makers. They make 60 to 100 components on a large wafer that they then (very carefully) cut apart into single units.
Challenges ahead
The MIT team has now used this process to make all the components needed for their engine, and each part works. Inside a tiny combustion chamber, fuel and air quickly mix and burn at the melting point of steel. Turbine blades, made of low-defect, high-strength microfabricated materials, spin at 20,000 revolutions per second -- 100 times faster than those in jet engines. A mini-generator produces 10 watts of power. A little compressor raises the pressure of air in preparation for combustion. And cooling (always a challenge in hot microdevices) appears manageable by sending the compression air around the outside of the combustor.
"So all the parts work…. We're now trying to get them all to work on the same day on the same lab bench," Epstein said. Ultimately, of course, hot gases from the combustion chamber need to turn the turbine blades, which must then power the generator, and so on. "That turns out to be a hard thing to do," he said. Their goal is to have it done by the end of this year.
Predicting how quickly they can move ahead is itself a bit of a challenge. If the bonding process is done well, each microengine is a monolithic piece of silicon, atomically perfect and inseparable. As a result, even a tiny mistake in a single component will necessitate starting from scratch. And if one component needs changing -- say, the compressor should be a micron smaller -- the microfabrication team will have to rethink the entire design process.
For all the difficulties, Epstein said the project is "an astonishing amount of fun" -- and MIT is the ideal place for it. "Within 300 feet of my office, I could find the world's experts on each of the technologies needed to make the complete system," he said.
In addition, the project provides an excellent opportunity for teaching. "No matter what your specialty is -- combustion or bearings or microfabrication -- it's equally hard," he said. "As an educational tool, it's enormously useful because the students realize that their success is dependent upon other people's success. They can't make their part easier by making somebody else's part harder, because then as a team we don't succeed."
This research was funded by the U.S. Army Research Laboratory.
Nancy Stauffer, Laboratory for Energy and the Environment
HP Dramatically Simplifies Network Storage for Small and Medium Businesses
NEW YORK,
HP today announced the industry’s first easy-to-use storage systems that drive down cost and complexity for small and medium businesses (SMBs) grappling with the exponential growth of data.
The new HP StorageWorks All-in-One (AiO) Storage Systems deliver simple, application-centric storage management, reliable data protection and affordable data storage. Using the systems, SMBs with limited or no storage expertise are able to store, share, manage, back up and protect their rapidly growing application and file data in a flexible network storage environment.
More than 60 percent of SMBs have not deployed networked storage,(1) and the HP All-in-One Storage Systems address what industry analyst firm IDC predicts will be a $5.7 billion market opportunity by 2010.(2)
“With these systems, HP is now the benchmark for storage solutions for SMBs,” said Bob Schultz, senior vice president and general manager, StorageWorks Division, HP. “Developed from the ground up to address the specific problems SMBs face, the HP All-in-One Storage Systems deliver functionality without the complexity of competing offerings.”
The HP AiO Storage Systems handle all storage tasks and are accessed via an easy-to-use graphical interface, saving administrators time and providing newfound levels of control.
“It’s clear the HP All-in-One Storage Systems were designed with the needs of small businesses like St. John’s in mind,” said Charles Love, director of IT for St. John’s Episcopal School in Tampa, Fla. “From the intuitive set-up wizards to the easy-to-use interface, the system has proven to be an extremely simple, complete and low-cost solution to help us manage our growing storage needs.”
The systems are specifically designed for managing storage in Microsoft® environments, including point-and-click tools for working with Microsoft Exchange and SQL Server. In fewer than 10 clicks, for example, customers can fully set up shared storage for an Exchange mailstore.
“We’re excited to work with HP, a strategic partner with deep expertise in Windows and networked storage technology, to deliver a breakthrough storage innovation,” said Gabriel Broner, general manager, Storage, Microsoft Corp. “HP All-in-One Storage Systems bring together HP’s unique design with Microsoft’s Storage Server software to provide an extremely compelling offering for SMBs. Microsoft is committed to making universal distributed storage a reality, by reducing storage costs and providing customers with high-end functionality on industry-standard hardware.”
A full version of HP StorageWorks Data Protector Express Software is integrated into HP All-in-One Storage Systems to enable data back up and recovery from tape, virtual tape, optical or external disk on the network. The HP AiO Storage Systems also run Windows® Storage Server 2003 R2, which enables replication from one system to another and rapid recovery in the event of a disaster.
“The HP All-in-One is filling a hole in the market that no one else is addressing by taking the complexity out of storage,” said Don Zurbrick, sales manager, Big Sur Technologies, a Florida-based solutions and services provider. ”We are confident that the simplicity of the AiO will help drive incremental business and strengthen our position as a trusted advisor to our customers.”
The HP StorageWorks All-in-One Storage Systems are available now and will be sold primarily through HP’s extensive network of more than 145,000 channel resellers worldwide.
More information about HP StorageWorks All-in-One Storage Systems is available in an online press kit at www.hp.com/go/AiOLaunch.
About HP
HP is a technology solutions provider to consumers, businesses and institutions globally. The company’s offerings span IT infrastructure, global services, business and home computing, and imaging and printing. For the four fiscal quarters ended July 31, 2006, HP revenue totaled $90.0 billion. More information about HP (NYSE, Nasdaq: HPQ) is available at http://www.hp.com.
HP today announced the industry’s first easy-to-use storage systems that drive down cost and complexity for small and medium businesses (SMBs) grappling with the exponential growth of data.
The new HP StorageWorks All-in-One (AiO) Storage Systems deliver simple, application-centric storage management, reliable data protection and affordable data storage. Using the systems, SMBs with limited or no storage expertise are able to store, share, manage, back up and protect their rapidly growing application and file data in a flexible network storage environment.
More than 60 percent of SMBs have not deployed networked storage,(1) and the HP All-in-One Storage Systems address what industry analyst firm IDC predicts will be a $5.7 billion market opportunity by 2010.(2)
“With these systems, HP is now the benchmark for storage solutions for SMBs,” said Bob Schultz, senior vice president and general manager, StorageWorks Division, HP. “Developed from the ground up to address the specific problems SMBs face, the HP All-in-One Storage Systems deliver functionality without the complexity of competing offerings.”
The HP AiO Storage Systems handle all storage tasks and are accessed via an easy-to-use graphical interface, saving administrators time and providing newfound levels of control.
“It’s clear the HP All-in-One Storage Systems were designed with the needs of small businesses like St. John’s in mind,” said Charles Love, director of IT for St. John’s Episcopal School in Tampa, Fla. “From the intuitive set-up wizards to the easy-to-use interface, the system has proven to be an extremely simple, complete and low-cost solution to help us manage our growing storage needs.”
The systems are specifically designed for managing storage in Microsoft® environments, including point-and-click tools for working with Microsoft Exchange and SQL Server. In fewer than 10 clicks, for example, customers can fully set up shared storage for an Exchange mailstore.
“We’re excited to work with HP, a strategic partner with deep expertise in Windows and networked storage technology, to deliver a breakthrough storage innovation,” said Gabriel Broner, general manager, Storage, Microsoft Corp. “HP All-in-One Storage Systems bring together HP’s unique design with Microsoft’s Storage Server software to provide an extremely compelling offering for SMBs. Microsoft is committed to making universal distributed storage a reality, by reducing storage costs and providing customers with high-end functionality on industry-standard hardware.”
A full version of HP StorageWorks Data Protector Express Software is integrated into HP All-in-One Storage Systems to enable data back up and recovery from tape, virtual tape, optical or external disk on the network. The HP AiO Storage Systems also run Windows® Storage Server 2003 R2, which enables replication from one system to another and rapid recovery in the event of a disaster.
“The HP All-in-One is filling a hole in the market that no one else is addressing by taking the complexity out of storage,” said Don Zurbrick, sales manager, Big Sur Technologies, a Florida-based solutions and services provider. ”We are confident that the simplicity of the AiO will help drive incremental business and strengthen our position as a trusted advisor to our customers.”
The HP StorageWorks All-in-One Storage Systems are available now and will be sold primarily through HP’s extensive network of more than 145,000 channel resellers worldwide.
More information about HP StorageWorks All-in-One Storage Systems is available in an online press kit at www.hp.com/go/AiOLaunch.
About HP
HP is a technology solutions provider to consumers, businesses and institutions globally. The company’s offerings span IT infrastructure, global services, business and home computing, and imaging and printing. For the four fiscal quarters ended July 31, 2006, HP revenue totaled $90.0 billion. More information about HP (NYSE, Nasdaq: HPQ) is available at http://www.hp.com.
MSN Launches Beta of Soapbox on MSN Video
MSN expands industry-leading MSN Video service by enabling people to actively participate in the MSN content experience.
REDMOND, Wash. — MSN today announced the U.S. beta release of Soapbox on MSN® Video, a user-uploaded video service that makes it easy for people to express themselves by uploading, discovering and sharing personal videos with the Soapbox community and others around the world. Soapbox will be available on MSN Video and will be deeply integrated throughout Microsoft Corp.’s portfolio of online services, including Windows Live™ Spaces and Windows Live Messenger.
“Soapbox delivers on a critical component of the MSN growth strategy of deepening audience engagement by enabling people to participate in the content experience,” said Rob Bennett, general manager of Entertainment and Video Services for MSN. “By adding a user-uploaded video service, we are rounding out our existing investments in commercially produced and original content on MSN Video.”
Advanced Technology for Exceptional Performance
Soapbox on MSN Video utilizes powerful Web 2.0 technologies to provide a dynamic, fun and entertaining experience and offers these benefits:
•
Easy uploading and sharing of video creations. By providing single-step uploading, background server-side video processing and acceptance of all major digital video formats, Soapbox makes uploading videos a snap.
•
Finding and discovering the most entertaining videos. Viewers can search, browse through 15 categories, find related videos, subscribe to RSS feeds, and share their favorites with their friends — all without interrupting whatever video they are watching.
•
Participation in the Soapbox community. Soapbox users can rate, comment on and tag the videos they view, share links with their friends via e-mail, and include the embeddable Soapbox player directly on their Web site or blog.
Availability
The beta of Soapbox on MSN Video is available on an invitation-only basis in the U.S. Those interested in participating in the beta can sign up for the waiting list now at http://soapbox.msn.com. Access to the beta will expand over time by enabling existing beta testers to invite a limited number of friends. The beta of Soapbox on MSN Video is available to users of Microsoft® Internet Explorer® 6 or later running on Windows® XP and Firefox 1.0.5 or later running on Windows XP or Macintosh OS X.
About MSN Video
MSN Video is one of the largest video-only streaming services on the Web, watched by more than 11 million unique users per month. In addition to streaming news, entertainment and sports video clips from more than 45 content partners including “Today,” FOX Sports, MSNBC, JibJab Media Inc. and Fox Entertainment Group, MSN Video presents a broad array of live events to online audiences worldwide. More than 50 top advertisers support MSN Video, which is available to consumers at no charge. MSN Video is available on the Web at c to consumers in the U.S. MSN Video is also live in Australia, Canada, Japan and the U.K., and in Spanish in the U.S.; the service also is in beta testing in France.
Overall, MSN attracts more than 465 million unique users worldwide per month. With localized versions available globally in 42 markets and 21 languages, MSN is a world leader in delivering Web services to consumers and online advertising opportunities to businesses worldwide.
About Microsoft
Founded in 1975, Microsoft (Nasdaq “MSFT”) is the worldwide leader in software, services and solutions that help people and businesses realize their full potential.
Note to editors: If you are interested in viewing additional information on Microsoft, please visit the Microsoft Web page at http://www.microsoft.com/presspass on Microsoft’s corporate information pages. Web links, telephone numbers and titles were correct at time of publication, but may since have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://www.microsoft.com/presspass/contactpr.mspx.
REDMOND, Wash. — MSN today announced the U.S. beta release of Soapbox on MSN® Video, a user-uploaded video service that makes it easy for people to express themselves by uploading, discovering and sharing personal videos with the Soapbox community and others around the world. Soapbox will be available on MSN Video and will be deeply integrated throughout Microsoft Corp.’s portfolio of online services, including Windows Live™ Spaces and Windows Live Messenger.
“Soapbox delivers on a critical component of the MSN growth strategy of deepening audience engagement by enabling people to participate in the content experience,” said Rob Bennett, general manager of Entertainment and Video Services for MSN. “By adding a user-uploaded video service, we are rounding out our existing investments in commercially produced and original content on MSN Video.”
Advanced Technology for Exceptional Performance
Soapbox on MSN Video utilizes powerful Web 2.0 technologies to provide a dynamic, fun and entertaining experience and offers these benefits:
•
Easy uploading and sharing of video creations. By providing single-step uploading, background server-side video processing and acceptance of all major digital video formats, Soapbox makes uploading videos a snap.
•
Finding and discovering the most entertaining videos. Viewers can search, browse through 15 categories, find related videos, subscribe to RSS feeds, and share their favorites with their friends — all without interrupting whatever video they are watching.
•
Participation in the Soapbox community. Soapbox users can rate, comment on and tag the videos they view, share links with their friends via e-mail, and include the embeddable Soapbox player directly on their Web site or blog.
Availability
The beta of Soapbox on MSN Video is available on an invitation-only basis in the U.S. Those interested in participating in the beta can sign up for the waiting list now at http://soapbox.msn.com. Access to the beta will expand over time by enabling existing beta testers to invite a limited number of friends. The beta of Soapbox on MSN Video is available to users of Microsoft® Internet Explorer® 6 or later running on Windows® XP and Firefox 1.0.5 or later running on Windows XP or Macintosh OS X.
About MSN Video
MSN Video is one of the largest video-only streaming services on the Web, watched by more than 11 million unique users per month. In addition to streaming news, entertainment and sports video clips from more than 45 content partners including “Today,” FOX Sports, MSNBC, JibJab Media Inc. and Fox Entertainment Group, MSN Video presents a broad array of live events to online audiences worldwide. More than 50 top advertisers support MSN Video, which is available to consumers at no charge. MSN Video is available on the Web at c to consumers in the U.S. MSN Video is also live in Australia, Canada, Japan and the U.K., and in Spanish in the U.S.; the service also is in beta testing in France.
Overall, MSN attracts more than 465 million unique users worldwide per month. With localized versions available globally in 42 markets and 21 languages, MSN is a world leader in delivering Web services to consumers and online advertising opportunities to businesses worldwide.
About Microsoft
Founded in 1975, Microsoft (Nasdaq “MSFT”) is the worldwide leader in software, services and solutions that help people and businesses realize their full potential.
Note to editors: If you are interested in viewing additional information on Microsoft, please visit the Microsoft Web page at http://www.microsoft.com/presspass on Microsoft’s corporate information pages. Web links, telephone numbers and titles were correct at time of publication, but may since have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://www.microsoft.com/presspass/contactpr.mspx.
Monday, September 18, 2006
'No time to exercise' is no excuse
A new study, published in The Journal of Physiology, shows that short bursts of very intense exercise -- equivalent to only a few minutes per day -- can produce the same results as traditional endurance training.
"The most striking finding from our study was the remarkably similar improvements in muscle health and performance induced by two such diverse training strategies," says Martin Gibala, an associate professor of kinesiology at McMaster University.
Gibala's team made headlines last year when they suggested that a few minutes of high-intensity exercise could be as effective as an hour of moderate activity. However, their previous work did not directly compare sprint versus endurance training.
The new study was conducted on 16 college-aged students who performed six training sessions over two weeks. Eight subjects performed between four and six 30-second bursts of "all out" cycling separated by 4 minutes of recovery during each training session. The other eight subjects performed 90-120 minutes of continuous moderate-intensity cycling each day. Total training time commitment including recovery was 2.5 hours in the sprint group, whereas the endurance group performed 10.5 hours of total exercise over two weeks. Despite the marked difference in training volume, both groups showed similar improvements in exercise performance and muscle parameters associated with fatigue resistance.
"Our study demonstrates that interval-based exercise is a very time-efficient training strategy," said Gibala. "This type of training is very demanding and requires a high level of motivation. However, short bursts of intense exercise may be an effective option for individuals who cite 'lack of time' as a major impediment to fitness."
"The most striking finding from our study was the remarkably similar improvements in muscle health and performance induced by two such diverse training strategies," says Martin Gibala, an associate professor of kinesiology at McMaster University.
Gibala's team made headlines last year when they suggested that a few minutes of high-intensity exercise could be as effective as an hour of moderate activity. However, their previous work did not directly compare sprint versus endurance training.
The new study was conducted on 16 college-aged students who performed six training sessions over two weeks. Eight subjects performed between four and six 30-second bursts of "all out" cycling separated by 4 minutes of recovery during each training session. The other eight subjects performed 90-120 minutes of continuous moderate-intensity cycling each day. Total training time commitment including recovery was 2.5 hours in the sprint group, whereas the endurance group performed 10.5 hours of total exercise over two weeks. Despite the marked difference in training volume, both groups showed similar improvements in exercise performance and muscle parameters associated with fatigue resistance.
"Our study demonstrates that interval-based exercise is a very time-efficient training strategy," said Gibala. "This type of training is very demanding and requires a high level of motivation. However, short bursts of intense exercise may be an effective option for individuals who cite 'lack of time' as a major impediment to fitness."
Sunday, September 17, 2006
Acoustic data may reveal hidden gas, oil supplies
Just as doctors use ultrasound to image internal organs and unborn babies, MIT Earth Resources Laboratory researchers listen to the echoing language of rocks to map what's going on tens of thousands of feet below the Earth's surface.
With the help of a new $580,000 US Department of Energy (DOE) grant, the earth scientists will use their skills at interpreting underground sound to seek out "sweet spots"--pockets of natural gas and oil contained in fractured porous rocks--in a Wyoming oil field. If the method proves effective at determining where to drill wells, it could eventually be used at oil and gas fields across the country.
A major domestic source of natural gas is low-permeability or "tight" gas formations. Oil and gas come from organic materials that have been cooked for eons under the pressure and high heat of the Earth's crust. Some underground reservoirs contain large volumes of oil and gas that flow easily through permeable rocks, but sometimes the fluids are trapped in rocks with small, difficult-to-access pores, forming separate scattered pockets. Until recently, there was no technology available to get at tight gas.
Tight gas is now the largest of three unconventional gas resources, which also include coal beds and shale. Production of unconventional gas in the United States represented around 40 percent of the nation's total gas output in 2004, according to the DOE, but could grow to 50 percent by 2030 if advanced technologies are developed and implemented.
One such advanced technology is the brainchild of Mark E. Willis and Daniel R. Burns, research scientists in the Department of Earth, Atmospheric and Planetary Sciences (EAPS), and M. Nafi Toksoz, professor of EAPS. Their method involves combining data from two established, yet previously unrelated, means of seeking out hidden oil and gas reserves.
To free up the hydrocarbons scattered in small pockets from one to three miles below ground, oil companies use a process called hydraulic fracturing, or hydrofrac, which forces water into the bedrock through deep wells to create fractures and increase the size and extent of existing fractures. The fractures open up avenues for the oil and gas to flow to wells.
To monitor the effectiveness of fracturing and to detect natural fractures that may be sweet spots of natural gas, engineers gather acoustic data from the surface and from deep within wells. "Surface seismic methods are like medical ultrasound. They give us images of the subsurface geology," Burns said. Three-dimensional seismic surveys involve creating vibrations on the surface and monitoring the resulting underground echoes. "When the echoes change, fractures are there," Willis said.
A method called time-lapse vertical seismic profiling (VSP) tends to be more accurate because it collects acoustic data directly underground through bore holes. "Putting the receivers down into a well is like making images with sensors inside the body in the medical world," Burns said. "The result is the ability to see finer details and avoid all the clutter that comes from sending sound waves through the skin and muscle tissue to get at the thing we are most interested in seeing."
Time-lapse VSP is expensive and not routinely used in oil and gas exploration. The EAPS research team, working with time-lapse VSP data collected by industry partner EnCana Corp., came up with unique ways to look at the data together with microseismic data from the tiny earthquakes that are produced when the rock is fractured. "If we record and locate these events just as the US Geological Survey does with large earthquakes around the world, we get an idea of where the hydrofrac is located. Then we look at the time-lapse VSP data at those spots and try to get a more detailed image of the fracture," Burns said.
The MIT team hopes to show that this new approach is the most effective way to find sweet spots. "If we can demonstrate the value of time-lapse VSP, this tool could be used in a wider fashion across the United States on many fields," Willis said.
Deborah Halber
With the help of a new $580,000 US Department of Energy (DOE) grant, the earth scientists will use their skills at interpreting underground sound to seek out "sweet spots"--pockets of natural gas and oil contained in fractured porous rocks--in a Wyoming oil field. If the method proves effective at determining where to drill wells, it could eventually be used at oil and gas fields across the country.
A major domestic source of natural gas is low-permeability or "tight" gas formations. Oil and gas come from organic materials that have been cooked for eons under the pressure and high heat of the Earth's crust. Some underground reservoirs contain large volumes of oil and gas that flow easily through permeable rocks, but sometimes the fluids are trapped in rocks with small, difficult-to-access pores, forming separate scattered pockets. Until recently, there was no technology available to get at tight gas.
Tight gas is now the largest of three unconventional gas resources, which also include coal beds and shale. Production of unconventional gas in the United States represented around 40 percent of the nation's total gas output in 2004, according to the DOE, but could grow to 50 percent by 2030 if advanced technologies are developed and implemented.
One such advanced technology is the brainchild of Mark E. Willis and Daniel R. Burns, research scientists in the Department of Earth, Atmospheric and Planetary Sciences (EAPS), and M. Nafi Toksoz, professor of EAPS. Their method involves combining data from two established, yet previously unrelated, means of seeking out hidden oil and gas reserves.
To free up the hydrocarbons scattered in small pockets from one to three miles below ground, oil companies use a process called hydraulic fracturing, or hydrofrac, which forces water into the bedrock through deep wells to create fractures and increase the size and extent of existing fractures. The fractures open up avenues for the oil and gas to flow to wells.
To monitor the effectiveness of fracturing and to detect natural fractures that may be sweet spots of natural gas, engineers gather acoustic data from the surface and from deep within wells. "Surface seismic methods are like medical ultrasound. They give us images of the subsurface geology," Burns said. Three-dimensional seismic surveys involve creating vibrations on the surface and monitoring the resulting underground echoes. "When the echoes change, fractures are there," Willis said.
A method called time-lapse vertical seismic profiling (VSP) tends to be more accurate because it collects acoustic data directly underground through bore holes. "Putting the receivers down into a well is like making images with sensors inside the body in the medical world," Burns said. "The result is the ability to see finer details and avoid all the clutter that comes from sending sound waves through the skin and muscle tissue to get at the thing we are most interested in seeing."
Time-lapse VSP is expensive and not routinely used in oil and gas exploration. The EAPS research team, working with time-lapse VSP data collected by industry partner EnCana Corp., came up with unique ways to look at the data together with microseismic data from the tiny earthquakes that are produced when the rock is fractured. "If we record and locate these events just as the US Geological Survey does with large earthquakes around the world, we get an idea of where the hydrofrac is located. Then we look at the time-lapse VSP data at those spots and try to get a more detailed image of the fracture," Burns said.
The MIT team hopes to show that this new approach is the most effective way to find sweet spots. "If we can demonstrate the value of time-lapse VSP, this tool could be used in a wider fashion across the United States on many fields," Willis said.
Deborah Halber
Weather forecast accuracy gets boost with new computer model
BOULDER--An advanced forecasting model that predicts several types of extreme weather with substantially improved accuracy has been adopted for day-to-day operational use by civilian and military weather forecasters. The new computer model was created through a partnership that includes the National Oceanic and Atmospheric Administration (NOAA), the National Center for Atmospheric Research (NCAR), and more than 150 other organizations and universities in the United States and abroad.
The high-resolution Weather Research and Forecasting model (WRF) is the first model to serve as both the backbone of the nation's public weather forecasts and a tool for cutting-edge weather research. Because the model fulfills both functions, it is easier for research findings to be translated into improved operational models, leading to better forecasts.
The model was adopted for use by NOAA's National Weather Service (NWS) as the primary model for its one-to-three-day U.S. forecasts and as a key part of the NWS's ensemble modeling system for short-range forecasts. The U.S. Air Force Weather Agency (AFWA) also has used WRF for several areas of operations around the world.
"The Weather Research and Forecasting model development project is the first time researchers and operational scientists have come together to collaborate on a weather modeling project of this magnitude," says Louis Uccellini, director of NOAA's National Centers for Environmental Prediction.
By late 2007, the new model will shape forecasts that serve more than a third of the world's population. It is being adopted by the national weather agencies of Taiwan, South Korea, China, and India.
"WRF is becoming the world's most popular model for weather prediction because it serves forecasters as well as researchers," says NCAR director Tim Killeen.
Multiple benefits
Tests over the last year at NOAA and AFWA have shown that the new model offers multiple benefits over its predecessor models. For example:
Errors in nighttime temperature and humidity across the eastern United States are cut by more than 50%.
The model depicts flight-level winds in the subtropics that are stronger and more realistic, thus leading to improved turbulence guidance for aircraft.
The model outperformed its predecessor in more than 70% of the situations studied by AFWA.
WRF incorporates data from satellites, radars, and a wide range of other tools with greater ease than earlier models.
Advanced research
NCAR has been experimenting with an advanced research version of WRF, with very fine resolution and innovative techniques, to demonstrate where potential may exist for improving the accuracy of hurricane track, intensity, and rainfall forecasts. A special hurricane-oriented version of WRF, the HWRF, is now being developed by scientists from NOAA, the Naval Research Laboratory, the University of Rhode Island, and Florida State University to support NOAA hurricane forecasting. The high-resolution HWRF will track waves and other features of the ocean and atmosphere, including the heat and moisture exchanged between them. Its depiction of hurricane cores and the ocean below them will be enhanced by data from satellites, aircraft, and other observing tools.
WRF also is skilled at depicting intense squall lines, supercell thunderstorms, and other types of severe weather. Although no model can pinpoint hours ahead of time where a thunderstorm will form, WRF outpaces many models in its ability to predict what types of storms could form and how they might evolve.
Approximately 4,000 people in 77 countries are registered users of WRF. Many of these users suggest improvements, which are tested for operational usefulness at a testbed facility based at NCAR and supported by NOAA.
"WRF will continue to improve because of all the research and development pouring into it from our nation's leading academic and scientific institutions," said AFWA commander Patrick Condray.
Contact:David Hosansky
hosansky@ucar.edu
303-497-8611
National Center for Atmospheric Research/University Corporation for Atmospheric Research
Dennis Feltgen
dennis.feltgen@noaa.gov
301-763-0622, ext. 127
National Oceanic and Atmospheric Administration
Miles Brown
miles.brown@afwa.af.mil
402-294-2862
Air Force Weather Agency Public Affairs
The high-resolution Weather Research and Forecasting model (WRF) is the first model to serve as both the backbone of the nation's public weather forecasts and a tool for cutting-edge weather research. Because the model fulfills both functions, it is easier for research findings to be translated into improved operational models, leading to better forecasts.
The model was adopted for use by NOAA's National Weather Service (NWS) as the primary model for its one-to-three-day U.S. forecasts and as a key part of the NWS's ensemble modeling system for short-range forecasts. The U.S. Air Force Weather Agency (AFWA) also has used WRF for several areas of operations around the world.
"The Weather Research and Forecasting model development project is the first time researchers and operational scientists have come together to collaborate on a weather modeling project of this magnitude," says Louis Uccellini, director of NOAA's National Centers for Environmental Prediction.
By late 2007, the new model will shape forecasts that serve more than a third of the world's population. It is being adopted by the national weather agencies of Taiwan, South Korea, China, and India.
"WRF is becoming the world's most popular model for weather prediction because it serves forecasters as well as researchers," says NCAR director Tim Killeen.
Multiple benefits
Tests over the last year at NOAA and AFWA have shown that the new model offers multiple benefits over its predecessor models. For example:
Errors in nighttime temperature and humidity across the eastern United States are cut by more than 50%.
The model depicts flight-level winds in the subtropics that are stronger and more realistic, thus leading to improved turbulence guidance for aircraft.
The model outperformed its predecessor in more than 70% of the situations studied by AFWA.
WRF incorporates data from satellites, radars, and a wide range of other tools with greater ease than earlier models.
Advanced research
NCAR has been experimenting with an advanced research version of WRF, with very fine resolution and innovative techniques, to demonstrate where potential may exist for improving the accuracy of hurricane track, intensity, and rainfall forecasts. A special hurricane-oriented version of WRF, the HWRF, is now being developed by scientists from NOAA, the Naval Research Laboratory, the University of Rhode Island, and Florida State University to support NOAA hurricane forecasting. The high-resolution HWRF will track waves and other features of the ocean and atmosphere, including the heat and moisture exchanged between them. Its depiction of hurricane cores and the ocean below them will be enhanced by data from satellites, aircraft, and other observing tools.
WRF also is skilled at depicting intense squall lines, supercell thunderstorms, and other types of severe weather. Although no model can pinpoint hours ahead of time where a thunderstorm will form, WRF outpaces many models in its ability to predict what types of storms could form and how they might evolve.
Approximately 4,000 people in 77 countries are registered users of WRF. Many of these users suggest improvements, which are tested for operational usefulness at a testbed facility based at NCAR and supported by NOAA.
"WRF will continue to improve because of all the research and development pouring into it from our nation's leading academic and scientific institutions," said AFWA commander Patrick Condray.
Contact:David Hosansky
hosansky@ucar.edu
303-497-8611
National Center for Atmospheric Research/University Corporation for Atmospheric Research
Dennis Feltgen
dennis.feltgen@noaa.gov
301-763-0622, ext. 127
National Oceanic and Atmospheric Administration
Miles Brown
miles.brown@afwa.af.mil
402-294-2862
Air Force Weather Agency Public Affairs
M. D. Anderson teaches the art of aromatherapy to soothe and heal
A bubble bath that improves memory. A kitchen cleaner that wards off nausea and energizes. A scented handkerchief that calms a patient entering the MRI. The benefits of aromatherapy are real. Below, learn the uses, healing properties and how-tos of using aromatherapy to heal and de-stress from The University of Texas M. D. Anderson Cancer Center in Houston.
Scan the shelves of the local bath and body stores and one is sure to find products labeled for aromatherapy. Many might be surprised to learn the science behind it. So what is aromatherapy, how is it used and will those products actually work?
Cherie Perez, a supervising research nurse in the Department of Genitourinary Medical Oncology, teaches a monthly aromatherapy class to answer those questions for cancer patients and caregivers undergoing treatment at M. D. Anderson Cancer Center. Perez's classes are offered free of charge through M. D. Anderson's Place... of wellness, a center within the institution that focuses on helping patients and caregivers deal with the non-medical issues of living with cancer, and is the first complementary therapy facility to be built on the campus of a comprehensive cancer center.
Perez, who first became involved with aromatherapy to help relieve the physical pain and discomfort caused by fibromyalgia, shares her professional knowledge of the basics of aromatherapy, safety precautions and interactive demonstrations in each hour-long class.
Oils and healing
While essential oils may not directly stimulate the immune system, they can complement cancer treatment by boosting the system's ability to fight off infections, says Perez.
Certain oils can also stimulate lymphatic drainage or have antibacterial properties. Since it has many potential uses ranging from managing anxiety and nausea to helping with sleep, general relaxation, memory and attention, many individuals, including cancer patients, can benefit from aromatherapy [See Sidebar 1: Five Oils to Reduce Stress and Relieve Ailments.]
There are a variety of different products and methods of diffusion to obtain the healing benefits of oils. Some oils - like lavender, ylang ylang and sandalwood can be applied directly to the skin - while others are too concentrated and need to be diluted into carriers such as massage oils, bath soaps and lotions [See Sidebar 2: Everyday Uses for Aromatherapy.] Most typically, Perez advises patients to put a few drops of an oil, or a combination of oils onto a handkerchief and "fan themselves like Scarlett O'Hara." Burning oils or incense is not recommended because most are poorly constructed and give off unhealthy fumes and soot.
Who should, or shouldn't, use oils?
Widely sold in health food stores and beauty chain stores, essential oils do have chemical properties that can affect the brain and enter the bloodstream, and for some patients may be toxic when combined with common cancer therapies such as chemotherapy and radiation therapy. Perez says essential oils, like many medicines, can increase a person's sensitivity to the sun and should be used with caution. Patients should always inform and discuss with their physicians before using aromatherapy oils to complement a medical condition.
People with high blood pressure should avoid hyssop, rosemary, sage and thyme, while diabetics should avoid angelica oil. Women who are pregnant or nursing should avoid a number of oils that stimulate the uterus including star anise, basil and juniper to name a few and should use with caution peppermint, rose and rosemary in the first trimester. According to Perez, pediatric patients can use aromatherapy essential oils in very low concentrations. [See Sidebar 3: Tips for Buying Oils.]
Aromatherapy's role in cancer treatment
"The nature of aromatherapy makes it challenging to study due to the fact that it is difficult to create a placebo and every person is different in their nasal sensitivities and skin absorption rates," says Perez. In the future, however, she would be interested in designing research to examine how aromatherapy can be used to treat/heal burns caused from radiation treatment safely and effectively, soothe pre-treatment anxiety and manage loss-of-memory issues in cancer survivors.
Contact: Lindsay Anderson
lindsay.anderson@gabbe.com
212-220-4444
University of Texas M. D. Anderson Cancer Center
Scan the shelves of the local bath and body stores and one is sure to find products labeled for aromatherapy. Many might be surprised to learn the science behind it. So what is aromatherapy, how is it used and will those products actually work?
Cherie Perez, a supervising research nurse in the Department of Genitourinary Medical Oncology, teaches a monthly aromatherapy class to answer those questions for cancer patients and caregivers undergoing treatment at M. D. Anderson Cancer Center. Perez's classes are offered free of charge through M. D. Anderson's Place... of wellness, a center within the institution that focuses on helping patients and caregivers deal with the non-medical issues of living with cancer, and is the first complementary therapy facility to be built on the campus of a comprehensive cancer center.
Perez, who first became involved with aromatherapy to help relieve the physical pain and discomfort caused by fibromyalgia, shares her professional knowledge of the basics of aromatherapy, safety precautions and interactive demonstrations in each hour-long class.
Oils and healing
While essential oils may not directly stimulate the immune system, they can complement cancer treatment by boosting the system's ability to fight off infections, says Perez.
Certain oils can also stimulate lymphatic drainage or have antibacterial properties. Since it has many potential uses ranging from managing anxiety and nausea to helping with sleep, general relaxation, memory and attention, many individuals, including cancer patients, can benefit from aromatherapy [See Sidebar 1: Five Oils to Reduce Stress and Relieve Ailments.]
There are a variety of different products and methods of diffusion to obtain the healing benefits of oils. Some oils - like lavender, ylang ylang and sandalwood can be applied directly to the skin - while others are too concentrated and need to be diluted into carriers such as massage oils, bath soaps and lotions [See Sidebar 2: Everyday Uses for Aromatherapy.] Most typically, Perez advises patients to put a few drops of an oil, or a combination of oils onto a handkerchief and "fan themselves like Scarlett O'Hara." Burning oils or incense is not recommended because most are poorly constructed and give off unhealthy fumes and soot.
Who should, or shouldn't, use oils?
Widely sold in health food stores and beauty chain stores, essential oils do have chemical properties that can affect the brain and enter the bloodstream, and for some patients may be toxic when combined with common cancer therapies such as chemotherapy and radiation therapy. Perez says essential oils, like many medicines, can increase a person's sensitivity to the sun and should be used with caution. Patients should always inform and discuss with their physicians before using aromatherapy oils to complement a medical condition.
People with high blood pressure should avoid hyssop, rosemary, sage and thyme, while diabetics should avoid angelica oil. Women who are pregnant or nursing should avoid a number of oils that stimulate the uterus including star anise, basil and juniper to name a few and should use with caution peppermint, rose and rosemary in the first trimester. According to Perez, pediatric patients can use aromatherapy essential oils in very low concentrations. [See Sidebar 3: Tips for Buying Oils.]
Aromatherapy's role in cancer treatment
"The nature of aromatherapy makes it challenging to study due to the fact that it is difficult to create a placebo and every person is different in their nasal sensitivities and skin absorption rates," says Perez. In the future, however, she would be interested in designing research to examine how aromatherapy can be used to treat/heal burns caused from radiation treatment safely and effectively, soothe pre-treatment anxiety and manage loss-of-memory issues in cancer survivors.
Contact: Lindsay Anderson
lindsay.anderson@gabbe.com
212-220-4444
University of Texas M. D. Anderson Cancer Center
Engineers forge greener path to iron production
MIT engineers have demonstrated an eco-friendly way to make iron. The new method eliminates the greenhouse gases usually associated with iron production.
The American Iron and Steel Institute (AISI) announced today that the team, led by Donald R. Sadoway of the Department of Materials Science and Engineering, has shown the technical viability of producing iron by molten oxide electrolysis (MOE).
"What sets molten oxide electrolysis apart from other metal-producing technologies is that it is totally carbon-free and hence generates no carbon dioxide gases -- only oxygen," said Lawrence W. Kavanagh, AISI vice president of manufacturing and technology.
The work was funded by the AISI/Department of Energy Technology Roadmap Program (TRP). The TRP goal is to increase the competitiveness of the U.S. steel industry while saving energy and enhancing the environment. According to the AISI, the MIT work "marks one of TRP's breakthrough projects toward meeting that goal."
Unlike other iron-making processes, MOE works by passing an electric current through a liquid solution of iron oxide. The iron oxide then breaks down into liquid iron and oxygen gas, allowing oxygen to be the main byproduct of the process.
Electrolysis itself is nothing new -- all of the world's aluminum is produced this way. And that is one advantage of the new process: It is based on a technology that metallurgists are already familiar with. Unlike aluminum smelting, however, MOE is carbon-free.
"What's different this time is that we have the resources to take the time to unravel the underlying basic science," said Sadoway, the John F. Elliott Professor of Materials Chemistry. "No one has ever studied the fundamental electrochemistry of a process operating at 1600ÂșC. We're doing voltammetry at white heat!"
The result? "I now can confirm that in molten oxide electrolysis we'll see iron productivities at least five times that of aluminum, maybe as high as 10 times. This changes everything when it comes to assessing technical viability at the industrial scale."
MIT will continue further experiments to determine how to increase the rate of iron production and to discover new materials capable of extending the life of certain reactor components to industrially practical limits. This work will set the stage for construction of a pilot-scale cell to further validate the viability of the MOE process and identify scale-up parameters.
Elizabeth A. Thomson, News Office
The American Iron and Steel Institute (AISI) announced today that the team, led by Donald R. Sadoway of the Department of Materials Science and Engineering, has shown the technical viability of producing iron by molten oxide electrolysis (MOE).
"What sets molten oxide electrolysis apart from other metal-producing technologies is that it is totally carbon-free and hence generates no carbon dioxide gases -- only oxygen," said Lawrence W. Kavanagh, AISI vice president of manufacturing and technology.
The work was funded by the AISI/Department of Energy Technology Roadmap Program (TRP). The TRP goal is to increase the competitiveness of the U.S. steel industry while saving energy and enhancing the environment. According to the AISI, the MIT work "marks one of TRP's breakthrough projects toward meeting that goal."
Unlike other iron-making processes, MOE works by passing an electric current through a liquid solution of iron oxide. The iron oxide then breaks down into liquid iron and oxygen gas, allowing oxygen to be the main byproduct of the process.
Electrolysis itself is nothing new -- all of the world's aluminum is produced this way. And that is one advantage of the new process: It is based on a technology that metallurgists are already familiar with. Unlike aluminum smelting, however, MOE is carbon-free.
"What's different this time is that we have the resources to take the time to unravel the underlying basic science," said Sadoway, the John F. Elliott Professor of Materials Chemistry. "No one has ever studied the fundamental electrochemistry of a process operating at 1600ÂșC. We're doing voltammetry at white heat!"
The result? "I now can confirm that in molten oxide electrolysis we'll see iron productivities at least five times that of aluminum, maybe as high as 10 times. This changes everything when it comes to assessing technical viability at the industrial scale."
MIT will continue further experiments to determine how to increase the rate of iron production and to discover new materials capable of extending the life of certain reactor components to industrially practical limits. This work will set the stage for construction of a pilot-scale cell to further validate the viability of the MOE process and identify scale-up parameters.
Elizabeth A. Thomson, News Office
Planet Earth may have 'tilted' to keep its balance
Imagine a shift in the Earth so profound that it could force our entire planet to spin on its side after a few million years, tilting it so far that Alaska would sit at the equator. Princeton scientists have now provided the first compelling evidence that this kind of major shift may have happened in our world's distant past.
By analyzing the magnetic composition of ancient sediments found in the remote Norwegian archipelago of Svalbard, Princeton University's Adam Maloof has lent credence to a 140-year-old theory regarding the way the Earth might restore its own balance if an unequal distribution of weight ever developed in its interior or on its surface.
The theory, known as true polar wander, postulates that if an object of sufficient weight --such as a supersized volcano -- ever formed far from the equator, the force of the planet's rotation would gradually pull the heavy object away from the axis the Earth spins around. If the volcanoes, land and other masses that exist within the spinning Earth ever became sufficiently imbalanced, the planet would tilt and rotate itself until this extra weight was relocated to a point along the equator.
"The sediments we have recovered from Norway offer the first good evidence that a true polar wander event happened about 800 million years ago," said Maloof, an assistant professor of geosciences. "If we can find good corroborating evidence from other parts of the world as well, we will have a very good idea that our planet is capable of this sort of dramatic change."
Maloof's team, which includes researchers from Harvard University, the California Institute of Technology and the Massachusetts Institute of Technology as well as Princeton, will publish their findings in the Geological Society of America Bulletin on Friday, Aug. 25.
True polar wander is different from the more familiar idea of "continental drift," which is the inchwise movement of individual continents relative to one another across the Earth's surface. Polar wander can tip the entire planet on its side at a rate of perhaps several meters per year, about 10 to 100 times as fast as the continents drift due to plate tectonics. Though the poles themselves would still point in the same direction with respect to the solar system, the process could conceivably shift entire continents from the tropics to the Arctic, or vice versa, within a relatively brief geological time span.
While the idea that the continents are slowly moving in relation to one another is a well-known concept, the less familiar theory of true polar wander has been around since the mid-19th century, several decades before continental drift was ever proposed. But when the continents were proven to be moving under the influence of plate tectonics in the 1960s, it explained so many dynamic processes in the Earth's surface so well that true polar wander became an obscure subject.
"Planetary scientists still talk about polar wander for other worlds, such as Mars, where a massive buildup of volcanic rock called Tharsis sits at the Martian equator," Maloof said. "But because Earth's surface is constantly changing as the continents move and ocean crustal plates slide over and under one another, it's more difficult to find evidence of our planet twisting hundreds of millions of years ago, as Mars likely did while it was still geologically active."
However, the sediments that the team studied in Svalbard from 1999 to 2005 may have provided just such long-sought evidence. It is well known that when rock particles are sinking to the ocean floor to form layers of new sediment, tiny magnetic grains within the particles align themselves with the magnetic lines of the Earth. Once this rock hardens, it becomes a reliable record of the direction the Earth's magnetic field was pointing at the time of the rock's formation. So, if a rock has been spun around by a dramatic geological event, its magnetic field will have an apparently anomalous orientation that geophysicists like those on Maloof's team seek to explain.
"We found just such anomalies in the Svalbard sediments," Maloof said. "We made every effort to find another reason for the anomalies, such as a rapid rotation of the individual crustal plate the islands rest upon, but none of the alternatives makes as much sense as a true polar wander event when taken in the context of geochemical and sea level data from the same rocks."
The findings, he said, could possibly explain odd changes in ocean chemistry that occurred about 800 million years ago. Other similar changes in the ocean have cropped up in ancient times, Maloof said, but at these other times scientists know that an ice age was to blame.
"Scientists have found no evidence for an ice age occurring 800 million years ago, and the change in the ocean at this juncture remains one of the great mysteries in the ancient history of our planet," he said. "But if all the continents were suddenly flipped around and their rivers began carrying water and nutrients into the tropics instead of the Arctic, for example, it could produce the mysterious geochemical changes science has been trying to explain."
Because the team obtained all its data from the islands of Svalbard, Maloof said their next priority would be to seek corroborating evidence within sediments of similar age from elsewhere on the planet. This is difficult, Maloof said, because most 800-million-year-old rocks have long since disappeared. Because the Earth's crustal plates slide under one another over time, they take most of geological history back into the planet's deep interior. However, Maloof said, a site his team has located in Australia looks promising.
"We cannot be certain of these findings until we find similar patterns in rock chemistry and magnetics on other continents," Maloof said. "Rocks of the same age are preserved in the Australian interior, so we'll be visiting the site over the next two years to look for additional evidence. If we find some, we'll be far more confident about this theory's validity."
Maloof said that true polar wander was most likely to occur when the Earth's landmasses were fused together to form a single supercontinent, something that has happened at least twice in the distant past. But he said we should not worry about the planet going through a major shift again any time soon.
"If a true polar wander event has occurred in our planet's history, it's likely been when the continents formed a single mass on one side of the Earth," he said. "We don't expect there to be another event in the foreseeable future, though. The Earth's surface is pretty well balanced today."
Contact: Chad Boutin
cboutin@princeton.edu
609-258-5729
Princeton University
By analyzing the magnetic composition of ancient sediments found in the remote Norwegian archipelago of Svalbard, Princeton University's Adam Maloof has lent credence to a 140-year-old theory regarding the way the Earth might restore its own balance if an unequal distribution of weight ever developed in its interior or on its surface.
The theory, known as true polar wander, postulates that if an object of sufficient weight --such as a supersized volcano -- ever formed far from the equator, the force of the planet's rotation would gradually pull the heavy object away from the axis the Earth spins around. If the volcanoes, land and other masses that exist within the spinning Earth ever became sufficiently imbalanced, the planet would tilt and rotate itself until this extra weight was relocated to a point along the equator.
"The sediments we have recovered from Norway offer the first good evidence that a true polar wander event happened about 800 million years ago," said Maloof, an assistant professor of geosciences. "If we can find good corroborating evidence from other parts of the world as well, we will have a very good idea that our planet is capable of this sort of dramatic change."
Maloof's team, which includes researchers from Harvard University, the California Institute of Technology and the Massachusetts Institute of Technology as well as Princeton, will publish their findings in the Geological Society of America Bulletin on Friday, Aug. 25.
True polar wander is different from the more familiar idea of "continental drift," which is the inchwise movement of individual continents relative to one another across the Earth's surface. Polar wander can tip the entire planet on its side at a rate of perhaps several meters per year, about 10 to 100 times as fast as the continents drift due to plate tectonics. Though the poles themselves would still point in the same direction with respect to the solar system, the process could conceivably shift entire continents from the tropics to the Arctic, or vice versa, within a relatively brief geological time span.
While the idea that the continents are slowly moving in relation to one another is a well-known concept, the less familiar theory of true polar wander has been around since the mid-19th century, several decades before continental drift was ever proposed. But when the continents were proven to be moving under the influence of plate tectonics in the 1960s, it explained so many dynamic processes in the Earth's surface so well that true polar wander became an obscure subject.
"Planetary scientists still talk about polar wander for other worlds, such as Mars, where a massive buildup of volcanic rock called Tharsis sits at the Martian equator," Maloof said. "But because Earth's surface is constantly changing as the continents move and ocean crustal plates slide over and under one another, it's more difficult to find evidence of our planet twisting hundreds of millions of years ago, as Mars likely did while it was still geologically active."
However, the sediments that the team studied in Svalbard from 1999 to 2005 may have provided just such long-sought evidence. It is well known that when rock particles are sinking to the ocean floor to form layers of new sediment, tiny magnetic grains within the particles align themselves with the magnetic lines of the Earth. Once this rock hardens, it becomes a reliable record of the direction the Earth's magnetic field was pointing at the time of the rock's formation. So, if a rock has been spun around by a dramatic geological event, its magnetic field will have an apparently anomalous orientation that geophysicists like those on Maloof's team seek to explain.
"We found just such anomalies in the Svalbard sediments," Maloof said. "We made every effort to find another reason for the anomalies, such as a rapid rotation of the individual crustal plate the islands rest upon, but none of the alternatives makes as much sense as a true polar wander event when taken in the context of geochemical and sea level data from the same rocks."
The findings, he said, could possibly explain odd changes in ocean chemistry that occurred about 800 million years ago. Other similar changes in the ocean have cropped up in ancient times, Maloof said, but at these other times scientists know that an ice age was to blame.
"Scientists have found no evidence for an ice age occurring 800 million years ago, and the change in the ocean at this juncture remains one of the great mysteries in the ancient history of our planet," he said. "But if all the continents were suddenly flipped around and their rivers began carrying water and nutrients into the tropics instead of the Arctic, for example, it could produce the mysterious geochemical changes science has been trying to explain."
Because the team obtained all its data from the islands of Svalbard, Maloof said their next priority would be to seek corroborating evidence within sediments of similar age from elsewhere on the planet. This is difficult, Maloof said, because most 800-million-year-old rocks have long since disappeared. Because the Earth's crustal plates slide under one another over time, they take most of geological history back into the planet's deep interior. However, Maloof said, a site his team has located in Australia looks promising.
"We cannot be certain of these findings until we find similar patterns in rock chemistry and magnetics on other continents," Maloof said. "Rocks of the same age are preserved in the Australian interior, so we'll be visiting the site over the next two years to look for additional evidence. If we find some, we'll be far more confident about this theory's validity."
Maloof said that true polar wander was most likely to occur when the Earth's landmasses were fused together to form a single supercontinent, something that has happened at least twice in the distant past. But he said we should not worry about the planet going through a major shift again any time soon.
"If a true polar wander event has occurred in our planet's history, it's likely been when the continents formed a single mass on one side of the Earth," he said. "We don't expect there to be another event in the foreseeable future, though. The Earth's surface is pretty well balanced today."
Contact: Chad Boutin
cboutin@princeton.edu
609-258-5729
Princeton University
Why are so many people dying on Everest?
Personal view: Why are so many people dying on Everest? BMJ Volume 333 p 452
Why are so many people dying on Mount Everest, asks doctor and climber, Andrew Sutherland in this week's BMJ?
It used to be thought that it would be physiologically impossible to climb Mount Everest with or without oxygen. In 1953 Hillary and Tenzing proved that it was possible to reach the summit with oxygen and in 1978 Messner and Habeler demonstrated it was possible without oxygen.
Although Everest has not changed, and we now have a better understanding of acclimatisation, improved climbing equipment, and established routes, it would therefore seem logical that climbing Everest might have become an altogether less deadly activity.
However, this year the unofficial body count on Mount Everest has reached 15, the most since the disaster of 1996 when 16 people died, eight in one night following an unexpected storm.
The death rate on Mount Everest has not changed over the years, with about one death for every 10 successful ascents. For anyone who reaches the summit, they have about a 1 in 20 chance of not making it down again.
So why are there so many people dying on Mount Everest? And more importantly, can we reduce this number?
The main reasons for people dying while climbing Mount Everest are injuries and exhaustion. However, there is also a large proportion of climbers who die from altitude related illness, specifically from high altitude cerebral oedema (HACE) and high altitude pulmonary oedema (HAPE).
This year, the author was on the north side of Everest as the doctor on the Everestmax expedition (www.everestmax.com) and was shocked by both the amount of altitude related illness and the relative lack of knowledge among people attempting Everest.
He writes: "On our summit attempt we were able to help with HAPE at 7000 metres, but higher up the mountain we passed four bodies of climbers who had been less fortunate. The last body we encountered was of a Frenchman who had reached the summit four days earlier but was too exhausted to descend. His best friend had tried in vain to get him down the mountain, but they had descended only 50 metres in six hours and he had to abandon him."
"Some people believe that part of the reason for the increase in deaths is the number of inexperienced climbers, who pay large sums of money to ascend Everest," he says. "In my view, climbers are not climbing beyond their ability but instead beyond their altitude ability. Unfortunately it is difficult to get experience of what it is like climbing above Camp 3 (8300 metres) without climbing Everest. Climbers invariably do not know what their ability above 8300 metres is going to be like."
He suggests that climbers need to think less about 'the climb' and more about their health on the way up.
No matter what the affliction, whether it be HACE, HAPE, or just exhaustion, the result is invariably the same – the climber starts to climb more slowly, he explains. If you are too slow this means that something is wrong and your chances of not making it off the mountain are greatly increased. But with the summit in sight this advice is too often ignored.
When the author visited the French consulate in Kathmandu to confirm the Frenchman's death, the consul, not a climbing or an altitude expert, shook his head and said, "He didn't reach the summit until 12.30; that is a 14 hour climb – it is too long."
Contact: Emma Dickinson
edickinson@bmj.com
44-207-383-6529
BMJ-British Medical Journal
Why are so many people dying on Mount Everest, asks doctor and climber, Andrew Sutherland in this week's BMJ?
It used to be thought that it would be physiologically impossible to climb Mount Everest with or without oxygen. In 1953 Hillary and Tenzing proved that it was possible to reach the summit with oxygen and in 1978 Messner and Habeler demonstrated it was possible without oxygen.
Although Everest has not changed, and we now have a better understanding of acclimatisation, improved climbing equipment, and established routes, it would therefore seem logical that climbing Everest might have become an altogether less deadly activity.
However, this year the unofficial body count on Mount Everest has reached 15, the most since the disaster of 1996 when 16 people died, eight in one night following an unexpected storm.
The death rate on Mount Everest has not changed over the years, with about one death for every 10 successful ascents. For anyone who reaches the summit, they have about a 1 in 20 chance of not making it down again.
So why are there so many people dying on Mount Everest? And more importantly, can we reduce this number?
The main reasons for people dying while climbing Mount Everest are injuries and exhaustion. However, there is also a large proportion of climbers who die from altitude related illness, specifically from high altitude cerebral oedema (HACE) and high altitude pulmonary oedema (HAPE).
This year, the author was on the north side of Everest as the doctor on the Everestmax expedition (www.everestmax.com) and was shocked by both the amount of altitude related illness and the relative lack of knowledge among people attempting Everest.
He writes: "On our summit attempt we were able to help with HAPE at 7000 metres, but higher up the mountain we passed four bodies of climbers who had been less fortunate. The last body we encountered was of a Frenchman who had reached the summit four days earlier but was too exhausted to descend. His best friend had tried in vain to get him down the mountain, but they had descended only 50 metres in six hours and he had to abandon him."
"Some people believe that part of the reason for the increase in deaths is the number of inexperienced climbers, who pay large sums of money to ascend Everest," he says. "In my view, climbers are not climbing beyond their ability but instead beyond their altitude ability. Unfortunately it is difficult to get experience of what it is like climbing above Camp 3 (8300 metres) without climbing Everest. Climbers invariably do not know what their ability above 8300 metres is going to be like."
He suggests that climbers need to think less about 'the climb' and more about their health on the way up.
No matter what the affliction, whether it be HACE, HAPE, or just exhaustion, the result is invariably the same – the climber starts to climb more slowly, he explains. If you are too slow this means that something is wrong and your chances of not making it off the mountain are greatly increased. But with the summit in sight this advice is too often ignored.
When the author visited the French consulate in Kathmandu to confirm the Frenchman's death, the consul, not a climbing or an altitude expert, shook his head and said, "He didn't reach the summit until 12.30; that is a 14 hour climb – it is too long."
Contact: Emma Dickinson
edickinson@bmj.com
44-207-383-6529
BMJ-British Medical Journal
New lab technique churns out fungus' potential cancer fighter
For the first time, researchers have developed a way to synthesize a cancer-killing compound called rasfonin in enough quantity to learn how it works.
Derived from a fungus discovered clinging to the walls of a New Zealand cave, the chemical tricks certain cancer cells into suicide while leaving healthy cells untouched.
"In 2000, scientists in Japan discovered that this compound might have some tremendous potential as a prototype anticancer agent, but no one has been able to study or develop it because it's so hard to get enough of it from natural sources," says Robert K. Boeckman, professor of chemistry.
"You either grow the fungus that makes it, or you go through a complicated chemical synthesis process that still yields only a minute amount," he says. "Now, after five years of effort, we've worked out a process that lets researchers finally produce enough rasfonin to really start investigating how it functions, and how we might harness it to fight cancer."
In 2000, researchers from Chiba University in Japan and the University of Tokyo simultaneously discovered a compound in certain fungi that selectively destroyed cells depending upon a gene called ras--one of the first known cancer-causing genes. They had found rasfonin, a compound that seemed tailor-made to knock out ras-dependent cancers like pancreatic cancer.
After six years, however, rasfonin's secrets remain a mystery because researchers can't make enough of it to carry out tests.
To bring about a new drug, organic chemists must produce a new chemical in enough quantity to test it under many different circumstances to tease out its modus operandi. Until now, no method existed to generate rasfonin, aside from growing more fungus--a time-consuming and terribly inefficient method. Boeckman, the Marshall D. Gates, Jr. Professor of Chemistry at the University of Rochester, has now revealed a process that produces 67 times more rasfonin than any previous method. For the first time, scientists can obtain enough rasfonin to conduct proper biological tests on it.
"At a guess, I'd say that rasfonin itself will not be the final compound that might come to market," says Boeckman. "But we need to figure out how it works, how it triggers the cancer cell to shut itself down. The key is to find exactly what buttons rasfonin is pushing, and then figure out if there's a way we can safely and more simply push those same buttons. But we couldn't do that until we have enough to test."
Even Boeckman's simplified process is notably complex, employing sophisticated organic reactions. Instead of the original method's 23 steps, Boeckman's has just 16--but finding them took five years of his team's hard work, skill and intuition.
Boeckman's paper, published in the Aug. 30 issue of the Journal of the American Chemical Society, outlines the sequence of steps showing how Boeckman's group inserted, removed, or altered the three-dimensional and chemical structure of their compound until they produced complete rasfonin. Diagrams of the complete process are available on the Web at pubs.acs.org.
"Very soon, researchers should be able to scale up this process rather easily to whatever volume they need," says Boeckman. "It may be a long road to a possible treatment, but at least we're now past the first hurdle."
Contact: Jonathan Sherwood
jonathan.sherwood@rochester.edu
585-273-4726
University of Rochester
Derived from a fungus discovered clinging to the walls of a New Zealand cave, the chemical tricks certain cancer cells into suicide while leaving healthy cells untouched.
"In 2000, scientists in Japan discovered that this compound might have some tremendous potential as a prototype anticancer agent, but no one has been able to study or develop it because it's so hard to get enough of it from natural sources," says Robert K. Boeckman, professor of chemistry.
"You either grow the fungus that makes it, or you go through a complicated chemical synthesis process that still yields only a minute amount," he says. "Now, after five years of effort, we've worked out a process that lets researchers finally produce enough rasfonin to really start investigating how it functions, and how we might harness it to fight cancer."
In 2000, researchers from Chiba University in Japan and the University of Tokyo simultaneously discovered a compound in certain fungi that selectively destroyed cells depending upon a gene called ras--one of the first known cancer-causing genes. They had found rasfonin, a compound that seemed tailor-made to knock out ras-dependent cancers like pancreatic cancer.
After six years, however, rasfonin's secrets remain a mystery because researchers can't make enough of it to carry out tests.
To bring about a new drug, organic chemists must produce a new chemical in enough quantity to test it under many different circumstances to tease out its modus operandi. Until now, no method existed to generate rasfonin, aside from growing more fungus--a time-consuming and terribly inefficient method. Boeckman, the Marshall D. Gates, Jr. Professor of Chemistry at the University of Rochester, has now revealed a process that produces 67 times more rasfonin than any previous method. For the first time, scientists can obtain enough rasfonin to conduct proper biological tests on it.
"At a guess, I'd say that rasfonin itself will not be the final compound that might come to market," says Boeckman. "But we need to figure out how it works, how it triggers the cancer cell to shut itself down. The key is to find exactly what buttons rasfonin is pushing, and then figure out if there's a way we can safely and more simply push those same buttons. But we couldn't do that until we have enough to test."
Even Boeckman's simplified process is notably complex, employing sophisticated organic reactions. Instead of the original method's 23 steps, Boeckman's has just 16--but finding them took five years of his team's hard work, skill and intuition.
Boeckman's paper, published in the Aug. 30 issue of the Journal of the American Chemical Society, outlines the sequence of steps showing how Boeckman's group inserted, removed, or altered the three-dimensional and chemical structure of their compound until they produced complete rasfonin. Diagrams of the complete process are available on the Web at pubs.acs.org.
"Very soon, researchers should be able to scale up this process rather easily to whatever volume they need," says Boeckman. "It may be a long road to a possible treatment, but at least we're now past the first hurdle."
Contact: Jonathan Sherwood
jonathan.sherwood@rochester.edu
585-273-4726
University of Rochester
Subscribe to:
Posts (Atom)