Using x-ray lasers, researchers at Stockholm University have been able to map out how water fluctuates between two different states when it is cooled. At -44°C these fluctuations reach a maximum pointing to the fact that water can exist as two different distinct liquids.
Water, both common and necessary for life on earth, behaves very strangely in comparison with other substances. How water’s density, specific heat, viscosity and compressibility respond to changes in pressure and temperature is completely opposite to other liquids that we know.
We all are aware that all matter shrinks when it is cooled resulting in an increase in the density. We would therefore expect that water would have high density at the freezing point. However, if we look at a glass of ice water, everything is upside down, since we expect that water at 0°C being surrounded by ice should be at the bottom of the glass, but of course as we know ice cubes float. Strangely enough for the liquid state, water is the densest at 4 degrees C, and therefore it stays on the bottom whether it’s in a glass or in an ocean.
If you chill water below 4 degrees, it starts to expand again. If you continue to cool pure water (where the rate of crystallization is low) to below 0, it continues to expand — the expansion even speeds up when it gets colder. Many more properties such as compressibility and heat capacity become increasingly strange as water is cooled. Now researchers at Stockholm University, with the help of ultra-short x-ray pulses at x-ray lasers in Japan and South Korea, have succeeded in determining that water reaches the peak of its strange behaviour at -44°C.
Water is unique, as it can exist in two liquid states that have different ways of bonding the water molecules together. The water fluctuates between these states as if it can’t make up its mind and these fluctuations reach a maximum at -44°C. It is this ability to shift from one liquid state into another that gives water its unusual properties and since the fluctuations increase upon cooling also the strangeness increases.
“What was special was that we were able to X-ray unimaginably fast before the ice froze and could observe how it fluctuated between the two states,” says Anders Nilsson, Professor of Chemical Physics at Stockholm University. “For decades there has been speculations and different theories to explain these remarkable properties and why they got stronger when water becomes colder. Now we have found such a maximum, which means that there should also be a critical point at higher pressures.”
Another remarkable finding of the study is that the unusual properties are different between normal and heavy water and more enhanced for the lighter one. “The differences between the two isotopes, H2O and D2O, given here shows the importance of nuclear quantum effects,” says Kyung Hwan Kim, postdoc in Chemical Physics at Stockholm University. “The possibility to make new discoveries in a much studied topic such as water is totally fascinating and a great inspiration for my further studies,” says Alexander Späh, PhD student in Chemical Physics at Stockholm University.
“It was a dream come true to be able to measure water under such low temperature condition without freezing” says Harshad Pathak, postdoc in Chemical Physics at Stockholm University. “Many attempts over the world have been made to look for this maximum.”
“There has been an intense debate about the origin of the strange properties of water for over a century since the early work of Wolfgang Röntgen,” further explains Anders Nilsson. “Researchers studying the physics of water can now settle on the model that water has a critical point in the supercooled regime. The next stage is to find the location of the critical in terms of pressure and temperature. A big challenge in the next few years.”
Being able to identify microbes in real time aboard the International Space Station, without having to send them back to Earth for identification first, would be revolutionary for the world of microbiology and space exploration. The Genes in Space-3 team turned that possibility into a reality this year, when it completed the first-ever sample-to-sequence process entirely aboard the space station.
The ability to identify microbes in space could aid in the ability to diagnose and treat astronaut ailments in real time, as well as assisting in the identification of DNA-based life on other planets. It could also benefit other experiments aboard the orbiting laboratory. Identifying microbes involves isolating the DNA of samples, and then amplifying — or making many copies — of that DNA that can then be sequenced, or identified.
The investigation was broken into two parts: the collection of the microbial samples and amplification by Polymerase Chain Reaction (PCR), then sequencing and identification of the microbes. NASA astronaut Peggy Whitson conducted the experiment aboard the orbiting laboratory, with NASA microbiologist and the project’s Principal Investigator Sarah Wallace and her team watching and guiding her from Houston.
As part of regular microbial monitoring, petri plates were touched to various surfaces of the space station. Working within the Microgravity Science Glovebox (MSG) about a week later, Whitson transferred cells from growing bacterial colonies on those plates into miniature test tubes, something that had never been done before in space.
Once the cells were successfully collected, it was time to isolate the DNA and prepare it for sequencing, enabling the identification of the unknown organisms — another first for space microbiology. An historic weather event, though, threatened the ground team’s ability to guide the progress of the experiment.
“We started hearing the reports of Hurricane Harvey the week in between Peggy performing the first part of collecting the sample and gearing up for the actual sequencing,” said Wallace.
When JSC became inaccessible due to dangerous road conditions and rising flood waters, the team at Marshall Space Flight Center’s Payload Operations Integration Center in Huntsville, Alabama, who serve as “Mission Control” for all station research, worked to connect Wallace to Whitson using Wallace’s personal cell phone.
With a hurricane wreaking havoc outside, Wallace and Whitson set out to make history. Wallace offered support to Whitson, a biochemist, as she used the MinION device to sequence the amplified DNA. The data were downlinked to the team in Houston for analysis and identification.
“Once we actually got the data on the ground we were able to turn it around and start analyzing it,” said Aaron Burton, NASA biochemist and the project’s co-investigator. “You get all these squiggle plots and you have to turn that into As, Gs, Cs and Ts.”
Those As, Gs, Cs and Ts are Adenine, Guanine, Cytosine and Thymine — the four bases that make up each strand of DNA and can tell you what organism the strand of DNA came from.
“Right away, we saw one microorganism pop up, and then a second one, and they were things that we find all the time on the space station,” said Wallace. “The validation of these results would be when we got the sample back to test on Earth.”
Soon after, the samples returned to Earth, along with Whitson, aboard the Soyuz spacecraft. Biochemical and sequencing tests were completed in ground labs to confirm the findings from the space station. They ran tests multiple times to confirm accuracy. Each time, the results were exactly the same on the ground as in orbit.
“We did it. Everything worked perfectly,” said Sarah Stahl, microbiologist.
Developed in partnership by NASA’s Johnson Space Center and Boeing, this National Lab sponsored investigation is managed by the Center for the Advancement of Science in Space.
Genes in Space-1 marked the first time the PCR was used in space to amplify DNA with the miniPCR thermal cycler, followed shortly after by Biomolecule Sequencer, which used the MinION device to sequence DNA. Genes in Space-3 married these two investigations to create a full microbial identification process in microgravity.
“It was a natural collaboration to put these two pieces of technology together because individually, they’re both great, but together they enable extremely powerful molecular biology applications,” said Wallace.
Many animals have evolved to stand out. Bright colours are easy to spot, but they warn predators off by signalling toxicity or foul taste.
Yet if every individual predator has to eat colourful prey to learn this unappetising lesson, it’s a puzzle how conspicuous colours had the chance to evolve as a defensive strategy.
Now, a new study using the great tit species as a “model predator” has shown that if one bird observes another being repulsed by a new type of prey, then both birds learn the lesson to stay away.
By filming a great tit having a terrible dining experience with conspicuous prey, then showing it on a television to other tits before tracking their meal selection, researchers found that birds acquired a better idea of which prey to avoid: those that stand out.
The team behind the study, published in the journal Nature Ecology & Evolution, say the ability of great tits to learn bad food choices through observing others is an example of “social transmission.”
The scientists scaled up data from their experiments through mathematical modelling to reveal a tipping point: where social transmission has occurred sufficiently in a predator species for its potential prey to stand a better chance with bright colours over camouflage.
“Our study demonstrates that the social behaviour of predators needs to be considered to understand the evolution of their prey,” said lead author Dr Rose Thorogood, from the University of Cambridge’s Department of Zoology.
“Without social transmission taking place in predator species such as great tits, it becomes extremely difficult for conspicuously coloured prey to outlast and outcompete alternative prey, even if they are distasteful or toxic.
“There is mounting evidence that learning by observing others occurs throughout the animal kingdom. Species ranging from fruit flies to trout can learn about food using social transmission.
“We suspect our findings apply over a wide range of predators and prey. Social information may have evolutionary consequences right across ecological communities.”
Thorogood (also based at the Helsinki Institute of Life Science) and colleagues from the University of Jyväskylä and University of Zurich captured wild great tits in the Finnish winter. At Konnevesi Research Station, they trained the birds to open white paper packages with pieces of almond inside as artificial prey.
The birds were given access to aviaries covered in white paper dotted with small black crosses. These crosses were also marked on some of the paper packages: the camouflaged prey.
One bird was filmed unwrapping a package stamped with a square instead of a cross: the conspicuous prey. As such, its contents were unpalatable — an almond soaked with bitter-tasting fluid.
The bird’s reaction was played on a TV in front of some great tits but not others (a control group). When foraging in the cross-covered aviaries containing both cross and square packages, the birds exposed to the video were quicker to select their first item, and 32% less likely to choose the ‘conspicuous’ square prey.
“Just as we might learn to avoid certain foods by seeing a facial expression of disgust, observing another individual headshake and wipe its beak encouraged the great tits to avoid that type of prey,” said Thorogood.
“By modelling the social spread of information from our experimental data, we worked out that predator avoidance of more vividly conspicuous species would become enough for them to survive, spread, and evolve.”
Great tits — a close relation of North America’s chickadee — make a good study species as they are “generalist insectivores” that forage in flocks, and are known to spread other forms of information through observation.
Famously, species of tit learned how to pierce milk bottle lids and siphon the cream during the middle of last century — a phenomenon that spread rapidly through flocks across the UK.
Something great tits don’t eat, however, is a seven-spotted ladybird. “One of the most common ladybird species is bright red, and goes untouched by great tits. Other insects that are camouflaged, such as the brown larch ladybird or green winter moth caterpillar, are fed on by great tits and their young,” said Thorogood.
“The seven-spotted ladybird is so easy to see that if every predator had to eat one before they discovered its foul taste, it would have struggled to survive and reproduce.
“We think it may be the social information of their unpalatable nature spreading through predator species such as great tits that makes the paradox of conspicuous insects such as seven-spotted ladybirds possible.”
Direct electrical stimulation of the human amygdala, a region of the brain known to regulate memory and emotional behaviors, can enhance next-day recognition of images when applied immediately after the images are viewed, neuroscientists have found.
The findings are the first example of electrical brain stimulation in humans giving a time-specific boost to memory lasting more than a few minutes, the scientists say. Patients’ recognition only increased for stimulated images, and not for control images presented in between the stimulated images. The experiments were conducted at Emory University Hospital in 14 epilepsy patients undergoing intracranial monitoring, an invasive procedure for the diagnosis of seizure origin, during which electrodes are introduced into the brain.
“We were able to tag specific memories to be better remembered later,” says co-first author Cory Inman, PhD, postdoctoral fellow in the Department of Neurosurgery. “One day, this could be incorporated into a device aimed at helping patients with severe memory impairments, like those with traumatic brain injuries or mild cognitive impairment associated with various neurodegenerative diseases. However, right now, this is more of a scientific finding than a therapeutic one.”
“We see this as a platform for the further study of memory enhancement,” says senior author Jon T. Willie, MD, PhD, assistant professor of neurosurgery and neurology at Emory University School of Medicine. “The time specificity enables a lot of other experiments, since we know that there’s not a carry-over effect from one image to the next.”
Deep brain stimulation (DBS), with current delivered continuously by an implanted device, is an established clinical method for the treatment of movement disorders such as Parkinson’s disease, and is being tested for psychiatric conditions such as depression. In contrast with DBS’s invasiveness, researchers elsewhere have experimented with non-invasive electrical stimulation as an approach for enhancing memory or cognition, with several rounds of stimulation applied while learning.
“The advantage of DBS is that it can selectively modulate a specific brain circuit without broad off-target effects,” says Willie, who performed surgeries on patients in the study.
The amygdala’s key roles in emotional responses and fear-associated learning have been well-studied. So the Emory scientists made sure that amygdala stimulation at low levels of current (0.5 milliamps) did not result in emotional responses, an elevated heart rate, or other signs of arousal. Study participants reported that they did not notice the stimulation at any point in the study.
The researchers avoided direct stimulation of the hippocampus, figuring that would be too close to the machinery of memory itself, like introducing a live wire into a computer’s motherboard.
“We chose the amygdala because of decades of research in rodents, showing that it interacts with several other memory structures in a modulatory role,” says co-first author Joseph Manns, PhD, associate professor of psychology. “We wanted to stimulate its endogenous function, which we think is to signal salience — something standing out — so that specific experiences are remembered in the future.”
Manns and his colleagues had already shown that electrical amygdala stimulation increases rodents’ ability to recognize images later. The human experiments were designed to closely resemble how his lab’s tests with rodents were set up.
Study participants first viewed 160 neutral objects (emotional faces were excluded, for example) and were asked to judge whether the objects belonged indoors or outdoors. For half of the images, participants received stimulation for one second after each image disappeared from the screen. They were quizzed on half the stimulated images and half the unstimulated images immediately, and the other half the next day. 40 new images were used as decoys. The effect of stimulation on immediate recognition was not statistically strong. However, the next day, the effects on stimulated images were clear.
“This makes sense because the amygdala is thought to be important for memory consolidation — making sure important events stick over time,” Manns says.
79 percent of the participants (11 out of 14) showed an improvement on overnight memory tests, while the remaining 21 percent showed no improvement or impairment. When compared to no stimulation, the increase in number of images accurately recognized ranged from around 8 percent up to several hundred percent.
Some of the patients had impaired memory as a result of their epilepsy; the patients in whom a greater effect was seen generally had poorer baseline memory performance. For instance, one patient essentially forgot all of the control images, but maintained good memory for the stimulated images. However, a substantial effect was also observed in people who had an average memory to start with.
“The average was like having a ‘B’-level memory performance move up to an ‘A’,” Willie says.
The researchers were also able to see signs of the previous day’s stimulation, in terms of electrical interactions within the brain, when those images were viewed again. The Emory team is now fine-tuning amygdala stimulation parameters, so that memory enhancement might be optimized. They are beginning to look at other types of memory tests, such as spatial or verbal tasks, as well as tasks that more closely mimic memory for real-world events.
“We want to understand the brain’s endogenous mechanisms for memory modulation better before moving ahead with a device,” Manns says.
Somewhere in our galaxy, an exoplanet is probably orbiting a star that’s colder than our sun, but instead of freezing solid, the planet might be cozy warm thanks to a greenhouse effect caused by methane in its atmosphere.
NASA astrobiologists from the Georgia Institute of Technology have developed a comprehensive new model that shows how planetary chemistry could make that happen. The model, was based on a likely scenario on Earth three billion years ago, and was actually built around its possible geological and biological chemistry.
The sun produced a quarter less light and heat then, but Earth remained temperate, and methane may have saved our planet from an eon-long deep-freeze, scientists hypothesize. Had it not, we and most other complex life probably wouldn’t be here.
The new model combined multiple microbial metabolic processes with volcanic, oceanic and atmospheric activities, which may make it the most comprehensive of its kind to date. But while studying Earth’s distant past, the Georgia Tech researchers aimed their model light-years away, wanting it to someday help interpret conditions on recently discovered exoplanets.
The researchers set the model’s parameters broadly so that they could apply not only to our own planet but potentially also to its siblings with their varying sizes, geologies, and lifeforms.
Earth and its siblings
“We really had an eye to future use with exoplanets for a reason,” said Chris Reinhard, the study’s principal investigator and an assistant professor in Georgia Tech’s School of Earth and Atmospheric Sciences. “It’s possible that the atmospheric methane models that we are exploring for the early Earth represent conditions common to biospheres throughout our galaxy because they don’t require such an advanced stage of evolution like we have here on Earth now.”
The research was supported by the NASA Postdoctoral Program, the Japan Society for the Promotion of Science, the NASA Astrobiology Institute and the Alfred P. Sloan Foundation.
Previous models have examined the mix of atmospheric gases needed to keep Earth warm in spite of the sun’s former faintness, or studied isolated microbial metabolisms that could have made the needed methane. “In isolation, each metabolism hasn’t made for productive models that accounted well for that much methane,” Reinhard said.
The Georgia Tech researchers synergized those isolated microbial metabolisms, including ancient photosynthesis, with geological chemistry to create a model reflective of the complexity of an entire living planet. And the model’s methane production ballooned.
“It’s important to think about the mechanisms controlling the atmospheric levels of greenhouse gases in the framework of all biogeochemical cycles in the ocean and atmosphere,” said first author Ozaki, a postdoctoral assistant.
Carl Sagan and the faint Sun
The Georgia Tech model strengthens a leading hypothesis that attempts to explain a mystery called the “faint young Sun paradox” pointed out by iconic late astronomer Carl Sagan and his Cornell University colleague George Mullen in 1972.
Astronomers noticed long ago that stars burned brighter as they matured and weaker in their youths. They calculated that about two billion years ago, our sun must have shone about 25 percent fainter than it does .
That would have been too cold for any liquid water to exist on Earth, but paradoxically, strong evidence says that liquid water did exist. “Based on the observation of the geological record, we know that there must have been liquid water,” Reinhard said, “and in some cases, we know that temperatures were similar to how they are today, if not a little warmer.”
Sagan and Mullen postulated that Earth’s atmosphere must have created a greenhouse effect that saved it. Back then, they suspected ammonia was at work, but chemically, that idea proved less feasible.
“Methane has taken a lead role in this hypothesis,” Reinhard said. “When oxygen and methane enter the atmosphere, they chemically cancel each other out over time in a complex chain of chemical reactions. Because there was extremely little oxygen in the air back then, it would have allowed for methane to build up much higher levels than.”
Iron, and rust photosynthesis
At the core of the model are two different types of photosynthesis. But three billion years ago, the dominant type of photosynthesis we know today that pumps out oxygen may not have even existed yet.
Instead, two other very primitive bacterial photosynthetic processes likely were essential to Earth’s ancient biosphere. One transformed iron in the ocean into rust, and the other photosynthesized hydrogen into formaldehyde.
“The model relied on lots of volcanic activity spewing out hydrogen,” Ozaki said. Other bacteria fermented the formaldehyde, and other bacteria, still, turned the fermented product into methane.
The two photosynthetic processes served as the watch spring of the model’s clockwork, which pulled in 359 previously established biogeochemical reactions spanning land, sea and air.
3,000,000 runs and raging methane
The model was not the type of simulation that produces a video animation of Earth’s ancient biogeochemistry. Instead, the model mathematically analyzed the processes, and the output was numbers and graphs.
Ozaki ran the model more than 3 million times, varying parameters, and found that if the model contained both forms of photosynthesis operating in tandem, that 24 percent of the runs produced enough methane to create the balance needed in the atmosphere to maintain the greenhouse effect and keep ancient Earth, or possibly an exoplanet, temperate.
“That translates into about a 24 percent probability that this model would produce a stable, warm climate on the ancient Earth with a faint sun or on an Earth-like exoplanet around a dimmer star,” Reinhard said. “Other models that looked at these photosynthetic metabolisms in isolation have much lower probabilities of producing enough methane to keep the climate warm.”
“We’re confident that this rather unique statistical approach means that you can take the basic insights of this new model to the bank,” he said.
Other explanations for the “faint young Sun paradox” have been more cataclysmic and perhaps less regular in their dynamics. They include ideas about routine asteroid strikes stirring up seismic activity thus resulting in more methane production, or about the sun consistently firing coronal mass ejections at Earth, heating it up.
Luke Skywalker’s bionic hand is a step closer to reality for amputees in this galaxy. Researchers at the Georgia Institute of Technology have created an ultrasonic sensor that allows amputees to control each of their prosthetic fingers individually. It provides fine motor hand gestures that aren’t possible with current commercially available devices.
The first amputee to use it, a musician who lost part of his right arm five years ago, is now able to play the piano for the first time since his accident. He can even strum the Star Wars theme song.
“Our prosthetic arm is powered by ultrasound signals,” said Gil Weinberg, the Georgia Tech College of Design professor who leads the project. “By using this new technology, the arm can detect which fingers an amputee wants to move, even if they don’t have fingers.”
Jason Barnes is the amputee working with Weinberg. The 28-year-old was electrocuted during a work accident in 2012, forcing doctors to amputate his right arm just below the elbow. Barnes no longer has his hand and most of his forearm but does have the muscles in his residual limb that control his fingers.
Barnes’ everyday prosthesis is similar to the majority of devices on the market. It’s controlled by electromyogram (EMG) sensors attached to his muscles. He switches the arm into various modes by pressing buttons on the arm. Each mode has two programmed moves, which are controlled by him either flexing or contracting his forearm muscles. For example, flexing allows his index finger and thumb to clamp together; contracting closes his fist.
“EMG sensors aren’t very accurate,” said Weinberg, director of Georgia Tech’s Center for Music Technology. “They can detect a muscle movement, but the signal is too noisy to infer which finger the person wants to move. We tried to improve the pattern detection from EMG for Jason but couldn’t get finger-by-finger control.”
But then the team looked around the lab and saw an ultrasound machine. They partnered with two other Georgia Tech professors Minoru Shinohara (College of Sciences) and Levent Degertekin (Woodruff School of Mechanical Engineering) and attached an ultrasound probe to the arm. The same kind of probe doctors use to see babies in the womb could watch how Barnes’ muscles moved.
“That’s when we had a eureka moment,” said Weinberg.
When Barnes tries to move his amputated ring finger, the muscle movements differ from those seen when he tries to move any other digit. Weinberg and the team fed each unique movement into an algorithm that can quickly determine which finger Barnes wants to move. The ultrasound signals and machine learning can detect continuous and simultaneous movements of each finger, as well as how much force he intends to use.
“It’s completely mind-blowing,” said Barnes. “This new arm allows me to do whatever grip I want, on the fly, without changing modes or pressing a button. I never thought we’d be able to do this.”
This is the second device Weinberg’s lab has built for Barnes. His first love is the drums, so the team fitted him with a prosthetic arm with two drumsticks in 2014. He controlled one of the sticks. The other moved on its own by listening to the music in the room and improvising.
The device gave him the chance to drum again. The robotic stick could play faster than any drummer in the world. Worldwide attention has sent Barnes and Weinberg’s robots around the globe for concerts across four continents. They’ve also played at the Kennedy Center in Washington, D.C. and Moogfest.
That success pushed Weinberg to take the next step and create something that gives Barnes the dexterity he’s lacked since 2012.
“If this type of arm can work on music, something as subtle and expressive as playing the piano, this technology can also be used for many other types of fine motor activities such as bathing, grooming and feeding,” said Weinberg. “I also envision able-bodied persons being able to remotely control robotic arms and hands by simply moving their fingers.”
University of Maryland School of Medicine (UMSOM) researchers have found a two-way link between traumatic brain injury (TBI) and intestinal changes. These interactions may contribute to increased infections in these patients, and may also worsen chronic brain damage.
This is the first study to find that TBI in mice can trigger delayed, long-term changes in the colon and that subsequent bacterial infections in the gastrointestinal system can increase posttraumatic brain inflammation and associated tissue loss. The findings were published recently in the journal Brain, Behavior, and Immunity.
“These results indicate strong two-way interactions between the brain and the gut that may help explain the increased incidence of systemic infections after brain trauma and allow new treatment approaches,” said the lead researcher, Alan Faden, MD, the David S. Brown Professor in Trauma in the Departments of Anesthesiology, Anatomy & Neurobiology, Psychiatry, Neurology, and Neurosurgery at UMSOM, and director of the UMSOM Shock, Trauma and Anesthesiology Research Center.
Researchers have known for years that TBI has significant effects on the gastrointestinal tract, but until now, scientists have not recognized that brain trauma can make the colon more permeable, potentially allowing allow harmful microbes to migrate from the intestine to other areas of the body, causing infection. People are 12 times more likely to die from blood poisoning after TBI, which is often caused by bacteria, and 2.5 times more likely to die of a digestive system problem, compared with those without such injury.
In this study, the researchers examined mice that received an experimental TBI. They found that the intestinal wall of the colon became more permeable after trauma, changes that were sustained over the following month.
It is not clear how TBI causes these gut changes. A key factor in the process may be enteric glial cells (EGCs), a class of cells that exist in the gut. These cells are similar to brain astroglial cells, and both types of glial cells are activated after TBI. After TBI, such activation is associated with brain inflammation that contributes to delayed tissue damage in the brain. Researchers don’t know whether activation of ECGs after TBI contributes to intestinal injury or is instead an attempt to compensate for the injury.
The researchers also focused on the two-way nature of the process: how gut dysfunction may worsen brain inflammation and tissue loss after TBI. They infected the mice with Citrobacter rodentium, a species of bacteria that is the rodent equivalent of E. coli, which infects humans. In mice with a TBI who were infected with this the bacteria, brain inflammation worsened. Furthermore, in the hippocampus, a key region for memory, the mice who had TBI and were then infected lost more neurons than animals without infection.
This suggests that TBI may trigger a vicious cycle, in which brain injury causes gut dysfunction, which then has the potential to worsen the original brain injury. “These results really underscore the importance of bi-directional gut-brain communication on the long-term effects of TBI,” said Dr. Faden.