Exotic quantum particle observed in bilayer graphene

Posted on Updated on


 

Wang Fong-Jen Professor of Mechanical Engineering at Columbia Engineering, has definitively observed an intensely studied anomaly in condensed matter physics the even-denominator fractional quantum Hall (FQH) state — via transport measurement in bilayer graphene.

“Observing the 5/2 state in any system is a remarkable scientific opportunity, since it encompasses some of the most perplexing concepts in modern condensed matter physics, such as emergence, quasi-particle formation, quantization, and even superconductivity,” Dean says. “Our observation that, in bilayer graphene, the 5/2 state survives to much higher temperatures than previously thought possible not only allows us to study this phenomenon in new ways, but also shifts our view of the FQH state from being largely a scientific curiosity to now having great potential for real-world applications, particularly in quantum computing.”

First discovered in the 1980s in gallium arsenide (GaAs) heterostructures, the 5/2 fractional quantum hall state remains the singular exception to the otherwise strict rule that says fractional quantum hall states can only exist with odd denominators. Soon after the discovery, theoretical work suggested that this state could represent an exotic type of superconductor, notable in part for the possibility that such a phase could enable a fundamentally new approach to quantum computation. However, confirmation of these theories has remained elusive, largely due to the fragile nature of the state; in GaAs it is observable only in the highest quality samples and even then appearing only at milikelvin temperaures (as much as 10,000 times colder than the freezing point of water).

The Columbia team has now observed this same state in bilayer graphene and appearing at much higher temperatures??reaching several Kelvin. “While it’s still 100 times colder than the freezing point of water, seeing the even-denominator state at these temperatures opens the door to a whole new suite of experimental tools that previously were unthinkable,” says Dean. “After several decades of effort by researchers all over the world, we may finally be close to solving the mystery of the 5/2.”

One of the outstanding problems in the field of modern condensed matter physics is understanding the phenomenon of “emergence,” the result of a large collection of quantum particles behaving in concert due to interactions between the particles and giving rise to new characteristics that are not a feature of the individual parts. For instance, in superconductors, a large number of electrons all collapse to a single quantum state, which can then propagate through a metal without any energy loss. The fractional quantum Hall effect is another state in which electrons collude with one another, in the presence of a magnetic field, resulting in quasiparticles with potentially exotic quantum properties.

Very difficult to predict theoretically, emergence often challenges our foundational understanding of how particles behave. For example, since any two electrons have the same charge, we think of electrons as objects that want to repel each other. However, in a superconducting metal, electrons unexpectedly pair up, forming a new object known as a cooper pair. Individual electrons scatter when moving through a metal, giving rise to resistance, but spontaneously formed cooper pairs behave collectively in such a way that they move through the material with no resistance at all.

“Think of trying to make your way through a crowd at a rock concert where everyone is dancing with a lot of energy and constantly bumping into you, compared to a ballroom dance floor where pairs of dancers are all moving in the same, carefully choreographed way, and it is easy to avoid each other,” says Dean. “One of the reasons that makes the even-denominator fractional quantum Hall effect so fascinating is that its origin is believed to be very similar to that of a superconductor, but, instead of simply forming cooper pairs, an entirely new kind of quantum particle emerges.”

According to quantum mechanics, elementary particles fall into two categories, Fermions and Bosons, and behave in very different ways. Two Fermions, such as electrons, cannot occupy the same state, which is why, for example, the electrons in atoms fill successive orbitals. Bosons, such as photons, or particles of light, can occupy the same state, allowing them to act coherently as in the light emission from a laser. When two identical particles are interchanged, the quantum mechanical wave-function describing their combined state is multiplied by a phase factor of 1 for Bosons, and -1 for Fermions.

Soon after the discovery of the fractional quantum hall effect, it was suggested on theoretical grounds that the quasiparticles associated with this state behave neither as Bosons nor Fermions but instead what is called an anyon: when anyon quasiparticles are interchanged, the phase factor is neither 1 nor -1 but is fractional. Despite several decades of effort, there still is no conclusive experimental proof confirming that these quasiparticles are anyons. The 5/2 state?a non-abelian anyon?is thought to be even more exotic. In theory, non-abelian anyons obey anyonic statistics as in other fractional quantum Hall states, but with the special feature that this phase cannot simply be undone by reversing the process. This inability to simply unwind the phase would make any information stored in the system uniquely stable, and is why many people believe the 5/2 could be a great candidate for quantum computation.

“Demonstration of the predicted 5/2 statistics would represent a tremendous achievement,” says Dean. “In many regards, this would confirm that, by fabricating a material system with just the right thickness and just the right number of electrons, and then applying just the right magnetic fields, we could effectively engineer fundamentally new classes of particles, with properties that do not otherwise exist among known particles naturally found in the universe. We still have no conclusive evidence that the 5/2 state exhibits non-abelian properties, but our discovery of this state in bilayer graphene opens up exciting new opportunities to test these theories.”

Until now, all of those conditions have needed to be not only just right but also extreme. In conventional semi-conductors, the even-denominator states are very difficult to isolate, and exist only for ultra-pure materials, at extremely low temperatures and high magnetic fields. While certain features of the state have been observable devising experiments that could investigate the state without destroying it, has been challenging.

“We needed a new platform,” says Hone. “With the successful isolation of graphene, these atomically thin layers of carbon atoms emerged as a promising platform for the study of electrons in 2D in general. One of the keys is that electrons in graphene interact even more strongly than in conventional 2D electron systems, theoretically making effects such as the even-denominator state even more robust. But while there have been predictions that bilayer graphene could host the long-sought even-denominator states, at higher temperatures than seen before, these predictions have not been realized due mostly the difficulty of making graphene clean enough.”

The Columbia team built on many years of pioneering work to improve the quality of graphene devices, creating ultra-clean devices entirely from atomically flat 2D materials: bilayer graphene for the conducting channel, hexagonal boron nitride as a protective insulator, and graphite used for electrical connections and as a conductive gate to change the charge carrier density in the channel.

A crucial component of the research was having access to the high magnetic fields tools available at the National High Magnetic Field Laboratory in Tallahassee, Fla., a nationally funded user facility with which Hone and Dean have had extensive collaborations. They studied the electrical conduction through their devices under magnetic fields up to 34 Tesla, and achieved clear observation of the even-denominator states.

“By tilting the sample with respect to the magnetic field, we were able to provide new confirmation that this FQH state has many of the properties predicted by theory, such as being spin-polarized,” says Jia Li, the paper’s lead author and post-doctoral researcher working with Dean and Hone. “We also discovered that in bilayer graphene, this state can be manipulated in ways that are not possible in conventional materials.”

The Columbia team’s result, which demonstrates measurement in transport — how electrons flow in the system — is a crucial step forward towards confirming the possible exotic origin of the even denominator state. Their findings are reported contemporaneously with a similar report by a research group at University of California, Santa Barbara. The UCSB study observed the even denominator state by capacitance measurement, which probes the existence of an electrical gap associated with the onset of the state.

“For many decades now it has been thought that if the 5/2 state does indeed represent a non-abelian anyon, it could theoretically revolutionize efforts to build a quantum computer,” Dean observes. “In the past, however, the extreme conditions necessary to see the state at all, let alone use it for computation, were always a major concern of practicality. Our results in bilayer graphene suggest that this dream may now not actually be so far from reality.”

Advertisements

No clear evidence that most new cancer drugs extend or improve life

Posted on


Even where drugs did show survival gains over existing treatments, these were often marginal.

Many of the drugs were approved on the basis of indirect (‘surrogate’) measures that do not always reliably predict whether a patient will live longer or feel better, raising serious questions about the current standards of drug regulation.

The researchers, based at King’s College London and the London School of Economics say: “When expensive drugs that lack clinically meaningful benefits are approved and paid for within publicly funded healthcare systems, individual patients can be harmed, important societal resources wasted, and the delivery of equitable and affordable care undermined.”

Of 68 cancer indications approved during this period, 57% (39) came onto the market on the basis of a surrogate endpoint and without evidence that they extended survival or improved the quality of patients’ lives.

After a median of 5 years on the market, only an additional 8 drug indications had shown survival or quality of life gains.

Thus, out of 68 cancer indications approved by the EMA, and with a median 5 years follow-up, only 35 (51%) had shown a survival or quality of life gain over existing treatments or placebo. For the remaining 33 (49%), uncertainty remains over whether the drugs extend survival or improve quality of life.

The researchers outline some study limitations which could have affected their results, but say their findings raise the possibility that regulatory evidence standards “are failing to incentivise drug development that best meets the needs of patients, clinicians, and healthcare systems.”

Taken together, these facts paint a sobering picture, says Vinay Prasad, Assistant Professor at Oregon Health & Science University in a linked editorial.

He calls for “rigorous testing against the best standard of care in randomized trials powered to rule in or rule out a clinically meaningful difference in patient centered outcomes in a representative population” and says “the use of uncontrolled study designs or surrogate endpoints should be the exception not the rule.”

He adds: “The expense and toxicity of cancer drugs means we have an obligation to expose patients to treatment only when they can reasonably expect an improvement in survival or quality of life.” These findings suggest “we may be falling far short of this important benchmark.”

This study comes at a time when European governments are starting to seriously challenge the high cost of drugs, says Dr Deborah Cohen.

She points to examples of methodological problems with trials that EMA has either failed to identify or overlooked, including trial design, conduct, analysis and reporting.

“The fact that so many of the new drugs on the market lack good evidence that they improve patient outcomes puts governments in a difficult position when it comes to deciding which treatments to fund,” she writes. “But regulatory sanctioning of a comparator that lacks robust evidence of efficacy, means the cycle of weak evidence and uncertainty continues.”

In a patient commentary, Emma Robertson says: “It’s clear to me and thousands of other patients like me that our current research and development model has failed.”

Emma is leader of Just Treatment, a patient led campaign with no ties to the pharmaceutical industry, which is calling for a new system that rewards and promotes innovation, so that more effective and accessible cancer medicines are brought within reach.

More traits associated with your Neanderthal DNA

Posted on


After humans and Neanderthals met many thousands of years ago, the two species began interbreeding. Although Neanderthals aren’t around anymore, about two percent of the DNA in non-African people living today comes from them. Recent studies have shown that some of those Neanderthal genes have contributed to human immunity and modern diseases. including skin tone, hair color, sleep patterns, mood, and even a person’s smoking status.

Inspired by an earlier study that found associations between Neanderthal DNA and disease risk, Janet Kelso at the Max Planck Institute for Evolutionary Anthropology in Germany says her team was interested in exploring connections between Neanderthal DNA and traits unrelated to disease. In other words, they wanted to uncover the “influence Neanderthal DNA might be having on ordinary variation in people today.”

Because Neanderthal alleles are relatively rare, the researchers needed data representing a really large number of people. They found what they were looking for in data representing more than 112,000 participants in the UK Biobank pilot study. The Biobank includes genetic data along with information on many traits related to physical appearance, diet, sun exposure, behavior, and disease.

Earlier studies had suggested that human genes involved in skin and hair biology were strongly influenced by Neanderthal DNA, Kelso says. But it hadn’t been clear how.

“We can now show that it is skin tone, and the ease with which one tans, as well as hair color that are affected,” Kelso says.

The researchers observe multiple different Neanderthal alleles contributing to skin and hair tones. What they found somewhat surprising is that some Neanderthal alleles are associated with lighter skin tones and others with darker skin tones. The same was true for hair color.

“These findings suggest that Neanderthals might have differed in their hair and skin tones, much as people now do” adds Michael Dannemann, first author of the study.

Kelso notes that the traits influenced by Neanderthal DNA, including skin and hair pigmentation, mood, and sleeping patterns are all linked to sunlight exposure. When modern humans arrived in Eurasia about 100,000 years ago, Neanderthals had already lived there for thousands of years. They were likely well adapted to lower and more variable levels of ultraviolet radiation from the sun than the new human arrivals from Africa were accustomed to.

“Skin and hair color, circadian rhythms and mood are all influenced by light exposure,” the researchers wrote. “We speculate that their identification in our analysis suggests that sun exposure may have shaped Neanderthal phenotypes and that gene flow into modern humans continues to contribute to variation in these traits today.”

Kelso and her colleagues say they’ll continue to explore Neanderthals’ influence on modern-day traits as more data becomes available.

New light shed on how Earth and Mars were created

Posted on


Analysing a mixture of earth samples and meteorites, scientists from the University of Bristol have shed new light on the sequence of events that led to the creation of the planets Earth and Mars.

Planets grow by a process of accretion — a gradual accumulation of additional material — in which they collisionally combine with their neighbours.

This is often a chaotic process and material gets lost as well as gained.

Massive planetary bodies impacting at several kilometres per second generate substantial heat which, in turn, produces magma oceans and temporary atmospheres of vaporised rock.

Before planets get to approximately the size of Mars, gravitational attraction is too weak to hold onto this inclement silicate atmosphere.

Repeated loss of this vapour envelope during continued collisional growth causes the planet’s composition to change substantially.

Dr Remco Hin from the University of Bristol’s School of Earth Sciences, led the research which is published today in Nature.

He said: “We have provided evidence that such a sequence of events occurred in the formation of the Earth and Mars, using high precision measurements of their magnesium isotope compositions.

“Magnesium isotope ratios change as a result of silicate vapour loss, which preferentially contains the lighter isotopes. In this way, we estimated that more than 40 per cent of the Earth’s mass was lost during its construction.

“This cowboy building job, as one of my co-authors described it, was also responsible for creating the Earth’s unique composition.”

The research was carried out in an effort to resolve a decades long debate in Earth and planetary sciences about the origin of distinctive, volatile poor compositions of planets.

Did this result from processes that acted in the mixture of gas and dust in the nebula of the earliest solar system or is it consequence of their violent growth?

Researchers analysed samples of the Earth together with meteorites from Mars and the asteroid Vesta, using a new technique to get higher quality (more accurate and more precise) measurements of magnesium isotope ratios than previously obtained.

The main findings are three-fold:

  • Earth, Mars and asteroid Vesta have distinct magnesium isotope ratios from any plausible nebula starting materials
  • The isotopically heavy magnesium isotope compositions of planets identify substantial (~40 per cent) mass loss following repeated episodes of vaporisation during their accretion
  • This slipshod construction process results in other chemical changes during growth that generate the unique chemical characteristics of Earth.

Dr Hin added: “Our work changes our views on how planets attain their physical and chemical characteristics.

“While it was previously known that building planets is a violent process and that the compositions of planets such as Earth are distinct, it was not clear that these features were linked.

“We now show that vapour loss during the high energy collisions of planetary accretion has a profound effect on a planet’s composition.

“This process seems common to planet building in general, not just for Earth and Mars, but for all planets in our Solar System and probably beyond, but differences in the collision histories of planets will create a diversity in their compositions.”

Is the Milky Way an ‘outlier’ galaxy? Studying its ‘siblings’ for clues

Posted on


The most-studied galaxy in the universe — the Milky Way — might not be as “typical” as previously thought, according to a new study.

The Milky Way, which is home to Earth and its solar system, is host to several dozen smaller galaxy satellites. These smaller galaxies orbit around the Milky Way and are useful in understanding the Milky Way itself.

Early results from the Satellites Around Galactic Analogs (SAGA) Survey indicate that the Milky Way’s satellites are much more tranquil than other systems of comparable luminosity and environment. Many satellites of those “sibling” galaxies are actively pumping out new stars, but the Milky Way’s satellites are mostly inert, the researchers found.

This is significant, according to the researchers, because many models for what we know about the universe rely on galaxies behaving in a fashion similar to the Milky Way.

“We use the Milky Way and its surroundings to study absolutely everything”. “Hundreds of studies come out every year about dark matter, cosmology, star formation, and galaxy formation, using the Milky Way as a guide. But it’s possible that the Milky Way is an outlier.”

The SAGA Survey began five years ago with a goal of studying the satellite galaxies around 100 Milky Way siblings. Thus far it has studied eight other Milky Way sibling systems, which the researchers say is too small of a sample to come to any definitive conclusions. SAGA expects to have studied 25 Milky Way siblings in the next two years.

Yet the survey already has people talking. At a recent conference where Geha presented some of SAGA’s initial findings, another researcher told her, “You’ve just thrown a monkey wrench into what we know about how small galaxies form.”

“Our work puts the Milky Way into a broader context,” said SAGA researcher Risa Wechsler, an astrophysicist at the Kavli Institute at Stanford University. “The SAGA Survey will provide a critical new understanding of galaxy formation and of the nature of dark matter.”

Wechsler, Geha, and their team said they will continue to improve the efficiency of finding satellites around Milky Way siblings. “I really want to know the answer to whether the Milky Way is unique, or totally normal,” Geha said. “By studying our siblings, we learn more about ourselves.”

Neuron types in brain are defined by gene activity shaping their communication patterns

Posted on


 

In a major step forward in research,  Neurons are the basic building blocks that wire up brain circuits supporting mental activities and behavior. The study, which involves sophisticated computational analysis of the messages transcribed from genes that are active in a neuron, points to patterns of cell-to-cell communication as the core feature that makes possible rigorous distinctions among neuron types across the mouse brain.

The team, led by CSHL Professor Z. Josh Huang, likens their discovery to the way communication styles and patterns enable us to learn important — definitive — things about people. “To figure out who I am,” says Anirban Paul, Ph.D., the paper’s first author, “you can learn a great deal by studying with whom and how I communicate. When I call my grandma, do I use an app or a phone? How do I speak to her, in terms of tone? How does that compare with when I communicate with my son, or my colleague?”

Using six genetically identifiable types of cortical inhibitory neurons that all release the neurotransmitter GABA, the team sought to discover factors capable of distinguishing their core molecular features. Using a high-resolution RNA sequencing method optimized by Paul and computational tools developed by CSHL Assistant Professor Jesse Gillis and colleague Megan Crow, Ph.D., the team searched for families of genes whose activity exhibited characteristic patterns in each neuron type.

Out of more than 600 gene families, the team found about 40 families whose activity patterns could be used to distinguish the six groups of cells. Surprisingly, Huang says, these cell-defining features fell into just six functional categories of gene families, all of which are vital for cell-to-cell communications. They include proteins that are expressed on either side of the communications interface in neurons: along or near membranes that face one another across the synaptic gap — the narrow space separating a (pre-synaptic) neuron sending a signal and a (post-synaptic) neuron facing it which receives the signal.

The key genes dictate which other cells a neuron connects to and how it communicates with those partners, Huang explains. “Ultimately, the genes code for one striking feature, which is with whom and how these neurons talk to each other.”

The communications architecture is effectively hardwired into the molecular makeup of neurons, very likely across the brain, the team says. The ability to generalize the GABA results is based on re-analysis performed by the team of published data from the Allen Brain Atlas. It shows that the six “defining” gene/protein families they identified are also statistically distinguishing in random sets of neurons, including excitatory neurons, throughout the brain.

The findings should help scientists sort out the bewildering array of neurons that are intertwined in the brain. Over the years, as neuroscientists have proposed various approaches for classifying neurons, one question has been whether these classifications reflect cells’ biological properties, or are merely useful but arbitrary designations. Hundreds or thousands of types of neurons probably exist in the brain but there has been no agreement among neuroscientists about how to define different types and on what biological basis. In a real sense, says Huang, the CSHL team has discovered the rules of a “core genetic recipe” for constructing diverse types of nerve cells.

Positive, negative or neutral, it all matters: NASA explains space radiation

Posted on


Charged particles may be small, but they matter to astronauts. NASA’s Human Research Program (HRP) is investigating these particles to solve one of its biggest challenges for a human journey to Mars: space radiation and its effects on the human body.

“One of our biggest challenges on a mission to Mars is protecting astronauts from radiation,” said NASA Space Radiation Element Scientist Lisa Simonsen, Ph.D.. “You can’t see it; you can’t feel it. You don’t know you’re getting bombarded by radiation.”

A common misconception of space radiation is that it’s similar to radiation on Earth. It’s actually quite different. On Earth, radiation coming from the sun and space is mainly absorbed and deflected by our atmosphere and magnetic field.

The main type of radiation people think of on Earth is found in the dentist’s office¬† X-rays. Shielding against X-rays and other types of electromagnetic radiation usually consists of wearing a heavy, lead blanket.

Space radiation, however, is different because it has sufficient energy to collide violently with the nuclei that make up shielding and human tissue. These so-called nuclear collisions cause both the incoming space radiation and shielding nuclei to break-up into many different types of new particles, referred to as secondary radiation.

“In space, there is particle radiation, which is basically everything on the periodic table, hydrogen all the way up through nickel and uranium, moving near the speed of light,” said NASA Research Physicist Tony Slaba, Ph.D. “NASA doesn’t want to use heavy materials like lead for shielding spacecraft because the incoming space radiation will suffer many nuclear collisions with the shielding, leading to the production of additional secondary radiation. The combination of the incoming space radiation and secondary radiation can make the exposure worse for astronauts.”

The HRP is focused on investigating these effects of space radiation on the human body especially those associated with galactic cosmic rays (GCRs).

“There are three main sources of space radiation, but GCRs are of most concern to researchers for a mission to Mars,” said NASA Research Physicist John Norbury, Ph.D. “GCRs that come from exploding stars known as supernovae outside the solar system are the most harmful to the human body.”

Other space radiation sources include the Van Allen Belts where radiation particles are trapped around the Earth and solar particle events (SPEs) which are associated with solar flares and coronal mass ejections and are more likely to occur during times of intense solar activity.

But GCRs are first in mind for the HRP researchers who create countermeasures to protect astronauts from space radiation. The challenge is obtaining adequate data on the GCR exposure and biological consequences. Researchers use NASA’s Space Radiation Laboratory (NSRL) to investigate the effects of ionizing radiation but space radiation is difficult to simulate on Earth. A radiation dose in a lab setting could be more concentrated and given over a shorter timeframe than what an astronaut actually experiences during a year in space.

As NASA prepares for a journey to Mars, it will continue to use, enhance and develop a variety of technologies to protect astronauts. International Space Station dosimeters, Orion’s Hybrid Electronic Radiation Assessor, and the Radiation Assessment Detector can measure and identify high-energy radiation. Protons, neutrons and electrons may be small but they will always matter to NASA.