Embracing Uncertainty Helps Bring Order to Quantum Chaos

In physics, chaos is something unpredictable. A butterfly flapping its wings somewhere in Guatemala might seem insignificant, but those flits and flutters might be the ultimate cause of a hurricane over the Indian Ocean. The butterfly effect captures what it means for something to behave chaotically: Two very similar starting points—a butterfly that either flaps its wings or doesn’t—could lead to two drastically different results, like a hurricane or calm winds.

But there's also a tamer, more subtle form of chaos in which similar starting points don’t cause drastically different results—at least not right away. This tamer chaos, known as ergodicity, is what allows a coffee cup to slowly cool down to room temperature or a piece of steak to heat up on a frying pan. It forms the basis of the field of statistical mechanics, which describes large collections of particles and how they exchange energy to arrive at a shared temperature. Chaos almost always grows out of ergodicity, forming its most eccentric variant.A system is ergodic if a particle traveling through it will eventually visit every possible point. In quantum mechanics, you never know exactly what point a particle is at, making ergodicity hard to track. In this schematic, the available space is divided into quantum-friendly cells, and an ergodic particle (left) winds through each of the cells, while a non-ergodic one (right) only visits a few. (Credit: Amit Vikram/JQI)A system is ergodic if a particle traveling through it will eventually visit every possible point. In quantum mechanics, you never know exactly what point a particle is at, making ergodicity hard to track. In this schematic, the available space is divided into quantum-friendly cells, and an ergodic particle (left) winds through each of the cells, while a non-ergodic one (right) only visits a few. (Credit: Amit Vikram/JQI)

Where classical, 19th-century physics is concerned, ergodicity is pretty well understood. But we know that the world is fundamentally quantum at the smallest scales, and the quantum origins of ergodicity have remained murky to this day—the uncertainty inherent in the quantum world makes classical notions of ergodicity fail. Now, Victor Galitski and colleagues in the Joint Quantum Institute (JQI) have found a way to translate the concept of ergodicity into the quantum realm. They recently published their results in the journal Physical Review Research. This work was supported by the DOE Office of Science (Office of Basic Energy Sciences).

“Statistical mechanics is based on the assumption that systems are ergodic,” Galitski says. “It’s an assumption, a conjecture, and nobody knows why. And our work sheds light on this conjecture.”

In the classical world, ergodicity is all about trajectories. Imagine an air hockey puck bouncing around a table. If you set it in motion, it will start bouncing off the walls, changing direction with each collision. If you wait long enough, that puck will eventually visit every point on the table's surface. This is what it means to be ergodic—to visit every nook and cranny available, given enough time. If you paint the puck’s path as you go, you will eventually color in the whole table. If lots of pucks are unleashed onto the table, they will bump into each other and eventually spread out evenly over the table.

To translate this idea of ergodicity into the quantum world of individual particles is tough. For one, the very notion of a trajectory doesn't quite make sense. The uncertainty principle dictates that you cannot know the precise position and momentum of a particle at the same time, so the exact path it follows ends up being a little bit fuzzy, making the normal definitions of chaos and ergodicity challenging to apply. 

Physicists have thought up several alternate ways to look for ergodicity or chaos in quantum mechanics. One is to study the particle’s quantum energy levels, especially how they space out and bunch up. If the way they bunch up has a particular kind of randomness, the theory goes, this is a type of quantum chaos. This might be a nice theoretical tool, but it’s difficult to connect to the actual motion of a quantum particle. Without such a connection to dynamics, the authors say there’s no fundamental reason to use this energy level signature as the ultimate definition of quantum chaos. “We don't really know what quantum chaos [or ergodicity] is in the first place,” says Amit Vikram, a graduate student in physics at JQI and lead author of the paper. “Chaos is a classical notion. And so what people really have are different diagnostics, essentially different things that they intuitively associate with chaos.”

Galitski and Vikram have found a way to define quantum ergodicity that closely mimics the classical definition. Just as an air hockey puck traverses the surface of the table, quantum particles traverse a space of quantum states—a surface like the air hockey table that lives in a more abstract world. But to capture the uncertainty inherent to the quantum world, the researchers break the space up into small cells rather than treating it as individual points. It's as if they divided the abstract air hockey table into cleverly chosen chunks and then checked to see if the uncertainty-widened particle has a decent probability of visiting each of the chunks.

“Quantum mechanically you have this uncertainty principle that says that your resolution in trajectories is a little bit fuzzy. These cells kind of capture that fuzziness,” Vikram says. “It's not the most intuitive thing to expect that some classical notion would just carry over to quantum mechanics. But here it does, which is rather strange, actually.”

Picking the correct cells to partition the space into is no easy task—a random guess will almost always fail. Even if there is only one special choice of cells where the particle visits each one, the system is quantum ergodic according to the new definition. The team found that the key to finding that magic cell choice, or ruling that no such choice exists, lies in the particle’s quantum energy levels, the basis of previous definitions of quantum chaos. This connection enabled them to calculate that special cell choice for particular cases, as well as connect to and expand the previous definition.

One advantage of this approach is that it's closer to something an experimentalist can see in the dynamics—it connects to the actual motion of the particle. This not only sheds light on quantum ergodicity, quantum chaos and the possible origins of thermalization, but it may also prove important for understanding why some quantum computing algorithms work while others do not.

As Galitski puts it, every quantum algorithm is just a quantum system trying to fight thermalization. The algorithm will only work if the thermalization is avoided, which would only happen if the particles are not ergodic. “This work not only relates to many body systems, such as materials and quantum devices, but that also relates to this effort on quantum algorithms and quantum computing,” Galitski says.

Original story by Dina Genkina: https://jqi.umd.edu/news/embracing-uncertainty-helps-bring-order-quantum-chaos

Reference Publications Dynamical quantum ergodicity from energy level statistics, A. Vikram Anand, and V. Galitski, Physical Review Research, 5, (2023)

Advocating for Quantum Simulation of Extreme Physics

The Big Bang, supernovae, collisions of nuclei at breakneck speeds—our universe is filled with extreme phenomena, both natural and human-made. But the surprising thing is that all of these seemingly distinct processes are governed by the same underlying physics: a combination of quantum mechanics and Einstein’s theory of special relativity known as quantum field theory.

Theoretical nuclear and particle physicists wield quantum field theory in their efforts to understand interactions between many particles or the behavior of particles with extremely large energies. This is no easy feat: At least theoretically, quantum field theory plays out in an infinite universe with particles constantly popping in and out of existence. Even the world’s biggest supercomputer would never be able to model it exactly. Fortunately, there are many computational tricks that can make the problem more tractable—like cutting up the infinite universe into a finite grid and taking judicious statistical samples instead of tracking every parameter of every particle—but they can only help so much. 

Over the past few years, a growing group of scientists has become wise to the potential of quantum computers to approach these calculations in a completely new way. With a fully functioning quantum computer, a lot of the approximations could be avoided, and the quantum nature of the universe could be modeled with true quantum hardware. However, quantum computers are not yet big and reliable enough to really tackle these problems, and the algorithms nuclear and particle physicists would need to run on them are not yet fully developed.

“Even if we have large-scale, fully capable quantum computers tomorrow,” said Zohreh Davoudi, associate professor of physics at UMD, “we don’t actually have all the theoretical tools and techniques to use them to solve our grand-challenge problems.”Zohreh Davoudi

Classical computers require exponential resources to simulate quantum physics. To simulate one extra tick of the clock or include one extra particle, the amount of computing power must grow significantly. So, the classical methods resort to approximations that fall short because they leave out details and lose the ability to address certain kinds of questions. For one, they can’t keep up with the real-time quantum evolution of the early universe. Additionally, they can’t track what happens during collisions of heavy nuclei. And finally, they are forced to ignore the quantum interactions between the myriad particles in high-energy settings, like those that are emitted from an exploding star. A quantum computer, however, could tackle these problems on their own quantum turf, without needing as many resources or resorting to as many approximations.

Now, researchers want to make sure the nascent effort to use quantum computers to simulate the extreme events of the universe continues to thrive. Davoudi, along with JQI Adjunct Fellow and College Park Professor of Physics Chris Monroe and other researchers, penned a whitepaper laying out the case for funding quantum simulation research in particle physics, published in the journal PRX Quantum in May 2023. Davoudi also co-authored a similar whitepaper in the field of nuclear physics, available on the arXiv preprint server.  

“It's a responsibility of researchers to also think at a larger scale,” said Davoudi, who is also a Fellow of the Joint Center for Quantum Information and Computer Science (QuICS) and the associate director of education at the National Science Foundation Quantum Leap Challenge Institute for Robust Quantum Simulation (RQS). “If we think this field is intellectually promising, interesting, and worth investing in as a scientist, we have to make sure that it stays healthy and lively for generations to come.”

Some sub-fields of physics, including the nuclear and particle physics communities, engage in long-term planning for the future of their field. Nuclear physicists in the U.S. plan seven years ahead, and particle physicists plan a full decade ahead. Researchers from many universities and national laboratories come together in meetings, seminars, and panel discussions over the course of a year to decide what the highest priorities in the field should be. Funding agencies in the U.S. and worldwide have historically taken these conclusions seriously. The whitepapers developed by Davoudi and her co-authors are a part of those efforts. In them, they argue for the importance of studying quantum simulation for nuclear and particle physics and make specific recommendations for further development. 

“These new research directions in both nuclear physics and high-energy physics were not part of the last U.S. long-range planning processes, because the idea had simply not been introduced at the time,” Davoudi said.

Indeed, the ideas weren’t even on Davoudi’s radar six years ago when she came to UMD to join the physics faculty as a theoretical nuclear physicist. While she was busy searching for an apartment, Davoudi saw an announcement for a workshop hosted by QuICS exploring the intersection of her field with quantum computing. Instead of looking for a place to live, she spent several days at the workshop, talking to theorists and experimentalists alike. 

Davoudi was enticed by the promise of quantum simulations to solve the kinds of problems she was unable to address with classical computational tools, and it changed the course of her career. In the years since, she has developed new theoretical techniques and collaborated with experimentalists to push the boundaries of what quantum simulators can do to help uncover the basic physics of the universe.

Davoudi wants to ensure that this burgeoning field continues to thrive into the future. In the whitepapers, she and her co-authors identified specific problems where quantum computing holds the most promise. Then, they made three main recommendations to ensure the success of the field for the next seven to 10 years. 

First, they recommended funding for theoretical efforts to develop algorithms that run on quantum hardware. Even though the potential of quantum computing is clear, detailed algorithms for simulating quantum field theory on a quantum computer are still in their infancy. Developing these will require a dedicated effort by the nuclear and particle physics communities. 

Second, they advocated for greater interdisciplinary communication between the nuclear, particle and quantum physics communities. Different quantum computer architectures will have different quirks and advantages, and the field theory folks will need to have access to them to figure out how to make the best use of each one. Certain implementations may, in turn, become motivated to engineer specific capabilities for the kinds of problems nuclear and particle physicists want to study. This can only be accomplished through close interdisciplinary collaboration, the authors claim. 

“As a community, we cannot isolate ourselves from the quantum information and quantum technology communities,” Davoudi said.

Third, Davoudi and her co-authors believe it is key to bring in junior researchers, train them with a diverse set of skills, and give them opportunities to contribute to this growing effort. As with the QuICS workshop that inspired Davoudi, the community should invest in education and training for the relevant skills through partnerships between universities, national labs and the private sector. 

“This is a new field, and you have to build the workforce,” Davoudi said. “I think it's important for our field to bring in diverse talent that would allow the field to continue to intellectually grow, and be able to solve the problems that we would like to eventually solve.”

 

Written by Dina Genkina

Novel Quantum Speed Limits Tackle Messy Reality of Disorder

The researchers and engineers studying quantum technologies are exploring uncharted territory. Due to the unintuitive quirks of quantum physics, the terrain isn’t easy to scout, and the path of progress has been littered with wrong turns and dead ends.

Sometimes, though, theorists have streamlined progress by spotting roadblocks in the distance or identifying the rules of the road. For instance, researchers have found several quantum speed limits—called Lieb-Robinson bounds—that are impassable caps on how quickly information can travel through collections of quantum particles. They’ve even developed protocols for quantum computers that achieve the best possible speeds for specific cases. But to make calculating the limits easier, physicists have mostly neglected the influence of disorder. In the real world, disorder can’t always be ignored, so researchers need to understand its potential effects.

JQI postdoctoral researcher Chris Baldwin, JQI Fellow and Adjunct Professor Alexey Gorshkov and other JQI researchers are facing down the impact disorder has on speed limits. In an article published on June 22, 2023 in the journal Physical Review X Quantum, they described novel methods for pulling insights from the mess created by disorder and identified new types of quantum speed limits that apply when disorder is present.

"We were motivated both by the beautiful theoretical problem of proving and saturating new speed limits and by the implications that our work would have on quantum computers that inevitably have some disorder," says Gorshkov, who is also a physicist at the National Institute of Standards and Technology and a Fellow of the Joint Center for Quantum Information and Computer Science.Spin Bucket BrigadeA chain of quantum spins can pass information down a line like a bucket brigade, but sometimes disorder (represented here by the red hand and bucket) can slow down the communication. The arrows in spheres in the buckets are a geometrical representation of a quantum state. (Credit: Sean Kelley/NIST)

Baldwin, Gorshkov and colleagues began by tackling the case of a one-dimensional line of particles, where each particle can only directly interact with its neighbors. They specifically focused on the spin—a quantum property related to magnetism—of each quantum particle. A spin is like a compass needle that wants to point along a magnetic field, but, being quantum, it can point in more than one direction at a time—a phenomenon called superposition.

Spins pass information to each other through interactions, so a line of spins can act like a bucket brigade passing quantum information: One jiggles its neighbor, which jiggles its neighbor on the other side, and the information makes its way down the line.

But if something is slightly off about a spin’s connection to a neighbor—there’s some disorder—the spin will fumble handing over the quantum data and slow things down. With their imperfect handovers, spins resemble people in a bucket brigade each working at a slightly different speed. Most people probably take a similar amount of time to pass a bucket, maybe clustered around a couple of seconds. But if enough random people are pulled in, a few speed demons may only need a second while others might take five seconds or more.

To account for the full range of possibilities in quantum systems, Baldwin and colleagues didn’t limit themselves to a fixed number of possible speeds. Instead, they analyzed distributions of speeds that extend from the quickest transfers for ideal connections between spins down infinitely to even the slightest chances of handoffs taking millennia or longer for arbitrarily bad connections.

In future quantum computers, experts expect millions of spins to work together. With so many spins, even long odds of any individual being a slowpoke can combine into a safe bet that one, or even several, will be present.

To make sense of the sea of possibilities presented by disorder’s influence on handoff speeds, Baldwin and colleagues pulled out the tools of probability theory. These tools allowed them to glean information about speed limits from the statistics of how transfer speeds are peppered throughout the line. With probability theory, they derived new speed limits for whole groups of spin chains based on the big picture without needing to know anything about the links between any particular spins.

The team was particularly interested in investigating if the speeds information can reach in different systems depend on the distance it is traveling. Some physical processes, like light travelling through space, resemble a car steadily cruising down an empty highway, where the travel time is directly proportional to the distance—it takes twice as long to move twice as far. But the speeds of other processes, like perfume defusing through a room, don’t have such a straightforward proportional behavior and can look more like a flagging runner who takes longer and longer the farther they push themselves. Knowing the relationship between speed and distance for quantum information is valuable when researchers are weighing their options for scaling up quantum computers.

With their new results, the researchers determined that information can’t always propagate at a steady speed indefinitely, and they identified the border between conditions that allow a steady speed from those that only allow a deteriorating pace.

They also found that their method allowed them to define two distinct types of limits that tell them different things. One they dubbed “almost always bounds” because the bounds hold for almost all the sections of a chain. These limits apply to any sufficiently long stretch of spins even though they might occasionally be violated for small sections of the chain—like if there is an unusual clump of speed demons in the brigade. These limits allow researchers to guarantee conditions, like that a particular spin won’t be disturbed by activity further down the line within a particular time window.

The researchers called the second type of limit “infinitely often bounds” because they are guaranteed to apply to some stretches of an infinite chain but there isn’t a guarantee that the limit will definitely hold for any particular stretch no matter how long a section is being considered. So, these limits are expected to occasionally pop up on sections of the chain and generally lower the limit from that set by the almost always bound—like a car occasionally entering a work zone on the highway. Having an idea of these lower speed limits that are likely to pop up can help researchers to judge the reasonable minimum amount of time to dedicate to getting the bucket all the way across a stretch of the brigade.

The newly defined limits allowed the team members to resolve a lingering discrepancy: The existing Lieb-Robinson bounds had set a higher ceiling than any information transfer protocol had reached. The mismatch could have been the result of either researchers not being creative enough in designing the protocols or them failing to account for something that enforced a lower limit. Accounting for disorder more carefully dropped the theoretical ceiling down to match the speed of existing protocols.

“For a while, we had this gap,” Baldwin says. “The main exciting thing of this work was figuring out how we could completely close this gap.”

The researchers say there is further work to be done exploring the applications of these limits and determining when the two types of bounds have significant impacts.

“The main direction I want to take this going forward is going beyond one dimension,” Baldwin says. “My suspicion is that the picture will end up looking very different, but I think it's still worth having this one-dimensional case in mind when we start to do that.”

Original story by Bailey Bedford: https://jqi.umd.edu/news/novel-quantum-speed-limits-tackle-messy-reality-disorder

In addition to Baldwin and Gorshkov, authors on the publications included UMD graduate student Adam Ehrenberg and former UMD graduate student Andrew Guo.

About the Research

Reference Publication
Disordered Lieb-Robinson bounds in one dimensionC. Baldwin, A. Ehrenberg, A. Y. Guo, and A. V. Gorshkov, PRX Quantum, 4, (2023) PRXQuantum.4.020349.pdf

Related Articles

New Approach to Information Transfer Reaches Quantum Speed Limit

New Quantum Information Speed Limits Depend on the Task at Hand

New Perspective Blends Quantum and Classical to Understand Quantum Rates of Change

UMD Researchers Study the Intricate Processes Underpinning Gene Expression

A new study led by University of Maryland physicists sheds light on the cellular processes that regulate genes. Published in the journal Science Advances, the paper explains how the dynamics of a polymer called chromatin—the structure into which DNA is packaged—regulate gene expression.

Through the use of machine learning and statistical algorithms, a research team led by Professor Arpita Upadhyaya and National Institutes of Health Senior Investigator Gordon Hager discovered that chromatin can switch between a lower and higher mobility state within seconds. The team found that the extent to which chromatin moves inside cells is an overlooked but important process, with the lower mobility state being linked to gene expression.

Notably, transcription factors (TFs)—proteins that bind specific DNA sequences within the chromatin polymer and turn genes on or off—exhibit the same mobility as that of the piece of chromatin they are bound to. In their study, the researchers analyzed a group of TFs called nuclear receptors, which are targeted by drugs that treat a variety of diseases and conditions.

“The nuclear receptors in our study are important therapeutic targets for breast cancer, prostate cancer and diabetes,” explained the study’s first author, Kaustubh Wagh (Ph.D. ’23, physics). “Understanding their basic mechanism of action is essential to establish a baseline for how these proteins function.”

As a result, these findings could have broad applications in medicine.

On the move

The genetic information that children inherit from their parents is contained in DNA—the set of instructions for all possible proteins that cells can make. A DNA molecule is about 2 meters in length when stretched from end to end, and it must be compacted 100,000 times in a highly organized manner to fit inside a cell’s nucleus. To achieve this, DNA is packaged into chromatin in the nucleus of a cell, but that bundle of genetic material doesn’t stay stationary.

“We know that how the genome is organized in the nucleus of our cells has profound consequences for gene expression,” Wagh said. “However, an often-overlooked fact is that chromatin is constantly moving around inside the cell, and this mobility may have important consequences for gene regulation.”

 Researchers discovered that chromatin can dynamically switch between two states of mobility: state 1, in which chromatin moves a shorter distance (shown in red font on the right) and state 2 (shown in blue font on the left). Click image to download hi-res version.

The research team—including collaborators from the National Cancer Institute, the University of Buenos Aires and the University of Southern Denmark—showed that chromatin switches between two distinct mobility states: a lower one (state 1) and a higher one (state 2). Earlier theories suggested that different parts of the nucleus had fixed chromatin mobilities, but the researchers demonstrated that chromatin is much more dynamic.Researchers discovered that chromatin can dynamically switch between two states of mobility: state 1, in which chromatin moves a shorter distance (shown in red font on the right) and state 2 (shown in blue font on the left).Researchers discovered that chromatin can dynamically switch between two states of mobility: state 1, in which chromatin moves a shorter distance (shown in red font on the right) and state 2 (shown in blue font on the left).

“Previous studies have proposed that different chromatin mobility states occupy distinct regions of the cell nucleus. However, these studies were performed on a sub-second timescale,” said Upadhyaya, who holds a joint appointment in the Institute for Physical Science and Technology. “We extend this model by showing that on longer timescales, the chromatin polymer can locally switch between two mobility states.”

The researchers found that transcriptionally active TFs preferred to bind to chromatin in state 1. They were also surprised to discover that TF molecules in a lower mobility state bound for longer periods of time, likely affecting gene regulation.

Finding a raft in the ocean

This study advances scientists’ understanding of chromatin dynamics and gene expression. The researchers will use their framework to study how mutations affect the function of TFs, which can offer insight into the onset of various diseases.

“We are now in a position to answer whether a particular disease phenotype occurs due to the TF binding for too much or too little time, or not binding in the right chromatin state,” Wagh said.

The team also plans to investigate how TFs achieve the challenging feat of finding their targets. TFs target a specific base pair sequence of DNA, and only by finding and binding this sequence can they recruit other proteins to activate nearby genes.

“A TF finding its target site is like finding a single raft in the middle of the ocean,” Upadhyaya said. “It’s a miracle it even happens, and we plan to figure out how.”

###

Their paper, “Dynamic switching of transcriptional regulators between two distinct low-mobility chromatin states,” was published in Science Advances on June 14, 2023.

This work was supported by the National Institutes of Health (Award No. R35 GM145313), National Cancer Institute Intramural Program, NCI-UMD Partnership for Integrative Cancer Research, Center for Cancer Research, National Science Foundation (Award Nos. NSF MCB 2132922 and NSF PHY 1915534), Vissing Foundation, William Demant Foundation, Knud Højgaard Foundation, Frimodt-Heineke Foundation, Director Ib Henriksen Foundation, Ove and Edith Buhl Olesen Memorial Foundation, Academy of Finland, Cancer Foundation Finland, Sigrid Jusélius Foundation, Villum Foundation (Award No. 73288), Independent Research Fund Denmark (Award No. 12-125524), Danish National Research Foundation (Award No. 141) to the Center for Functional Genomics and Tissue Plasticity, CONICET and the Agencia Nacional de Programación Científica y Tecnológica (Award Nos. 2019-0397 and PICT 2018-0573). This story does not necessarily reflect the views of these organizations.

This article is adapted from text provided by Kaustubh Wagh. Originally published here: https://cmns.umd.edu/news-events/news/umd-researchers-study-intricate-processes-underpinning-gene-expression

Media Relations Contact: Emily Nunez
This email address is being protected from spambots. You need JavaScript enabled to view it.; 301-405-9463

 

Crystal Imperfections Reveal Rich New Phases of Familiar Matter

Matter—all the stuff we see around us—can be classified into familiar phases: our chairs are solid, our coffee is liquid, and the oxygen we breathe is a gas. This grouping obscures the nitty gritty details of what each molecule or atom is up to and reduces all that complexity down to a few main features that are most salient in our everyday lives.

But those are not the only properties of matter that matter. Focusing on solids, physicists have found that they can group things according to symmetries. For example, atoms in solids arrange themselves into repeating patterns, forming crystals that can be grouped according to whether they look the same left and right, up and down, rotated about, and more. In the 1980s, physicists discovered a new paradigm: In addition to symmetries, solids can be classified using topology—a field of math that does for geometrical shapes the same kind of thing that symmetries do for crystalline solids. All the shapes without holes (a ball, a pizza) are in the same topological “phase,” while those with one hole (a donut, a coffee mug) are in a different “phase,” and so on with each new hole.

Within physics, topology doesn’t usually refer to the shape a piece of metal is cut into. Rather, the topology of how electrons are arranged inside a crystal provides information about the material’s electrical conductance and other properties. Now, theorists at the Joint Quantum Institute have found that these same crystals hide a richer set of topological phases than previously thought. In two separate works, they revealed a host of possible topological phases that become apparent when two different kinds of defects develop in crystals, or when they study the twirling properties of the electronic arrangement. They published their findings in the journal Physical Review X on July 14, 2023 and in the journal Physical Review Letters in Dec. 2022.

“Condensed matter physics is about understanding all the properties of phases of matter,” says Naren Manjunath, a graduate student at JQI and an author on both results. “I think that our work is really expanding our understanding of how to define new topological properties and how to characterize phases of matter better.”

Topology was first recognized as an important matter-classification tool after the discovery of the quantum Hall effect in the 1980s. When thin sheets of certain materials are pierced by a strong magnetic field, the electrons inside the materials spin around in circles—the larger the magnetic field the tighter their turns. Once the circles get small enough, quantum mechanics kicks in and dictates that the size of the circles can only have certain discrete values (the “quantum” in the quantum Hall effect). As the magnetic field is increased, nothing changes for a while—there is a plateau. Then, when the field gets large enough, electrons suddenly hop into a tighter orbit—an abrupt, step-wise change.

This jump from one radius of a spinning orbit to another can be thought of as a change in the topological phase—the geometry of the electron motion in the material switches. This sudden hopping is extremely precise, and it results in abrupt jumps in the electrical conductivity of the metal sheet, making the topological phase easy to measure experimentally.

Hofstadter butterfly (Adapted from Osadchy and Avron, J. Math. Phys. 42, 5665–5671 (2001) https://doi.org/10.1063/1.1412464 )Hofstadter butterfly (Adapted from Osadchy and Avron, J. Math. Phys. 42, 5665–5671 (2001) https://doi.org/10.1063/1.1412464 )

Even more interesting things would happen if the magnetic field in the quantum Hall effect was cranked up so high that the electron orbitals became about as small as the atomic spacing in the crystal. There, electrons arrange themselves into different topological phases that depend on how many electrons were around in the first place and the magnetic field piercing each little bit of the crystal. A color-coded plot of conductivity as it depends on the electron density and the magnetic field appears as a winged butterfly, called the Hofstadter butterfly after the theoretical physicist that first studied this model.

“We're furthering this program of trying to find all possible quantized numbers that could be associated with phases of matter,” says JQI Fellow and Associate Professor Maissam Barkeshli, a principal investigator on the work. “And this is a long-term program and we made a lot of progress on it recently.”

Manjunath, Barkeshli, and their collaborators found that there may be more intricate details hiding in the Hofstadter butterfly’s wings than previously thought. Some spots on the butterfly might have the same color, and therefore the same topological phase in the original treatment, and yet behave differently from each other in important ways.

These extra distinguishing features are always present, but they become most obvious when the crystal develops defects—little mistakes in its otherwise perfectly regular pattern. The way electrons behave around this defect would differ depending on the underlying topological phase. And different defects can help uncover different kinds of phases.

The team first studied an imperfection called a disclination, which occurs when a piece of the crystal is taken out and the remaining atoms are stitched back together, as seen in the diagram below. The researchers found that electric charge tends to cluster around this defect. And how much charge pops up at the defect depends on a new quantity, which the team called the shift. Much like the size of electron orbits in the quantum Hall effect, the shift is quantized: It can only be an integer or a half-integer. A different value of the shift corresponds to a different phase of matter. The electric charge appearing at a disclination would be a multiple of this shift, which, weirdly enough, could even be a fraction of a single electron’s charge. They published the results of their theoretical investigation in the journal Physical Review Letters in December 2022.

After disclinations, the team focused their attention on another kind of imperfection called a dislocation. Unlike a disclination, no atoms are missing in a dislocation. Instead, the connections between atoms in a crystal are rewired in a different order. Instead of being connected to its closest neighbor, one or more of the atoms bonds with the next atom over, creating a skewed ladder of links.

Dislocations turned out to have another quantized quantity associated with them, this time named a quantized polarization. Inside a perfectly regular crystal, every tiny square of the lattice may hide a bit of charge polarization—one side becomes somewhat positively charged while the other side becomes a bit negatively charged. This polarization is hard to spot. If a dislocation is introduced, however, the researchers found that one side of this polarized charge gets trapped in the defect, revealing the extent of the polarization. Exactly how polarized they would become depended directly on the underlying quantized polarization. The team published this result in the journal Physical Review X.

Each of these quantities—the shift and the quantized polarization—has consequences even without any defects. These consequences have to do with the way the electrons tend to twist around different points inside the crystal lattice. But these twists are tricky to find experimentally, while crystal defects offer a tangible way of measuring and observing these new quantities by trapping charges in their vicinity.

New butterflies, cousins of the original Hofstadter butterfly, pop up thanks to the shift and quantized polarization. Both can be plotted as a function of electron density and magnetic field and exhibit the same winged, fractal butterfly-like structure. The researchers believe more such quantities and their associated phases remain to be uncovered, associated with yet more defects. “Eventually we expect we will have a large number of beautiful colored butterfly figures,” Barkeshli says, “one for each of these topological properties.”

For now, testing these predictions experimentally seems just out of reach. The Hofstadter model requires such large magnetic fields that it cannot be realized in regular materials. Instead, people have resorted to simulating this model with synthetic lattices of atoms or photons, or in layered graphene structures. These synthetic lattices are not quite large enough to measure the charge distributions required, but with some engineering advances, they might be up to the task in the coming years. It may also be possible to create these lattices using small, noisy quantum computers that have already been built, or topological photonic systems.

“We only considered the Hofstadter model,” says Manjunath, “but you could measure the same thing for more exotic phases of matter. And some of those phases might actually have some applications in the very distant future.”

Original story by Dina Genkina: https://jqi.umd.edu/news/crystal-imperfections-reveal-rich-new-phases-familiar-matter

In addition to Manjunath and Barkeshli, authors on the publications included UMD graduate students Yuxuan Zhang and Gautam Nambiar.

About the Research

Reference Publications