Novel Quantum Speed Limits Tackle Messy Reality of Disorder

The researchers and engineers studying quantum technologies are exploring uncharted territory. Due to the unintuitive quirks of quantum physics, the terrain isn’t easy to scout, and the path of progress has been littered with wrong turns and dead ends.

Sometimes, though, theorists have streamlined progress by spotting roadblocks in the distance or identifying the rules of the road. For instance, researchers have found several quantum speed limits—called Lieb-Robinson bounds—that are impassable caps on how quickly information can travel through collections of quantum particles. They’ve even developed protocols for quantum computers that achieve the best possible speeds for specific cases. But to make calculating the limits easier, physicists have mostly neglected the influence of disorder. In the real world, disorder can’t always be ignored, so researchers need to understand its potential effects.

JQI postdoctoral researcher Chris Baldwin, JQI Fellow and Adjunct Professor Alexey Gorshkov and other JQI researchers are facing down the impact disorder has on speed limits. In an article published on June 22, 2023 in the journal Physical Review X Quantum, they described novel methods for pulling insights from the mess created by disorder and identified new types of quantum speed limits that apply when disorder is present.

"We were motivated both by the beautiful theoretical problem of proving and saturating new speed limits and by the implications that our work would have on quantum computers that inevitably have some disorder," says Gorshkov, who is also a physicist at the National Institute of Standards and Technology and a Fellow of the Joint Center for Quantum Information and Computer Science.Spin Bucket BrigadeA chain of quantum spins can pass information down a line like a bucket brigade, but sometimes disorder (represented here by the red hand and bucket) can slow down the communication. The arrows in spheres in the buckets are a geometrical representation of a quantum state. (Credit: Sean Kelley/NIST)

Baldwin, Gorshkov and colleagues began by tackling the case of a one-dimensional line of particles, where each particle can only directly interact with its neighbors. They specifically focused on the spin—a quantum property related to magnetism—of each quantum particle. A spin is like a compass needle that wants to point along a magnetic field, but, being quantum, it can point in more than one direction at a time—a phenomenon called superposition.

Spins pass information to each other through interactions, so a line of spins can act like a bucket brigade passing quantum information: One jiggles its neighbor, which jiggles its neighbor on the other side, and the information makes its way down the line.

But if something is slightly off about a spin’s connection to a neighbor—there’s some disorder—the spin will fumble handing over the quantum data and slow things down. With their imperfect handovers, spins resemble people in a bucket brigade each working at a slightly different speed. Most people probably take a similar amount of time to pass a bucket, maybe clustered around a couple of seconds. But if enough random people are pulled in, a few speed demons may only need a second while others might take five seconds or more.

To account for the full range of possibilities in quantum systems, Baldwin and colleagues didn’t limit themselves to a fixed number of possible speeds. Instead, they analyzed distributions of speeds that extend from the quickest transfers for ideal connections between spins down infinitely to even the slightest chances of handoffs taking millennia or longer for arbitrarily bad connections.

In future quantum computers, experts expect millions of spins to work together. With so many spins, even long odds of any individual being a slowpoke can combine into a safe bet that one, or even several, will be present.

To make sense of the sea of possibilities presented by disorder’s influence on handoff speeds, Baldwin and colleagues pulled out the tools of probability theory. These tools allowed them to glean information about speed limits from the statistics of how transfer speeds are peppered throughout the line. With probability theory, they derived new speed limits for whole groups of spin chains based on the big picture without needing to know anything about the links between any particular spins.

The team was particularly interested in investigating if the speeds information can reach in different systems depend on the distance it is traveling. Some physical processes, like light travelling through space, resemble a car steadily cruising down an empty highway, where the travel time is directly proportional to the distance—it takes twice as long to move twice as far. But the speeds of other processes, like perfume defusing through a room, don’t have such a straightforward proportional behavior and can look more like a flagging runner who takes longer and longer the farther they push themselves. Knowing the relationship between speed and distance for quantum information is valuable when researchers are weighing their options for scaling up quantum computers.

With their new results, the researchers determined that information can’t always propagate at a steady speed indefinitely, and they identified the border between conditions that allow a steady speed from those that only allow a deteriorating pace.

They also found that their method allowed them to define two distinct types of limits that tell them different things. One they dubbed “almost always bounds” because the bounds hold for almost all the sections of a chain. These limits apply to any sufficiently long stretch of spins even though they might occasionally be violated for small sections of the chain—like if there is an unusual clump of speed demons in the brigade. These limits allow researchers to guarantee conditions, like that a particular spin won’t be disturbed by activity further down the line within a particular time window.

The researchers called the second type of limit “infinitely often bounds” because they are guaranteed to apply to some stretches of an infinite chain but there isn’t a guarantee that the limit will definitely hold for any particular stretch no matter how long a section is being considered. So, these limits are expected to occasionally pop up on sections of the chain and generally lower the limit from that set by the almost always bound—like a car occasionally entering a work zone on the highway. Having an idea of these lower speed limits that are likely to pop up can help researchers to judge the reasonable minimum amount of time to dedicate to getting the bucket all the way across a stretch of the brigade.

The newly defined limits allowed the team members to resolve a lingering discrepancy: The existing Lieb-Robinson bounds had set a higher ceiling than any information transfer protocol had reached. The mismatch could have been the result of either researchers not being creative enough in designing the protocols or them failing to account for something that enforced a lower limit. Accounting for disorder more carefully dropped the theoretical ceiling down to match the speed of existing protocols.

“For a while, we had this gap,” Baldwin says. “The main exciting thing of this work was figuring out how we could completely close this gap.”

The researchers say there is further work to be done exploring the applications of these limits and determining when the two types of bounds have significant impacts.

“The main direction I want to take this going forward is going beyond one dimension,” Baldwin says. “My suspicion is that the picture will end up looking very different, but I think it's still worth having this one-dimensional case in mind when we start to do that.”

Original story by Bailey Bedford: https://jqi.umd.edu/news/novel-quantum-speed-limits-tackle-messy-reality-disorder

In addition to Baldwin and Gorshkov, authors on the publications included UMD graduate student Adam Ehrenberg and former UMD graduate student Andrew Guo.

About the Research

Reference Publication
Disordered Lieb-Robinson bounds in one dimensionC. Baldwin, A. Ehrenberg, A. Y. Guo, and A. V. Gorshkov, PRX Quantum, 4, (2023) PRXQuantum.4.020349.pdf

Related Articles

New Approach to Information Transfer Reaches Quantum Speed Limit

New Quantum Information Speed Limits Depend on the Task at Hand

New Perspective Blends Quantum and Classical to Understand Quantum Rates of Change

UMD Researchers Study the Intricate Processes Underpinning Gene Expression

A new study led by University of Maryland physicists sheds light on the cellular processes that regulate genes. Published in the journal Science Advances, the paper explains how the dynamics of a polymer called chromatin—the structure into which DNA is packaged—regulate gene expression.

Through the use of machine learning and statistical algorithms, a research team led by Professor Arpita Upadhyaya and National Institutes of Health Senior Investigator Gordon Hager discovered that chromatin can switch between a lower and higher mobility state within seconds. The team found that the extent to which chromatin moves inside cells is an overlooked but important process, with the lower mobility state being linked to gene expression.

Notably, transcription factors (TFs)—proteins that bind specific DNA sequences within the chromatin polymer and turn genes on or off—exhibit the same mobility as that of the piece of chromatin they are bound to. In their study, the researchers analyzed a group of TFs called nuclear receptors, which are targeted by drugs that treat a variety of diseases and conditions.

“The nuclear receptors in our study are important therapeutic targets for breast cancer, prostate cancer and diabetes,” explained the study’s first author, Kaustubh Wagh (Ph.D. ’23, physics). “Understanding their basic mechanism of action is essential to establish a baseline for how these proteins function.”

As a result, these findings could have broad applications in medicine.

On the move

The genetic information that children inherit from their parents is contained in DNA—the set of instructions for all possible proteins that cells can make. A DNA molecule is about 2 meters in length when stretched from end to end, and it must be compacted 100,000 times in a highly organized manner to fit inside a cell’s nucleus. To achieve this, DNA is packaged into chromatin in the nucleus of a cell, but that bundle of genetic material doesn’t stay stationary.

“We know that how the genome is organized in the nucleus of our cells has profound consequences for gene expression,” Wagh said. “However, an often-overlooked fact is that chromatin is constantly moving around inside the cell, and this mobility may have important consequences for gene regulation.”

 Researchers discovered that chromatin can dynamically switch between two states of mobility: state 1, in which chromatin moves a shorter distance (shown in red font on the right) and state 2 (shown in blue font on the left). Click image to download hi-res version.

The research team—including collaborators from the National Cancer Institute, the University of Buenos Aires and the University of Southern Denmark—showed that chromatin switches between two distinct mobility states: a lower one (state 1) and a higher one (state 2). Earlier theories suggested that different parts of the nucleus had fixed chromatin mobilities, but the researchers demonstrated that chromatin is much more dynamic.Researchers discovered that chromatin can dynamically switch between two states of mobility: state 1, in which chromatin moves a shorter distance (shown in red font on the right) and state 2 (shown in blue font on the left).Researchers discovered that chromatin can dynamically switch between two states of mobility: state 1, in which chromatin moves a shorter distance (shown in red font on the right) and state 2 (shown in blue font on the left).

“Previous studies have proposed that different chromatin mobility states occupy distinct regions of the cell nucleus. However, these studies were performed on a sub-second timescale,” said Upadhyaya, who holds a joint appointment in the Institute for Physical Science and Technology. “We extend this model by showing that on longer timescales, the chromatin polymer can locally switch between two mobility states.”

The researchers found that transcriptionally active TFs preferred to bind to chromatin in state 1. They were also surprised to discover that TF molecules in a lower mobility state bound for longer periods of time, likely affecting gene regulation.

Finding a raft in the ocean

This study advances scientists’ understanding of chromatin dynamics and gene expression. The researchers will use their framework to study how mutations affect the function of TFs, which can offer insight into the onset of various diseases.

“We are now in a position to answer whether a particular disease phenotype occurs due to the TF binding for too much or too little time, or not binding in the right chromatin state,” Wagh said.

The team also plans to investigate how TFs achieve the challenging feat of finding their targets. TFs target a specific base pair sequence of DNA, and only by finding and binding this sequence can they recruit other proteins to activate nearby genes.

“A TF finding its target site is like finding a single raft in the middle of the ocean,” Upadhyaya said. “It’s a miracle it even happens, and we plan to figure out how.”

###

Their paper, “Dynamic switching of transcriptional regulators between two distinct low-mobility chromatin states,” was published in Science Advances on June 14, 2023.

This work was supported by the National Institutes of Health (Award No. R35 GM145313), National Cancer Institute Intramural Program, NCI-UMD Partnership for Integrative Cancer Research, Center for Cancer Research, National Science Foundation (Award Nos. NSF MCB 2132922 and NSF PHY 1915534), Vissing Foundation, William Demant Foundation, Knud Højgaard Foundation, Frimodt-Heineke Foundation, Director Ib Henriksen Foundation, Ove and Edith Buhl Olesen Memorial Foundation, Academy of Finland, Cancer Foundation Finland, Sigrid Jusélius Foundation, Villum Foundation (Award No. 73288), Independent Research Fund Denmark (Award No. 12-125524), Danish National Research Foundation (Award No. 141) to the Center for Functional Genomics and Tissue Plasticity, CONICET and the Agencia Nacional de Programación Científica y Tecnológica (Award Nos. 2019-0397 and PICT 2018-0573). This story does not necessarily reflect the views of these organizations.

This article is adapted from text provided by Kaustubh Wagh. Originally published here: https://cmns.umd.edu/news-events/news/umd-researchers-study-intricate-processes-underpinning-gene-expression

Media Relations Contact: Emily Nunez
This email address is being protected from spambots. You need JavaScript enabled to view it.; 301-405-9463

 

Crystal Imperfections Reveal Rich New Phases of Familiar Matter

Matter—all the stuff we see around us—can be classified into familiar phases: our chairs are solid, our coffee is liquid, and the oxygen we breathe is a gas. This grouping obscures the nitty gritty details of what each molecule or atom is up to and reduces all that complexity down to a few main features that are most salient in our everyday lives.

But those are not the only properties of matter that matter. Focusing on solids, physicists have found that they can group things according to symmetries. For example, atoms in solids arrange themselves into repeating patterns, forming crystals that can be grouped according to whether they look the same left and right, up and down, rotated about, and more. In the 1980s, physicists discovered a new paradigm: In addition to symmetries, solids can be classified using topology—a field of math that does for geometrical shapes the same kind of thing that symmetries do for crystalline solids. All the shapes without holes (a ball, a pizza) are in the same topological “phase,” while those with one hole (a donut, a coffee mug) are in a different “phase,” and so on with each new hole.

Within physics, topology doesn’t usually refer to the shape a piece of metal is cut into. Rather, the topology of how electrons are arranged inside a crystal provides information about the material’s electrical conductance and other properties. Now, theorists at the Joint Quantum Institute have found that these same crystals hide a richer set of topological phases than previously thought. In two separate works, they revealed a host of possible topological phases that become apparent when two different kinds of defects develop in crystals, or when they study the twirling properties of the electronic arrangement. They published their findings in the journal Physical Review X on July 14, 2023 and in the journal Physical Review Letters in Dec. 2022.

“Condensed matter physics is about understanding all the properties of phases of matter,” says Naren Manjunath, a graduate student at JQI and an author on both results. “I think that our work is really expanding our understanding of how to define new topological properties and how to characterize phases of matter better.”

Topology was first recognized as an important matter-classification tool after the discovery of the quantum Hall effect in the 1980s. When thin sheets of certain materials are pierced by a strong magnetic field, the electrons inside the materials spin around in circles—the larger the magnetic field the tighter their turns. Once the circles get small enough, quantum mechanics kicks in and dictates that the size of the circles can only have certain discrete values (the “quantum” in the quantum Hall effect). As the magnetic field is increased, nothing changes for a while—there is a plateau. Then, when the field gets large enough, electrons suddenly hop into a tighter orbit—an abrupt, step-wise change.

This jump from one radius of a spinning orbit to another can be thought of as a change in the topological phase—the geometry of the electron motion in the material switches. This sudden hopping is extremely precise, and it results in abrupt jumps in the electrical conductivity of the metal sheet, making the topological phase easy to measure experimentally.

Hofstadter butterfly (Adapted from Osadchy and Avron, J. Math. Phys. 42, 5665–5671 (2001) https://doi.org/10.1063/1.1412464 )Hofstadter butterfly (Adapted from Osadchy and Avron, J. Math. Phys. 42, 5665–5671 (2001) https://doi.org/10.1063/1.1412464 )

Even more interesting things would happen if the magnetic field in the quantum Hall effect was cranked up so high that the electron orbitals became about as small as the atomic spacing in the crystal. There, electrons arrange themselves into different topological phases that depend on how many electrons were around in the first place and the magnetic field piercing each little bit of the crystal. A color-coded plot of conductivity as it depends on the electron density and the magnetic field appears as a winged butterfly, called the Hofstadter butterfly after the theoretical physicist that first studied this model.

“We're furthering this program of trying to find all possible quantized numbers that could be associated with phases of matter,” says JQI Fellow and Associate Professor Maissam Barkeshli, a principal investigator on the work. “And this is a long-term program and we made a lot of progress on it recently.”

Manjunath, Barkeshli, and their collaborators found that there may be more intricate details hiding in the Hofstadter butterfly’s wings than previously thought. Some spots on the butterfly might have the same color, and therefore the same topological phase in the original treatment, and yet behave differently from each other in important ways.

These extra distinguishing features are always present, but they become most obvious when the crystal develops defects—little mistakes in its otherwise perfectly regular pattern. The way electrons behave around this defect would differ depending on the underlying topological phase. And different defects can help uncover different kinds of phases.

The team first studied an imperfection called a disclination, which occurs when a piece of the crystal is taken out and the remaining atoms are stitched back together, as seen in the diagram below. The researchers found that electric charge tends to cluster around this defect. And how much charge pops up at the defect depends on a new quantity, which the team called the shift. Much like the size of electron orbits in the quantum Hall effect, the shift is quantized: It can only be an integer or a half-integer. A different value of the shift corresponds to a different phase of matter. The electric charge appearing at a disclination would be a multiple of this shift, which, weirdly enough, could even be a fraction of a single electron’s charge. They published the results of their theoretical investigation in the journal Physical Review Letters in December 2022.

After disclinations, the team focused their attention on another kind of imperfection called a dislocation. Unlike a disclination, no atoms are missing in a dislocation. Instead, the connections between atoms in a crystal are rewired in a different order. Instead of being connected to its closest neighbor, one or more of the atoms bonds with the next atom over, creating a skewed ladder of links.

Dislocations turned out to have another quantized quantity associated with them, this time named a quantized polarization. Inside a perfectly regular crystal, every tiny square of the lattice may hide a bit of charge polarization—one side becomes somewhat positively charged while the other side becomes a bit negatively charged. This polarization is hard to spot. If a dislocation is introduced, however, the researchers found that one side of this polarized charge gets trapped in the defect, revealing the extent of the polarization. Exactly how polarized they would become depended directly on the underlying quantized polarization. The team published this result in the journal Physical Review X.

Each of these quantities—the shift and the quantized polarization—has consequences even without any defects. These consequences have to do with the way the electrons tend to twist around different points inside the crystal lattice. But these twists are tricky to find experimentally, while crystal defects offer a tangible way of measuring and observing these new quantities by trapping charges in their vicinity.

New butterflies, cousins of the original Hofstadter butterfly, pop up thanks to the shift and quantized polarization. Both can be plotted as a function of electron density and magnetic field and exhibit the same winged, fractal butterfly-like structure. The researchers believe more such quantities and their associated phases remain to be uncovered, associated with yet more defects. “Eventually we expect we will have a large number of beautiful colored butterfly figures,” Barkeshli says, “one for each of these topological properties.”

For now, testing these predictions experimentally seems just out of reach. The Hofstadter model requires such large magnetic fields that it cannot be realized in regular materials. Instead, people have resorted to simulating this model with synthetic lattices of atoms or photons, or in layered graphene structures. These synthetic lattices are not quite large enough to measure the charge distributions required, but with some engineering advances, they might be up to the task in the coming years. It may also be possible to create these lattices using small, noisy quantum computers that have already been built, or topological photonic systems.

“We only considered the Hofstadter model,” says Manjunath, “but you could measure the same thing for more exotic phases of matter. And some of those phases might actually have some applications in the very distant future.”

Original story by Dina Genkina: https://jqi.umd.edu/news/crystal-imperfections-reveal-rich-new-phases-familiar-matter

In addition to Manjunath and Barkeshli, authors on the publications included UMD graduate students Yuxuan Zhang and Gautam Nambiar.

About the Research

Reference Publications

 

 

New Study Identifies Mechanism Driving the Sun’s Fast Wind

The fastest winds ever recorded on Earth reached more than 200 miles per hour, but even those gusts pale in comparison to the sun’s wind.

In a paper published June 7, 2023 in the journal Nature, a team of researchers used data from NASA’s Parker Solar Probe to explain how the solar wind is capable of surpassing speeds of 1 million miles per hour. They discovered that the energy released from the magnetic field near the sun’s surface is powerful enough to drive the fast solar wind, which is made up of ionized particles—called plasma—that flow outward from the sun.

This illustration shows NASA’s Parker Solar Probe near the sun. Credit: NASA/Johns Hopkins APL/Steve Gribben.This illustration shows NASA’s Parker Solar Probe near the sun. Credit: NASA/Johns Hopkins APL/Steve Gribben. This illustration shows NASA’s Parker Solar Probe near the sun. Credit: NASA/Johns Hopkins APL/Steve Gribben.

James Drake, a Distinguished University Professor in the University of Maryland’s Department of Physics and Institute for Physical Science and Technology (IPST), co-led this research alongside first author Stuart Bale of UC Berkeley. Drake said scientists have been trying to understand solar wind drivers since the 1950s—and with the world more interconnected than ever, the implications for Earth are significant.

The solar wind forms a giant magnetic bubble, known as the heliosphere, that protects planets in our solar system from a barrage of high-energy cosmic rays that whip around the galaxy. However, the solar wind also carries plasma and part of the sun’s magnetic field, which can crash into Earth’s magnetosphere and cause disturbances, including geomagnetic storms.

These storms occur when the sun experiences more turbulent activity, including solar flares and enormous expulsions of plasma into space, known as coronal mass ejections. Geomagnetic storms are responsible for spectacular aurora light shows that can be seen near the Earth’s poles, but at their most powerful, they can knock out a city’s power grid and potentially even disrupt global communications. Such events, while rare, can also be deadly to astronauts in space.

“Winds carry lots of information from the sun to Earth, so understanding the mechanism behind the sun’s wind is important for practical reasons on Earth,” Drake said. “That’s going to affect our ability to understand how the sun releases energy and drives geomagnetic storms, which are a threat to our communication networks.”

Previous studies revealed that the sun’s magnetic field was somehow driving the solar wind, but researchers didn’t know the underlying mechanism. Earlier this year, Drake co-authored a paper which argued that the heating and acceleration of the solar wind is driven by magnetic reconnection—a process that Drake has dedicated his scientific career to studying.

The authors explained that the entire surface of the sun is covered in small “jetlets” of hot plasma that are propelled upward by magnetic reconnection, which occurs when magnetic fields pointing in opposite directions cross-connect. In turn, this triggers the release of massive amounts of energy.

“Two things pointing in opposite directions often wind up annihilating each other, and in this case doing so releases magnetic energy,” Drake said. “These explosions that happen on the sun are all driven by that mechanism. It’s the annihilation of a magnetic field.”

To better understand these processes, the authors of the new Nature paper used data from the Parker Solar Probe to analyze the plasma flowing out of the corona—the outermost and hottest layer of the sun. In April 2021, Parker became the first spacecraft to enter the sun’s corona and has been nudging closer to the sun ever since. The data cited in this paper was taken at a distance of 13 solar radii, or roughly 5.6 million miles from the sun.

“When you get very close to the sun, you start seeing stuff that you just can’t see from Earth,” Drake said. “All the satellites that surround Earth are 210 solar radii from the sun, and now we’re down to 13. We’re about as close as we’re going to get.”

Using this new data, the Nature paper authors provided the first characterization of the bursts of magnetic energy that occur in coronal holes, which are openings in the sun’s magnetic field as well as the source of the solar wind.

The researchers demonstrated that magnetic reconnection between open and closed magnetic fields—known as interchange connection—is a continuous process, rather than a series of isolated events as previously thought. This led them to conclude that the rate of magnetic energy release, which drives the outward jet of heated plasma, was powerful enough to overcome gravity and produce the sun’s fast wind.

By understanding these smaller releases of energy that are constantly occurring on the sun, researchers hope to understand—and possibly even predict—the larger and more dangerous eruptions that launch plasma out into space. In addition to the implications for Earth, findings from this study can be applied to other areas of astronomy as well.

“Winds are produced by objects throughout the universe, so understanding what drives the wind from the sun has broad implications,” Drake said. “Winds from stars, for example, play a crucial role in shielding planetary systems from galactic cosmic rays, which can impact habitability.”

This would not only aid our understanding of the universe, but possibly also the search for life on other planets.

###

In addition to Drake, Marc Swisdak, a research scientist in UMD’s Institute for Research in Electronics and Applied Physics, co-authored this study.

Their paper, “Interchange reconnection as the source of the fast solar wind within coronal holes,” was published in Nature on June 7, 2023. 

This study was supported by NASA (Contract No. NNN06AA01C). This story does not necessarily reflect the views of this organization.

 

Original story by Emily C. Nunez: https://cmns.umd.edu/news-events/news/new-study-identifies-mechanism-driving-suns-fast-wind

Insight into How Cells Get Signals from Physical Senses Could Lead to New Disease Treatments

The body’s cells are constantly receiving and reacting to signals from their environment. A lot is known about how a cell senses and responds to chemical signals, or biomolecules, such as COVID-19. But little is known about how signals from the physical environment, like touch, temperature or light, direct a cell’s activity. Understanding that process could lead to new ways of treating cancer and other disease.mage showing how the red mechano-chemical waves (actin waves) guide the signaling molecules (green). Image courtesy of UMD MURI team.mage showing how the red mechano-chemical waves (actin waves) guide the signaling molecules (green). Image courtesy of UMD MURI team.

A new study published May 1, 2023 in the Proceedings of the National Academy of Sciences by a University of Maryland-led Multidisciplinary University Research Initiative (MURI) funded by the Air Force Office of Scientific Research has opened the door to seeing how cells react to physical signals.

“We elucidated a cell's sense of touch,” said Professor Wolfgang Losert, a team leader of the study. “We think how cells sense the physical environment may be quite distinct from how they sense the chemical environment. This may help us develop new treatment options for conditions that involve altered physical cellular environments, such as tumors, immune disease and wound healing.”

A major difference between chemical and physical signals is size. Chemical signals are 100,000 times smaller than the width of a human hair. Physical cues are the heavyweights in the ring.

“We looked at how cells sense crucial physical cues from their environment that are on the order of 100 times larger than chemical signaling molecules,” said Losert, who also has a joint appointment in UMD’s Institute for Physical Science and Technology (IPST).

“We’re really answering a kind of long-standing mystery of how cells react to cues in their environment that are on a physical rather than chemical-size scale,” said paper co-author and MURI team member John T. Fourkas, a professor in UMD’s Department of Chemistry and Biochemistry with a joint appointment in IPST.

The MURI team studied the major players in a cell’s interaction with its physical environment: the cytoskeleton, a network of proteins that surround a cell and acts as a direct sensor of the physical environment; actin, the protein that keeps cells connected; and the cell’s signaling pathways.

Qixin Yang (Ph.D. ’22, physics), who led the experiments and analysis for her Ph.D. research at UMD, said, “I think our work related to the cytoskeleton shows that it plays an important role in sensing physical cues, like pain.”

The MURI team found that the networks that guide cell migration are upstream for chemical sensing and downstream for physical, topographic sensing; and that actin is the direct sensor for both types of signals.

“I think this is the first real crucial confirmation that actin itself is the sensor and that the waves are really where they are in the sensing pathway, not way downstream, but up front and center,” Fourkas said.

“Our findings reveal that, in much the same way that patterns of waves in the ocean allow an expert surfer to understand the undersea topography, the so-called ‘mechano-chemical’ waves in cells are key in sensing signals from their physical environment that are much larger than single proteins,” Losert said. “That has implications for how you might design physical interventions to change the behavior of cells.”

For instance, previous research by a co-author of this study, Peter Devreotes of Johns Hopkins University, found that actin dynamics were different for cancer cells considered most invasive.

“Understanding how drugs impact waves is an important additional piece of information that may be used in making decisions on treatment options,” Losert said. “I see our study also providing pointers on how you can improve the ability of immune cells to be guided to their target.”

The MURI team is made up of researchers in physics, chemistry, biology, bioengineering and dermatology from the University of Maryland and several other institutions.

###

In addition to Losert, Fourkas and Yang, UMD chemistry graduate student Matt Hourwitz was a co-author of the paper.

The paper, “Nanotopography modulates intracellular excitable systems through cytoskeleton actuation," was published in PNAS on May 1, 2023.

This research was supported by the Air Force Office of Scientific Research (Award No. FA9550-16-1-0052). This story does not necessarily reflect the views of this organization.

Original story by Ellen Ternes: https://cmns.umd.edu/news-events/news/afosr-muri-insight-how-cells-get-signals-physical-senses-could-lead-new-disease-treatments