Pristine Quantum Light Source Created at the Edge of Silicon Chip

hafezi mittal goldschmidt nature 09 2018 psResearchers configure silicon rings on a chip to emit high-quality photons for use in quantum information processing. Credit: E. Edwards/JQI.

Protected Pathways for Light Offer a way to Streamline Single Photon Production.

The smallest amount of light you can have is one photon, so dim that it’s pretty much invisible to humans. While imperceptible, these tiny blips of energy are useful for carrying quantum information around. Ideally, every quantum courier would be the same, but there isn’t a straightforward way to produce a stream of identical photons. This is particularly challenging when individual photons come from fabricated chips.

Now, researchers at the Joint Quantum Institute (JQI) have demonstrated a new approach that enables different devices to repeatedly emit nearly identical single photons. The team, led by JQI Fellow Mohammad Hafezi, made a silicon chip that guides light around the device’s edge, where it is inherently protected against disruptions. Previously, Hafezi and colleagues showed that this design can reduce the likelihood of optical signal degradation. In a paper published online on Sept. 10 in Nature , the team explains that the same physics which protects the light along the chip’s edge also ensures reliable photon production.

Single photons, which are an example of quantum light, are more than just really dim light. This distinction has a lot to do with where the light comes from. “Pretty much all of the light we encounter in our everyday lives is packed with photons,” says Elizabeth Goldschmidt, a researcher at the US Army Research Laboratory and co-author on the study.  “But unlike a light bulb, there are some sources that actually emit light, one photon at time, and this can only be described by quantum physics,” adds Goldschmidt.

Many researchers are working on building reliable quantum light emitters so that they can isolate and control the quantum properties of single photons. Goldschmidt explains that such light sources will likely be important for future quantum information devices as well as further understanding the mysteries of quantum physics. “Modern communications relies heavily on non-quantum light,” says Goldschmidt. “Similarly, many of us believe that single photons are going to be required for any kind of quantum communication application out there.”

Scientists can generate quantum light using a natural color-changing process that occurs when a beam of light passes through certain materials. In this experiment the team used silicon, a common industrial choice for guiding light, to convert infrared laser light into pairs of different-colored single photons.

They injected light into a chip containing an array of miniscule silicon loops. Under the microscope, the loops look like linked-up glassy racetracks. The light circulates around each loop thousands of times before moving on to a neighboring loop. Stretched out, the light’s path would be several centimeters long, but the loops make it possible to fit the journey in a space that is about 500 times smaller. The relatively long journey is necessary to get many pairs single photons out of the silicon chip.  

Such loop arrays are routinely used as single photon sources, but small differences between chips will cause the photon colors to vary from one device to the next. Even within a single device, random defects in the material may reduce the average photon quality. This is a problem for quantum information applications where researchers need the photons to be as close to identical as possible.

The team circumvented this issue by arranging the loops in a way that always allows the light to travel undisturbed around the edge of the chip, even if fabrication defects are present. This design not only shields the light from disruptions—it  also restricts how single photons form within those edge channels. The loop layout essentially forces each photon pair to be nearly identical to the next, regardless of microscopic differences among the rings. The central part of the chip does not contain protected routes, and so any photons created in those areas are affected by material defects.

The researchers compared their chips to ones without any protected routes. They collected pairs of photons from the different chips, counting the number emitted and noting their color. They observed that their quantum light source reliably produced high quality, single-color photons time and again, whereas the conventional chip’s output was more unpredictable.

“We initially thought that we would need to be more careful with the design, and that the photons would be more sensitive to our chip’s fabrication process,” says Sunil Mittal, a JQI postdoctoral researcher and lead author on the new study. “But, astonishingly, photons generated in these shielded edge channels are always nearly identical, regardless of how bad the chips are.”

Mittal adds that this device has one additional advantage over other single photon sources. “Our chip works at room temperature. I don’t have to cool it down to cryogenic temperatures like other quantum light sources, making it a comparatively very simple setup.”

The team says that this finding could open up a new avenue of research, which unites quantum light with photonic devices having built-in protective features. “Physicists have only recently realized that shielded pathways fundamentally alter the way that photons interact with matter,” says Mittal. “This could have implications for a variety of fields where light-matter interactions play a role, including quantum information science and optoelectronic technology.”

Written by D. Genkina and E. Edwards

Author Affiliations 

Sunil Mittal, Joint Quantum Institute, Institute for Research in Electronics and Applied Physics (IREAP)

Elizabeth Goldschmidt, Joint Quantum Institute and US Army Research Laboratory

Mohammad Hafezi, Joint Quantum Institute, UMD Electrical and Computer Engineering, IREAP and Department of Physics

Reference Publication

"A topological source of quantum light," S. MittalElizabeth A. GoldschmidtMohammad HafeziNature, , (2018)

Research Contact

S. Mittal:  This email address is being protected from spambots. You need JavaScript enabled to view it.

LHC Scientists Finally Detect Most Favored Higgs Decay

ZeeEvent Aug21 v5Candidate event showing the associated production of a Higgs boson and a Z boson, with the subsequent decay of the Higgs boson to a bottom quark and its antiparticle.

Scientists now know the fate of the vast majority of all Higgs bosons produced in the LHC.

Today at CERN, the Large Hadron Collider experiments ATLAS and CMS jointly announced the discovery of the Higgs boson transforming into bottom quarks as it decays. This is predicted to be the most common way for Higgs bosons to decay, yet was a difficult signal to isolate because it closely mimics ordinary background processes. This new discovery is a big step forward in the quest to understand how the Higgs enables fundamental particles to acquire mass.

After several years of refining their techniques and gradually incorporating more data, both experiments finally saw evidence of the Higgs decaying to bottom quarks that exceeds the 5-sigma threshold of statistical significance typically required to claim a discovery. Both teams found their results were consistent with predictions based on the Standard Model. UMD professors Alberto Belloni, Drew Baden, Sarah Eno, Nick Hadley and Andris Skuja are members of the CMS collaboration.

Higgs bosons are only produced in roughly one out of a billion LHC collisions and live only one-septillionth of a second before their energy is converted into a cascade of other particles. Because it is impossible to see Higgs bosons directly, scientists use these secondary particles to study the Higgs’ properties. Since its discovery in 2012, scientists have been able to identify only about thirty percent of all the predicted Higgs boson decays, while its decays to bottom quarks, which should occur sixty percent of the time, had not yet been observed.

“The Higgs boson is an integral component of our universe and theorized to give all fundamental particles their mass,” said Alberto Belloni of the University of Maryland. “But previously we had only directly seen the Higgs couplings to the tau lepton, and the W and Z bosons. Now we have seen the decay of the Higgs to a quark-antiquark pair. This measurement shows for the first time that the Higgs gives mass to a quark.”

The Higgs field is theorized to interact with all massive particles in the Standard Model, the best theory scientists have to explain the behavior of subatomic particles. But many scientists suspect that the Higgs could also interact with massive particles outside the Standard Model, such as dark matter. By finding and mapping the Higgs bosons’ interactions with known particles, scientists can simultaneously probe for new phenomena.

The next step is to increase the precision of these measurements so that scientists can study this decay mode with a much greater resolution and explore what secrets the Higgs boson might be hiding.

Further information:

ATLAS: https://atlas.cern/updates/press-statement/observation-higgs-boson-decay-pair-bottom-quarks

CMS: http://cms.cern/higgs-observed-decaying-b-quarks-submitted

JQI Scientists Monroe and Gorshkov are Part of a New, $15 Million NSF Quantum Computing Project

ion trapA fabricated trap that researchers use to capture and control atomic ion qubits (quantum bits). (Credit: K. Hudek/IonQ and E. Edwards/JQI)NSF has announced a $15 million award to a collaboration of seven institutions including the University of Maryland. The goal: Build the world’s first practical quantum computer.

"Quantum computers will change everything about the technology we use and how we use it, and we are still taking the initial steps toward realizing this goal," said NSF Director France Córdova. "Developing the first practical quantum computer would be a major milestone. By bringing together experts who have outlined a path to a practical quantum computer and supporting its development, NSF is working to take the quantum revolution from theory to reality."

Dubbed the Software-Tailored Architecture for Quantum co-design (STAQ) project, the new effort seeks to demonstrate a quantum advantage over traditional computers within five years using ion trap technology.

The project is the result of a National Science Foundation Ideas Lab—a week-long, free-form exchange among researchers from a wide range of fields that aims to spawn creative, collaborative proposals to address a given research challenge. The result of each Ideas Lab is interdisciplinary research that is high-risk, high-reward, cutting-edge and unlikely to be funded through traditional grant mechanisms.

JQI Fellow Christopher Monroe will lead the team developing the hardware. JQI Fellow Alexey Gorshkov will be involved in the theory side of the collaboration.

Text for this news item was adapted from the Duke University and NSF press releases on the award.

RESEARCH CONTACT
Christopher Monroe | This email address is being protected from spambots. You need JavaScript enabled to view it.;
MEDIA CONTACT
Emily Edwards | This email address is being protected from spambots. You need JavaScript enabled to view it.

Complexity Test Offers New Perspective on Small Quantum Computers

Simulating the behavior of quantum particles hopping around on a grid may be one of the first problems tackled by early quantum computers. (Credit: E. Edwards/JQI)

State-of-the-art quantum devices are not yet large enough to be called full-scale computers. The biggest comprise just a few dozen qubits—a meager count compared to the billions of bits in an ordinary computer’s memory. But steady progress means that these machines now routinely string together 10 or 20 qubits and may soon hold sway over 100 or more.

In the meantime, researchers are busy dreaming up uses for small quantum computers and mapping out the landscape of problems they’ll be suited to solving. A paper by researchers from the Joint Quantum Institute (JQI) and the Joint Center for Quantum Information and Computer Science (QuICS), published recently in Physical Review Letters, argues that a novel non-quantum perspective may help sketch the boundaries of this landscape and potentially even reveal new physics in future experiments.

The new perspective involves a mathematical tool—a standard measure of computational difficulty known as sampling complexity—that gauges how easy or hard it is for an ordinary computer to simulate the outcome of a quantum experiment. Because the predictions of quantum physics are probabilistic, a single experiment could never verify that these predictions are accurate. You would need to perform many experiments, just like you would need to flip a coin many times to convince yourself that you’re holding an everyday, unbiased nickel.

If an ordinary computer takes a reasonable amount of time to mimic one run of a quantum experiment—by producing samples with approximately the same probabilities as the real thing—the sampling complexity is low; if it takes a long time, the sampling complexity is high.

Few expect that quantum computers wielding lots of qubits will have low sampling complexity—after all, quantum computers are expected to be more powerful than ordinary computers, so simulating them on your laptop should be hard. But while the power of quantum computers remains unproven, exploring the crossover from low complexity to high complexity could offer fresh insights about the capabilities of early quantum devices, says Alexey Gorshkov, a JQI and QuICS Fellow who is a co-author of the new paper.

“Sampling complexity has remained an underappreciated tool,” Gorshkov says, largely because small quantum devices have only recently become reliable. “These devices are now essentially doing quantum sampling, and simulating this is at the heart of our entire field.”

To demonstrate the utility of this approach, Gorshkov and several collaborators proved that sampling complexity tracks the easy-to-hard transition of a task that small- and medium-sized quantum computers are expected to perform faster than ordinary computers: boson sampling.

Bosons are one of the two families of fundamental particles (the other being fermions). In general two bosons can interact with one another, but that’s not the case for the boson sampling problem. “Even though they are non-interacting in this problem, bosons are sort of just interesting enough to make boson sampling worth studying,” says Abhinav Deshpande, a graduate student at JQI and QuICS and the lead author of the paper.

In the boson sampling problem, a fixed number of identical particles are allowed to hop around on a grid, spreading out into quantum superpositions over many grid sites. Solving the problem means sampling from this smeared-out quantum probability cloud, something a quantum computer would have no trouble doing.

Deshpande, Gorshkov and their colleagues proved that there is a sharp transition between how easy and hard it is to simulate boson sampling on an ordinary computer. If you start with a few well-separated bosons and only let them hop around briefly, the sampling complexity remains low and the problem is easy to simulate. But if you wait longer, an ordinary computer has no chance of capturing the quantum behavior, and the problem becomes hard to simulate.

The result is intuitive, Deshpande says, since at short times the bosons are still relatively close to their starting positions and not much of their “quantumness” has emerged. For longer times, though, there’s an explosion of possibilities for where any given boson can end up. And because it’s impossible to tell two identical bosons apart from one another, the longer you let them hop around, the more likely they are to quietly swap places and further complicate the quantum probabilities. In this way, the dramatic shift in the sampling complexity is related to a change in the physics: Things don’t get too hard until bosons hop far enough to switch places.

Gorshkov says that looking for changes like this in sampling complexity may help uncover physical transitions in other quantum tasks or experiments. Conversely, a lack of ramping up in complexity may rule out a quantum advantage for devices that are too error-prone. Either way, Gorshkov says, future results arising from this perspective shift should be interesting. “A deeper look into the use of sampling complexity theory from computer science to study quantum many-body physics is bound to teach us something new and exciting about both fields,” he says.

Story by Chris Cesare

Reference Publication
"Dynamical Phase Transitions in Sampling Complexity," Abhinav Deshpande, Bill Fefferman, Minh C. Tran, Michael Foss-Feig, Alexey V. Gorshkov, Phys. Rev. Lett., 121, 030501 (2018)
Research Contact: Abhinav Deshpande, This email address is being protected from spambots. You need JavaScript enabled to view it.: Alexey Gorshkov, This email address is being protected from spambots. You need JavaScript enabled to view it.

Orginal story: https://jqi.umd.edu/news/complexity-test-offers-new-perspective-on-small-quantum-computers

Chris Monroe Co-authors Piece on National Quantum Initiative - The Washington Times

Quantum technology harnesses the radical power of quantum systems — such as isolated atoms, photons and electrons — to transform how we process and communicate information. But that potential can be realized only if our nation’s resources are focused in a way that helps bring quantum research from the laboratory to the marketplace.

Read More