Liwaiwai Liwaiwai
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • Learning
  • About
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • Learning
  • About
Liwaiwai Liwaiwai
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • Learning
  • About
  • Machine Learning

Machine Learning Pushes Quantum Computing Forward

  • March 24, 2020
  • admin

Researchers have created a machine learning framework to precisely locate atom-sized quantum bits in silicon. It’s a crucial step for building a large-scale silicon quantum computer, the researchers report. 

Here, Muhammad Usman and Lloyd Hollenberg of the University of Melbourne explain their research and what it means for the future of quantum computers:


Partner with liwaiwai.com
for your next big idea.
Let us know here.



From our partners:

CITI.IO :: Business. Institutions. Society. Global Political Economy.
CYBERPOGO.COM :: For the Arts, Sciences, and Technology.
DADAHACKS.COM :: Parenting For The Rest Of Us.
ZEDISTA.COM :: Entertainment. Sports. Culture. Escape.
TAKUMAKU.COM :: For The Hearth And Home.
ASTER.CLOUD :: From The Cloud And Beyond.
LIWAIWAI.COM :: Intelligence, Inside and Outside.
GLOBALCLOUDPLATFORMS.COM :: For The World's Computing Needs.
FIREGULAMAN.COM :: For The Fire In The Belly Of The Coder.
ASTERCASTER.COM :: Supra Astra. Beyond The Stars.
BARTDAY.COM :: Prosperity For Everyone.

Quantum computers are expected to offer tremendous computational power for complex problems—currently intractable even on supercomputers—in the areas of drug design, data science, astronomy, and materials chemistry among others.

The high technological and strategic stakes mean major technology companies as well as ambitious start-ups and government-funded research centers are all in the race to build the world’s first universal quantum computer.

A map of electron wave function patterns, where the symmetry, brightness, and size of features is directly related to the position of a phosphorus atom in silicon lattice. (Credit: M.Usman/ U. Melbourne)

Qubits & Quantum Computers

In contrast to today’s classical computers, where information is encoded in bits (0 or 1), quantum computers process information stored in quantum bits (qubits). These are hosted by quantum mechanical objects like electrons, the negatively charged particles of an atom.

Quantum states can also be binary and can be put in one of two possibilities, or effectively both at the same time—known as quantum superposition—offering an exponentially larger computational space with an increasing number of qubits.

This unique data-crunching power is further boosted by entanglement, another magical property of quantum mechanics where the state of one qubit is able to dictate the state of another qubit without any physical connection, making them all 1’s for example. Einstein called it a “spooky action at distance.”

Different research groups in the world are pursuing different kinds of qubits, each having its own benefits and limitations. Some qubits offer potential for scalability, while others come with very long coherence times, that is the time for which quantum information can be robustly stored.

Read More  How Sweet It Is: Using Cloud AI To Whip Up New Treats With Mars Maltesers

Qubits in silicon are highly promising as they offer both. Therefore, these qubits are one of the front-runner candidates for the design and implementation of a large-scale quantum computer architecture.

One way to implement large-scale quantum computer architecture in silicon is by placing individual phosphorus atoms on a two-dimensional grid.

The single and two qubit logical operations are controlled by a grid of nanoelectronic wires, bearing some resemblance to classical logic gates for conventional microelectronic circuits. However, key to this scheme is ultra-precise placement of phosphorus atoms on the silicon grid.

What’s Holding Things Back?

However, even with state-of-the-art fabrication technologies, placing phosphorus atoms at precise locations in silicon lattice is a very challenging task. Small variations, of the order of one atomic lattice site, in their positions are often observed and may have a huge impact on the efficiency of two qubit operations.

The problem arises from the ultra-sensitive dependence of the exchange interaction between the electron qubits on phosphorus atoms in silicon. Exchange interaction is a fundamental quantum mechanical property where two subatomic particles such as electrons can interact in real space when their wave functions overlap and make interference patterns, much like the two traveling waves interfering on water surface.

Exchange interaction between electrons on phosphorus atom qubits can be exploited to implement fast two-qubit gates, but any unknown variation can be detrimental to accuracy of quantum gate. Like logic gates in a conventional computer, the quantum gates are the building blocks of a quantum circuit.

For phosphorus qubits in silicon, even an uncertainty in the location of qubit atom of the order of one atomic lattice site can alter the corresponding exchange interaction by orders of magnitude, leading to errors in two-qubit gate operations.

Read More  Scientific Machine Learning Paves Way for Rapid Rocket Engine Design

Such errors, accumulated over the large-scale architecture, may severely impede the efficiency of quantum computer, diminishing any quantum advantage expected due to the quantum mechanical properties of qubits.

Pinpointing Qubit Atoms

So in 2016, we worked with the Center for Quantum Computation & Communication Technology researchers at the University of New South Wales, to develop a technique that could pinpoint exact locations of phosphorus atoms in silicon.

The technique, reported in Nature Nanotechnology, was the first to use computed scanning tunneling microscope (STM) images of phosphorus atom wave functions to pinpoint their spatial locations in silicon.

The images were calculated using a computational framework which allowed electronic calculations to be performed on millions of atoms utilizing Australia’s national supercomputer facilities at the Pawsey supercomputing center.

These calculations produced maps of electron wave function patterns, where the symmetry, brightness, and size of features was directly related to the position of a phosphorus atom in silicon lattice, around which the electron was bound.

The fact that each donor atom positions led to a distinct map, pinpointing of qubit atom locations, known as spatial metrology, with single lattice site precision was achieved.

The technique worked very well at the individual qubit level. However, the next big challenge was to build a framework that could perform this exact atom spatial pinpointing with high speed and minimal human interaction coping with the requirements of a universal fault tolerant quantum computer.

Machine Learning To The Rescue

Machine learning is an emerging area of research which is revolutionizing almost every field of research, from medical science to image processing, robotics, and material design.

A carefully trained machine learning algorithm can process very large data sets with enormous efficiency.

Read More  UC Berkeley To Lead $25 Million Quantum Computing Center

One branch of machine learning is known as convolutional neural network (CNN)—an extremely powerful tool for image recognition and classification problems. When a CNN is trained on thousands of sample images, it can precisely recognize unknown images (including noise) and perform classifications.

Recognizing that the principle underpinning the established spatial metrology of qubit atoms is basically recognizing and classifying feature maps of STM images, we decided to train a CNN on the computed STM images. The work is published in the NPJ Computational Materials journal.

The training involved 100,000 STM images and achieved a remarkable learning of above 99% for the CNN. We then tested the trained CNN for 17600 test images including blurring and asymmetry noise typically present in the realistic environments.

The CNN classified the test images with an accuracy of above 98%, confirming that this machine learning-based technique could process qubit measurement data with high-throughput, high precision, and minimal human interaction.

This technique also has the potential to scale up for qubits consisting of more than one phosphorus atoms, where the number of possible image configurations would exponentially increase. However, machine learning-based framework could readily include any number of possible configurations.

In the coming years, as the number of qubits increase and size of quantum devices grow, qubit characterization via manual measurements is likely to be highly challenging and onerous.

This work shows how machine learning techniques such as developed in this work could play a crucial role in this aspect of the realization of a full-scale fault-tolerant universal quantum computer—the ultimate goal of the global research effort.

Source: University of Melbourne

Original Study DOI: 10.1038/s41524-020-0282-0
Source: Futurity

For enquiries, product placements, sponsorships, and collaborations, connect with us at [email protected]. We'd love to hear from you!

Our humans need coffee too! Your support is highly appreciated, thank you!

admin

Related Topics
  • Quantum Computers
  • Quantum Computing
  • Qubits
  • Silicon
You May Also Like
View Post
  • Artificial Intelligence
  • Engineering
  • Machine Learning
  • Platforms

Bring AI To Looker With The Machine Learning Accelerator

  • September 28, 2023
View Post
  • Artificial Intelligence
  • Machine Learning

Canonical releases Charmed MLFlow

  • September 26, 2023
Microsoft and Adobe
View Post
  • Artificial Intelligence
  • Machine Learning
  • Platforms

Microsoft And Adobe Partner To Deliver Cost Savings And Business Benefits

  • September 21, 2023
Data
View Post
  • Artificial Intelligence
  • Machine Learning
  • Technology

UK Space Sector Has Sights Set On Artificial Intelligence And Machine Learning Professionals

  • September 15, 2023
Data
View Post
  • Artificial Intelligence
  • Engineering
  • Machine Learning
  • Platforms

How Verve Group Transforms Customer Experiences With Google Cloud Vertex AI

  • September 11, 2023
View Post
  • Artificial Intelligence
  • Machine Learning
  • Technology

ListenField Enables Farmers To Harvest The Benefits Of AI And Machine Learning

  • September 7, 2023
View Post
  • Artificial Intelligence
  • Hybrid Cloud
  • Machine Learning
  • Platforms

Red Hat OpenShift Now Available In AWS Marketplace For The U.S. Intelligence Community

  • September 6, 2023
View Post
  • Artificial Intelligence
  • Machine Learning
  • Software
  • Technology

Series Of Events Will Highlight Generative AI Use Cases Powered By Open Source Software

  • September 6, 2023
A Field Guide To A.I.
Navigate the complexities of Artificial Intelligence and unlock new perspectives in this must-have guide.
Now available in print and ebook.

charity-water



Stay Connected!
LATEST
  • OpenAI 1
    How We Interact With Information: The New Era Of Search
    • September 28, 2023
  • 2
    Bring AI To Looker With The Machine Learning Accelerator
    • September 28, 2023
  • 3
    3 Questions: A New PhD Program From The Center For Computational Science And Engineering
    • September 28, 2023
  • 4
    Microsoft And Mercy Collaborate To Empower Clinicians To Transform Patient Care With Generative AI
    • September 27, 2023
  • 5
    Canonical releases Charmed MLFlow
    • September 26, 2023
  • 6
    NASA’s Mars Rovers Could Inspire A More Ethical Future For AI
    • September 26, 2023
  • 7
    Oracle CloudWorld 2023: 6 Key Takeaways From The Big Annual Event
    • September 25, 2023
  • 8
    3 Ways AI Can Help Communities Adapt To Climate Change In Africa
    • September 25, 2023
  • Robotic Hand | Lights 9
    Nvidia H100 Tensor Core GPUs Come To Oracle Cloud
    • September 24, 2023
  • 10
    AI-Driven Tool Makes It Easy To Personalize 3D-Printable Models
    • September 22, 2023

about
About
Hello World!

We are liwaiwai.com. Created by programmers for programmers.

Our site aims to provide materials, guides, programming how-tos, and resources relating to artificial intelligence, machine learning and the likes.

We would like to hear from you.

If you have any questions, enquiries or would like to sponsor content, kindly reach out to us at:

[email protected]

Live long & prosper!
Most Popular
  • 1
    Huawei: Advancing a Flourishing AI Ecosystem Together
    • September 22, 2023
  • Coffee | Laptop | Notebook | Work 2
    First HP Work Relationship Index Shows Majority of People Worldwide Have an Unhealthy Relationship with Work
    • September 20, 2023
  • 3
    Huawei Connect 2023: Accelerating Intelligence For Shared Success
    • September 20, 2023
  • 4
    Applying Generative AI To Product Design With BigQuery DataFrames
    • September 21, 2023
  • 5
    Combining AI With A Trusted Data Approach On IBM Power To Fuel Business Outcomes
    • September 21, 2023
  • /
  • Artificial Intelligence
  • Explore
  • About
  • Contact Us

Input your search keywords and press Enter.