Liwaiwai Liwaiwai
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • About
Liwaiwai Liwaiwai
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • About
  • Artificial Intelligence
  • Research

Our Progress Toward Quantum Error Correction

  • March 1, 2023
  • relay

Three years ago, our quantum computers were the first to demonstrate a computational task in which they outperformed the fastest supercomputers. It was a significant milestone on our roadmap toward building a large-scale quantum computer, and the “hello world” moment so many of us had been hoping for. Yet in the long arc of scientific progress it was just one step towards making quantum applications meaningful to human progress.

Now, we’re taking another big step forward: For the first time ever, our Quantum AI researchers have experimentally demonstrated that it’s possible to reduce errors by increasing the number of qubits. In quantum computing, a qubit is a basic unit of quantum information that can take on richer states that extend beyond just 0 and 1. Our breakthrough represents a significant shift in how we operate quantum computers. Instead of working on the physical qubits on our quantum processor one by one, we are treating a group of them as one logical qubit. As a result, a logical qubit that we made from 49 physical qubits was able to outperform one we made from 17 qubits. Nature is publishing our research today.

Here’s why this milestone is important: Our quantum computers work by manipulating qubits in an orchestrated fashion that we call quantum algorithms. The challenge is that qubits are so sensitive that even stray light can cause calculation errors — and the problem worsens as quantum computers grow. This has significant consequences, since the best quantum algorithms that we know for running useful applications require the error rates of our qubits to be far lower than we have today. To bridge this gap, we will need quantum error correction.

Read More  Can AI Accurately Predict Depression Using Our Voice?

Quantum error correction protects information by encoding it across multiple physical qubits to form a “logical qubit,” and is believed to be the only way to produce a large-scale quantum computer with error rates low enough for useful calculations. Instead of computing on the individual qubits themselves, we will then compute on logical qubits. By encoding larger numbers of physical qubits on our quantum processor into one logical qubit, we hope to reduce the error rates to enable useful quantum algorithms.

It’s the first time anyone has achieved this experimental milestone of scaling a logical qubit. We’ve been working towards this milestone and the ones ahead because quantum computers have the potential to bring tangible benefits to the lives of millions. Someday, we believe quantum computers will be used to identify molecules for new medicines, create fertilizer using less energy, design more efficient sustainable technologies from batteries to nuclear fusion reactors, and produce physics research that will lead to advances we can’t yet imagine. That’s why we’re working on eventually making quantum hardware, tools and applications available to customers and partners, including through Google Cloud, so that they can harness the power of quantum in new and exciting ways.

Helping others to realize the full potential of quantum will require us to achieve even more technical milestones in order to scale to thousands of logical qubits with low error rates. There’s a long road ahead — several components of our technology will need to be improved, from cryogenics to control electronics to the design and materials of our qubits. With such developments, large-scale quantum computers will come into clearer view. Developing quantum processors is also an excellent testbed for AI-assisted engineering as we explore the use of machine learning to improve our processes.

Read More  Singapore Unveils Its National Artificial Intelligence Strategy

We are also taking steps to develop quantum computing responsibly, given its powerful potential. Our partnerships with governments and the security community are helping to create systems that can protect internet traffic from future quantum computer attacks. And we’re making sure services like Google Cloud, Android and Chrome remain safe and secure in a quantum future.

I am inspired by what quantum computing could mean for the future of our users, customers and partners, and the world. We’ll continue to work towards a day when quantum computers can work in tandem with classical computers to expand the boundaries of human knowledge and help us find solutions to some of the world’s most complex problems.

Source: Cyberpogo

relay

Related Topics
  • Google Cloud
  • Quantum Error
  • Research
You May Also Like
View Post
  • Artificial Intelligence
  • Machine Learning

6 ways Google AI Is Helping You Sleep Better

  • March 21, 2023
View Post
  • Artificial Intelligence
  • Machine Learning

AI Could Make More Work For Us, Instead Of Simplifying Our Lives

  • March 21, 2023
View Post
  • Artificial Intelligence
  • Platforms

Microsoft To Showcase Purpose-Built AI Infrastructure At NVIDIA GTC

  • March 21, 2023
View Post
  • Artificial Intelligence
  • Engineering
  • Tools

The Next Generation Of AI For Developers And Google Workspace

  • March 21, 2023
View Post
  • Artificial Intelligence
  • Technology

Limits To Computing: A Computer Scientist Explains Why Even In The Age Of AI, Some Problems Are Just Too Difficult

  • March 17, 2023
View Post
  • Artificial Intelligence
  • Machine Learning
  • Platforms
  • Technology

Using ML To Predict The Weather And Climate Risk

  • March 16, 2023
View Post
  • Artificial Intelligence
  • Platforms
  • Technology

Google Is A Leader In The 2023 Gartner® Magic Quadrant™ For Enterprise Conversational AI Platforms

  • March 16, 2023
View Post
  • Artificial Intelligence
  • Technology

The Future Of AI Is Promising Yet Turbulent

  • March 16, 2023
Stay Connected!
LATEST
  • 1
    6 ways Google AI Is Helping You Sleep Better
    • March 21, 2023
  • 2
    AI Could Make More Work For Us, Instead Of Simplifying Our Lives
    • March 21, 2023
  • 3
    Microsoft To Showcase Purpose-Built AI Infrastructure At NVIDIA GTC
    • March 21, 2023
  • 4
    The Next Generation Of AI For Developers And Google Workspace
    • March 21, 2023
  • 5
    Sumitovant More Than Doubles Its Research Output In Its Quest To Save Lives
    • March 21, 2023
  • 6
    How Osmo Is Digitizing Smell With Google Cloud AI Technology
    • March 20, 2023
  • 7
    Built With BigQuery: How Sift Delivers Fraud Detection Workflow Backtesting At Scale
    • March 20, 2023
  • 8
    Building The Most Open And Innovative AI Ecosystem
    • March 20, 2023
  • 9
    Understand And Trust Data With Dataplex Data Lineage
    • March 17, 2023
  • 10
    Limits To Computing: A Computer Scientist Explains Why Even In The Age Of AI, Some Problems Are Just Too Difficult
    • March 17, 2023

about
About
Hello World!

We are liwaiwai.com. Created by programmers for programmers.

Our site aims to provide materials, guides, programming how-tos, and resources relating to artificial intelligence, machine learning and the likes.

We would like to hear from you.

If you have any questions, enquiries or would like to sponsor content, kindly reach out to us at:

[email protected]

Live long & prosper!
Most Popular
  • 1
    The Benefits And Core Processes Of Data Wrangling
    • March 17, 2023
  • 2
    We Cannot Even Agree On Dates…
    • March 17, 2023
  • 3
    Financial Crisis: It’s A Game & We’re All Being Played
    • March 17, 2023
  • 4
    Using ML To Predict The Weather And Climate Risk
    • March 16, 2023
  • 5
    Google Is A Leader In The 2023 Gartner® Magic Quadrant™ For Enterprise Conversational AI Platforms
    • March 16, 2023
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
  • About

Input your search keywords and press Enter.