Liwaiwai Liwaiwai
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • Learning
  • About
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • Learning
  • About
Liwaiwai Liwaiwai
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • Learning
  • About
  • Artificial Intelligence
  • Technology

Stanford Engineers Present New Chip That Ramps Up AI Computing Efficiency

  • August 22, 2022
  • liwaiwai.com

AI-powered edge computing is already pervasive in our lives. Devices like drones, smart wearables, and industrial IoT sensors are equipped with AI-enabled chips so that computing can occur at the “edge” of the internet, where the data originates. This allows real-time processing and guarantees data privacy.

The NeuRRAM chip is not only twice as energy efficient as state-of-the-art, it’s also versatile and delivers results that are just as accurate as conventional digital chips. (Image credit: David Baillot/University of California San Diego.)

 


Partner with liwaiwai.com
for your next big idea.
Let us know here.



From our partners:

CITI.IO :: Business. Institutions. Society. Global Political Economy.
CYBERPOGO.COM :: For the Arts, Sciences, and Technology.
DADAHACKS.COM :: Parenting For The Rest Of Us.
ZEDISTA.COM :: Entertainment. Sports. Culture. Escape.
TAKUMAKU.COM :: For The Hearth And Home.
ASTER.CLOUD :: From The Cloud And Beyond.
LIWAIWAI.COM :: Intelligence, Inside and Outside.
GLOBALCLOUDPLATFORMS.COM :: For The World's Computing Needs.
FIREGULAMAN.COM :: For The Fire In The Belly Of The Coder.
ASTERCASTER.COM :: Supra Astra. Beyond The Stars.
BARTDAY.COM :: Prosperity For Everyone.

 

However, AI functionalities on these tiny edge devices are limited by the energy provided by a battery. Therefore, improving energy efficiency is crucial. In today’s AI chips, data processing and data storage happen at separate places – a compute unit and a memory unit. The frequent data movement between these units consumes most of the energy during AI processing, so reducing the data movement is the key to addressing the energy issue.

Stanford University engineers have come up with a potential solution: a novel resistive random-access memory (RRAM) chip that does the AI processing within the memory itself, thereby eliminating the separation between the compute and memory units. Their “compute-in-memory” (CIM) chip, called NeuRRAM, is about the size of a fingertip and does more work with limited battery power than what current chips can do.

“Having those calculations done on the chip instead of sending information to and from the cloud could enable faster, more secure, cheaper, and more scalable AI going into the future, and give more people access to AI power,” said H.-S Philip Wong, the Willard R. and Inez Kerr Bell Professor in the School of Engineering.

“The data movement issue is similar to spending eight hours in commute for a two-hour workday,” added Weier Wan, a recent graduate at Stanford leading this project. “With our chip, we are showing a technology to tackle this challenge.”

They presented NeuRRAM in a recent article in the journal Nature. While compute-in-memory has been around for decades, this chip is the first to actually demonstrate a broad range of AI applications on hardware, rather than through simulation alone.

Read More  PT Meratus Line Enters Alliance With Google Cloud And PT Metrodata Electronics Tbk To Build Indonesia’s First Maritime Logistics Super App

Putting computing power on the device

To overcome the data movement bottleneck, researchers implemented what is known as compute-in-memory (CIM), a novel chip architecture that performs AI computing directly within memory rather than in separate computing units. The memory technology that NeuRRAM used is resistive random-access memory (RRAM). It is a type of non-volatile memory – memory that retains data even once power is off – that has emerged in commercial products. RRAM can store large AI models in a small area footprint, and consume very little power, making them perfect for small-size and low-power edge devices.

Even though the concept of CIM chips is well established, and the idea of implementing AI computing in RRAM isn’t new, “this is one of the first instances to integrate a lot of memory right onto the neural network chip and present all benchmark results through hardware measurements,” said Wong, who is a co-senior author of the Nature paper.

The architecture of NeuRRAM allows the chip to perform analog in-memory computation at low power and in a compact-area footprint. It was designed in collaboration with the lab of Gert Cauwenberghs at the University of California, San Diego, who pioneered low-power neuromorphic hardware design. The architecture also enables reconfigurability in dataflow directions, supports various AI workload mapping strategies, and can work with different kinds of AI algorithms – all without sacrificing AI computation accuracy.

To show the accuracy of NeuRRAM’s AI abilities, the team tested how it functioned on different tasks. They found that it’s 99% accurate in letter recognition from the MNIST dataset, 85.7% accurate on image classification from the CIFAR-10 dataset, 84.7% accurate on Google speech command recognition and showed a 70% reduction in image-reconstruction error on a Bayesian image recovery task.

Read More  Apple Unveils M3, M3 Pro, And M3 Max, The Most Advanced Chips For A Personal Computer

“Efficiency, versatility, and accuracy are all important aspects for broader adoption of the technology,” said Wan. “But to realize them all at once is not simple. Co-optimizing the full stack from hardware to software is the key.”

“Such full-stack co-design is made possible with an international team of researchers with diverse expertise,” added Wong.

Fueling edge computations of the future

Right now, NeuRRAM is a physical proof-of-concept but needs more development before it’s ready to be translated into actual edge devices.

But this combined efficiency, accuracy, and ability to do different tasks showcases the chip’s potential. “Maybe today it is used to do simple AI tasks such as keyword spotting or human detection, but tomorrow it could enable a whole different user experience. Imagine real-time video analytics combined with speech recognition all within a tiny device,” said Wan. “To realize this, we need to continue improving the design and scaling RRAM to more advanced technology nodes.”

“This work opens up several avenues of future research on RRAM device engineering, and programming models and neural network design for compute-in-memory, to make this technology scalable and usable by software developers”, said Priyanka Raina, assistant professor of electrical engineering and a co-author of the paper.

If successful, RRAM compute-in-memory chips like NeuRRAM have almost unlimited potential. They could be embedded in crop fields to do real-time AI calculations for adjusting irrigation systems to current soil conditions. Or they could turn augmented reality glasses from clunky headsets with limited functionality to something more akin to Tony Stark’s viewscreen in the Iron Man and Avengers movies (without intergalactic or multiverse threats – one can hope).

Read More  Moving Toward A Clean-Energy Future By Advancing Fuel Cell Technology

If mass produced, these chips would be cheap enough, adaptable enough, and low power enough that they could be used to advance technologies already improving our lives, said Wong, like in medical devices that allow home health monitoring.

They can be used to solve global societal challenges as well: AI-enabled sensors would play a role in tracking and addressing climate change. “By having these kinds of smart electronics that can be placed almost anywhere, you can monitor the changing world and be part of the solution,” Wong said. “These chips could be used to solve all kinds of problems from climate change to food security.”

Additional co-authors of this work include researchers from University of California San Diego (co-lead), Tsinghua University, University of Notre Dame, and University of Pittsburgh. Former Stanford graduate student Sukru Burc Eryilmaz is also a co-author. Wong is a member of Stanford Bio-X and the Wu Tsai Neurosciences Institute, and an affiliate of the Precourt Institute for Energy. He is also Faculty Director of the Stanford Nanofabrication Facility and the founding faculty co-director of the Stanford SystemX Alliance – an industrial affiliate program at Stanford focused on building systems.

This research was funded by the National Science Foundation Expeditions in Computing, SRC JUMP ASCENT Center, Stanford SystemX Alliance, Stanford NMTRI, Beijing Innovation Center for Future Chips, National Natural Science Foundation of China, and the Office of Naval Research.

 

 

Source Stanford


For enquiries, product placements, sponsorships, and collaborations, connect with us at [email protected]. We'd love to hear from you!

Our humans need coffee too! Your support is highly appreciated, thank you!

liwaiwai.com

Related Topics
  • Chip
  • CIM
  • NeuRRAM
  • Stanford
You May Also Like
View Post
  • Artificial Intelligence
  • Technology

AI For Impact: How Google Cloud Is Bringing AI To Accelerate Climate Action

  • December 4, 2023
Birthday Cake
View Post
  • Technology

How ChatGPT Altered Our World in Just One Year

  • December 1, 2023
View Post
  • People
  • Technology

Why Student Experiments With Generative AI Matter For Our Collective Learning

  • November 30, 2023
ChatGPT
View Post
  • Artificial Intelligence
  • Engineering
  • Technology

Why Student Experiments With Generative AI Matter For Our Collective Learning

  • November 29, 2023
Code
View Post
  • Artificial Intelligence
  • Data
  • Engineering
  • Technology

Why We Can’t Leave AI To The Machines

  • November 28, 2023
Claude 2.1
View Post
  • Artificial Intelligence
  • Engineering
  • Tools

Introducing Claude 2.1

  • November 26, 2023
Ubuntu. Chiselled containers.
View Post
  • Engineering
  • Technology

Canonical Announces The General Availability Of Chiselled Ubuntu Containers

  • November 26, 2023
Microsoft. Windows
View Post
  • Artificial Intelligence
  • People
  • Technology

Ousted Sam Altman To Lead New Microsoft AI Team

  • November 21, 2023
A Field Guide To A.I.
Navigate the complexities of Artificial Intelligence and unlock new perspectives in this must-have guide.
Now available in print and ebook.

charity-water



Stay Connected!
LATEST
  • 1
    AI For Impact: How Google Cloud Is Bringing AI To Accelerate Climate Action
    • December 4, 2023
  • Birthday Cake 2
    How ChatGPT Altered Our World in Just One Year
    • December 1, 2023
  • 3
    Why Student Experiments With Generative AI Matter For Our Collective Learning
    • November 30, 2023
  • OpenAI 4
    Sam Altman Returns As CEO, OpenAI Has A New Initial Board
    • November 30, 2023
  • ChatGPT 5
    Why Student Experiments With Generative AI Matter For Our Collective Learning
    • November 29, 2023
  • Code 6
    Why We Can’t Leave AI To The Machines
    • November 28, 2023
  • Data center. Servers. 7
    Intel Granulate Optimizes Databricks Data Management Operations
    • November 27, 2023
  • Claude 2.1 8
    Introducing Claude 2.1
    • November 26, 2023
  • Ubuntu. Chiselled containers. 9
    Canonical Announces The General Availability Of Chiselled Ubuntu Containers
    • November 26, 2023
  • Cyber Monday Sale. Guzz. Ideals collection. 10
    Decode Workweek Style with guzz
    • November 24, 2023

about
About
Hello World!

We are liwaiwai.com. Created by programmers for programmers.

Our site aims to provide materials, guides, programming how-tos, and resources relating to artificial intelligence, machine learning and the likes.

We would like to hear from you.

If you have any questions, enquiries or would like to sponsor content, kindly reach out to us at:

[email protected]

Live long & prosper!
Most Popular
  • Oracle | Microsoft 1
    Oracle Cloud Infrastructure Utilized by Microsoft for Bing Conversational Search
    • November 7, 2023
  • Riyadh Air and IBM 2
    Riyadh Air And IBM Sign Collaboration Agreement To Establish Technology Foundation Of The Digitally Led Airline
    • November 6, 2023
  • Ingrasys 3
    Ingrasys Unveils Next-Gen AI And Cooling Solutions At Supercomputing 2023
    • November 15, 2023
  • Guzz. Black Friday Specials. 4
    Art Meets Algorithm In Our Exclusive Shirt Collection!
    • November 24, 2023
  • Presents. Gifts. 5
    25 Besties Bargain Bags Below $100 This Black Friday 2023
    • November 23, 2023
  • /
  • Artificial Intelligence
  • Explore
  • About
  • Contact Us

Input your search keywords and press Enter.