Liwaiwai Liwaiwai
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • Learning
  • About
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • Learning
  • About
Liwaiwai Liwaiwai
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • Learning
  • About
  • Architecture
  • Data
  • Engineering
  • Machine Learning

System Brings Deep Learning To “Internet Of Things” Devices

  • November 14, 2020
  • liwaiwai.com
Iot devices
MIT researchers have developed a system, called MCUNet, that brings machine learning to microcontrollers. The advance could enhance the function and security of devices connected to the Internet of Things (IoT).

Deep learning is everywhere. This branch of artificial intelligence curates your social media and serves your Google search results. Soon, deep learning could also check your vitals or set your thermostat. MIT researchers have developed a system that could bring deep learning neural networks to new — and much smaller — places, like the tiny computer chips in wearable medical devices, household appliances, and the 250 billion other objects that constitute the “internet of things” (IoT).

The system, called MCUNet, designs compact neural networks that deliver unprecedented speed and accuracy for deep learning on IoT devices, despite limited memory and processing power. The technology could facilitate the expansion of the IoT universe while saving energy and improving data security.


Partner with liwaiwai.com
for your next big idea.
Let us know here.



From our partners:

CITI.IO :: Business. Institutions. Society. Global Political Economy.
CYBERPOGO.COM :: For the Arts, Sciences, and Technology.
DADAHACKS.COM :: Parenting For The Rest Of Us.
ZEDISTA.COM :: Entertainment. Sports. Culture. Escape.
TAKUMAKU.COM :: For The Hearth And Home.
ASTER.CLOUD :: From The Cloud And Beyond.
LIWAIWAI.COM :: Intelligence, Inside and Outside.
GLOBALCLOUDPLATFORMS.COM :: For The World's Computing Needs.
FIREGULAMAN.COM :: For The Fire In The Belly Of The Coder.
ASTERCASTER.COM :: Supra Astra. Beyond The Stars.
BARTDAY.COM :: Prosperity For Everyone.

The research will be presented at next month’s Conference on Neural Information Processing Systems. The lead author is Ji Lin, a PhD student in Song Han’s lab in MIT’s Department of Electrical Engineering and Computer Science. Co-authors include Han and Yujun Lin of MIT, Wei-Ming Chen of MIT and National University Taiwan, and John Cohn and Chuang Gan of the MIT-IBM Watson AI Lab.

 

The Internet of Things

The IoT was born in the early 1980s. Grad students at Carnegie Mellon University, including Mike Kazar ’78, connected a Cola-Cola machine to the internet. The group’s motivation was simple: laziness. They wanted to use their computers to confirm the machine was stocked before trekking from their office to make a purchase. It was the world’s first internet-connected appliance. “This was pretty much treated as the punchline of a joke,” says Kazar, now a Microsoft engineer. “No one expected billions of devices on the internet.”

Since that Coke machine, everyday objects have become increasingly networked into the growing IoT. That includes everything from wearable heart monitors to smart fridges that tell you when you’re low on milk. IoT devices often run on microcontrollers — simple computer chips with no operating system, minimal processing power, and less than one thousandth of the memory of a typical smartphone. So pattern-recognition tasks like deep learning are difficult to run locally on IoT devices. For complex analysis, IoT-collected data is often sent to the cloud, making it vulnerable to hacking.

Read More  Deep Learning Helps Predict New Drug Combinations To Fight COVID-19

“How do we deploy neural nets directly on these tiny devices? It’s a new research area that’s getting very hot,” says Han. “Companies like Google and ARM are all working in this direction.” Han is too.

With MCUNet, Han’s group codesigned two components needed for “tiny deep learning” — the operation of neural networks on microcontrollers. One component is TinyEngine, an inference engine that directs resource management, akin to an operating system. TinyEngine is optimized to run a particular neural network structure, which is selected by MCUNet’s other component: TinyNAS, a neural architecture search algorithm.

 

System-algorithm codesign

Designing a deep network for microcontrollers isn’t easy. Existing neural architecture search techniques start with a big pool of possible network structures based on a predefined template, then they gradually find the one with high accuracy and low cost. While the method works, it’s not the most efficient. “It can work pretty well for GPUs or smartphones,” says Lin. “But it’s been difficult to directly apply these techniques to tiny microcontrollers, because they are too small.”

So Lin developed TinyNAS, a neural architecture search method that creates custom-sized networks. “We have a lot of microcontrollers that come with different power capacities and different memory sizes,” says Lin. “So we developed the algorithm [TinyNAS] to optimize the search space for different microcontrollers.” The customized nature of TinyNAS means it can generate compact neural networks with the best possible performance for a given microcontroller — with no unnecessary parameters. “Then we deliver the final, efficient model to the microcontroller,” say Lin.

Read More  Google Cloud Next 2019 | Think Big, Think Global

To run that tiny neural network, a microcontroller also needs a lean inference engine. A typical inference engine carries some dead weight — instructions for tasks it may rarely run. The extra code poses no problem for a laptop or smartphone, but it could easily overwhelm a microcontroller. “It doesn’t have off-chip memory, and it doesn’t have a disk,” says Han. “Everything put together is just one megabyte of flash, so we have to really carefully manage such a small resource.” Cue TinyEngine.

The researchers developed their inference engine in conjunction with TinyNAS. TinyEngine generates the essential code necessary to run TinyNAS’ customized neural network. Any deadweight code is discarded, which cuts down on compile-time. “We keep only what we need,” says Han. “And since we designed the neural network, we know exactly what we need. That’s the advantage of system-algorithm codesign.” In the group’s tests of TinyEngine, the size of the compiled binary code was between 1.9 and five times smaller than comparable microcontroller inference engines from Google and ARM. TinyEngine also contains innovations that reduce runtime, including in-place depth-wise convolution, which cuts peak memory usage nearly in half. After codesigning TinyNAS and TinyEngine, Han’s team put MCUNet to the test.

MCUNet’s first challenge was image classification. The researchers used the ImageNet database to train the system with labeled images, then to test its ability to classify novel ones. On a commercial microcontroller they tested, MCUNet successfully classified 70.7 percent of the novel images — the previous state-of-the-art neural network and inference engine combo was just 54 percent accurate. “Even a 1 percent improvement is considered significant,” says Lin. “So this is a giant leap for microcontroller settings.”

The team found similar results in ImageNet tests of three other microcontrollers. And on both speed and accuracy, MCUNet beat the competition for audio and visual “wake-word” tasks, where a user initiates an interaction with a computer using vocal cues (think: “Hey, Siri”) or simply by entering a room. The experiments highlight MCUNet’s adaptability to numerous applications.

Read More  Machine Learning Algorithms Hunt For Dark Matter In Space Maps

 

“Huge potential”

The promising test results give Han hope that it will become the new industry standard for microcontrollers. “It has huge potential,” he says.

The advance “extends the frontier of deep neural network design even farther into the computational domain of small energy-efficient microcontrollers,” says Kurt Keutzer, a computer scientist at the University of California at Berkeley, who was not involved in the work. He adds that MCUNet could “bring intelligent computer-vision capabilities to even the simplest kitchen appliances, or enable more intelligent motion sensors.”

MCUNet could also make IoT devices more secure. “A key advantage is preserving privacy,” says Han. “You don’t need to transmit the data to the cloud.”

Analyzing data locally reduces the risk of personal information being stolen — including personal health data. Han envisions smart watches with MCUNet that don’t just sense users’ heartbeat, blood pressure, and oxygen levels, but also analyze and help them understand that information. MCUNet could also bring deep learning to IoT devices in vehicles and rural areas with limited internet access.

Plus, MCUNet’s slim computing footprint translates into a slim carbon footprint. “Our big dream is for green AI,” says Han, adding that training a large neural network can burn carbon equivalent to the lifetime emissions of five cars. MCUNet on a microcontroller would require a small fraction of that energy. “Our end goal is to enable efficient, tiny AI with less computational resources, less human resources, and less data,” says Han.

 

This article is republished from MIT News by Daniel Ackerman.


For enquiries, product placements, sponsorships, and collaborations, connect with us at [email protected]. We'd love to hear from you!

Our humans need coffee too! Your support is highly appreciated, thank you!

liwaiwai.com

Related Topics
  • Algorithms
  • Artificial Intelligence
  • College of Computing
  • CSAIL
  • Deep Learning
  • Electrical Engineering & Computer Science
  • Internet of Things
  • IoT
  • Machine Learning
  • MCUNet
  • MIT
  • School of Engineering
You May Also Like
View Post
  • Artificial Intelligence
  • Engineering
  • Technology

AI-Driven Tool Makes It Easy To Personalize 3D-Printable Models

  • September 22, 2023
View Post
  • Artificial Intelligence
  • Data

Applying Generative AI To Product Design With BigQuery DataFrames

  • September 21, 2023
Microsoft and Adobe
View Post
  • Artificial Intelligence
  • Machine Learning
  • Platforms

Microsoft And Adobe Partner To Deliver Cost Savings And Business Benefits

  • September 21, 2023
View Post
  • Artificial Intelligence
  • Engineering
  • Platforms
  • Tools

Document AI Workbench Is Now Powered By Generative AI To Structure Document Data Faster

  • September 15, 2023
Data
View Post
  • Artificial Intelligence
  • Machine Learning
  • Technology

UK Space Sector Has Sights Set On Artificial Intelligence And Machine Learning Professionals

  • September 15, 2023
View Post
  • Data
  • Platforms

Microsoft And Oracle Expand Partnership To Deliver Oracle Database Services On Oracle Cloud Infrastructure In Microsoft Azure

  • September 14, 2023
Data
View Post
  • Artificial Intelligence
  • Engineering
  • Machine Learning
  • Platforms

How Verve Group Transforms Customer Experiences With Google Cloud Vertex AI

  • September 11, 2023
View Post
  • Artificial Intelligence
  • Data
  • Platforms
  • Software Engineering
  • Technology

Combining AI With A Trusted Data Approach On IBM Power To Fuel Business Outcomes

  • September 11, 2023
A Field Guide To A.I.
Navigate the complexities of Artificial Intelligence and unlock new perspectives in this must-have guide.
Now available in print and ebook.

charity-water



Stay Connected!
LATEST
  • 1
    NASA’s Mars Rovers Could Inspire A More Ethical Future For AI
    • September 26, 2023
  • 2
    Oracle CloudWorld 2023: 6 Key Takeaways From The Big Annual Event
    • September 25, 2023
  • 3
    3 Ways AI Can Help Communities Adapt To Climate Change In Africa
    • September 25, 2023
  • Robotic Hand | Lights 4
    Nvidia H100 Tensor Core GPUs Come To Oracle Cloud
    • September 24, 2023
  • 5
    AI-Driven Tool Makes It Easy To Personalize 3D-Printable Models
    • September 22, 2023
  • 6
    Applying Generative AI To Product Design With BigQuery DataFrames
    • September 21, 2023
  • 7
    Combining AI With A Trusted Data Approach On IBM Power To Fuel Business Outcomes
    • September 21, 2023
  • Microsoft and Adobe 8
    Microsoft And Adobe Partner To Deliver Cost Savings And Business Benefits
    • September 21, 2023
  • Coffee | Laptop | Notebook | Work 9
    First HP Work Relationship Index Shows Majority of People Worldwide Have an Unhealthy Relationship with Work
    • September 20, 2023
  • 10
    Huawei Connect 2023: Accelerating Intelligence For Shared Success
    • September 20, 2023

about
About
Hello World!

We are liwaiwai.com. Created by programmers for programmers.

Our site aims to provide materials, guides, programming how-tos, and resources relating to artificial intelligence, machine learning and the likes.

We would like to hear from you.

If you have any questions, enquiries or would like to sponsor content, kindly reach out to us at:

[email protected]

Live long & prosper!
Most Popular
  • Intel Innovation 1
    Intel Innovation 2023
    • September 15, 2023
  • 2
    Microsoft And Oracle Expand Partnership To Deliver Oracle Database Services On Oracle Cloud Infrastructure In Microsoft Azure
    • September 14, 2023
  • 3
    Real-Time Ubuntu Is Now Available In AWS Marketplace
    • September 12, 2023
  • 4
    IBM Brings Watsonx To ESPN Fantasy Football With New Waiver Grades And Trade Grades
    • September 13, 2023
  • 5
    Document AI Workbench Is Now Powered By Generative AI To Structure Document Data Faster
    • September 15, 2023
  • /
  • Artificial Intelligence
  • Explore
  • About
  • Contact Us

Input your search keywords and press Enter.