Liwaiwai Liwaiwai
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • Learning
  • About
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • Learning
  • About
Liwaiwai Liwaiwai
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • Learning
  • About
  • Artificial Intelligence
  • Machine Learning
  • Technology

Mammoth AI Report Says Era Of Deep Learning May Fade, But That’s Unlikely

  • October 13, 2021
  • admin

Scholars see deep learning’s pitfalls and limitations, the computer industry just sees a huge opportunity in matrix multiplications.

The era of deep learning began in 2006, when Geoffrey Hinton, a professor at the University of Toronto, who is one of the founders of that particular approach to artificial intelligence, theorized that greatly improved results could be achieved by adding many more artificial neurons to a machine learning program. The “deep” in deep learning refers to the depth of a neural network, how many layers of artificial neurons data is passed through.


Partner with liwaiwai.com
for your next big idea.
Let us know here.



From our partners:

CITI.IO :: Business. Institutions. Society. Global Political Economy.
CYBERPOGO.COM :: For the Arts, Sciences, and Technology.
DADAHACKS.COM :: Parenting For The Rest Of Us.
ZEDISTA.COM :: Entertainment. Sports. Culture. Escape.
TAKUMAKU.COM :: For The Hearth And Home.
ASTER.CLOUD :: From The Cloud And Beyond.
LIWAIWAI.COM :: Intelligence, Inside and Outside.
GLOBALCLOUDPLATFORMS.COM :: For The World's Computing Needs.
FIREGULAMAN.COM :: For The Fire In The Belly Of The Coder.
ASTERCASTER.COM :: Supra Astra. Beyond The Stars.
BARTDAY.COM :: Prosperity For Everyone.

Hinton’s insight led to breakthroughs in the practical performance of AI programs on tests such as the ImageNet image recognition task. The subsequent fifteen years have been called the deep learning revolution.

A report put out last week by Stanford University, in conjunction with multiple institutions, argues that the dominance of the deep learning approach may fade in coming years, as it runs out of answers for tough questions of building AI.

“The recent dominance of deep learning may be coming to an end,” write the report’s authors. “To continue making progress, AI researchers will likely need to embrace both general- and special-purpose hand-coded methods, as well as ever faster processors and bigger data.” says the AI100 report.

The report, known formally as “The One Hundred Year Study of AI,” is the second installment in what is planned to be a series of reports every five years on the state of the discipline. The report is put together by a collection of academics who make up a standing committee and organize workshops whose findings are summarized in the study.

The report’s prediction about deep learning may be premature, for the simple reason that unlike in past eras, when AI was on the outskirts of computer science, the mathematics that power deep learning have now become firmly embedded in the world of commercial computing.

Hundreds of billions of dollars in market value is now ascribed to deep learning fundamentals. Deep learning, unlike any AI approach before it, is now the establishment.

Read More  How Explainable Artificial Intelligence Can Help Humans Innovate

Decades ago, companies with ambitious computing quests went out of business for lack of money. Thinking Machines was the gem of the 1980s – 1990s artificial intelligence quest. It went bankrupt in 1994 after burning through $125 million in venture capital.

The idea of today’s startups going bankrupt seems a lot less likely, stuffed as they are with unprecedented amounts of cash. Cerebras Systems, Graphcore, SambaNova Systems, have gotten billions, collectively, and have access to lots more money, both in debt and equity markets.

More important, the leader in AI chips, Nvidia, is a powerhouse that is worth $539 billion in market value and makes $10 billion annually selling chips for deep learning training and inference. This is a company with a lot of runway to build more and sell more and get even richer off the deep learning revolution.

Why is the business of deep learning so successful? It’s because deep learning, regardless of whether it actually leads to anything resembling intelligence, has created a paradigm to use faster and faster computing to automate a lot of computer programming. Hinton, along with his co-conspirators, have been honored for moving forward computing science regardless of what AI scholars may think of their contribution to AI.

The authors of the AI100 report make the case that deep learning is running up against practical limits in its insatiable desire for data and compute power. The authors write,

But now, in the 2020s, these general methods are running into limits—available computation, model size, sustainability, availability of data, brittleness, and a lack of semantics—that are starting to drive researchers back into designing specialized components of their systems to try to work around them.

All that may well be true, but, again, that is a call to arms that the computer industry is happy to spend billions answering. The tasks of deep learning have become the target for the the strongest computers. AI is no longer a special discipline, it is the heart of computer engineering.

It takes just fourteen seconds for one of the fastest computers on the planet, built by Google, to automatically be “trained” to solve ImageNet, according to the benchmark results earlier this year of the MLPerf test suite. That is not a measure of thinking, per se, it is a measure of how fast a computer can transform input into output — pictures in, linear regression answer out.

Read More  10 Best Stock Market Datasets For Machine Learning

All computers, ever since Alan Turing conceived of them, do one thing and one thing only, they transform a series of ones and zeros into a different series of ones and zeros. All that deep learning is, is a way for the computer to automatically come up with the rules of transformation rather than have a person specify the rule.

What Google and Nvidia are helping to build is simply the future of all computers. Every computer program can benefit from having some of its transformations automated, rather than being laboriously coded by a person.

The incredibly simple approach that underlies that automation, matrix multiplication, is sublime because it is such a basic mathematical operation. It’s an easy target for computer makers.

That means that every chip is becoming a deep learning chip, in the sense that every chip is now a matrix multiplication chip.

“Neural nets are the new apps,” said Raja M. Koduri, senior vice president and general manager of Intel’s Accelerated Computing Systems and Graphics Group, recently told ZDNet. “What we see is that every socket, it’s not CPU, GPU, IPU, everything will have matrix acceleration,” said Koduri.

When you have a hammer, everything is a nail. And the computer industry has a very big hammer.

Cerebras’s WSE chip, the biggest semiconductor in the world, is a giant machine for doing one thing over and over, the matrix multiplications that power deep learning.

cerebras-wse-2-splash.jpg
The computer industry wants to feast on giant matrix multiplication engines, such as Cerebras’s WSE chip, the biggest chip in the world, dedicated to performing fast multiplications in parallel.Cerebras Systems

The benchmark test MLPerf, has become the yardstick by which companies purchase computers, based on their speed of deep learning compute. The deep learning revolution has become the deep learning industry as it has established matrix math as the new measure of compute.

The scholars who assembled the AI100 report are making a point about research directions. Many scholars are concerned that deep learning has gotten no closer to the goal of understanding nor of achieving human-like intelligence, and doesn’t seem like it will any time soon.

Read More  Scale Your data Science Workflows With The Vertex AI Workbench Notebook Executor

Deep learning critics such as NYU psychologist Gary Marcus have organized whole seminars to explore a way to merge deep learning with other approaches, such as symbolic reasoning, to find a way past what seems the limited nature of deep learning’s monotonic approach.

The critique is elegantly encapsulated by one of the report’s study panel members, Melanie Mitchell of the Santa Fe Institute and Portland State University. Mitchell wrote in a paper this year, titled “Why AI is harder than we think,” that deep learning is running up against severe limitations despite the optimistic embrace of the approach. Mitchell cites as evidence the fact that much-ballyhooed goals such as the long-heralded age of self-driving cars have failed to materialize.

As Mitchell argues, quite astutely, deep learning barely knows how to talk about intelligence, much less replicate it:

It’s clear that to make and assess progress in AI more effectively, we will need to develop a better vocabulary for talking about what machines can do. And more generally, we will need a better scientific understanding of intelligence as it manifests in different systems in nature. This will require AI researchers to engage more deeply with other scientific disciplines that study intelligence.

All that is no doubt true, and yet, the computer industry loves incrementalism. Sixty years of making integrated circuits double in speed and double in speed have gotten the computer world hooked on things that can be easily replicated. Deep learning, based on a sea of matrix multiplications, again, is a sublime target, a terrifically simple task to run faster and faster.

As long as computer companies can keep churning out improvements in matrix acceleration, the deep learning industry, as the mainstream of compute, will have a staying power that has to be reckoned with.

This article was originally appeared in ZDNet.


For enquiries, product placements, sponsorships, and collaborations, connect with us at [email protected]. We'd love to hear from you!

Our humans need coffee too! Your support is highly appreciated, thank you!

admin

Related Topics
  • AI
  • AI100 report
  • Deep Learning
  • Research
You May Also Like
View Post
  • Artificial Intelligence
  • Engineering
  • Technology

AI-Driven Tool Makes It Easy To Personalize 3D-Printable Models

  • September 22, 2023
View Post
  • Artificial Intelligence
  • Data

Applying Generative AI To Product Design With BigQuery DataFrames

  • September 21, 2023
View Post
  • Artificial Intelligence
  • Platforms

Combining AI With A Trusted Data Approach On IBM Power To Fuel Business Outcomes

  • September 21, 2023
Microsoft and Adobe
View Post
  • Artificial Intelligence
  • Machine Learning
  • Platforms

Microsoft And Adobe Partner To Deliver Cost Savings And Business Benefits

  • September 21, 2023
View Post
  • Artificial Intelligence
  • Technology

Huawei Connect 2023: Accelerating Intelligence For Shared Success

  • September 20, 2023
View Post
  • Artificial Intelligence
  • Engineering
  • Platforms
  • Tools

Document AI Workbench Is Now Powered By Generative AI To Structure Document Data Faster

  • September 15, 2023
Data
View Post
  • Artificial Intelligence
  • Machine Learning
  • Technology

UK Space Sector Has Sights Set On Artificial Intelligence And Machine Learning Professionals

  • September 15, 2023
Intel Innovation
View Post
  • Artificial Intelligence
  • Technology

Intel Innovation 2023

  • September 15, 2023
A Field Guide To A.I.
Navigate the complexities of Artificial Intelligence and unlock new perspectives in this must-have guide.
Now available in print and ebook.

charity-water



Stay Connected!
LATEST
  • 1
    AI-Driven Tool Makes It Easy To Personalize 3D-Printable Models
    • September 22, 2023
  • 2
    Applying Generative AI To Product Design With BigQuery DataFrames
    • September 21, 2023
  • 3
    Combining AI With A Trusted Data Approach On IBM Power To Fuel Business Outcomes
    • September 21, 2023
  • Microsoft and Adobe 4
    Microsoft And Adobe Partner To Deliver Cost Savings And Business Benefits
    • September 21, 2023
  • 5
    Huawei Connect 2023: Accelerating Intelligence For Shared Success
    • September 20, 2023
  • 6
    Document AI Workbench Is Now Powered By Generative AI To Structure Document Data Faster
    • September 15, 2023
  • Data 7
    UK Space Sector Has Sights Set On Artificial Intelligence And Machine Learning Professionals
    • September 15, 2023
  • Intel Innovation 8
    Intel Innovation 2023
    • September 15, 2023
  • 9
    Introducing OpenAI Dublin
    • September 14, 2023
  • 10
    Microsoft And Oracle Expand Partnership To Deliver Oracle Database Services On Oracle Cloud Infrastructure In Microsoft Azure
    • September 14, 2023

about
About
Hello World!

We are liwaiwai.com. Created by programmers for programmers.

Our site aims to provide materials, guides, programming how-tos, and resources relating to artificial intelligence, machine learning and the likes.

We would like to hear from you.

If you have any questions, enquiries or would like to sponsor content, kindly reach out to us at:

[email protected]

Live long & prosper!
Most Popular
  • 1
    Real-Time Ubuntu Is Now Available In AWS Marketplace
    • September 12, 2023
  • 2
    IBM Brings Watsonx To ESPN Fantasy Football With New Waiver Grades And Trade Grades
    • September 13, 2023
  • 3
    IBM Announced As A Sponsor Of 2023 U.N. Climate Change Conference (COP28)
    • September 13, 2023
  • 4
    NASA Shares Unidentified Anomalous Phenomena Independent Study Report
    • September 14, 2023
  • 5
    Bristol Set To Host UK’s Most Powerful Supercomputer To Turbocharge AI Innovation
    • September 13, 2023
  • /
  • Artificial Intelligence
  • Explore
  • About
  • Contact Us

Input your search keywords and press Enter.