Liwaiwai Liwaiwai
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • About
Liwaiwai Liwaiwai
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • About
  • Artificial Intelligence
  • Machine Learning

Robot Cognition Requires Machines That Both Think And Feel

  • June 9, 2019
  • admin

For more than two millennia, Western thinkers have separated emotion from cognition – emotion being the poorer sibling of the two. Cognition helps to explain the nature of space-time and sends humans to the Moon. Emotion might save the lioness in the savannah, but it also makes humans act irrationally with disconcerting frequency. 

From Ex Machina (2015). Photo courtesy United Pictures International

In the quest to create intelligent robots, designers tend to focus on purely rational, cognitive capacities. It’s tempting to disregard emotion entirely, or include only as much as necessary. But without emotion to help determine the personal significance of objects and actions, I doubt that true intelligence can exist – not the kind that beats human opponents at chess or the game of Go, but the sort of smarts that we humans recognise as such. Although we can refer to certain behaviours as either ‘emotional’ or ‘cognitive’, this is really a linguistic short-cut. The two can’t be teased apart.

What counts as sophisticated, intelligent behaviour in the first place? Consider a crew of robots on a mission to Mars. To act intelligently, the robots can’t just scuttle about taking pictures of the environment and collecting dirt and mineral samples. They’d need to be able to figure out how to reach a target destination, and come up with alternative tactics if the most direct path is blocked. If pressed for time, the team of robots would have to know which materials are more important and to be prioritised as part of the expedition.

Part of being intelligent, then, is about the ability to function autonomously in various conditions and environments. Emotion is helpful here because it allows an agent to piece together the most significant kinds of information. For example, emotion can instil a sense of urgency in actions and decisions. Imagine crossing a patch of desert in an unreliable car, during the hottest hours of the day. If the vehicle breaks down, what you need is a quick fix to get you to the next town, not a more permanent solution that might be perfect but could take many hours to complete in the beating sun. In real-world scenarios, a ‘good’ outcome is often all that’s required, but without the external pressure of perceiving a ‘stressful’ situation, an android might take too long trying to find the optimal solution.

Read More  Doing Small Network Scientific Machine Learning In Julia 5x Faster Than PyTorch

Most proposals for emotion in robots involve the addition of a separate ‘emotion module’ – some sort of bolted-on affective architecture that can influence other abilities such as perception and cognition. The idea would be to give the agent access to an enriched set of properties, such as the urgency of an action or the meaning of facial expressions. These properties could help to determine issues such as which visual objects should be processed first, what memories should be recollected, and which decisions will lead to better outcomes.

But research from the behavioural and brain sciences suggests that emotion is not just an ‘added feature’ layered on top of ‘standard’ cognition. Instead, it’s an integral part of our cognitive machinery. In one of the experiments from my lab, people in a scanner watched videos of rapid, flashed-up images of either a house or a skyscraper. They had to identify which of these scenes was present in the video, a task designed to be very difficult. We then introduced an element of emotional manipulation. Before viewing the clips, half of the participants received a mild electric shock while viewing a series of skyscrapers; the other half, instead, watched a series of houses appear, paired with the same mild shock. This is what’s known as classical conditioning, and links an initially neutral stimulus (a nondescript picture) with the emotional meaning of the unpleasant stimulus (the shock).

The outcome: participants conditioned to the skyscrapers were better at detecting them than at detecting houses; conversely, participants conditioned to houses detected those better than they did skyscrapers. And in each case, responses in the visual cortex were stronger for the type of stimulus (house or skyscraper) to which participants had been conditioned. This study shows that perception is not a passive process that merely reflects the external world. Rather, it involves picking up on the significance of objects, and determines how they are processed. Vision is never neutral, it is always laden with affective meaning.

Read More  Here’s Why Robots Are Actually Going To Increase Human Employment

It’s the architecture of the brain, with its short- and long-range connections, that allows these properties to emerge. Emotion doesn’t come about from local computations in a single region, such as the amygdala, which is frequently called the ‘emotion centre’ in the brain. Instead, anatomical studies have revealed that the areas in the brain associated with perception, cognition, emotion, motivation, action and bodily sensations are closely intertwined. Looking at the brain as a complex network helps to clarify why some brain structures, such as the amygdala, are important for emotion: they’re hubs, much like major airports that link to a very large number of destinations. As a consequence, these regions can influence and be influenced by many parts of the brain – which also suggests that it’s not possible to subtract emotion without affecting cognition.

The point is not that emotion is needed for intelligent, autonomous robots – the answer is yes – but that emotion needs to be hooked up to everything that goes on in a cognitive system. Emotion is not an ‘add-on’ module that endows a robot with feelings or allows it to express an internal state, such as the current risk of overheating. Its integration is a design principle of the information-processing architecture. Without emotion, no being that we might create can have any hope of aspiring to true intelligence.Aeon counter – do not remove

 

Luiz Pessoa

This article was originally published at Aeon and has been republished under Creative Commons.

admin

Related Topics
  • Automation
  • Cognition
  • Intelligence
  • Robotics
You May Also Like
View Post
  • Artificial Intelligence
  • Technology

Unlocking The Secrets Of ChatGPT: Tips And Tricks For Optimizing Your AI Prompts

  • March 29, 2023
View Post
  • Artificial Intelligence
  • Technology

Try Bard And Share Your Feedback

  • March 29, 2023
View Post
  • Artificial Intelligence
  • Data
  • Data Science
  • Machine Learning
  • Technology

Google Data Cloud & AI Summit : In Less Than 12 Hours From Now

  • March 29, 2023
View Post
  • Artificial Intelligence
  • Technology

Talking Cars: The Role Of Conversational AI In Shaping The Future Of Automobiles

  • March 28, 2023
View Post
  • Artificial Intelligence
  • Tools

Document AI Introduces Powerful New Custom Document Classifier To Automate Document Processing

  • March 28, 2023
View Post
  • Artificial Intelligence
  • Design
  • Practices

How AI Can Improve Digital Security

  • March 27, 2023
View Post
  • Artificial Intelligence
  • Machine Learning
  • Technology

ChatGPT 4.0 Finally Gets A Joke

  • March 27, 2023
View Post
  • Artificial Intelligence
  • Machine Learning
  • Technology

Mr. Cooper Is Improving The Home-buyer Experience With AI And ML

  • March 24, 2023

Leave a Reply

Your email address will not be published. Required fields are marked *

Stay Connected!
LATEST
  • 1
    Unlocking The Secrets Of ChatGPT: Tips And Tricks For Optimizing Your AI Prompts
    • March 29, 2023
  • 2
    Try Bard And Share Your Feedback
    • March 29, 2023
  • 3
    Google Data Cloud & AI Summit : In Less Than 12 Hours From Now
    • March 29, 2023
  • 4
    Talking Cars: The Role Of Conversational AI In Shaping The Future Of Automobiles
    • March 28, 2023
  • 5
    Document AI Introduces Powerful New Custom Document Classifier To Automate Document Processing
    • March 28, 2023
  • 6
    How AI Can Improve Digital Security
    • March 27, 2023
  • 7
    ChatGPT 4.0 Finally Gets A Joke
    • March 27, 2023
  • 8
    Mr. Cooper Is Improving The Home-buyer Experience With AI And ML
    • March 24, 2023
  • 9
    My First Pull Request At Age 14
    • March 24, 2023
  • 10
    The 5 Podcasts To Check If You Want To Get Up To Speed On AI
    • March 24, 2023

about
About
Hello World!

We are liwaiwai.com. Created by programmers for programmers.

Our site aims to provide materials, guides, programming how-tos, and resources relating to artificial intelligence, machine learning and the likes.

We would like to hear from you.

If you have any questions, enquiries or would like to sponsor content, kindly reach out to us at:

[email protected]

Live long & prosper!
Most Popular
  • 1
    GPT-4 : The Latest Milestone From OpenAI
    • March 24, 2023
  • 2
    Ditching Google: The 3 Search Engines That Use AI To Give Results That Are Meaningful
    • March 23, 2023
  • 3
    Peacock: Tackling ML Challenges By Accelerating Skills
    • March 23, 2023
  • 4
    Coop Reduces Food Waste By Forecasting With Google’s AI And Data Cloud
    • March 23, 2023
  • 5
    Gods In The Machine? The Rise Of Artificial Intelligence May Result In New Religions
    • March 23, 2023
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
  • About

Input your search keywords and press Enter.