Emotion Recognition

 

Introduction

Emotion Recognition (ER) Engine recognizes emotions based on the user's expressions, voice, and typed inputs.

The ER engine's recognition of human emotions enables natural verbal and nonverbal communication and allows the human to interact with a machine.

 

Process of ER Engine

 

ER Engine by ThinQ.AI supports the following features:

ER Engine Feature
Feature Description
Various input data Uses audio, image, and text as input data
Various emotion recognition Determines seven emotions (anger, disgust, fear, happiness, sadness, surprise, neutral).
Perception

Recognizes the user's emotions in two ways.

  • Unimodal: Uses a single modality
  • Multimodal: Uses multiple modalities simultaneously
Server Communication Connects to the server using the HTTP/2 method based on transport layer security (TLS) for enhanced security.

 

Engine Structure

All functions of ER Engine are run on a server. ER Engine takes text, visual, and audio data as input and generates appropriate responses.

 

ER Engine Architecture

 

Examples of Use

ER Engine is used a lot in everyday life.

 

  • Friendship with AI robots

Allows humans and robots to interact and communicate with each other emotionally.

 

AI robots act like friends

  • Chatbot service that recognizes users' emotions

Allows customer issues to be escalated quickly through real-time monitoring of the chatbot service. Through recognizing customer emotions and complaints, businesses can take action so that agents, not the chatbot, can respond.

Chatbot service that recognizes users' emotions