Controlling Embedded Systems Remotely via Internet-of-Things Based on Emotional Recognition

Nowadays, much research attention is focused on human–computer interaction (HCI), specifically in terms of biosignal, which has been recently used for the remote controlling to offer benefits especially for disabled people or protecting against contagions, such as coronavirus. In this paper, a biosi...

Full description

Saved in:
Bibliographic Details
Main Authors: Mohammad J. M. Zedan, Ali I. Abduljabbar, Fahad Layth Malallah, Mustafa Ghanem Saeed
Format: Article
Language:English
Published: Wiley 2020-01-01
Series:Advances in Human-Computer Interaction
Online Access:http://dx.doi.org/10.1155/2020/8895176
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832547713282998272
author Mohammad J. M. Zedan
Ali I. Abduljabbar
Fahad Layth Malallah
Mustafa Ghanem Saeed
author_facet Mohammad J. M. Zedan
Ali I. Abduljabbar
Fahad Layth Malallah
Mustafa Ghanem Saeed
author_sort Mohammad J. M. Zedan
collection DOAJ
description Nowadays, much research attention is focused on human–computer interaction (HCI), specifically in terms of biosignal, which has been recently used for the remote controlling to offer benefits especially for disabled people or protecting against contagions, such as coronavirus. In this paper, a biosignal type, namely, facial emotional signal, is proposed to control electronic devices remotely via emotional vision recognition. The objective is converting only two facial emotions: a smiling or nonsmiling vision signal captured by the camera into a remote control signal. The methodology is achieved by combining machine learning (for smiling recognition) and embedded systems (for remote control IoT) fields. In terms of the smiling recognition, GENKl-4K database is exploited to train a model, which is built in the following sequenced steps: real-time video, snapshot image, preprocessing, face detection, feature extraction using HOG, and then finally SVM for the classification. The achieved recognition rate is up to 89% for the training and testing with 10-fold validation of SVM. In terms of IoT, the Arduino and MCU (Tx and Rx) nodes are exploited for transferring the resulting biosignal remotely as a server and client via the HTTP protocol. Promising experimental results are achieved by conducting experiments on 40 individuals who participated in controlling their emotional biosignals on several devices such as closing and opening a door and also turning the alarm on or off through Wi-Fi. The system implementing this research is developed in Matlab. It connects a webcam to Arduino and a MCU node as an embedded system.
format Article
id doaj-art-7d6c6aaa7db140529fa2661a7842d4cb
institution Kabale University
issn 1687-5893
1687-5907
language English
publishDate 2020-01-01
publisher Wiley
record_format Article
series Advances in Human-Computer Interaction
spelling doaj-art-7d6c6aaa7db140529fa2661a7842d4cb2025-02-03T06:43:37ZengWileyAdvances in Human-Computer Interaction1687-58931687-59072020-01-01202010.1155/2020/88951768895176Controlling Embedded Systems Remotely via Internet-of-Things Based on Emotional RecognitionMohammad J. M. Zedan0Ali I. Abduljabbar1Fahad Layth Malallah2Mustafa Ghanem Saeed3Computer and Information Department, Electronics Engineering College, Ninevah University, Mosul 41002, IraqComputer and Information Department, Electronics Engineering College, Ninevah University, Mosul 41002, IraqComputer and Information Department, Electronics Engineering College, Ninevah University, Mosul 41002, IraqDepartment of Computer Science, Cihan University Sulaimaniya, Sulaimaniya 46001, IraqNowadays, much research attention is focused on human–computer interaction (HCI), specifically in terms of biosignal, which has been recently used for the remote controlling to offer benefits especially for disabled people or protecting against contagions, such as coronavirus. In this paper, a biosignal type, namely, facial emotional signal, is proposed to control electronic devices remotely via emotional vision recognition. The objective is converting only two facial emotions: a smiling or nonsmiling vision signal captured by the camera into a remote control signal. The methodology is achieved by combining machine learning (for smiling recognition) and embedded systems (for remote control IoT) fields. In terms of the smiling recognition, GENKl-4K database is exploited to train a model, which is built in the following sequenced steps: real-time video, snapshot image, preprocessing, face detection, feature extraction using HOG, and then finally SVM for the classification. The achieved recognition rate is up to 89% for the training and testing with 10-fold validation of SVM. In terms of IoT, the Arduino and MCU (Tx and Rx) nodes are exploited for transferring the resulting biosignal remotely as a server and client via the HTTP protocol. Promising experimental results are achieved by conducting experiments on 40 individuals who participated in controlling their emotional biosignals on several devices such as closing and opening a door and also turning the alarm on or off through Wi-Fi. The system implementing this research is developed in Matlab. It connects a webcam to Arduino and a MCU node as an embedded system.http://dx.doi.org/10.1155/2020/8895176
spellingShingle Mohammad J. M. Zedan
Ali I. Abduljabbar
Fahad Layth Malallah
Mustafa Ghanem Saeed
Controlling Embedded Systems Remotely via Internet-of-Things Based on Emotional Recognition
Advances in Human-Computer Interaction
title Controlling Embedded Systems Remotely via Internet-of-Things Based on Emotional Recognition
title_full Controlling Embedded Systems Remotely via Internet-of-Things Based on Emotional Recognition
title_fullStr Controlling Embedded Systems Remotely via Internet-of-Things Based on Emotional Recognition
title_full_unstemmed Controlling Embedded Systems Remotely via Internet-of-Things Based on Emotional Recognition
title_short Controlling Embedded Systems Remotely via Internet-of-Things Based on Emotional Recognition
title_sort controlling embedded systems remotely via internet of things based on emotional recognition
url http://dx.doi.org/10.1155/2020/8895176
work_keys_str_mv AT mohammadjmzedan controllingembeddedsystemsremotelyviainternetofthingsbasedonemotionalrecognition
AT aliiabduljabbar controllingembeddedsystemsremotelyviainternetofthingsbasedonemotionalrecognition
AT fahadlaythmalallah controllingembeddedsystemsremotelyviainternetofthingsbasedonemotionalrecognition
AT mustafaghanemsaeed controllingembeddedsystemsremotelyviainternetofthingsbasedonemotionalrecognition