A Brain-Computer Interface for Teleoperation of a Semiautonomous Mobile Robotic Assistive System Using SLAM

The proposed assistive hybrid brain-computer interface (BCI) semiautonomous mobile robotic arm demonstrates a design that is (1) adaptable by observing environmental changes with sensors and deploying alternate solutions and (2) versatile by receiving commands from the user’s brainwave signals throu...

Full description

Saved in:
Bibliographic Details
Main Authors: Vidya Nandikolla, Bryan Ghoslin, Kevin Matsuno, Daniel A. Medina Portilla
Format: Article
Language:English
Published: Wiley 2022-01-01
Series:Journal of Robotics
Online Access:http://dx.doi.org/10.1155/2022/6178917
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832567026573377536
author Vidya Nandikolla
Bryan Ghoslin
Kevin Matsuno
Daniel A. Medina Portilla
author_facet Vidya Nandikolla
Bryan Ghoslin
Kevin Matsuno
Daniel A. Medina Portilla
author_sort Vidya Nandikolla
collection DOAJ
description The proposed assistive hybrid brain-computer interface (BCI) semiautonomous mobile robotic arm demonstrates a design that is (1) adaptable by observing environmental changes with sensors and deploying alternate solutions and (2) versatile by receiving commands from the user’s brainwave signals through a noninvasive electroencephalogram cap. Composed of three integrated subsystems, a hybrid BCI controller, an omnidirectional mobile base, and a robotic arm, the proposed robot has commands mapped to the user’s brainwaves related to a set of specific physical or mental tasks. The implementation of sensors and the camera systems enable both the mobile base and the arm to be semiautonomous. The mobile base’s SLAM algorithm has obstacle avoidance capability and path planning to assist the robot maneuver safely. The robot arm calculates and deploys the necessary joint movement to pick up or drop off a desired object selected by the user via a brainwave controlled cursor on a camera feed. Validation, testing, and implementation of the subsystems were conducted using Gazebo. Communication between the BCI controller and the subsystems is tested independently. A loop of prerecorded brainwave data related to each specific task is used to ensure that the mobile base command is executed; the same prerecorded file is used to move the robot arm cursor and initiate a pick-up or drop-off action. A final system test is conducted where the BCI controller input moves the cursor and selects a goal point. Successful virtual demonstrations of the assistive robotic arm show the feasibility of restoring movement capability and autonomy for a disabled user.
format Article
id doaj-art-38b7d6ccb3c44b398c8ea915fd2b9e3e
institution Kabale University
issn 1687-9619
language English
publishDate 2022-01-01
publisher Wiley
record_format Article
series Journal of Robotics
spelling doaj-art-38b7d6ccb3c44b398c8ea915fd2b9e3e2025-02-03T01:02:28ZengWileyJournal of Robotics1687-96192022-01-01202210.1155/2022/6178917A Brain-Computer Interface for Teleoperation of a Semiautonomous Mobile Robotic Assistive System Using SLAMVidya Nandikolla0Bryan Ghoslin1Kevin Matsuno2Daniel A. Medina Portilla3Department of Mechanical EngineeringDepartment of Mechanical EngineeringDepartment of Mechanical EngineeringDepartment of Mechanical EngineeringThe proposed assistive hybrid brain-computer interface (BCI) semiautonomous mobile robotic arm demonstrates a design that is (1) adaptable by observing environmental changes with sensors and deploying alternate solutions and (2) versatile by receiving commands from the user’s brainwave signals through a noninvasive electroencephalogram cap. Composed of three integrated subsystems, a hybrid BCI controller, an omnidirectional mobile base, and a robotic arm, the proposed robot has commands mapped to the user’s brainwaves related to a set of specific physical or mental tasks. The implementation of sensors and the camera systems enable both the mobile base and the arm to be semiautonomous. The mobile base’s SLAM algorithm has obstacle avoidance capability and path planning to assist the robot maneuver safely. The robot arm calculates and deploys the necessary joint movement to pick up or drop off a desired object selected by the user via a brainwave controlled cursor on a camera feed. Validation, testing, and implementation of the subsystems were conducted using Gazebo. Communication between the BCI controller and the subsystems is tested independently. A loop of prerecorded brainwave data related to each specific task is used to ensure that the mobile base command is executed; the same prerecorded file is used to move the robot arm cursor and initiate a pick-up or drop-off action. A final system test is conducted where the BCI controller input moves the cursor and selects a goal point. Successful virtual demonstrations of the assistive robotic arm show the feasibility of restoring movement capability and autonomy for a disabled user.http://dx.doi.org/10.1155/2022/6178917
spellingShingle Vidya Nandikolla
Bryan Ghoslin
Kevin Matsuno
Daniel A. Medina Portilla
A Brain-Computer Interface for Teleoperation of a Semiautonomous Mobile Robotic Assistive System Using SLAM
Journal of Robotics
title A Brain-Computer Interface for Teleoperation of a Semiautonomous Mobile Robotic Assistive System Using SLAM
title_full A Brain-Computer Interface for Teleoperation of a Semiautonomous Mobile Robotic Assistive System Using SLAM
title_fullStr A Brain-Computer Interface for Teleoperation of a Semiautonomous Mobile Robotic Assistive System Using SLAM
title_full_unstemmed A Brain-Computer Interface for Teleoperation of a Semiautonomous Mobile Robotic Assistive System Using SLAM
title_short A Brain-Computer Interface for Teleoperation of a Semiautonomous Mobile Robotic Assistive System Using SLAM
title_sort brain computer interface for teleoperation of a semiautonomous mobile robotic assistive system using slam
url http://dx.doi.org/10.1155/2022/6178917
work_keys_str_mv AT vidyanandikolla abraincomputerinterfaceforteleoperationofasemiautonomousmobileroboticassistivesystemusingslam
AT bryanghoslin abraincomputerinterfaceforteleoperationofasemiautonomousmobileroboticassistivesystemusingslam
AT kevinmatsuno abraincomputerinterfaceforteleoperationofasemiautonomousmobileroboticassistivesystemusingslam
AT danielamedinaportilla abraincomputerinterfaceforteleoperationofasemiautonomousmobileroboticassistivesystemusingslam
AT vidyanandikolla braincomputerinterfaceforteleoperationofasemiautonomousmobileroboticassistivesystemusingslam
AT bryanghoslin braincomputerinterfaceforteleoperationofasemiautonomousmobileroboticassistivesystemusingslam
AT kevinmatsuno braincomputerinterfaceforteleoperationofasemiautonomousmobileroboticassistivesystemusingslam
AT danielamedinaportilla braincomputerinterfaceforteleoperationofasemiautonomousmobileroboticassistivesystemusingslam