A generic noninvasive neuromotor interface for human-computer interaction

Sussillo, David, Kaifosh, Patrick, Reardon, Thomas (February 2024) A generic noninvasive neuromotor interface for human-computer interaction. bioRxiv. (Submitted)

[thumbnail of s42003-024-06087-8.pdf] PDF
s42003-024-06087-8.pdf - Submitted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.

Download (5MB)
DOI: 10.1101/2024.02.23.581779

Abstract

Since the advent of computing, humans have sought computer input technologies that are expressive, intuitive, and universal. While diverse modalities have been developed, including keyboards, mice, and touchscreens, they require interaction with an intermediary device that can be limiting, especially in mobile scenarios. Gesture-based systems utilize cameras or inertial sensors to avoid an intermediary device, but they tend to perform well only for unobscured or overt movements. Brain computer interfaces (BCIs) have been imagined for decades to solve the interface problem by allowing for input to computers via thought alone. However high-bandwidth communication has only been demonstrated using invasive BCIs with decoders designed for single individuals, and so cannot scale to the general public. In contrast, neuromotor signals found at the muscle offer access to subtle gestures and force information. Here we describe the development of a noninvasive neuromotor interface that allows for computer input using surface electromyography (sEMG). We developed a highly-sensitive and robust hardware platform that is easily donned/doffed to sense myoelectric activity at the wrist and transform intentional neuromotor commands into computer input. We paired this device with an infrastructure optimized to collect training data from thousands of consenting participants, which allowed us to develop generic sEMG neural network decoding models that work across many people without the need for per-person calibration. Test users not included in the training set demonstrate closed-loop median performance of gesture decoding at 0.5 target acquisitions per second in a continuous navigation task, 0.9 gesture detections per second in a discrete gesture task, and handwriting at 17.0 adjusted words per minute. We demonstrate that input bandwidth can be further improved up to 30% by personalizing sEMG decoding models to the individual, anticipating a future in which humans and machines co-adapt to provide seamless translation of human intent. To our knowledge this is the first high-bandwidth neuromotor interface that directly leverages biosignals with performant out-of-the-box generalization across people.

Item Type: Paper
Subjects: bioinformatics
CSHL Authors:
Communities: CSHL labs > Hou lab
SWORD Depositor: CSHL Elements
Depositing User: CSHL Elements
Date: 28 February 2024
Date Deposited: 11 Apr 2024 14:14
Last Modified: 11 Apr 2024 14:14
Related URLs:
URI: https://repository.cshl.edu/id/eprint/41490

Actions (login required)

Administrator's edit/view item Administrator's edit/view item
CSHL HomeAbout CSHLResearchEducationNews & FeaturesCampus & Public EventsCareersGiving