top of page

Brain-Computer Interface Control of a Motorized Wheelchair.

Short video demonstrating my recent work of controlling a motorized wheelchair with a calibrationless, steady-state visual evoked potential (SSVEP) based BCI.
SSVEP BCI control of a motorized wheelchair for disabled individuals.

This video gives an Overview of the ASPEN lab (Advanced Signal Processing in Engineering and Neuroscience) at Old Dominion University. It demonstrates the steady-state visual evoked potential (SSVEP) based BCI system controlling a wheelchair. The SSVEP BCI utilizes several flickering stimuli that modulate at distinct, steady-state frequencies. When a user visually fixates on a stimulus, fundamental and harmonic oscillatory components that match the target stimulus become entrained in the user’s brain signals. The EEG (scalp recorded electroencephalography) signals are decoded in real time and translated to movement commands that are sent to the wheelchair.

 

The stimuli are displayed on an android phone for normal BCI control (first half of video), and are displayed on an LCD monitor for telepresence BCI control (second half of video). The four stimuli represent forward, backward, left and right movement commands for the wheelchair.

 

The system uses dry electrodes that are fixed to a custom headband to decrease EEG obtrusiveness and increase user comfort.

 

Additionally, the decoding algorithm is completely calibration-less (it does not require any training data from the user).  Any user can simply put on the EEG headband and start controlling the wheelchair.

 

All aspects of this system were designed and developed by myself, in affiliation with Old Dominion University.  

SSVEP BCI control of a prosthetic hand attached to a robotic manipulator arm.

Short video demonstrating my previous work of controlling a robotic manipulator arm for drawing and writing text using steady-state visual evoked potentials.
SSVEP BCI control of a prosthetic hand / robotic arm for augmented communication.

This is a Brain-Computer Interface (BCI) using Steady State Visual Evoked Potentials (SSVEP) to drive a prosthetic hand attached to a robotic manipulator arm.

The stimulus display (black box in front of the robot) contains 9 LEDs that each flash at distinct, constant frequencies. Each LED represents a movement direction for the prosthetic hand (i.e. left, right, up, down, diagonal...)

When the user attends to a flashing LED, the SSVEP response according to that frequency is elicited over the occipital regions of the brain. The user's EEG brain signals are then decoded and the robotic arm moves in the corresponding direction.

In this particular application, the robotic arm is grasping a dry-erase pen, and the user is spelling the word "g.tec".

The bio-signal amplifier (shown on the right) is the g.USBamp amplifier from g.tec.

bottom of page