Glovus 3

Glovus is a prototype for a new user-machine interface that uses an instrumented wireless glove to recognize the movements of a human hand and control a computer.

Glovus is a project that started out as a school project from 7 students in computer and electrical engineering at the Universite de Sherbrooke.

Demo

Source Code

The source code and hardware documentation is available on svn. To get it, check out the code on the svn repository using the command

svn co https://glovus3.svn.sourceforge.net/svnroot/glovus3 glovus3

The application was tested under OSX and Windows. It is assumed to be working under Linux altough a little work might be needed for it to build.

Also check out our Sourceforge project page

System overview

Hardware

The hardware schematics for the glove and PCB routing as well as the bill of material will shortly be made available on the svn for people to be able to make their own glove. We use 4 PCB as a PCB supplier. For about 200$ you should be able to make your own glove.

Power supply

This part of the hardware that is responsible to supply power to the glove hardware. The power is supplied by a 3.7V 1 cell Li-Po battery. The hardware also have charging abilities.

Sensors

We use a total of 10 sensors to recognize the movements of the human hand. We have a tri-axis accelerometer, a dual-axis gyrometer as well a a single-axis gyrometer as well as 4 flexible resistors for the fingers.

PIC Microcontroller

The microcontroller on the glove is a PIC18F from Microchip. The source code of the PIC microcontroller will soon be made available on svn.

Wireless link

The wireless link is realized with Bluetooth using the Serial Port Profile sub-protocol.

Data acquisition interface

Hardware Layer

This is the module that reads data from the bluetooth glove.

Data acquisition system

This is the layer that reads data from the glove. We used the bridge design pattern to decouple the type of the source from the signal processing code. This was done to make unit testing available on the source code.

Move definition interface

This is the user interface that allows the user to define moves and train the system to recognize them.

Move matching system

Kalman Filters

We implemented a Kalman Filter to compensate for the poor quality of the sensor used. Using the Kalman filters we can get non-drifting pitch and roll from the glove.

Pre-neural-network filters

This module prepares the signal before it can be processed by a neural network. We downsample it and extract somes features manually.

Artificial neural network

We use an artificial neural network to do the move recognition. We use the librairy FANN (fast artificial neural networks) and the Cascade Correlation topology.

System control modules

Control interface

This is the user interface that allows the user to assign movements to MIDI notes or hotkeys.

Computer control interface

This is the module that can send hotkeys when movements are recognized

MIDI Interface

This is the module that can send MIDI notes when movements are recognized. We use the RtMidi library.