+ All Categories
Home > Documents > An hardware and software based framework for the ...

An hardware and software based framework for the ...

Date post: 08-Jan-2022
Category:
Upload: others
View: 0 times
Download: 0 times
Share this document with a friend
139
POLITECNICO DI MILANO DEPARTMENT OF BIOENGINEERING PHD PROGRAM IN BIOENGINEERING An hardware and software based framework for the development of a domestic Brain Computer Interface PhD dissertation of: Luca Maggi Tutor: Chiar.mo Prof. Antonio Pedotti Supervisors: Prof. Giuseppe Andreoni Ing. Luigi Arnone Chairman of PhD Program: Prof. Maria Gabriella Signorini XX Edition 2005 - 2008
Transcript

POLITECNICO DI MILANO

DEPARTMENT OF BIOENGINEERING

PHD PROGRAM IN BIOENGINEERING

An hardware and software based framework for the

development of a domestic Brain Computer Interface

PhD dissertation of:

Luca Maggi

Tutor:

Chiar.mo Prof. Antonio Pedotti

Supervisors:

Prof. Giuseppe Andreoni

Ing. Luigi Arnone

Chairman of PhD Program:

Prof. Maria Gabriella Signorini

XX Edition

2005 - 2008

AN HARDWARE AND SOFTWARE BASED FRAMEWORK FOR

THE DEVELOPMENT OF A DOMESTIC BRAIN COMPUTER

INTERFACE

©LUCA MAGGI

AD EMANUELA, AI MIEI GENITORI,

E A TUTTE QUELLE PERSONE CHE HANNO DEDICATO PIÙ

DI DIECI SECONDI A CAPIRE CIÒ CHE FACCIO

ACKNOWLEDGEMENTS

We want to thank all the partner which cooperated with Politecnico di Milano for the

realization of this research. First of all STMicroelectronics, which granted my Ph. D.

Course and supported my work. Many thanks also to La Nostra Famiglia IRCCS of

Bosisio Parini and Ospedale San Camillo IRCCS of Venezia, SXT - Sistemi per

telemedicina SRL (Lecco, Italy) and Bticino (Erba, Italy).

SUMMARY

This work was developed in the WowsLab of the TBMLab and in the Sensibilab of

the Bioengineering department of Politecnico di Milano (Milan, Italy) in

collaboration with ST Microelectronics (Agrate Brianza, Italy) and aimed at

providing a set of tools for the development of home Brain Computer Interfaces

based on a low cost acquisition device which could be used by both the end-user in

their domestic activities and by the researcher in order to develop new protocols and

algorithms. Such a system could be the link between the laboratory research activity

and domestic use of a BCI.

A Brain Computer Interface (BCI) is a man machine interface that establish a new

communication channel between the brain and the computer. The communication is

provided by a complex system that reveals and identifies a specific brain activity in

order to associate it to a predetermined command: then the command is sent to a

computer in order to drive a specific application.

PRESENT DAY BCIs

Thought many technologies are useful in order to reveal the brain activity with a high

signal to noise ratio (SNR) and a good spatial resolution, the electroencephalographic

(EEG) signal is generally adopted thanks to the technological and practical aspects

that distinguish this technique among the others:

• relatively short time constants;

• can operate in most environments;

• require quite simple and inexpensive equipment;

• doesn’t involve the use of invasive sensor or stimulation;

which guarantee the domestic applicability of a BCI.

A typical EEG based BCI system is composed of an EEG unit for the signal

amplification and acquisition and a computer which executes one or more software

modules. Usually there are at least two main software modules:

• a signal processing and classification algorithm;

• the controlled application.

The operation is constrained by a set of rules that form the operative protocol in

order to reduce the number of degree of freedom of the system and to bind it to a

limited number of well known electrophysiological signals. Those phenomena can be

of endogenous or exogenous origin depending on the necessity to have an external

stimulation in order to elicitate the addressed signal. Depending on the nature of the

signal it can be necessary to have a stimulation module that can be external or

embedded in the user interface depending on the specific requirements and on the

design choice.

A BCI is a communication system in which the messages or the commands that an

individual sends to the external world do not pass through the brain’s normal output

pathways of peripheral nerves and muscles. For example, in an EEG based BCI the

messages are encoded in EEG activity. A BCI provides an alternative method for

acting on to the user with. BCIs fall into two classes: dependent and independent.

A dependent BCI does not use the brain’s normal output pathways to carry the

message, but activity in these pathways is needed to generate the brain activity (e.g.

EEG) that does carry it. For example, one dependent BCI presents the user with a

matrix of letters that flash one at a time, and the user selects a specific letter by

looking directly at it so that the visual evoked potential (VEP) recorded from the

scalp over visual cortex when that letter flashes is much larger that the VEPs

produced when other letters flash. In this case, the brain’s output channel is EEG, but

the generation of the EEG signal depends on gaze direction, and therefore on

extraocular muscles and the cranial nerves that activate them [Wolpaw, 2002]. The

following table illustrates the classification of the most diffused BCI protocols.

Some protocols, like P300 and VEP, rely on the identification of a specific pattern

within a precise time interval after the stimulus and require an extremely tight

synchronization between the acquired signal and the stimulation hardware, other

signal are characterized by less precise requirements.

The set-up of a new BCI system doesn’t involve only technical and soft-computing

knowledge but also requires the understanding of problems related to man-machine

interaction which range from the bioengineering to the physiology and the

psychology.

The international literature attributes a key role to the feedback. As a replacement

for the brain’s normal neuromuscular output channels, a BCI also depends on

feedback and on adaptation of brain activity based on that feedback. Thus, a BCI

system must provide feedback and must interact in a productive fashion with the

adaptations the brain makes in response to that feedback.

Tab 1. Definition of different BCI typologies

A BCI system includes the user in a closed loop operation. The BCI is the result of

the interaction of two adaptive controllers: the user’s brain, which produces the

signals measured by the BCI; and the BCI itself, which translates these signals into

specific commands. Successful BCI operation requires that the user develop and

maintain a new skill, a skill that consists not of proper muscle control but rather of

proper control of specific electrophysiological signals; and it also requires that the

BCI translate that control into output that accomplishes the user’s intent.

The first studies about Brain Computer Interfaces aimed at providing supplemental

control pathways to soldiers in order to fly a plane or to activate a weapon; nowadays

the BCI is considered as a potential aid to people with a minimal or non residual

mobility caused by an invalidating pathology or a trauma. In the last decade a sort of

international community composed of many different research units was born. The

two main BCI research centre are the Graz University of Technology (Graz,AU) and

the Wadsworth Center (Albany, USA). Many group had focused on specific aspects

an proposed their own vision of a BCI. Many group aimed at the development of

new algorithms, other group focused on the protocols and other on the enhancement

on the command capabilities using a low bit-rate interface.

OVERVIEW OF THE SYSTEM

In our vision the ideal BCI is like a common input device a normal computer

application, the system should be recognised by the operating system as a pointing

device or a keyboard.

The proposed methodological approach for the development of a domestic BCI is

here presented step by step:

• Data-set recording;

• Offline algorithm prototyping in Matlab®;

• Online test;

• Identification of the possible user-specific customization;

• High level profiling of the most important blocks (Memory, execution time);

• Optimization of the algorithm and the protocol;

• Porting into a lower level language (C);

• Execution by the ARM7 CPU of the EEG device;

• User Interface customization.

In order to achieve this target it is essential to provide both the final device and the

intermediate tools necessary for the setup of the protocol and the algorithms. During

this work a complete framework composed of hardware an software tools was

developed. The framework was validated by setting up a BCI system based on the

SSVEP protocol.

The framework was composed of:

• A low cost acquisition unit;

• An hardware interface module;

• A mathematical software library (C4M);

• A dynamic mathematical engine;

• A graphic user interface;

• A home automation system interface.

THE ACQUISITION UNIT

The low cost acquisition unit was called KimeraII and was composed of an EEG

amplifier and a digital acquisition and elaboration unit. It was an evolution of its

predecessor Kimera: while keeping the same analog front-end architecture, it was

improved by using a new specifically designed acquisition module produced by SxT

– Sistemi per Telemedicina s.r.l. (Lecco - Italy).

The EEG amplifier was developed in the laboratory and adopted a patented topology

for the signal conditioning. Thought being 3.3V battery powered, the device is robust

against artefacts and out-of-band noise thanks to the adopted topology which

provides a fast recovery from saturation and enhanced dynamic reserve.

The digital acquisition board executed a Real Time Operating System (RTOS) and a

specifically designed firmware. Se system was controlled by a ARM7 32 bit

microcontroller which performed the following tasks:

• Analog to digital conversion using an external eight channel ADC;

• Local data storage on a micro-SD card in a PC compatible format (FAT16);

• Communication with the PC or a PDA by an onboard Bluetooth® module

(SPP profile);

• On board data processing using a specifically designed mathematical library

written in C language.

HARDWARE INTERFACE MODULE

The Hardware Interface Module (HIM) was a software designed in order to assist the

researcher during the prototyping of a new BCI system. It provides a solid structure

for the acquisition, storage and visualization of the signal, for the communication

with the BCI application user interface and for the real-time execution of algorithm

both in C and Matlab® environment. This software was been written in C++

language using the cross-platform wxWidgets library.

The source code was structured in order to promote application specific

customization using hereditary and polymorphism techniques. The basic version

supports several kind of instruments; some are real, other are virtual. They are

• Kimera version II;

• Kimera version I;

• Other devices non studied for BCI use;

• An ideal signal generator;

• An instrument simulator which load previously recorded signal from a file

and play it ad the same sample frequency;

• An interactive file player which can load pieces of dataset depending on the

operator’s needs.

It is possible to add a new instrument by deriving a specific class from the basic

Instrument model.

The HIM also provides a basic algorithm class which handles all the operation

related to data buffer management, the performance monitoring and, if necessary, the

communication with Matlab®. In order to test a new DSP technique the researcher

have to focus only on writing the specific code even in Matlab language.

THE C4M LIBRARY

In order to simplify the problem of porting BCI algorithms to different platforms,

especially those not supporting Matlab® based development, the C4M Mathematical

library was developed.

A careful management of the programming code was realized after a deep analysis of

the hardware characteristics of the computers: all the necessary optimizations were

adopted in order to obtain fast execution of functions. Special attention was also

dedicated to the management of the memory. The C4M library is a powerful tool for

the efficient porting of generic algorithms on single chip embedded system, which

have limited performances in terms of speed and memory resources, when compared

with those available in a standard PC.

A mathematical engine provides a command line high level language which allow

the test of new functions directly on the final hardware device: using a terminal

software and a serial port it is possible to use the a standard personal computer in

order to interact with the mathematical engine running on the device in a manner

which is similar to a simplified Matlab® version.

THE USER INTERFACE MODULE

The user interface was developed in collaboration with another PhD student in the

form of a flexible tool studied in order to simplify the implementation of new

operating protocol or BCI based user applications. This module was written using an

OpenGL based graphical engine in order to provide a more realistic and challenging

experience to the user. A TCP/IP socket based layer managed the communication

with the hardware interface module which can be executed by both the same personal

computer and another one.

THE HOME AUTOMATION SYSTEM INTERFACE

A specific software module was implemented in order to provide an interaction layer

with a home automation system. A MyHome® gateway was provided by BTicino

spa (Erba, Italy) and a basic demonstrator was set-up in our laboratory. The standard

physical communication layer of the used gateway is RS-232; in order to maximize

the ease of installation a specific RS-232 to Bluetooth module was designed. The

Application layer was implemented using the OpenWebNet® language proposed by

BTicino. Although the current demonstrator supports only a few lamps, and some

devices, the software module supports all the options provided by the home

automation system.

THE DEVELOPMENT EXPERIENCE: THE SSVEP BASED

SYSTEM

The framework was validated developing a SSVEP based BCI system. The visual

evoked potential (VEP) is an electrical potential generated in the brain and recorded

from the occipital region of the scalp after the presentation of a visual stimulus: when

the stimulus repetition rate is higher or equal to 6 Hz, the evoked response appears as

a periodic activity. This activity increases the power spectral density (PSD) in

correspondence to the integer multiples of the stimulus flashing frequency: this

particular type of brain response is known as steady-state visual evoked potential.

A specifically designed stimulation system connected to the computer by a USB port

and driving four LEDs applied to the edges of the screen, was adopted. In order to

simplify the signal processing, the chose of the frequencies was constrained into the

set of those frequencies whose period could be represented by a integer number of

samples at a 256Hz.

The system was able to discriminate between five classes: RIGHT, LEFT, UP,

DOWN, CENTRE.

The BCI algorithm was based on a supervised multiclass based on a Regularized

Linear Discriminant Analysis. The selected features were the result of a ratio

between the amplitude of a stimuli time-synchronous averaging of the signal,

compared to the estimation of the amplitude of the raw signal. A simplified surface

Laplacian combined a reduced number of electrodes and a good signal to noise ratio.

The framework proved a good flexibility and the wide set of debugging instruments

dramatically simplified the debug and the test of a new system. Starting from a

previous experience and set of algorithms for the SSVEP protocol it was possible to

set-up a complete BCI system in the Sensibilab laboratory in CampusPoint@Lecco

(Lecco, Italy) in a few weeks. The resulting system proved a good usability: it was

possible to complete an eight choices within four possibilities test without errors in

about one minute.

CHAPTER ORGANIZATION

This thesis is organized in 5 chapters. The first ore is an introduction on the concept

of weak users and out objectives, the second one is about the Brain Computer

Interface and on the state of the art in this field. This chapter aims at giving a general

overview on the topic. The third chapter is focused on the development of the

framework. A detailed each module will be described in order to explain the specific

role in the system or in the BCI research process. This chapter also works as an

introductory reference to the system. The fourth chapter describes the developed

SSVEP based system.

The fifth chapter is dedicated to illustrate the outcomes of this work, rather than on

the electronic specifications, the main focus is on the experience of the developer in

the use the system.

The last chapter is a general conclusion provided in order to understand the cohesion

of the two parts of the thesis (the framework and the implemented system) and the

importance of the study in the BCI field.

AN HARDWARE AND SOFTWARE BASED FRAMEWORK FOR THE DEVELOPMENT

OF A DOMESTIC BRAIN COMPUTER INTERFACE

INTRODUCTION.............................................................................................................................. 25

WEAK USERS AND ICF ..................................................................................................................... 25 OUR OBJECTIVES .............................................................................................................................. 27

BRAIN COMPUTER INTERFACES AND ENABLING TECHNOLOGIES.............................. 32

BRAIN COMPUTER INTERFACES ........................................................................................................ 32 Dependent and independent BCIs............................................................................................... 33 Two adaptive controllers ............................................................................................................ 33 Synchronous and asynchronous BCIs......................................................................................... 34 Protocols..................................................................................................................................... 35 Structure of a typical BCI system ............................................................................................... 36 Conclusions ................................................................................................................................ 38

DISEASES WHICH AFFECT AUTONOMY .............................................................................................. 39 HOME AUTOMATION SYSTEMS ......................................................................................................... 41

THE FRAMEWORK......................................................................................................................... 46

PROPOSED METHODOLOGICAL APPROACH FOR THE DEVELOPMENT OF AN EMBEDDED BCI.............. 46 THE SYSTEM ..................................................................................................................................... 49 THE C4M LIBRARY............................................................................................................................ 50

Structure of the library ............................................................................................................... 51 Rules ........................................................................................................................................... 51 Structures.................................................................................................................................... 52 Functions and conventions ......................................................................................................... 53 Optimizations.............................................................................................................................. 55 Mex Files .................................................................................................................................... 56 Dynamic memory allocation manager........................................................................................ 57

THE C4MENGINE .............................................................................................................................. 58 Workspace manager ................................................................................................................... 59 The command line parser and dynamic functions manager ....................................................... 60

THE DEVICE...................................................................................................................................... 61 The analog design....................................................................................................................... 62 Results and discussion ................................................................................................................ 67

THE DIGITAL ACQUISITION UNIT ....................................................................................................... 68 Electronic design ........................................................................................................................ 68 Firmware design......................................................................................................................... 70

STIMULATION DEVICE ...................................................................................................................... 73 SOFTWARE MODULES ....................................................................................................................... 74

Instrument interface module ....................................................................................................... 74 AEnima: the Graphic User interface Module ............................................................................. 77

HOME AUTOMATION SYSTEM ........................................................................................................... 80 COOPERATING USER INTERFACES: THE DYNAMIC KEYBOARD .......................................................... 81

Static keyboard layout ................................................................................................................ 83 Dynamic keyboard layout ........................................................................................................... 84 Static keyboard with positioning on the most probable letter..................................................... 84 Comparison ................................................................................................................................ 85

DEVELOPMENT OF A SSVEP BASED BCI SYSTEM ............................................................... 88

THE PROTOCOLS .............................................................................................................................. 89 Screening protocol...................................................................................................................... 90 Training or calibration protocol................................................................................................. 92 Testing protocol.......................................................................................................................... 93 BCI-User protocol ...................................................................................................................... 95 AstroBrainFight protocol ........................................................................................................... 95 MyHome protocol ....................................................................................................................... 96

SIGNAL PROCESSING......................................................................................................................... 98 Signal pre-processing block ....................................................................................................... 99

Feature extraction block........................................................................................................... 100 Window length affects frequency selectivity and resolution ..................................................... 101 Effect of the base EEG.............................................................................................................. 103 Biofeedback generation ............................................................................................................ 104 Classification ............................................................................................................................ 106

RESULTS AND DISCUSSION....................................................................................................... 108

POWER REQUIREMENT AND DSP PERFORMANCES .......................................................................... 108 Test setup .................................................................................................................................. 108 Results....................................................................................................................................... 109

BCI USERS AND PERFORMANCES.................................................................................................... 115 Testing procedure ..................................................................................................................... 117

BCI DEVELOPMENT EXPERIENCE ................................................................................................... 120

CONCLUSIONS AND FURTHER WORKS ................................................................................ 124

APPENDIX A – SCHEMATICS OF THE GENERAL PURPOSE ACQUISITION BOARD . 126

REFERENCES................................................................................................................................. 136

22

Index of the figures

Fig 1. The person and all the environmental aspects which are considered in the ICF

.................................................................................................................................... 26

Fig 2 - ICF Disability model ...................................................................................... 27

Fig 3. Set up of the two levels of adaptation.............................................................. 34

Fig 4. Structure of a typical BCI system.................................................................... 38

Fig 5. Generic functional model of an EEG based BCI............................................. 38

Fig 6. Home automation for elderly or disabled people............................................. 41

Fig 7. Applications of the home automation.............................................................. 42

Fig 8. Bus structure or a home automation system .................................................... 43

Fig 9. Graphical representation of the framework components and their relation and

interconnections ......................................................................................................... 49

Fig 10. Structure of a C4M function acting as a gateway function from Matlab®

environment to c4m library........................................................................................ 56

Fig 11. Dynamic memory buffer management .......................................................... 58

Fig 12. Variable descriptor......................................................................................... 59

Fig 13. Structure of the c4mAmxFunctionStruct structure........................................ 60

Fig 14. Physical realization of the EEG unit.............................................................. 61

Fig 15. The amplification chain proposed by the OpenEEG project ......................... 62

Fig 16. Block diagram of the system ......................................................................... 63

Fig 17. Frequency response of a system with 500V/V gain and a 2nd order low pass

75Hz filter .................................................................................................................. 64

Fig 18. Bode plot of the compensate filter against the original one .......................... 65

Fig 19. Phase diagram of the resulting GLoop .......................................................... 66

Fig 20. Root locus of a 4th order system ................................................................... 66

Fig 21. Root locus of a 9th order system ................................................................... 67

Fig 22. Main schematic of the digital acquisition board............................................ 70

Fig 23. Firmware structure......................................................................................... 72

Fig 24. Screenshot of the Hardware Interface module: A - Main window, B - Signal

plot window,............................................................................................................... 74

Fig 25. Current hierarchical tree of the instrument class ........................................... 75

Fig 26. Structure of the AEnima module ................................................................... 78

Fig 27. Structure of a BCIMessage ............................................................................ 79

Fig 28. Asymmetry of BCI communication bandwidth............................................. 81

Fig 29. Keyboard structure......................................................................................... 82

Fig 30. Layout composing strategy............................................................................ 83

Fig 31. Typical evolution of the population. The black line represents the best

performance, the blue dots the average performance of the population .................... 85

Fig 32. Static keyboard layout ................................................................................... 85

Fig 33. Static keyboard with positioning on the most probable letter ....................... 85

Fig 34. 10-20 reference system .................................................................................. 89

Fig 35. Screening protocol GUI (Active phase)......................................................... 91

Fig 36. Screening application..................................................................................... 91

Fig 37. Training sequence .......................................................................................... 92

Fig 38. Main window of the training application....................................................... 93

Fig 39. The user during the testing phase .................................................................. 94

Fig 40. Main menu ..................................................................................................... 95

Fig 41. Astrobrainfight application ............................................................................ 96

Fig 42. Main menu of MyHome protocol .................................................................. 97

23

Fig 43. Cursor movement grid ................................................................................... 97

Fig 44. Planimetry of the acquisition room................................................................ 98

Fig 45. Structure of the algorithm.............................................................................. 99

Fig 46 Most common spatial filters ........................................................................... 99

Fig 47. Feature plot in time ...................................................................................... 101

Fig 48. Selectivity plot 8hz ...................................................................................... 102

Fig 49. Frequency selectivity with 2,3,4 seconds windows..................................... 102

Fig 50. Frequency response to a sinusoidal signal with white noise ....................... 103

Fig 51. Comparison of the response with and without band-pass filter................... 104

Fig 52. Three different situation: (A) Low separability due to variance and low

distance, (A) Better perability due to variance reduction, (C) Good separability du to

high distance and low variance ................................................................................ 105

Fig 53. Time course of the feedback on the basis of the evaluation window length106

Fig 54. Time plot of the debug pin and measurement of elaboration time .............. 113

Fig 55. Time plot of the debug pin and measurement of communication time ....... 113

Fig 56. SD card profiling ......................................................................................... 114

Fig 57. Screening subject A – SSVEP response at all stimulation frequencies,

presence of harmonics.............................................................................................. 115

Fig 58. Screening subject B - SSVEP response only in the upper frequencies, no

harmonics ................................................................................................................. 116

Fig 59. Subject C - SSVEP response on some frequencies, some have harmonics,

other not ................................................................................................................... 116

Fig 60. A subject during the acquisition .................................................................. 118

24

Index of the tables

Table 1. Comparison of two different coding stile: optimized on the left, more

readable on the right side ........................................................................................... 55

Table 2. Required operations corresponding to the previous table............................ 55

Table 3. Genetic algorithm parameters ...................................................................... 84

Table 4. Comparison of different keyboard layouts................................................... 86

Table 5. Power consumption in different configurations......................................... 112

Table 6. Subject performances using the SSVEP based BCI................................... 119

25

INTRODUCTION

Home automation and ambient intelligence are long term objectives of the

international research; multidisciplinary skills ranging from computer science to

microelectronics and mechanics are involved in an integration process aimed at the

definition of new scenarios and solutions. In the future the surrounding environment

will be characterized by an integrated non intrusive technological presence able to

perceive and control the environment conditions depending on the user determination

or necessity [Andreoni, 2007].

In the last decade in the most developed countries the structural composition of the

familiar nucleus has changed making the presence of old or disabled people more

frequent. The augmented awareness about the need of the so called “weak users”

make necessary the development of enabling technologies which make the usability

of the environment. The International Classification of Functioning, Disability and

Health (ICF) is the World Health Organization's (WHO) framework for measuring

health and disability at both individual and population levels. The ICF was officially

endorsed by 191 WHO Member States in the Fifty-fourth World Health Assembly on

22 May 2001 (resolution WHA 54.21).

The following paragraph aims at giving an overview of the actual approach to

disability given by the ICF.

Weak users and ICF

The overall aim of the developers of the ICF was to provide a unified and standard

language and framework for the description of all aspects of human health and some

health-relevant aspects of well-being [WHO, 2001].

The ICF comprises a bio psychosocial model in which a person’s functioning and

disability is conceived as a dynamic interaction between health conditions and both

environmental and personal contextual factors [Badley, 2008] thus as shown in Fig 1

an ICF according classification should take into account all the aspect related to the

interaction between the person and the environment.

26

Fig 1. The person and all the environmental aspects which are considered in the ICF

The ICF organizes the domains of activity and participation into sub domains. The

sub domains are the same for both domains and include the following:

• Learning and applying knowledge;

• General tasks and demands;

• Communication;

• Mobility;

• Self-care;

• Domestic life;

• Interpersonal interactions and relationships;

• Major life areas; and

• Community, social, and civic life.

The ICF identifies three levels of human function: functioning at the level of body

parts, the whole person, and the whole person in their complete environment. These

levels, in turn, contain three domains of human function: body functions and

structures, activities, and participation [Jette, 2006].

Each ICF component can be expressed in both positive and negative terms. At one

end of this scale are the terms that indicate non problematic (ie, neutral and positive)

aspects of health and health-related states, and at the other end are the terms can be

used to indicate problems. Non problematic aspects of health are summarized under

the umbrella term "functioning," whereas "disability" serves as an umbrella term for

impairment, activity limitation, or participation restriction [Steiner, 2002].

27

The ICF also provides a model of functioning and disability, which reflects

interactions between the components of the ICF. The Model of Functioning and

Disability is a biopsychosocial model designed to provide a coherent view of

various

dimensions of health at biological, individual, and social levels.

Fig 2 - ICF Disability model

As illustrated in Fig 2 an individual's functioning or disability in a specific domain

represents an interaction between the "health condition" (eg, diseases, disorders,

injuries, traumas) and the contextual factors (ie, "environmental factors" and

"personal factors"). The interactions of the components in the model are

in 2

directions, and interventions in one component can potentially modify one or more of

the other components.

Our objectives

This work is a technological approach that aims at changing the ability of interaction

of severely disabled people with the external environment, thus acting in a direction

which doesn’t affect user health condition, but the activity and participation layer and

contributing at increasing disabled person’s quality of life.

“A BCI is a communication system in which messages or commands that an

individual sends to the external world do not pass through the brain’s normal output

pathways of peripheral nerves and muscles” [Wolpaw 2002]. A BCI is a

communication device that reveals a specific brain activity and associate it to a

command in order to control an application.

28

Since a few years ago BCI research was mainly focused to the development of new

protocols and more performing algorithms, the most of the scientific results were

proven on healthy people and systems were developed without taking into account

the real applicability to the daily life. In the last two years a great effort was done in

order to bring the BCI to real users homes: the availability of low cost and high

performance computer helped the set-up of the system, but the problems related to

the complexity of the system, the software licence cost and the portability and cost of

the hardware still slow down this process: “… there are a number of impediments in

efforts to provide BCIs to potential users. First and foremost, all current BCIs share

the common problem that they require excessive effort to set up, calibrate, and

operate. Most BCIs are currently at the research stage: a research team of

technicians, engineers, computer scientists, psychologists and neurologists is

necessary for implementation, data analysis and maintenance. Thus, BCIs are not

currently in people’s homes for continuous use in their daily lives” [Kubler, 2006].

This work aimed at providing a set of tools for the development of home Brain

Computer Interfaces based on a low cost acquisition and processing device and all

the software modules necessary to ease the interfacing of the system to external

devices (eg. Home automation systems).

This work isn’t a mere miniaturization of existing devices but a definition of new

acquisition conditions, of system architecture and algorithm structures which must be

considered to assure the feasibility of the project. The developed tools form both a

framework for the laboratory research and a technological platform for the

development of user oriented systems which could be used by the end-user in their

domestic activity.

From the point of view or the BCI research the framework will eliminate the

technological and economical barrier for the entrance of new research groups in the

BCI field, moreover the adoption of a common framework will assure the portability

of new solution among the multiple developed user applications.

From the point of view of the final user, the adoption of a common framework will

ease the process of technology assessment necessary to make a product out of a

prototype thus reducing the time for availability and costs for commercialization.

This work was developed in the WowsLab of the TBMLab and in the Sensibilab of

the Bioengineering department of Politecnico di Milano (Milan, Italy) in

collaboration with ST Microelectronics (Agrate Brianza, Italy), a permanent

29

demonstrator was installed in the Sensibilab of the Industrial Design department of

Politecnico di Milano (Lecco, Italy).

30

32

BRAIN COMPUTER INTERFACES AND ENABLING TECHNOLOGIES

Brain computer interfaces

A Brain Computer Interface (BCI) is a system that implements a direct

communication pathway between the brain and a controlled device. Conditions such

as amyotrophic lateral sclerosis (ALS), brainstem stroke, and brain or spinal cord

injury can impair the neural pathways that control muscles or the muscles

themselves. People who are most severely affected may lose all or nearly all

voluntary muscle control, even eye movements and respiration, and may be

essentially “locked in” to their bodies, unable to communicate in anyway or limited

to slow unreliable single-switch methods. The ultimate goal of brain–computer

interface (BCI) technology is to provide communication and control capacities to

people with severe motor disabilities [Vaugan, 2006].

Many studies over the past 20 years show that the scalp-recorded

electroencephalogram (EEG) can be the basis for brain–computer interfaces (BCIs)

[Kuebler, 2001; Donoghue, 2002; Birbaumer, 1999; Wolpaw, 1991] that restore

communication and control to these severely disabled individuals: among the other

technologies (fMRI, NIRS), the electroencephalographic (EEG) signal was adopted

thanks to the technological and practical aspects that distinguish this technique

among the others:

• relatively short time constants;

• can operate in most environments;

• require quite simple and inexpensive equipment;

• doesn’t involve the use of invasive sensor or stimulation;

• which guarantee the domestic applicability of a BCI.

A typical EEG based BCI system is composed of an EEG unit for the signal

amplification and acquisition and a computer which executes one or more software

modules. Usually there are at least two main software modules:

• a signal processing and classification algorithm;

• the controlled application.

33

The operation is constrained by a set of rules that form the operative protocol in

order to reduce the number of degree of freedom of the system and to bind it to a

limited number of well known electrophysiological signals. Those phenomena can be

of endogenous or exogenous origin depending on the necessity to have an external

stimulation in order to elicitate the addressed signal. Depending on the nature of the

signal, it can be necessary to have a stimulation module that can be external or

embedded in the user interface, depending on the specific requirements and on the

design choice.

Dependent and independent BCIs

A BCI is a communication system in which the messages or the commands, that an

individual sends to the external world, do not pass through the brain’s normal output

pathways of peripheral nerves and muscles. A BCI provides its user with an

alternative method for acting on the world. BCIs fall into two classes: dependent

and independent. A dependent BCI does not use the brain’s normal output pathways

to carry the message, but activity in these pathways is needed to generate the brain

activity (e.g. EEG) that does carry it. For example, one dependent BCI presents the

user with a matrix of letters that flash one at a time, and the user selects a specific

letter by looking directly at it so that the visual evoked potential (VEP) recorded

from the scalp over visual cortex when that letter flashes is much larger that the

VEPs produced when other letters flash. In this case, the brain’s output channel is

EEG, but the generation of the EEG signal depends on gaze direction, and therefore

on extraocular muscles and the cranial nerves that activate them [Wolpaw, 2002].

Approaches for independent BCIs are of greater theoretical interest than for

dependent BCIs, because they offer the brain a completely new output pathway and

are likely to be more useful for people with most severe neuromuscular disabilities

[Krepki, 2007].

Two adaptive controllers

A BCI system includes the user in a closed loop operation. The BCI is the result of

the interaction of two adaptive controllers: the user’s brain, which produces the

signals measured by the BCI; and the BCI itself, which translates these signals into

specific commands. Successful BCI operation requires the to user develop and

maintain a new skill, a skill that consists not of proper muscle control but rather of

34

proper control of specific electrophysiological signals; and it also requires the BCI to

translate that control into output that accomplishes the user’s intent.

Fig 3. Set up of the two levels of adaptation

This form of adaptation is obtained using the strategy described in Fig 3. In the first

phase a learning session without feedback is performed in order to calibrate

classification parameters, during the second phase the user test the classifiers

performance and learns how to control his/her own EEG patterns. If it is necessary

the calibration can be repeated in order to accomplish the improvement in EEG

patterns repeatability. The international literature attributes a key role to the

feedback. and to the adaptation of brain activity based on that feedback. BCI

systems must provide feedback and must interact in a productive fashion with the

adaptations the brain makes in response to that feedback in order to maximize the

performance of the whole system.

Synchronous and asynchronous BCIs

The mode of operation determines when the user performs a mental task and,

therewith, intends to transmit a message. In principle, this step can be divided into

two distinct modes of operation, the first being externally paced by a cue stimulus

and the second internally paced without using any cue stimulus. In the former case,

the brain signal, has to be analyzed and classified synchronously to each cue stimulus

in fixed, predefined time windows. Such a system is therefore termed cue-based or a

35

synchronous BCI. After each cue stimulus (visual or auditory), the user has to act

and produce a specific brain state [Prfurtsheller 2001, 2006]

In the case of an internally or self-paced BCI, no cue stimulus is necessary and the

user intends an operation at free will. This BCI is called asynchronous or un-cued

BCI. Such an asynchronous protocol requires the continuous sample-by-sample

analysis and feature extraction of the recorded signal. Thus, it is more demanding

and more complex than BCIs operating with a fixed time scheme. The main issue

here is to discriminate between intentional brain states resulting in true positive

output signals and non intentional states resulting in false positives.

Protocols

It is possible to identify two main approaches typically accepted in the specific BCI

field in order to generate those brain activity variations used as the input control

commands of a system. The first is the operant conditioning approach based on the

self-driving and self-regulation of the EEG response. The second is the cognitive

states detection approach (also known as pattern recognition approach) based on the

identification of specific brain variations in response to a predefined cognitive mental

task. Starting from this distinction, BCIs can then be categorized accordingly to what

kind of imagery or mental tasks the users need to perform in order to drive or evoke

the command-related EEG response. This choice is strictly correlated with the type

of brain activity or neuromechanism that is going to be used as input in a BCI

system. According to this categorization, the following typical BCI paradigms can be

identified [Wolpaw, 2002]:

mu rhythm control. Similar to the alpha rhythm but with substantial physiological

and topographic differences. It can be detected over the motor or somatosensory

cortex in the 8-13 Hz frequency band. The subject can learn how to control the

amplitude of this waves in relation to actual and imagined limb movement or by

performing intense mental tasks or just by increasing the subject selective attention.

Event related Synchronization/Desynchronization (ERS/ERD) [Pfurtscheller 1998,

1999, 2001]. Those signal are recordable as movement related increments (ERS) or

decrements (ERD) in specific frequency bands. They are localized over the

sensorymotor cortex and their amplitude can be modified through an active feedback.

Those signal exist even in the case of imagined movement. The cerebral activity

related to the programming of a movement without the actual muscular activity is

36

known as motor imagery, the preparation of the movement doesn't require any

peripheral activity as it is just a preparation task. The main difference between

imagined and executed movement is that the former is just a preliminary stage of the

latter, whose evolution is blocked at a non precise cortico-spinal level.

P300 [Wolpaw, 2002; Beverina 2004]. Rare or particularly significant auditory,

visual, or somato-sensorial stimuli, when interspersed with frequent or routine

stimuli, typically evoked in the EEG over parietal cortex a positive peak at about 300

ms. This wave is best recorded over the median line of the scalp.

Short latency Visual Evoked Potential (VEP) [Sutter, 1992]. VEP represent the

exogenous response of the brain to fast and short visual stimuli which are recorded

over the visual cortex corresponding to the occipital lobe of the scalp.

Slow Cortical Potential (SCP). Among the lowest frequency features of the scalp

recorded EEG are slow voltage changes generated in cortex consisting in potential

shifts which occur over 300ms to many seconds.

Steady-State Visual Evoked Potentials (SSVEP) [Andreoni, 2004]. Visually evoked

potentials are the result of those electro-physiological phenomena that reflect visual

information processing at a CNS level. The evoked response is considered transient

if the stimulation frequency is less or equal to 2 Hz, steady-state if the visual

stimulus repetition rate is greater than 6Hz. In the latter case the resulting EEG signal

shows an additional periodic response which leads to an increased spectral power

localized on those frequencies integer multiples of the stimulus repetition rate. The

elicited signal is particularly relevant at the occipital regions level of the scalp. The

amplitude of SSVEP is strictly related to user’s attentive conditions thus it can be

modulated and controlled by returning the subject an efficient feedback index.

Structure of a typical BCI system

As stated before a BCI have to exploit the information about the brain physiological

activity in order to understand the user’s intention. In order to perform such an

operation, the system is typically composed by the following blocks:

• a data acquisition system;

• a data processing real-time system;

• a user interface.

The most diffused method for the acquisition of the brain activity is the

electroencephalogram (EEG). The EEG signal reflects the electrophysiological

37

activity of the brain and can be recorded by means of superficial electrodes applied

on the scalp [Kennedy, 2000]. In order to extract the user’s will from the signal there

is a data processing system: this system must process all the data as fast as possible

in order to provide a real-time operation. The user have to perform some kind of

activity in order to produce the signal associated to a specific command: each

command is identified by a class of similar brain activities.

The first part is a block which performs a data pre-processing in order to reject

artefact and to increase the signal to noise ratio (SNR), some kind of elaboration are

band-pass filtering, adaptive filtering or envelope extraction.

After that a feature extraction block extracts the parameters (called ‘features’) from

the signal which can be used to discriminate between different classes of possible

commands. The goal of feature extraction is to find a suitable representation (signal

features) of the electrophysiological data that simplifies the subsequent classification

or detection of brain patterns [Pfurtsheller, 2006]. Typical features are the power

density in certain area of the spectrum, the value and the index of the maximum and

the root mean square amplitude.

The features vector identifies a multidimensional space inside which each class is

situated in a different place. Given the coordinates of the signal in the space of the

features, the classifier have to identify the discrimination border between each area

in order to be able to understand to which class it corresponds.

After the class is identified it is possible to associate it to a command. The command

is executed by the user interface: it can be a word processing application, a functional

electrical stimulation of some muscles or the activation of an home automation

device. Usually a feedback device is used in order to allow the user to know about

the quality of the produced signal a to understand the best mental strategy to produce

a better signal in terms of separation of the classes.

38

Fig 4. Structure of a typical BCI system

From the point of view of the functional model of the system many architectures

were proposed [Mason, 2003], for the development of an EEG based BCI, the

following picture illustrates the most general one.

Fig 5. Generic functional model of an EEG based BCI

According to Fig 5 the control display can also be embedded in the controlled device,

a visual stimulation device has been considered for those protocols which need an

external stimulation to elicitate a specific EEG pattern. A synchronization signal

from the stimulation device to the feature extractor is sometimes necessary

[Beverina, 2004].

Conclusions

The following table illustrates the classification of the most diffused BCI protocols.

Some protocols, like P300 and VEP, rely on the identification of a specific pattern

within a precise time interval after the stimulus and require an extremely tight

39

synchronization between the acquired signal and the stimulation hardware, other

signal are characterized by less demanding requirements [Piccione, 2006].

The set-up of a new BCI system doesn’t involve only technical and soft-computing

knowledge but also requires the understanding of problems related to man-machine

interaction which range from the bioengineering to the physiology and the

psychology.

The first studies about Brain Computer Interfaces aimed at providing supplemental

control pathways to soldiers in order to fly a plane or to activate a weapon; nowadays

the BCI is considered as a potential aid to people with a minimal or non residual

mobility caused by an invalidating pathology or a trauma. In the last decade a sort of

international community composed of many different research units was born. The

two main BCI research centre are the Graz University of Technology (Graz, AU) and

the Wadsworth Center (Albany, USA). Many group had focused on specific aspects

an proposed their own vision of a BCI. Many group aimed at the development of

new algorithms, other group focused on the protocols and other on the enhancement

of the command capabilities using a low bit-rate interface.

Diseases which affect autonomy

Potential beneficiaries of assistive or rehabilitative BCI technology can be classified

either by disease, injury, or functional impairment. Diseases can be broadly divided

into two categories: progressive and non progressive. Progressive conditions

include neuromuscular diseases (e.g., amyotrophic lateral sclerosis (ALS) and

muscular dystrophy) in which muscle use is typically lost over time, potentially

leading to a full loss of movement capability (“locked-in” condition), whereas non

progressive conditions include stroke, traumatic brain injury, spinal cord injury, and

amputation [Bauer, 1979][Chia, 1991], [Leon-Carrion, 2002].

Within the degenerative ones the most disabling ones are:

• Muscular dystrophy;

• Multiple Sclerosis;

• Amyotrophic Lateral Sclerosis;

The a generic term muscular dystrophy points out a group of pathologies that

determine a gradual atrophy of the musculature in some parts of the body and a

general sense of weakness. The various forms of dystrophy are differentiated for the

age of beginning, weight of the lesions, rapidity of the progression. In all these cases

40

some microscopic anomalies of the skeletal musculature are noticed. The most

common form in children is the Duchenne dystrophy: the first symptoms usually

appear at and age between two and six years and they quickly worsen. Duchenne

dystrophy generally take to paralysis and the death is caused by atrophy of the

respiratory muscles.

Multiple sclerosis is a chronic, inflammatory, demyelinating disease that affects the

central nervous system (CNS) MS results in a thinning or complete loss of myelin

and, less frequently, the cutting of the neuron's extensions or axons. When the myelin

is lost, the neurons can no longer effectively conduct their electrical signals. The

most common symptoms rage from the lose movement function to perception and

behavioural dysfunctions.

Amyotrophic Lateral Sclerosis (ALS) is a progressive, usually fatal,

neurodegenerative disease caused by the degeneration of motor neurons This

disorder causes muscle weakness and atrophy throughout the body as both the upper

and lower motor neurons degenerate, ceasing to send messages to muscles. Without

any input the muscles gradually weaken, develop fasciculations because of

denervation, and eventually atrophy. With voluntary muscle action progressively

affected, patients in the later stages of the disease may become totally paralyzed and

require assistive breathing devices;

Spinal Muscular Atrophy (SMA) is a term applied to a number of different

disorders, all having in common a genetic cause and the manifestation of weakness

due to loss of the motor neurons of the spinal cord and brainstem. There are different

kind of SMA that are differentiated by the age of onset and by the entity of the

disorder.

Non progressive diseases are usually caused by a traumatic leisure or strokes. Those

leisure can take to various forms of disability. The most disabling disease is the so

called “locked-in syndrome” is a rare neurological disorder characterized by

complete paralysis of voluntary muscles in all parts of the body except for those that

control eye movement. It may result from traumatic brain injury, diseases of the

circulatory system, diseases that destroy the myelin sheath surrounding nerve cells,

or medication overdose. Individuals with locked-in syndrome are conscious and can

41

think and reason, but are unable to speak or move. The disorder leaves individuals

completely mute and paralyzed. Communication may be possible with blinking eye

movements. In rare cases some patients may regain certain functions, the chances for

motor recovery are very limited.

The needs of these different groups of potential users may differ profoundly; and for

those with progressive disorders they may change substantially over time. Thus, it

seems most effective to focus on addressing the level of functional loss rather than

targeting particular diseases.

Home automation systems

Home automation (or Domotics) is a field within building automation, specializing

in the specific technical requirements of private homes and in the application of

technologies for the comfort and security of its residents.

Many technological field are involved in the realization of an home automation

system ranging from electronics and computer science, but also comprising

communication networks and the internet. From the application point of view

technology will change the interaction paradigm of the user the domestic

environment, while generally conceived as a commercial technology for user

appealing application or extremely high quality buildings, the application of home

automation to the environment were disabled people live can significantly increase

the quality of their life.

Fig 6. Home automation for elderly or disabled people

42

From the technological research point of view, the main focus is the creation of a

smart system able to perform control and integrate of the different installations in the

house like:

• Heating, ventilating, and air conditioning (HVAC);

• Lighting;

• Water delivery;

• Access control;

• Audio and video switching and distribution

• Intercommunication;

• Remote process monitoring and control;

in a more efficient fashion thus gathering power consumption reduction, comfort and

safety sensation [Andreoni, 2006].

Fig 7. Applications of the home automation

An typical home automation system is composed of sensors, actuators, user

interfaces; all those components are usually connected by a bus and coordinated by a

controlling system; since the control tasks are often trivial, the system can also be

43

managed by a distributed computing system embedded in the sensors and the

actuators.

Fig 8. Bus structure or a home automation system

The home automation system can be distinguished on the basis of the technology for

the implementation of the bus, many producers have proposed their own physical and

logical layer: the lack of compatibility and standardization is actually slowing the

diffusion of this technology.

The following paragraph gives a general overview of the available interconnection

technologies:

EIB (European Installation Bus) is an open standard usable with different physical

layers (twisted paid, ethernet, powerline). It is quite diffused an is the basis of the

more modern Konnex BUS;

EHS (European Home System) was aimed at home appliances control and

communication using PLC, power line communications. Developed by EHSA,

European Home Systems Association. Now, after merging with two other protocols,

it is a part of KNX protocol and has a chance to be a basis for the first open standard

for home and building control;

X10 is the american standard for home automation and is based on central system

which send commando using a signal modulation on the power line. It is very

popular fro the inexpensiveness of the components, but data transmission so slow

that the technology is confined to turning devices on and off or other very simple

operations.

KNX (Konnex) is the successor to, and convergence of, three previous standards: the

European Home Systems Protocol (EHS), BatiBUS, and the European Installation

Bus (EIB). The KNX standard is administered by the Konnex Association. KNX

defines several physical communication media:

44

• Twisted pair wiring (inherited from the BatiBUS and EIB Instabus

standards);

• Power line networking (inherited from EIB and EHS - similar to that used by

X10);

• Radio;

• Infrared;

• Ethernet (also known as EIBnet/IP or KNXnet/IP).

KNX is designed to be independent of any particular hardware platform. A KNX

Device Network can be controlled by anything from an 8-bit microcontroller to a PC,

according to the needs of a particular implementation.;

SCS-BUS: The SCS is a proprietary BUS technology proposed by BTicino. It is

based on a distributed system of actuators and command which are connected by a

twisted pair cable.

45

46

THE FRAMEWORK

In this chapter a detailed description of the developed framework will be proposed.

In the first part a methodological approach for the development of an embedded BCI

based on a generic operative protocol will be discussed; in the second part, after a

general overview of the system, each part will be presented in detail.

The purpose of this chapter is to act as a introductory reference for new developers

approaching to the system. From the point of view of the development, the platform

exports a set blocks with documented interfaces which can be considered as

independent representations of the three technological activities involved in the

development of a BCI:

• EEG instrumentation;

• Signal processing;

• Application customization.

The framework is composed of:

• A low cost acquisition unit;

• An hardware interface module;

• A mathematical C library (C4M);

• A dynamic mathematical engine;

• A graphic user interface;

• A home automation system interface.

Proposed methodological approach for the development of

an embedded BCI

The development process doesn’t significantly differ from the one actually adopted

by the most of the research groups; a few aspects and phases which make the process

more “product oriented” of “user oriented” rather than research focused, were added

to a consolidated and validated approach. The proposed development process is

shown in the following flowchart and is composed of the following steps:

47

• Recording of a dataset containing the signals of interest for the given

protocol, (this step can be omitted if compatible databases are already present

in the literature);

• Offline algorithm prototyping in Matlab or other languages oriented to signal

processing provided that they export an interface with other software for the

following step;

• Online test using the Matlab algorithm in order to asses the performances and

the usability of the system.

When the algorithm is satisfactory and the structure can be considered definitive it is

possible to proceed to the next step of the process:

• High level profiling of the most important blocks in terms of memory

requirement and execution time;

• Algorithm optimization in order to reduce any found overhead;

• Porting to C language in order to make it compatible with almost any

electronic platform;

• Execution of the translation algorithm by the on board CPU of the device.

The aspects which differentiate the development process from the one of current day

BCIs are located in the second part of the list and are dedicated to algorithm

optimization and porting to C language, thus they will be discussed in detail. During

this part of the system development it can also be possible to perform a parallel and

partially independent activity dedicated to user interface customization on the basis

of the chosen operative protocol and of the kind application the final user will need

to control.

When the algorithm is validated and thus satisfies the addressed performance target,

an high level analysis of the processing is done in order to identify the most resource

consuming blocks. When the workload appears to be concentrated on any specific

block, a deeper analysis should be done in order to understand if the introduction of

optimized or approximated ah-hoc solutions is effective.

This activity is independent from the used programming language and can be

performed both on the basis of theoretical analysis and using the profiling tools

available in the development environment and the effects of any approximation

should be evaluated offline using the previously recorded datasets.

48

The following guidelines are proposed for the high level optimization activity, each

guideline is associated to one of the following group:

• Level I: activities which do not affect the results of the function and do not

increase code complexity;

• Level II: approximation which are not mathematically equivalent to the

original code and which could affect the global performances of the

algorithm;

• Level III: optimizations which strongly affect code complexity in order to

achieve better performances.

It is worth underlining that not all the activities must be performed: on the bases of

the associated mark the following aspect should be taken into account:

Level I activities must be performed or the resulting code will be bulky and

inefficient and will not be suitable for use in more complex algorithms;

Level II activities should be taken into account if the actual algorithm resource

requirement are above 80% of the available resources;

Level III activities should be done only when the actual code is too complex for real

time execution.

The rules adopted in our approach are:

1. When blocks with a linear transfer function or blocks that accomplish the

requirements of the commutative property function are present, it is possible

to move those blocks which reduce the dimensionality the elaborated signal at

the beginning of the algorithm, thus reducing the computational workload and

the memory required by the following blocks – Level I;

2. when the algorithm contains a memory of the previous states, the use of an

autoregressive structure usually helps reducing the usage of memory – Level

II;

3. When available, adopt simplified or approximated estimations of variables –

Level II;

4. Usually mathematical languages represent data at the maximal allowed

working precision: in the most of the cases it is useful to introduce an

adequate data scaling thus reducing RAM and CPU usage – Level I;

5. The mathematical definition of a quantity is often not the fastest way to

compute it, there are a lot of scientific publications related to optimized

calculations [Cambridge university, 2007] – Level I;

49

6. Algorithm are often based on signal elaboration windows which overlap over

time: the introduction of an incremental structure can dramatically reduce the

elaboration time – Level III;

7. Relevant code portions are often composed by common functions which

perform similar intermediate calculations in order to reach their result: it is

possible to globally rewrite those code portions in order to reach a

considerable performance gain.

The system

The framework provides both the final device and all the intermediate tools

necessary for the setup of the protocol and the algorithms following the depicted

development approach.

The framework is composed of:

• A low cost acquisition unit;

• A mathematical C library (C4M);

• A dynamic mathematical engine (C4MEngine);

• An hardware interface module;

• A graphic user interface (AEnima);

• A home automation system interface.

TCP/IP

Fig 9. Graphical representation of the framework components and their relation and interconnections

50

The low cost acquisition unit is used during the dataset recording and during the

online phase of the development. The unit is provided with signal elaboration

capability thus is able to perform EEG translation to commands in the last phase of

the development.

The C4M library and the C4MEngine are using during the algorithm optimization,

profiling and porting stage.

The hardware interface module is a software with data storage and visualization

capabilities, that during the intermediate development phases performs signal

elaboration and interfaces to the AEnima module using a TCP/IP connection.

The EAnima module implements the user interface, the operative protocol and the

visual biofeedback display.

The c4m library

In order to avoid undesired or unsafe behaviour of the controlled system, BCI

algorithms have to operate in real-time. Many state of the art BCI algorithms are

developed and executed using specific mathematical languages strictly dependant

from the use of a PC or similar devices. Notwithstanding the importance of studying

and obtaining more reliable algorithms, the possible future diffusion of daily life

application, will also face the problems related to the possibility to transport the BCI

from the traditional PC to the devices and systems used by the disabled people.

Furthermore we point out that the most of the high level digital signal processing

languages are typically oriented to the reduction of the developing time rather than

optimizing the resources and the real-time execution [Mazzucco, 2006].

In this part of the work the problem of porting BCI algorithms to different platforms

especially the those not supporting high level mathematical language based

development was addressed, our attention was focused mainly to the application to

embedded devices. Since the most diffused tool for the prototyping of BCI

algorithms is Matlab® (Mathworks Inc., Massachusetts, USA) [Schrerer, 2004], and

since the C language is considered the standard for embedded processor, the structure

of the proposed library was organized in order to simplify the porting from Matlab to

C language.

A framework capable to facilitate the porting of BCI algorithms to any platform was

studied paying a special attention to the embedded solutions and without affecting

the development time. A careful management of the programming code was realized

51

after a deep analysis of the hardware characteristics of the computers: all the

necessary optimizations were adopted in order to obtain fast execution of functions.

Special attention was also dedicated to the management of the memory.

This library was developed in collaboration with the AST group of ST

Microelectronic (Agrate, Italy) and was applied also the other problems like face

tracking and face recognition algorithms for consumer devices.

The library was called C4M (acronym of C For(4) Mathematics).

The C4M library is a powerful tool for the efficient porting of algorithms for generic

applications on single chip embedded system, which have limited performances in

terms of speed and memory resources, when compared with those available in a

standard PC.

Structure of the library

The features that make Matlab suitable to design signal processing solutions are the

matrix-based architecture and the availability of several basic and recurrent functions

able to address both time and frequency domain operations. The adopted design

specification were:

• Data struct similar to the one of adopted in Matlab in order to ease porting;

• Availability of a wide set of basic functions;

• Execution time optimization of the basic functions;

• Working precision scalability;

• Memory usage reduction and monitoring.

Rules

In order to obtain the maximum performances, portability and usability the C4M

proposed and used some specific rules:

• Since not all the platforms provide heap management (dynamic allocation)

the C4M functions can’t dynamically allocate memory (the program that

operates in the higher level must provide the necessary memory);

• in order to adapt the algorithm to the architecture of the CPU and to the

requirements in terms of precision, both single and double floating-point

precision version of each function should be provided;

52

• in order to avoid waste of memory, each function should use the input

variables space to store the output variables: the calling level function will

make a copy of the input variables if necessary;

• Input and output arguments are in the form of matrices and are represented

using a matrix.

Structures

Since each compiler have a specific implementation of the basic datatypes an

intermediate datatype definition block was used in order to assure the same results on

any platform. The following code snippet is the definition block for a generic 32bit

architecture.

typedef unsigned int uint32; //32bit positive integer

typedef unsigned short uint16; //16bit positive integer

typedef unsigned char uint8; //8bit positive integer

typedef int int32; //32bit integer

typedef short int16; //32bit integer

typedef char int8; //32bit integer

typedef double float64; //floating point double precision (64bit)

typedef float float32; //floating point single precision (32bit)

The basic data is a four field structure, called c4mMatrix is used in order to represent

matrices and vectors:

typedef struct c4mMatrix

{

void *pdata;

uint32 rows,cols;

uint8 esize;

} c4mMatrix;

The pdata member is a generic pointer to the memory area where the data is stored;

the rows and the cols elements are used to store information about the matrix

dimension; the esize variable is the size (expressed in bytes) of the single element of

the matrix and it allows the users to retrieve the amount of memory used by the

matrix.

53

Complex numbers are represented using a specific structure and can be represented

both in single and double precision:

typedef struct c4mComplex32

{

float32 r; //Real part

float32 i; //Imaginary part

} c4mComplex32;

typedef struct c4mComplex64

{

float64 r; //Real part

float64 i; //Imaginary part

} c4mComplex64;

Functions and conventions

The basic functions, or atomic functions, operate on input and output arguments

represented as matrices. Some kind of operators may need some temporary working

space: since matrices can have variable size the allocation of such space in the

function stack is not possible; in order to avoid dynamic allocation of memory the

calling application must provide temporary working memory. If a temporary space is

needed, it will be provided as a c4mMatrix. For each function both a single precision

and a double precision version should be provided: the most of the double precision

version of the functions can be automatically generated starting from the single

precision ones. The version of the function can be identified by the prototype: the

naming convention is the following:

int8 c4m<NameOfTheFunction>_F/_D(c4mMatrix *pIn, c4mMatrix *pOut,

c4mMatrix *pTemp)

The name of the function should be preceded by the prefix c4m and followed by the

suffix _F or _D whether it is the single precision or the double precision version of

the function.

The function should always return an int8 in order to report if the operation was

successfully executed or not (C4M_RET_OK or C4M_GENERIC_ERROR). In

order to avoid waste of time and stack memory in the copy of the c4mMatrix input

54

structs, the input parameters should be pointers to c4mMatrix and should be listed in

the following order: inputs, outputs, temporary space.

The following code snippet shows the typical internal structure of a c4m function.

int8 c4mMatrixProd_D(c4mMatrix *x, c4mMatrix *y, c4mMatrix *out) { uint16 m, p, n, r, c, rc; float64 *ddx, *ddy, *dout, sum, *ddyend; int8 ret_code=C4M_RET_OK; ddx = (float64*)x−>pdata; ddy = (float64*)y−>pdata; dout = (float64*)out−>pdata; m = x−>rows; p = x−>cols; n = y−>cols; #ifdef C4M_CHECKS /* BEGIN Check parameters*/ if((x−>esize != y−>esize) || (x−>esize != out−>esize)) ret_code = c4mSetError(C4M_GENERIC_ERROR, "Input and output elements must have the same size"); if(p != y−>rows) ret_code = c4mSetError(C4M_GENERIC_ERROR, "The first matrix colums must be equal to the second matrix rows."); if((m != out−>rows) || ( n != out−>cols)) ret_code = c4mSetError(C4M_GENERIC_ERROR, "Wrong output matrix dimensions. "); if((n == 1)&&(p == 1)&&(n == 1)) ret_code =c4mSetError(C4M_GENERIC_ERROR, "Input parameters must be matrix.") ; if(ret_code!=C4M_RET_OK) return(ret_code); /* END Check parameters */ #endif /* C4M_CHECKS */ for(c=0; c<n; c++)

{ for(r=0; r<m; r++)

{ sum = 0.0; ddx = (float64*)x−>pdata + r; ddy = (float64*)y−>pdata + c*p; ddyend = ddy + p; for (; ddy<ddyend;) { sum += *ddx * *ddy++; ddx += m; } *dout++ = sum; } } return(ret_code); }

55

The fist part is dedicated to variable declaration, the second part is devoted to the

parameters check and can be omitted at compile time using the proper compiler

directives and the third part is the mathematical function.

Optimizations

Thanks to the performance improvement of the computers, the coding style has

moved from a performance optimization style to a readability optimization style. The

following code snippets represent the same internal loop of the c4mMatrixProd_D

coded using the performance oriented style on the left and the readability style on the

right:

ddx = (float64*)x−>pdata + r;

ddy = (float64*)y−>pdata + c*p;

ddyend = ddy + p;

for (; ddy<ddyend;)

{

sum += *ddx * *ddy++;

ddx += m;

}

ddx = (float64*)x−>pdata + r;

ddy = (float64*)y−>pdata + c*p;

for (i=0; i<p;i++)

{

sum += *ddx[m*i] * *ddy[i];

}

Table 1. Comparison of two different coding stile: optimized on the left, more readable on the

right side

While the code on the right uses the standard vector and indexes notation and

executes the loop on the basis of the value of the index, the code on the left was

optimized buy removing the index and by calculating the end value of the pointer in

order to manage the loop. Considering just the operation related to data addressing

and loop management the two snippets could be coded using:

1 Comparison

2 Additions

1 Comparison

3 Additions

3 Multiplication

Table 2. Required operations corresponding to the previous table

While on modern processors with many internal registers, a long instruction pipeline

and a cache memory there is just a slight performance difference, a significant

performance difference can be found executing this code using microprocessors for

56

embedded applications: for example, using an ARM7TDMI core microcontroller, the

optimized code, is executed about 20% faster than the other.

Mex Files

Matlab supports an external interface called mex-functions which allows the

execution of C and Fortran based functions within its working environment. A MEX

function acts as a gateway between Matlab and the C4M function. This technology

was exploited both in order to the correctness an the performance and the reliability

of the c4mFunctions and to immediately exploit the advantages in terms of

performances of the c4m library.

The following figure shows the structure of a typical mex functions: an input variable

check block is followed by a switch for the selection of the proper working precision,

then variables are retrieved from Matlab workspace, the space for the output is

allocated and the c4mFunction is executed.

Input variables check

c4mFunction_F

Switch

Variableretrievement and memory allocation

Float32

c4mFunction_D

Variableretrievement and memory allocation

Float64

return Completion Status

Input variables check

c4mFunction_F

Switch

Variableretrievement and memory allocation

Float32

c4mFunction_D

Variableretrievement and memory allocation

Float64

return Completion Status

Fig 10. Structure of a C4M function acting as a gateway function from Matlab® environment to c4m library

The following code snippet represents the structure of the typical mex function.

void mexFunction(int nlhs, mxArray *plhs[], int nrhs, const mxArray *prhs[]) { c4mMatrix in1; c4mMatrix in2; int8 ret_code = C4M_RET_OK; if(nrhs != 2) ret_code = c4mErrMsgTxt("c4mAdd require two input arguments."); if(nlhs > 1) ret_code = c4mErrMsgTxt("c4mAdd returns one output values."); if(ret_code != C4M_RET_OK) return;

57

plhs[0] = c4mMxMatrixClone(&in1,prhs[0]); c4mMxMatrixInit(&in2,prhs[1]); if(in1.rows != in2.rows || in1.cols != in2.cols) { ret_code = c4mErrMsgTxt("The first parameter and the second parameter must have the same dimension"); } if(ret_code != C4M_RET_OK) return; if(mxIsDouble(prhs[0]) && mxIsDouble(prhs[1])) { ret_code = c4mAdd_D(&in1, &in2); } else if(mxIsSingle(prhs[0]) && mxIsSingle(prhs[1])) { ret_code = c4mAdd_F(&in1, &in2); } else { ret_code = c4mSetError(C4M_GENERIC_ERROR,"c4mAdd: input type not supported."); } if(ret_code != C4M_RET_OK) { c4mErrMsgTxt(c4mGetErrorMsg()); return; } }

Dynamic memory allocation manager

While during the last years many microcontrollers for embedded applications with

high clock frequencies and good computational performances have been proposed

they still constrained for what regards RAM availability. Some compilers and some

RTOS provide different implementation of dynamic memory allocation: each

implementation have it’s own constraint and the probability of memory

unavailability due to heap fragmentation is high. For those reasons the basic

functions of the C4M library are non allowed to use dynamic memory allocation.

In order to ease the higher-lever code composition a proprietary dynamic memory

manager was implemented as a separated module of the c4m library and was called

c4mMalloc. C4MMalloc isn’t actually a dynamic memory manager: it performs an

automated partitioning of a pre-allocated memory block of user defined size. The

module also provides some simple mechanisms for the profiling memory resource

usage and main buffer fragmentation. A memory integrity check is a useful tool for

error tracking.

The library is composed of the following functions:

int8 c4mInit(int8 *pBuff,c4mSizeType size); //Performs the

initialization of the c4mMalloc module

int8* c4mMalloc(c4mSizeType size); //Performs the allocation of the requested memory and marks it as “in use”

58

c4mSizeType c4mFree(int8 *pBuff); //Deallocates the memory and marks is as available void c4mMallocStat (void); //Prints on the stdout some statistics about memory usage and fragmentation; void ShowMemHistory(void); //Prints the history of allocated memory

c4mSizeType is defined at compile time and can be and int16 of a int32 depending

on the target architecture and on the maximum addressed memory.

Fig 11. Dynamic memory buffer management

As shown in the previous figure, the main block is considered as a double input

stack. The allocated blocks are grown by the top of the stack, while block descriptors

are grown by the bottom of the same stack. When the descriptors stack reaches the

block stack, the available memory is exhausted. The block descriptor is composed of:

the pointer to the first element of the block and the size of the block. In order to

provide a basic memory error management, the allocated block is composed of the

allocated memory and an element at the end of the block. This element is a

c4mSizeType and is equal to the size of the block: in order to verify is a block was

corrupted it is possible to check whether that value is correct or not. In order to

reduce the effect of memory fragmentation, when a block is requested the actually

allocated memory is rounded to a multiple of a constant (C4M_NEXT_LIN_STEP)

defined at compile time.

The c4mEngine

A mathematical engine provides a command line high level language which allow

the test of new functions directly on the final hardware device: using a terminal

59

software and a serial port it is possible to use the a standard personal computer in

order to interact with the mathematical engine running on the device in a manner

which is similar to a simplified Matlab® version. The mathematical engine was

compiled to run an STR710 evaluation board that was enhanced by the addition of an

SD card socket. The main purpose of this software is to create a function benchmark

platform with all the necessary instrument to test and evaluate C4M functions, as a

second aspect, using the c4mEngine module, it is possible to create a real-time

processing firmware reconfigurable without cross-compilation and without updating

MCU firmware.

The main components of this software are:

1. a variables workspace;

2. a command line parser and interpreter;

3. a SD card manager;

4. a FAT16 file-system;

5. a USB stack featuring a mass storage profile.

6. some profiling and utility functions.

The engine is similar to Matlab® MEX-Functions: on the basis of the input

command, a gateway function called amxFunction performs input variables retrival,

create space for output arguments and calls the proper function.

Workspace manager

The workspace is based on the c4mDinMem module and is used to handle online

creation and asses to algorithm variables. The maximum number of simultaneously

available variables is defined at compile time, while the maximum total usable

memory depends of the C4MMalloc module. The workspace manages a stack of

variable descriptors shown in the following diagram:

Fig 12. Variable descriptor

Each variable is identified inside the workspace by a natural language name

composed of up to eight ASCII characters. The VariableKind field is used to store

60

the kind of variable addressed pointed by pVariable. The implemented variable kind

are the following:

• Single precision floating point matrix (‘f’);

• Double precision floating point matrix (‘d’);

• Fixed point matrix (‘i’);

• Scalar floating point value (‘s’);

• uint8 Matrix (‘8’);

• uint16 Matrix (‘6’);

• uint32 Matrix (‘2’);

• int8 Matrix (‘9’);

• int16 Matrix (‘7’);

• int32 Matrix (‘3’).

When a new variable is created a block of memory is allocated and the

corresponding descriptor is created and put into the descriptors stack. When a

variable has to be to be retrieved, the corresponding descriptor is searched on the

basis of the given name.

The command line parser and dynamic functions manager

The command line parser receives the input string by the microcontroller UART. It is

possible to interact with it using a terminal emulation software like Hyper Terminal.

The syntax of a command is the following:

OUT1, OUT2, … , OUTM = FunctionName(IN1,IN2, …, INN)

When a carriage return termed string is fed into the into the mathparser, if the syntax

is correct, the following c4mAmxFunctionStruct structure is filled:

Fig 13. Structure of the c4mAmxFunctionStruct structure

61

The structure is passed to the amxFunction identified from the ‘FunctionName’ field.

The amxFunction acts as a gateway between the c4mEngine workspace and the

c4mFunction, performing input variables retrival and output and temporary memory

allocation.

The device

The acquisition device is composed of two independent parts: a set of analog EEG

amplifiers and an acquisition and elaboration unit. The device is battery powered and

is connected to the computer using an onboard Bluetooth® module. Thanks to this

configuration all the issues related to the isolation from the mainline are overcome

without any risk to the patient. The analog amplifiers adopted a novel circuit

topology for the conditioning of biomedical signals [WO 2007/085383 A1]. The

amplifier was composed of an amplification chain and relied on a double feedback

path which assured the stability of the system whichever the amplification block gain

and the order of the low-pass filter were. During the normal operation the offset

recovery circuit had a linear transfer function, when it detected a saturation of the

amplifier, it automatically switched to the fast recovery mode and restored the

baseline in few milliseconds. Four two channel EEG acquisition units performed

signal conditioning in order to obtain an eight monopolar channels acquisition; the

topology will be illustrated in detail in the following paragraph. The digital

acquisition board adopted and ARM7® STR711 microcontroller which managed

both data acquisition, processing and communication/storage issues.

Fig 14. Physical realization of the EEG unit

62

The analog design

This design was developed on the basis of our previous experience in this field

[Maggi, 2004] and focused on obtaining a more robust acquisition chain.

Fig 15 shows a typical, state of the art biosignal detection circuit which is composed

of a set of independent stages connected in a chain. At the beginning there is a pre-

filtering stage, the pre-amplification stage which is followed by the offset rejection

circuit and by an amplification and filtering circuit.

Fig 15. The amplification chain proposed by the OpenEEG project

This kind of solution is simple and effective when the wide power supply range

provides a high dynamic reserve (avoiding problems related to the clipping of the

output) and when the mechanical specifications allow the use of high capacity

capacitors or the specific application doesn’t require very low frequency high pass

filter. It is worth to underline that the maximum tolerable offset is given by the

following equation:

preamp

MAX

OffG

VVV

)(*

2

1 supsup

−+ −≅

Where Vsup are the value of the supply rails and Gpreamp is the gain of the

preamplification stage. It is worth noting that in case of a change in the input signal

that causes the amplifier saturation, the output of the system will remain latched for a

time which depends on the signal amplitude; it is possible to overcome this limitation

by increasing the system complexity and inserting a baseline reset circuit which is

activated by the saturation of the system itself.

General description

The proposed system is composed of a differential pre-amplification stage P(s):

usually realized using an Instrumentation Amplifier (INA). The F(s) block is a unity

63

gain inverting filter (low-pass or low-pass plus notch filter) of any order. A(s) is an

amplification stage, while I(s) is an offset compensation network. In the proposed

version it is a non-linear circuit which acts as an attenuated inverting integrator when

the Vin is inside the linear region and as an amplified inverting integrator when the

signal is over threshold whose behaviour can be expressed as follow:

−<<+−=

+−

otherwisesRC

k

ThVVThVifsRCaI

in

1

11supsup

where Th is threshold value which identifies a saturated state, ‘a’ is an attenuation

factor and ‘k’ the amplification factor.

Fig 16. Block diagram of the system

The small signal transfer function and the GLoop of the system are represented by the

following equation:

)](1)[()(

]1)()[()(1

)()()()(

sFsIsAG

sFsAsI

sAsFsPsTF

LOOP −=

−−⋅=

considering that I(s) and F(s) are inverting the equation can be expressed as follows:

]1)([)()(

]1)([)()(1

)()()()(

+⋅⋅−=

+⋅⋅−⋅=

sFsIsAG

sFsAsI

sAsFsPsTF

LOOP

The transfer function is a band-pass amplifier with a single pole high-pass and a low-

pass whose shape depends on F(s). Figure 3 shows the bode diagram of a system

with the following characteristics:

• F(s): 2nd order low pass at 75Hz;

• A(s): amplifier gain 100 V/V;

• P(s): pre-amplifier gain 5V/V;

• I(s): integrator 1/100 * 1/s.

64

Fig 17. Frequency response of a system with 500V/V gain and a 2nd order low pass 75Hz filter

Offset Compensation Issues

On the basis of the final output, the offset compensation value is fed both directly to

the preamplifier P(s) reference pin, and by modifying polarization of the amplifier

A(s). The proposed structure introduces a systemic offset compensation method

which ensures that, thanks to the double feedback path, even when the pre-

amplification output is close to the power supply rail, the following stages work

inside the linear region this property doubles the maximum tolerable offset:

pream

MAX

OffG

VVV

)( supsup

−+ −≅

Thanks to this improvement it is possible to increase the gain of the pre-amplification

stage taking major advantages of the qualities of the INA in terms of CMRR a noise

figure.

As proposed in our previous work, the AC-coupling of the amplifier using a

feedback integrator allows the tuning of the high-pass pole frequency just by varying

the open loop gain of the system [Maggi, 2004]. When setting the parameters for

biosignals acquisition, it is useful to insert an attenuation factor in the I(s) block in

order to compensate the amplifier gain: keeping the GLoop below the unity gain the

high pass pole is moved to the lower frequencies.

The I(s) automatically identifies a saturation of the amplifier using a threshold

method: if the value is outside a predefined interval, the attenuated integrator is

switched into an amplified integrator that quickly brings the system output inside the

linear interval.

65

The k value defines the delay of the offset recovery of the system: for example we

can have a 0.05Hz high pass pole during the linear phase and switch it to a 100Hz

one during the offset recovery phase, achieving a baseline recovery in about 10ms.

Stability of the loop

During the normal operation the GLoop is usually kept low using the attenuation net of

I(s) in order to achieve the desired high-pass frequency; when the saturation occurs

the I(s) is switched to a high gain configuration: in this section the stability of this

configuration will be discussed both by considering the Bode Stability Criterion and

the root-locus method.

Bode Stability Criterion

Provided that A(s) have a sufficient bandwidth to be considered like an ideal

amplifier and that F(s) and I(s) are stable, the poles in F(s) and I(s) are the possible

instability causes of the system. Considering the open-loop transfer function, the I(s)

provides a single pole at low frequencies, while the F(s) put a variable number of

poles at the higher bandwidth limit. Thanks to the second feedback path the poles of

F(s) are compensated by the same number of zeroes. The figure 4 shows the Bode

diagram of the original F(s) and the compensated one.

The newly created zeros must be very close to the F(s) poles in order to allow a

difference between the DC gain and the high frequency one of just 6dB. The

nearness of poles and zeroes makes the phase plot very flat: for any complexity of

the filter the plot is between +90 and +270 degrees.

Fig 18. Bode plot of the compensate filter against the original one

66

Fig 19. Phase diagram of the resulting GLoop

Fig 19 shows the phase diagram of the resulting GLoop, including the pole introduced

by the integration process, it is possible to notice that it never cross the instability

region.

Root locus study

The root locus (figure 6 and 7) show that the all the resulting closed loop poles are in

the left semi-plane even with a 9th order low pass filter. For the higher open loop

gains the phase margin can be less than 45 degrees, but during the nonlinear phase

the overshoot can make the settling faster.

Fig 20. Root locus of a 4th order system

67

Fig 21. Root locus of a 9th order system

Results and discussion

This configuration was adopted both in a commercial wearable polygraph, in order to

acquire the ECG signal and on the EEG acquisition device. Figure 8 shows the

adopted implementation for the EEG acquisition device. The system is 3,3V single

supply powered using a li-ion battery and a low-dropout linear voltage regulator. The

pre-amplification stage has a gain of 100V/V and P(s) is realized using a INA118

(Texas Instruments). The other four operational amplifiers are contained in a single

integrated circuit (TLC2254, Texas Instruments).

The F(s) is an inverting double pole low pass filter, and A(s) is an amplification

chain. The I(s) is composed by and attenuation network (R30 and R31), an inverting

integrator (IC2B, C1 and R12) and the nonlinear activation network (K2, H2, R22,

R23, R25, R27, R26).

The R27 and R26 network are used in order to set the intervention threshold of the

offset recovery circuit: when the Vbe of K2 and H2 are kept below 0,7 Volt the

transistor are turned off. The Th parameter is defined also follows:

27

26277,0

R

RRVTh

+⋅=

K2 is switched on when the amplifier output voltage reaches the upper saturation

limit, while H2 is switched on in case of lower saturation. When one of the transistor

is turned on, it injects a current into the inverting integrator causing the fast offset

recovery. R22, R23, R25 are necessary in order to limit the transistor current and

68

avoid instabilities related to 2nd order effects of the components. The final

amplification stage is optional an provides a last anti aliasing filtering.

The system circuit has been successfully used in a brain computer interface

application and has an offset recovery time of less than 10ms.

The proposed architecture is a smart and cost effective solution to the problems

related to the acquisition of biosignal in difficult acquisition situations based on an

analog design; thanks to the evolution of modern digital devices, it is possible to

adopt other method in order to achieve similar results.

The strength of the proposed topology is that a simple local solution doesn’t require a

full systemic redesign and supports the development of modular multi-parametric

wearable devices.

The discussion doesn’t take into account second order effects related the components

physical limitations: even if the architecture is robust and the frequency range of

biosignals is reduced, the design of an amplifier based on the proposed topology

should be approached with care.

The digital acquisition unit

The electrical design and the production of the digital acquisition unit was provided

by an external supplier (SXT – Sistemi per telemedicina s.r.l., Lecco – Italy) on the

basis of our specifications, the firmware was internally implemented for our specific

needs.

Electronic design

After a detailed review of the need for the applications specific needs the following

design specifications were defined:

1. High capacity battery support;

2. Battery recharge management system;

3. Micro SD card support;

4. Onboard 8 channels 12 bit ADC;

5. Separated voltage regulator with external connector for analog sensors;

6. ARM7® microcontroller;

7. RTC support;

8. Support for serial to Bluettooth®/Zigbee® bridge modules;

9. User signalling LED;

69

10. Possibility of entering in very low power mode.

The complete system is controlled by a 32-bit, ARM7TDMI® RISC core CPU, more

in detail it is and STR711 produced by STMicroelectronics (Agrate Brianza, Italy).

The main features provided by this solution are both computational efficiency and

low power consumption, supporting also an extreme flexibility of configuration. The

chip is equipped with:

• 256KB Flash

• 16KB Data Flash

• 64KB RAM

• 4x12 A/D Converters

• 5x16bits timers

• Embedded RTC oscillator (external crystal)

• SPI, I2C, UARTs and other peripherals

The CPU works with only a 3.3V (down to 3V) power supply, requiring a reduced

number of external components. It can support up to 45 MIPS @ 50 Mhz, when

working from Flash. This features, coupled with the 32-bit architecture, ensures

excellent processing capabilities without affecting power consumption performances.

In IDLE mode, with RTC enabled, it requires typically 30uA. This CPU supports

more than four low power-consumption operating modes and an internal PLL

allowed the implementation of a dynamic frequency switching driver.

The data required by the processing algorithms are stored in the CPU RAM, without

the need of external components.

Since the internal ADC isn’t suitable for biosignal acquisition applications, an

external ADC ADS7844 (Texas Instruments, USA) is connected to the MCU unit

using an SPI bus. The adopted device is a successive approximation (SAR) ADC

with a multiplexed input in order to handle up to eight inputs. The maximum sample

rate over the eight channels is 200kHz with a 12bit resolution. The PCB was

designed in order to form a multipurpose socket which can be used in order to

connect both a Bluetooth or a Zigbee module: in the current prototype the Bluetooth

connection was preferred.

The class II BT transmission module provides the standard BT SPP protocol. The

module is controlled by the CPU that can manage and control the pairing, the

70

connection link and the data transmission through a specific driver. The module

exports an SPP service which accepts a connection from an external device like a

personal computer. The adopted module was the PAN1540 (Panasonic, Germany), a

fully qualified Class II BT module able to transmit data with a throughput up to

256kbps. The battery is an internally 2300 mAh battery, the recharge process is

performed using a external AC adapter and charger certified for medical

applications.

Since the SD card protocol requires bulk communication with time constants which

are non compliant with the requirement for ADC communication, the MicroSD®

card was connected to the MPU using a second SPI bus.

Fig 22. Main schematic of the digital acquisition board

Firmware design

The firmware was developed using a Realview® IDE and compiler (ARM Holdings

plc) and was based on a free version of a commercial Real Time Operating System

(embOS, Segger, Germany). EmbOS® is a priority-controlled real time operating

system, designed to be used as foundation for the development of embedded real-

time applications. It is a zero interrupt latency, high-performance RTOS that has

been optimized for minimum memory consumption in both RAM and ROM, as well

as high speed and versatility. Its' highly modular structure ensures that only those

functions that are needed are linked, keeping the ROM size very small. Tasks can

71

easily be created and safely communicate with each other using a complete palette of

communication mechanisms such as semaphores, mailboxes, and events.

EmbOS main features:

• Preemptive scheduling: Guarantees that of all tasks in READY state, the one

with the highest priority executes, except for situations where priority

inversion applies.

• Round-robin scheduling for tasks with identical priorities.

• Preemptions can be disabled for entire tasks or for sections of a program.

• Unlimited number of semaphores (limited only by available memory).

• Unlimited number of mailboxes (limited only by available memory).

• Size and number of messages can be freely defined.

• Unlimited number of software timers (limited only by available memory).

• Time resolution tick can be freely selected (default is 1ms).

• Full interrupt support: Most API functions can be used from within the

Interrupt Service Routines (ISRs).

The firmware was structured as a multitasking application for communication, signal

processing and acquisition. The following drivers were developed for the interface

with external component:

• ADC, adc7844 for real-time data acquisition;

• SD card interface;

• FAT16 compliant file system for personal computer compatible data storage;

• Dynamic frequency switching for power saving;

• Serial communication driver for Bluettoth/ZigBee communication.

Al the developed drivers exploit hardware FIFOs and OS event support in order to

minimize waste of CPU time due to peripheral state polling.

72

Fig 23. Firmware structure

The firmware was composed of the following tasks:

Data acquisition task: the acquisition timing was generated by an high priority

interrupt routine activate by the RTC clock. An internal sample rate of 4096Hz was

used. The ADC was properly activated using the “16 clock per conversion mode”

and in order to exploit the 16bit word feature of the MCU’s SPI. One conversion for

each of the 8 inputs of the ADC was performed every cycle, the SPI hardware FIFO

buffer was activated used in order to reduce CPU load: a complete 8 channels sample

was completed in 36us with about 10us CPU time usage. A queue buffer provided by

the RTOS was used for communication with the processing task.

Processing task: the processing task must be customized on the algorithm specific

requirements, but the general structure is always preserved. The first block performs

a data low pass filtering using a FIR filter and decimation in order to represent it with

a 14bit resolution and a 256Hz sample frequency. Decimated data are the forwarded

to the following block and to the communication thread. The second part performs

feature extraction and signal classification.

Communication and storage task: the communication handles all the operations

related to communication with the computer and storage on the internal memory. For

73

reasons witch are related to the clock distribution of STR7 clock dynamic switching

has to de disabled during bidirectional communication with the radio device.

Stimulation device

An external stimulation device was developed for those protocol, like SSVEP, which

require external optical stimulation. The device is connected to the personal

computer by a USB port. The communication was implemented using a Serial Port

Profile compliant USB Stack. Using a bidirectional protocol the computer can turn

on and off up to eight LEDS, and program the stimulation device for a continuous

intermittent stimulations. Stimulation periods can be up to 1 second with a resolution

of 256bits, stimulation duty cycle is 25%. The MCU that implements the USB stack

and led control is the same of the acquisition unit. The device is based on a

development board from Olimex ltd. (Plodiv, Bulgaria) connected to a circuit for the

intensity calibration. Actually four LEDs are connected, but the firmware supports up

to eight LED.

74

Software modules

The following paragraph will give a detailed description of all the software module

which compose the framework. The modules were developed using free of charge

libraries with the LGPL licence (GNU Lesser General Public License).

Instrument interface module

The hardware interface module (HIM) was a software designed in order to assist the

researcher during the prototyping of a new BCI system. It provides a solid structure

for the acquisition, storage and visualization of the signal, for the communication

with the BCI application user interface and for the real-time execution of algorithm

both in C and Matlab® environment. This software was written in C++ language

using the cross-platform wxWidgets library.

Fig 24. Screenshot of the Hardware Interface module: A - Main window, B - Signal plot window,

C - Biofeedback plot

B

A

C

75

Fig 24 shows the screenshot of the HIM during a normal execution. This software

was designed for laboratory purposes and for use with double monitor support. The

main window (A) is used to select and connect the acquisition instrument, to activate

data storage and to select the kind of signal processing. It is also possible to insert

some operator activated signal trigger for general purpose applications. The second

window (B) is used for online signal visualization: it supports simple spatial filters

and online time domain filtering. Visualization time domain filters can be

customized by saving the corresponding parameters in a file and registering into the

application. The third window (C) is used to display algorithm parameters during the

execution.

The source code was structured in order to promote application specific

customization using hereditary and polymorphism techniques: in order to

customize the HIM the developed should know the structure two base classes:

Instrument and Algorithm.

The instrument class

Fig 25. Current hierarchical tree of the instrument class

This class was created as a basic communication layer for interaction with all the

component of the HIM. The basic version supports several kind of instrument some

are real, other are virtual:

• Kimera version II;

• Kimera version I;

• Other devices not studied for BCI use;

• An ideal signal simulator;

• A signal simulator which load data from a file;

• An interactive file player which can load pieces of dataset depending on the

operator’s needs;

• G.MOBIlab a commercial EEG Bluetooth device from Gtec (Graz, Austria).

It is possible to add a new instrument by deriving a specific class from the basic

Instrument class and implementing the virtual methods which related do the

communication with the hardware.

76

The algorithm class

The HIM also provides a basic algorithm class which handles all the operation

related to data buffer management, the performance monitoring and, if necessary the

communication with Matlab®. In order to test a new DSP technique the researcher

have to focus only on writing the specific code even in Matlab language. This class

implements an algorithm model which have the following characteristics:

• Have an initialization phase;

• Requires sample by sample operations (eg. Spatial and time domain filtering);

• Requires windowed processing (windows can overlap over time);

• Feature extraction and classification can be not simultaneous.

The model is quite general thus the most of the algorithm are compatible. The class

provides a native support to both c4m library and to Matlab: hybrid algorithms can

also be implemented. The following code snipped show the methods declaration of

the Algorithm class:

class Algorithm { public: Algorithm(void); void OpenMatlab(void); void Open(int MLen = 2000,int SLen = 125); void Close(void); virtual ~Algorithm(void); void Restart(void); void PutIntoMatlab(char *VarName,c4mMatrix *pBlock); void GetFromMatlab(char *VarName,c4mMatrix *pBlock); void SetClassification(int Class); void SetFeedback(float *pFeedback,int MaxLen = FEEDBACK_LEN); int GetLastClassification(void); void GetFeedback(float *pFeedback,int MaxLen = FEEDBACK_LEN); void RegisterSocket( wxSocketBase *sock); Algostats GetAlgorithmStatistics(void);

void ShiftMainWin(int Len); void ShiftSmallWin(int Len); virtual void PutSample(DataStruct *pData); virtual void SendConfiguration(void){}

virtual void Init(void){}; virtual void RunMainWin(void); virtual void RunSmallWin(void); private: void StopSmallTimeCounter(void); void StartSmallTimeCounter(void); void StopMainTimeCounter(void); void StartMainTimeCounter(void); void StartCounter(TimerParams *pParams); void StopCounter(TimerParams *pParams); };

77

In order to define the new protocol, the inherited class must implement the following

virtual methods:

• PutSample: this function is called every time a sample is generated by the

instrument;

• Init: this function is called at the begginning of the acquisition. Should be

used the variable initialization;

• RunMainWin: this function is called every thime the main window cursor

reaches the MainLenght parameter. Usually contain the classification code.

Window shifting is done by calling the ShiftMainWindow method.

• RunSmallWin: Similar to RunMainWin but with an independent window

with possibly different dimension.

AEnima: the Graphic User interface Module

The user interface was developed in collaboration with another PhD. student and is a

flexible tool studied in order to simplify the implementation of new operating

protocol or BCI based user applications. This module is an independent application

and was written using an OpenGL based graphical engine in order to provide a more

realistic and challenging experience to the user. A TCP/IP socket based layer

managed the communication with the hardware interface module which can be

executed by both the same personal computer which executes the HIM and by

another one.

The following figure shows the structure of application:

• the core of the system the Irrlicht Engine: it is an open source high

performance realtime 3D engine written and usable in C++. It is cross-

platform, using D3D, OpenGL and its own software renderer;

• the protocol class implements the user application and the stimulation

protocol;

• the protocol dealer is a coordination object used for messages dispatch to

handle the interaction between different user applications.

78

Fig 26. Structure of the AEnima module

The software was structured in order to achieve the best level of abstraction and to

guarantee the maximum flexibility for the implementation of new protocols. In order

to insert a new protocol, the developer have to derive a new class from the base class

protocol manager. The following code snippet shows the prototype of the base

ProtocolMngr class:

class ProtocolMngr { public: // BASIC MEMBERS ProtocolMngr(GraphicEngine *Engine, SoundSystem *Sound, SocketComm* pSocket, LedDriver *myLed, DealProtocol *pParent); ~ProtocolMngr(void); int GetXState(void); void EvalProt(void); void ResetTime(void); void SetNextCall(int Millis); void SendBCIMessage(int Kind, int Value=0, short int *pBuffer=0,int BuffSize=0); void setNextOpt (int kind, int delay, int p1=0, char *ptext=NULL); void HideCenter(void); void RotAnimator(char *NodeName, scene::ISceneNodeAnimator* anim); void creaTextPerif(void); void SetNamePerif(gui::IGUIStaticText* perif, char *nome1); void ViewFeedback(void); void HideFeedback(void); // CUSTOM MEMBERS virtual bool StartProt(bool SendSocketMsg = true);

Event

Manager

Protocol

Home

automation

Configuration

Graphics Engine

LEDs

Protocol

Dealer

Screening Protocol Training Protocol Testing Protocol User Protocol MyHome Protocol Game Protocol

TCP/IP

79

virtual bool StopProt(bool SendSocketMsg = true); virtual void myProtocol(int myTime){}; virtual void OnProtocolEvent(); virtual void OnSocketEvent(BCIMessage *SocketMsg){}; };

In order to define the new protocol, the inherited class must implement the following

functions:

myProtocol: this method is called by the protocol dealed and is used as a timer for

the protocol. This function is usually used for training and calibration protocols when

the events are not generated by the classifier;

OnSocketEvent: this method is called every time a socket message arrives from the

TCI/IP connection with the HIM. Fig 27 shows the structure of the message.

Fig 27. Structure of a BCIMessage

The following messages are used for the interaction between AEnima and the HIM,

some messages are conceived to be send from AEnima to the HIM other from the

HIM to AEnima:

GRAPH_TRIG: is sent from AEnima to the HIM in order to send a trigger label

wich is recorded in the trigger channel, the value field contains the corresponding

flag;

CLASSIFICATION: is sent from the HIM to AEnima when the algorithm generates

a classification, the value field contains the classification;

FEEDBACK: is sent from the HIM to AEnima when a the algorthm returns a new

biofeedback, the Buffer vector is used to store the values of feedback to be displayed

to the BCI user;

START_ACQ and STOP_ACQ are sent from AEnima in orther to activate of stop

the acquisition and elaboration;

sono i messaggi inviati al software di acquisizione per avviare e terminare la

memorizzazione dei dati;

SET_FREQUENCIES is and SSVEP specific message sent by the HIM in order to

set the visual stimulation frequencies.

80

StartProt: is used to initialize the internal states of the protocol and the graphical

objects involved in the protocol. The GraphicEngine class provides a list of basic

graphical objects; the new protocol can use those resources or define new specific

resources. The basic resources are:

• Skydomes: moving images in the background;

• Cue: a sphere at the centre of the screen;

• Feedback bars: rectangles near the sides of the screen whose height is change

in order to implement biofeedback;

• Feedback particles: are a sort of highlight of feedback bars that can be

activated as a reward when the feedback is above a certain threshold;

• Buttons: cubes with an image as texture which are used to compose

interactive menus;

Home automation system

A specific software module was implemented in order to provide an interaction layer

with a home automation system. A MyHome® gateway was provided by BTicino

spa (Erba, Italy) and a basic demonstrator was set-up in our laboratory. The standard

physical communication layer of the used gateway is RS-232; in order to maximize

the ease of installation a specific RS-232 to Bluetooth module was designed. The

Application layer was implemented using the OpenWebNet® language proposed by

BTicino. Although the current demonstrator supports only a few lamps, and some

devices, the software module supports all the options provided by the home

automation system. A library for the OpenWebNet language was developed and a set

of software for the system configuration and functional demonstration is available.

The library was structured into two layers:

• OpenWebNet layer for the management of the communication protocol with

the home automation system;

• A Serial interface layer witch acts as and hardware abstraction layer which

can be customized in order to assure portability to other platforms.

The library was composed in C language in order to assure portability to any

platform.

81

Cooperating user interfaces: the dynamic keyboard

Since a principle motivation of the development of BCIs is to provide paralyzed

patients with independent communication tools, BCI-driven spelling devices are an

important topic in BCI research. Many group implemented different BCI controlled

word processing applications [Kubler, 2001. Dornhege 2007, Birbaumer 1999] but

the necessity of the development of optimized, “cooperating”, user interfaces which

help the user task by reducing the number of necessary commands to enter a word is

still considered necessary: “Although the proof-of-concept of BCI systems was given

decades ago several major challenges are still to be faced. One of those challenges is

to develop BCI applications which take the specific characteristics of BCI

communication into account. Apart from being prone to error and having a rather

uncontrolled variability in timing, its bandwidth is heavily unbalanced: BCI users

can perceive a high rate of information transfer from the display, but have a low-

bandwidth communication in their control actions, cf. Fig 28.” [Blankertz, 2007].

Fig 28. Asymmetry of BCI communication bandwidth

82

In this part of the work a four commands optimized virtual keyboard was developed.

Fig 29. Keyboard structure

The keyboard consisted of a six by six matrix of letters this a moving cursor. The

possible commands were:

• move left;

• move right;

• select;

• delete.

Thanks to this configuration the matrix is completely reachable, and the direct

deletion command helps reducing user frustration in case of error. The keyboard was

calibrated to be used by an Italian user thus the selected symbols where the following

36 symbols:

Character Vector: CV = [abcdefghijklmnopqrstuvwxyz +^ àéèìòù|]

Numbers, mathematical operators and punctuation symbols where grouped in the ‘+’

and ‘^’ symbols which were used as links to corresponding sub keyboards.

The a generic text distribution was represented by the following matrix:

The Nearness matrix (NM) is a square 36x36 matrix whose Vkj element counts the

number of times the CVi character followed the CVk, by normalizing each rows by

the total number of occurrences an estimation of the probability of the CVi to be

followed by CVk is obtained.

A keyboard layout can be described by the Distance Matrix (DM) which represents

the number of movements necessary to move from CVi character to the CVk. In order

83

to find the character layout which reduces the number of commands to enter a text

the following estimator should be minimized:

∑∑=i k

kNMikDMiJ ,*,

Since the NM depends on the text, the minimization of J should be done by

associating to the highest values of NM the lowest values of DM: since the DM

element values can’t be changed independently, this optimization can be complex.

Three minimization criterion were evaluated:

• A static keyboard in which after any selection the cursor returned in the top

left position (SR);

• A dynamic keyboard in which, after the selection the cursor returned to the

upper-left position and the letters were rearranged on the basis of the most

probable following letter (DK). A similar strategy was proposed for mobile

applications under the name Fluctuating Optimal Character Layout (FOCL)

[Bellman,1998];

• A static keyboard in which after the selection the cursor was moved to the

most probable letter (SMP).

Static keyboard layout

Thank to the adopted cursor return strategy the synthesis of the optimal layout is

trivial and consist on disposing the letters according to the probability of occurrence

along the path shown in the following figure.

Fig 30. Layout composing strategy

84

Dynamic keyboard layout

Thanks to the adopted cursor return and layout rearrangement the synthesis of the

optimal layouts is similar to the previous one: for every symbol a dedicated keyboard

is designed on the basis of the corresponding line of the NM following the layout

composing strategy shown in Fig 30.

Static keyboard with positioning on the most probable letter

In this case the optimal layout cannot be found with a direct solution, thus the use of

a search technique is advisable: a genetic algorithm was used in order to find an

approximation of the best solution.

• The mutation function was implemented as a permutation of two positions;

• The crossover function was a one point crossover.

Twenty simulation were executed using a Matlab with the following parameters:

Population 40

Generations 2000

Elitism 2

Crossover fraction 80%

Selection Rule Roulette

Table 3. Genetic algorithm parameters

In the most of the cases the fitness values reached a stable state in about 400

generations. Al the 20 final best results gave fitness values with a range of 6%, thus

it is not advisable a relevant change by increasing the number of generations of the

simulation of the number of simulations. Fig 31 the typical evolution of the

population. The monotonicity of the best performance is assured by the use of the

elitism.

85

Fig 31. Typical evolution of the population. The black line represents the best performance, the

blue dots the average performance of the population

Comparison

All the generated keyboards were based on a model extracted by an Italian text of

150.000 characters (A). The validation was performed using a 560000 (B) characters

long Italian text and a shorter English text of 12500 (C) characters.

Fig 32. Static keyboard layout

Fig 33. Static keyboard with positioning on the most probable letter

The following table shows the obtained results expressed as the average number of

commands necessary to move from a generic letter to the following:

86

A B C

Dynamic keyboard (DK) 1.57 1.58 2.33

Static keyboard (SR) 2.32 2.31 2.61

Static keyboard with positioning on

the most probable letter (SMP)

2.35 2.35 2.86

Table 4. Comparison of different keyboard layouts

While SR and SMP are equivalent in terms of performances and usability, the other

Dynamic solution appears to be much more efficient (22% considering the selection

command), but the continuous reorganization of the keyboard could lead to a

performance drop caused by letter search time, thus both SR and SMP could be

considered as good alternatives.

The reduced difference between A and B columns exclude the presence of

overfitting, the relevant difference with column C suggest that customization

oriented to the user vocabulary and language could lead to significantly better

performances. If a dynamic user oriented auto reorganization had to be taken into

account, DK and SR layouts should be preferred for the ease of layout update.

87

88

DEVELOPMENT OF A SSVEP BASED BCI SYSTEM

The visual evoked potential (VEP) is an electrical potential generated in the brain

and recorded from the occipital region of the scalp after the presentation of a visual

stimulus: when the stimulus repetition rate is higher or equal to 6 Hz, the evoked

response appears as a periodic activity. This activity increases the power spectral

density (PSD) in correspondence to the integer multiples of the stimulus flashing

frequency: this particular type of brain response is known as steady-state visual

evoked potential.

The SSVEPs have the same fundamental frequency (first harmonic) as the

stimulating frequency, but usually they also include higher (Regan, 1989) and/or sub-

harmonics (Herrmann, 2001). Most SSVEP-based BCIs were implemented on the

basis of the first ([Kelly et al., 2005a], [Kelly et al., 2005b] and [Middendorf et al.,

2000]) or on the first and second harmonic detection ([Cheng et al., 2002], [Gao et

al., 2003] and [Lalor et al., 2004]).

In most SSVEP-BCI related publications the electroencephalogram (EEG) was

derived from electrode positions O1 and O2 (international 10–20 electrode system).

It was either derived bipolarly from these positions (Middendorf et al., 2000),

monopolarly (Kelly et al., 2005b) or bipolarly anterior/posterior to them (Muller-

Putz et al., 2005). Futhermore, in one work, high-density EEG was derived to study

an SSVEP-based BCI (Kelly et al., 2005a). Discrete Fourier transformation (DFT)

had been used in most publications dealing with SSVEP-based BCIs ([Cheng et al.,

2002], [Gao et al., 2003], [Lalor et al., 2004] and [McMillan et al., 1995]). Only in

some cases a lock-in analyzer system (LAS) had been used ([Middendorf et al.,

2000] and [Muller-Putz et al., 2005]).

The following chapter will describe the implementation of a four classes BCI based

on the SSVEP protocol. A specifically designed stimulation system connected to the

computer by a USB port and driving four LEDs applied to the edges of an LCD

screen, was adopted. The system was able to discriminate between five classes:

RIGHT, LEFT, UP, DOWN, CENTRE.

89

The BCI algorithm was based on a supervised multiclass based on a Regularized

Linear Discriminant Analysis. The selected features were the result of a ratio

between the amplitude of a stimuli time-synchronous averaging of the signal,

compared to the estimation of the amplitude of the raw signal. A simplified surface

Laplacian combined a reduced number of electrodes and a good signal to noise ratio.

The adopted feature extraction technique implicitly takes into account of all the

harmonic components of the signal.

A specific user customization protocol was developed in order to retrieve the best

user specific calibration parameters, different control strategies were implemented

and evaluated.

According to the following figure the recorded location were O1, Oz, O2, P3, Pz, P4,

T5, T6 referred to the linked mastoids:

Fig 34. 10-20 reference system

In order to simplify the signal processing, the choose of the frequencies was

constrained into the set of those frequencies whose period could be represented by a

integer number of samples at a 256Hz.

The Protocols

The adopted SSVEP protocol was composed of the following phases the first three

are needed for the set-up and the validation of the classifier the second group are for

real free will use:

90

• Screening phase: studied in order to understand the best user specific

stimulation frequencies;

• Training of calibration phase: studied in order to record a dataset for the

calibration of the classifier;

• Testing phase: studied for online evaluation of classifier performances;

• BCI-User: main menu for application selection;

• AstroBrainFight: studied to test a cursor like control;

• Home automation protocol: implemented to demonstrate the usability of the

home automation system using a BCI.

Screening protocol

The screening protocol is essential to reach the maximum performance of the SSVEP

based BCI. Since there wasn’t empirical or literature based evidence of which are the

best stimulation frequencies in order to elicitate the SSVEP response, a specific

protocol was developed in order to identify the best user-specific stimulation

frequencies. The protocol consisted in a simple user interface without real-time

feedback and was composed of two phases:

• a stimulation phase in which the user had to focus the only active stimulator

for eight seconds;

• a rest phase during wich the user had to focus on the centre of the screen an

the stimulation was deactivated fo eight seconds.

Each resting phase was followed by a stimulation phase at increasing frequencies

between 6 an 17Hz at one hertz step. The whole session duration was about 3

minutes (184 seconds).

91

Fig 35. Screening protocol GUI (Active phase)

After the screening phase the recorded signal was processed with a matlab software

in order to select the best stimulation frequencies. The screening application was

composed two GUIs: the first was used to a global idea of the best recording

locations, the second was used to perform a joint time-frequency analysis (JTFA) of

the recoded signal.

Fig 36. Screening application

The main window can be used to have a general overview of the distribution of the

electrodes gathering the best SNR over the different stimulation frequencies. The

second window is a JTFA of the recorded signal over the selected electrode and can

92

be used as a confirmation of the frequency selection operated with the first part of the

application.

This phase is usually performed just the first time the user approaches to the BCI

system.

Training or calibration protocol

When the four stimulation frequencies are selected, the stimulators are programmed

with the corresponding periods. The calibration consisted of a guided sequence of

operations used to acquire training set for the classifier. During this phase all the

stimulator were always activated and a symbol at the centre of the screen guides the

user with the sequence shown in Fig 37.

Fig 37. Training sequence

During the active phase (left, right, up, down) the user had to focus his/her gaze on

the corresponding stimulator trying to minimize eye blink and head movements, the

‘NULL’ phase was inserted in order to have a ‘non command’ condition. Each phase

lasted 20 seconds (140 seconds total). Since the adopted biofeedback does not

depend on a previous calibration, the operator can decide whether to activate or not

the real-time feedback during this phase of the protocol. Depending on classifiers

performances and on user requirements, the calibration phase can be repeated when

the user has gained familiarity with the system. Calibration usually resulted in being

portable over sessions.

In order to avoid user distraction, during the first training acquisition the biofeedback

is usually turned off, in some cases the user asked for a second calibration of the

classifier with active biofeedback. After the acquisition the classifier is calibrated

using a matlab application.

93

Fig 38. Main window of the training application

According to Fig 38 the main window allows the following operations:

a. Dataset loading and calibration storage;

b. Spatial filter selection;

c. Frequency selection and dataset editing

d. Evaluation of the feature consistency;

e. Evaluation of the resulting classifier.

The dataset editing window allow the selection and deletion of portion of signal

affected by artefacts of not containing an SSVEP response.

Testing protocol

This phase is used to verify the performance of the classifier and to let the user learn

how to control biofeedback. The session consists of a guided sequence completion

task. The user was asked to focus his attention on a particular light source while the

A

B

D

C

E

94

signal was processed and identified continuously by the online translation algorithm;

the switching from one stimulus to another occurred only after the system identified

a command related to the actual target source [Maggi, 2006]. The biofeedback was

activated and the user was instructed that, in order to achieve a classification, he/her

had to focus and concentrate on the led indicated by the arrow obtaining and

maintaining for at least one second an increase in the corresponding feedback bar

length.

Fig 39. The user during the testing phase

The sequence consisted of eight symbols, the classifier was activated four times per

second and a classification was considered definitive when five consecutive

classifications were produced; after a correct classification the stimulation LEDs

were switched off and the classification interdicted for three seconds in order to

avoid the occurrence of false positives due to the length of the signal elaboration

window.

The structure of this validation protocol allowed to record a triggered signal with a

structure similar to the training set, thus the union of both recordings in order to have

a more general training set could be possible.

95

BCI-User protocol

Fig 40. Main menu

This protocol was structured as a main selection menu for the available applications.

Four textured cubes (buttons) represent four kind of applications:

• Home automation;

• Communication applications;

• Gaming applications;

• Music players;

at the moment of writing only the home automation and the gaming application were

implemented. A cursor movement selection was implemented: the movement of the

cursor was achieved by using the four leds like command arrows. The classification

prprocessing was the same used in the testing phase. Starting from the central

position the user could select the preferred application with two movements in the

same direction. The GUI was designed with big icons in order to facilitate the control

of cursor position using the peripheric view.

AstroBrainFight protocol

In order to make the use experience more involving an entertaining especially for

young users, a gaming protocol was developed. The GUI was a composed of two

astrofighters, an appropriate sky dome and the feedback bars; the user had to move

the red fighter in order to collide with the green one. After a collision one point was

assigned to the player, the red fighter returned in the central position and the green

one was positioned in another pseudo casual position.

96

Fig 41. Astrobrainfight application

In this application the stimulation devices were always active and every form of

classification post-processing was disabled. The red fighter moved one tick in the

direction of the classification every one fourth of second, after six identical

consecutive classifications the movement was accelerated to two ticks per

classification. The game ended after 15 collisions and the total time was displayed, in

order to promote user motivation they were informed they were part of the

AstroBrainFight hall of fame.

MyHome protocol

The MyHome protocol is used in order to demonstrate the flexibility of the system in

the interface with external components. Using the libraries illustrated in the previous

chapter it was possible to create a button menu based application for the control of

the home automation system. The first menu allows the selection between six

submenus of commands:

• Lighting controls;

• Environments;

• Automations;

• Shutdown all;

• Exit – return to main menu;

• Scenarios.

97

Fig 42. Main menu of MyHome protocol

When one of the submenus is selected the user gain access to the corresponding GUI

with the related buttons: when the user select an operation in the submenu, the

protocol returns to the main menu.

In order to simplify user control, the cursor movement was constrained by the grid

shown in Fig 43.

Fig 43. Cursor movement grid

98

Fig 44. Planimetry of the acquisition room

A little home automation system was set-up in the room where the sessions took

place, the user could activate two devices and two independent lights behind users

sit. Since the SSVEP protocol is based on external optical stimulation and thus on the

stimulus related potential, the activation of the lights could have a potential effect on

the SSVEP potential. During this protocol a researched verbally guided the user in

the use of the application, thus it was possible to verify the effect of environmental

light on system usability.

Signal processing

The following figure shows the structure of the SSVEP classification algorithm. A

signal pre-processing algorithm performs a spatial and time domain filtering, a

feature extraction block extract one feature for each stimulation frequency and a

multi class Regularized Linear Discriminant Analysis (RLDA) performs the

classification of the signal; biofeedback is computed on the basis of the extracted

features.

99

Fig 45. Structure of the algorithm

Signal pre-processing block

A spatial filtering block (SFB) was used in order to reduce the dimensionality of the

eight acquired monopolar derivations. The SFB computes a custom derivation on the

basis of the analysis of the calibration data-set. The spatial filtering is computed

performing a linear combination of the acquired channel:

ii i wtMtF *)()(8

1∑ ==

Where F(t) is the result of the spatial filtering, M(t) is the original monopolar

derivation and w is a weight vector. It is possible to compute an approximation of a

Common Average Reference (CAR) of the i-th electrode by setting the term w(i) at 1

and the other weights to -1/7, or a simplified surface Laplacian by setting the term

w(i) at 1, setting the weights corresponding to four surrounding electrodes at -¼ and

zeroing remaining ones.

Fig 46 Most common spatial filters

It was possible to put the time filtering block after the SFB since the spatial filtering

block allows only linear operators: the reduction of dimensionality (up to eight-to-

100

one) performed by this block strongly reduced the workload related to time domain

filtering. The applied filter weights corresponded to a 5th order Butterwoth band-pass

filter ranging from 5Hz to 45Hz.

The filter was a C implementation of the following canonical representation:

a(1)*y(n) = b(1)*x(n) + b(2)*x(n-1) +... + b(nb+1)*x(n-nb) - a(2)*y(n-1) -... -

a(na+1)*y(n-na)

where y is the filtered signal, x is the original signal, a is the autoregressive weight

vector and b is the moving average part.

Feature extraction block

The point of strength and originality of this system is the feature extraction block.

The selected features were the result of a ratio between the amplitude of a stimuli

time-synchronous averaging of the signal, compared to the estimation of the

amplitude of the raw signal. For each frequency the algorithm extract one feature.

For any stimulus frequency, given Plength the time distance (measured in samples) of

the specific stimulus and given a window of signal, the algorithm divides the window

into several sub windows of length equal to Plength and computes the accumulation

window resulting by the average of the sub windows.

StimPeriod

WinLengtherSubWinNumb

where

erSubWinNumbwiSignalerSubWinNumb

iaccerSubWinNumb

w

=

+= ∑−

=)*(

1)(

1

0

The signal components which are synchronous to the stimulus frequency are then

enhanced, while the other components are lessen. And amplitude estimation of the

SSVEP is obtained by comparing the standard deviation of the accumulation window

to the one of the raw signal:

)(

)(

Signalstd

accstdFeat =

the resulting estimation is a numerical value which theoretically ranges from 0, in

case of total disruptive interference within the main window, to 1, when the signal

has only one periodic component at the same period of the stimulation.

The main advantage of this property is that the resulting values don’t need to be

scaled on the basis of a previous training session. The following figure shows the

101

plot of the feature corresponding to the 9,14Hz feature extracted during the

calibration recording.

Fig 47. Feature plot in time

The following paragraphs will discuss the effect of some aspects that should be taken

into account during the set-up of a classifier

• length of the evaluated window;

• frequency selectivity and resolution;

• harmonics selectivity;

• effect of the base EEG;

• stimulus frequency.

Window length affects frequency selectivity and resolution

A frequency selectivity graph was calculated by generating a 20 seconds long

sinusoidal signal at increasing frequencies ranging from 0Hz to 50Hz at a step of

0.05Hz sampled at the same frequency of our EEG device (256Hz). The mean value

of the calculated feature is then used and index of sensibility at the given frequency.

The following figure shows the graph calculated for the 8Hz feature on a one second

window

102

Fig 48. Selectivity plot 8hz

The plot shows the same response for every harmonic and a number of side lobes

which depends on the length of the considered window and on the distance between.

The width of the main lobe is twice the inverse of the window length (in seconds),

while the width of the other lobes is equal to the inverse of the window length. Thus

the frequency resolution of the proposed feature can be considered:

WinLengthFres

1=

Fig 49. Frequency selectivity with 2,3,4 seconds windows

103

The Fig 49 shows that the amplitude of the first side lobe is similar for each window

length, but it decreases more rapidly for longer windows.

Effect of the base EEG

In order to give a brief evaluation of the effect of the base EEG the simulated

sinusoidal signal was added to a generic Gaussian white noise. The figure shows a

reduction of the peak response of the feature and an increment of the baseline, thus

the SNR is strongly decreased.

Fig 50. Frequency response to a sinusoidal signal with white noise

In order to exclude the effect of the out of SSVEP band noise, a general band-pass

filter from 5 to 45Hz was added to the pre-processing stage. Fig 51 shows an

increase in the peak response and a decrement of the baseline in the in-band

frequencies leading to an increase of the SNR.

104

Fig 51. Comparison of the response with and without band-pass filter

Biofeedback generation

In the most of the algorithm biofeedback is usually generated by the classifier as an

estimation of the reliability of the proposed classification on the basis of a previous

calibration of the algorithm. This approach aims at helping the user in generating a

more reproducible signal. As a general rule the classifiers usually tend to encourage

the production of a signal similar to the one generalized on the basis of the training

set. In case of a training set with a low distinguishability between the classes the user

will tend to learn to produce a better signal by reducing the variability within each

class, but non necessarily the distance between classes. The use of a calibration

independent parameter as biofeedback could help the user having a direct control of

the parameter thus increasing class distinguishability by both enhancing the signal

repeatability in terms of reduction of the within class variance and increasing the

distance between classes.

105

Fig 52. Three different situation: (A) Low separability due to variance and low distance, (A)

Better perability due to variance reduction, (C) Good separability du to high distance and low

variance

The adopted feature has a time response which make it suitable for use as a

biofeedback to the user with some modifications. The frequency response in case of

white noise superimposed to a simulated SSVEP response, show a baseline. In order

to achieve a more easy to understand biofeedback, an offset was subtracted from the

feature. The feature baseline value depends on the corresponding frequency and thus

on the number of ensemble average windows that was possible to extract from the

main windows. As a reasonable offset value the value resulting from a Gaussian

white noise was used:

)(

)(

)(

)(

windowstdn

windowstd

windowstd

averagestdFeatureWN

•==

thus

nOffset

1=

00

1

)(

)(

=→<

−=

FeedbackFeedback

nwindowstd

averagestdFeedback

106

Where n is the number of integer complete sub windows contained in the main

windows. For each window one biofeedback value for each stimulation frequency

was generated; in order to avoid confusing the user, only the on with the highest

value was prompted. The use of feature related values as biofeedback gives the user a

direct control of the classifier input and thus behaviour.

Fig 53. Time course of the feedback on the basis of the evaluation window length

Fig 53 shows the time course of three biofeedback calculated on a screening dataset

for three different length of the evaluation window. As expectable the trace

corresponding to a 2 seconds window shows a faster response time to gaze onset and

offset but it’s more unstable when compared to the others. The 4 seconds window

feedback is the most stable, but shows a considerable delay especially in after gaze

offset; as general setting the three seconds windows was chosen as being the best

compromise between speed and stability.

Classification

The classifier consisted of a regularized linear discriminant analysis (RLDA) based

on the modified samples covariance matrix method. The RLDA included a boosting

algorithm based on a cyclic minimization of the classification error on the training

set and an algorithm for outliers rejection.

The classification algorithm was conceived for use in an asynchronous protocol, thus

the system was trained in order to identify five different classes referred to as LEFT,

RIGHT, UP; DOWN and NULL for the non-stimulus or non-command class.

107

A classification post processing was inserted in the protocols in order to achieve a

good usability.

108

RESULTS AND DISCUSSION

Power requirement and DSP performances

In the following paragraph the technological performances of the device will be

discussed. The aim of this activity was the characterisation of the device in order to

define the following parameters:

1. Power consumption;

2. Amount of allocated memory;

3. Amount of CPU time required by the processing;

4. Access time to the SD card using the file system;

5. Total EEG noise;

Test setup

The firmware configuration of the KimeraII unit was the most demanding in terms of

resources:

• Sample rate: 256Hz;

• Acquired EEG channels: 8;

• Classifier calibration on SD;

• Online feedback feature extraction ON;

• Online classification ON;

• Online signal streaming ON;

• SD card signal storing OFF.

Timings were measured using a digital oscilloscope connected to a general purpose

input/output pin used for debug purposes. Power consumption was measured using a

10 Ohm resistor in series to the power supply, using an oscilloscope in differential

mode it is possible to monitor the power consumption by measuring the voltage drop

cause by the resistor and to apply Ohm’s Law.

Device total noise was measured by connecting the input of the amplifier to the

reference an to the ground input of the amplifier. A ten minutes acquisition was

performed using KimeraII Bluetooth connection thus the measured values can be

109

considered representative of the performance during the real operation. The total

measured noise is composed of the sum of the following effects:

• Amplifier noise;

• Power supply noise caused by the digital devices;

• ADC noise.

The scope of this measurement is to give a broad reference for comparison with other

devices, more complex techniques should be used in order to separate the different

aspects.

Results

Power consumption

The following table shows the measured power consumption in the corresponding

working condition:

110

112

Condition Current (mA) Power (mW)

Bluetooth data acquisition

with trigger reception**

80 mA 288 mW

Bluetooth data acquisition

without trigger reception 60 mA 216 mW

SD card data storage

without triggers 15 mA 54 mW

Table 5. Power consumption in different configurations

More optimized power management techniques could be taken into consideration in

order to further reduce power consumption, but the current performance can be

considered satisfying: in the most power demanding configuration the battery will

allow 28 hours of continuous acquisition.

Amount of allocated memory

The total used memory was directly prompted by the linked and was about 10kByte.

About 4kByte were used by the operating system and for data routing and

communication routines the rest was used by the algorithm.

* due to the STR7 clock distribution architecture, in order to receive triggers from the

serial interface the dynamic clock frequency switching must be deactivated.

113

CPU time

Fig 54. Time plot of the debug pin and measurement of elaboration time

Fig 55. Time plot of the debug pin and measurement of communication time

The signal elaboration CPU time was 18ms for a 3 seconds window. Since the

communication routine was a blocking routine even a communication time of 48ms

should be taken into account. Considering a time between one elaboration and the

114

following of 125ms about 48% of CPU time is still available. By implementing a non

blocking communication procedure the available CPU time would be 85%.

SD card access time

Fig 56. SD card profiling

In order to assess the performance of the SD card driver and of the FAT16 file

system a custom function was used. The function opened a temporary file, written

256 block of 128 bytes each and closed the file. Fig 56 shows the time measurement

of the part related to writing.

• File opening time: 50 ms;

• 32kbytes writing 162ms;

• File flush and closure time 130ms;

The corresponding calculated transfer rate is about 197 Kbytes per seconds. The

maximum clock of the STR7 SPI is 4MHz thus a theoretical transfer rate of

500kbytes per seconds.

Total EEG noise

The measured total EEG noise is less than 100nVpp. In was measured by a 10

minutes acquisition from three electrodes (two inputs and GND) connected together.

115

BCI users and performances

The screening sequence was performed on 24 disabled patients in a clinical

environment (La nostra famiglia, Bosisio Parini) using a medical certified EEG

amplifier (Neuroscan - Compumedics USA).

The following figure represents the Joint Time-Frequency Analysis (JTFA) of the

signal acquired during the screening phase. The signal is Oz digitally referred to the

common average of the other electrodes. The spectral density was referred to a

baseline extracted from a non-stimulation period and an image filter was used in

order achieve a better contrast. The figure shows the analysis of three different

subjects in order to give a better illustration of different typology of elicited signal.

Fig 57. Screening subject A – SSVEP response at all stimulation frequencies, presence of

harmonics

116

Fig 58. Screening subject B - SSVEP response only in the upper frequencies, no harmonics

Fig 59. Subject C - SSVEP response on some frequencies, some have harmonics, other not

117

The SSVEP can be identified by a power increase during the stimulation phase at

increasing frequencies according to the screening protocol. It is possible to notice

that each subject have a very specific response to the stimulation. While some

subject generate an SSVEP with every stimulation frequency mainly ad the main

stimulation frequency (C), other generate a complete three harmonics wave (A).

Some subject (B) are able to generate an appreciable SSVEP just in the main

stimulation frequency, other subjects show a more complex response in the second

and upper harmonics. About 60% of the examined subject were considered suitable

for online use of the system.

An online evaluation of the SSVEP system was performed on ten healthy subjects

using our EEG device.

Testing procedure

The subject was sitting in front of the screen on a chair for office use. The distance

from the screen was chosen by the user at his/her own pleasure. An electrode cap

from G.Tec and golden electrodes were used. The electrodes were applied by a

student and the recording sites were the following:

• O1, Oz, O2;

• P4, Pz, P5;

• T5, T6.

118

Fig 60. A subject during the acquisition

The reference was a linked mastoids reference and ground electrode was connected

in the frontal region. The electrode skin impedance was measured using an external

equipment. In 6 subjects out of 10 all the impedances were under 10kOhms.

The standard testing procedure was composed of:

• One screening session;

• One calibration session without feedback;

• One testing session with feedback;

If the testing was successfully carried out:

• One AstroBrainGame with feedback session;

• One home automation usability test.

A detailed description of the protocols can be found in the previous chapter.

During the first part of the testing procedure the subject was verbally instructed about

the meaning of the graphic interface. In order to increase users participation and

involvement in the protocol, they were informed that a “hall of fame” of the best

AstroBrainGame players was being formed. Some subject enjoyed the system and

asked for other BCI sessions after the standard one: in those cases the database was

updated, but the score wasn’t recorded in the results table.

119

Subject SSVEP Frequencies (Hz) Test

time

Game

time MyHome

L R U D

HA Yes 9,14 6,9 5,9 8 73 265 Used with

lights

HB Yes 8 9,10 9,8 11,1 59 206 Used with

lights

HC No

HD No

HE Yes x x x x Only

screening

HF Yes 8 11,1 12,2 14,2 80 205 Used with

lights

HG No

HI Yes 9,14 9,8 11,13 16 65 210 Used with

lights

HL Yes 9,8 11,1 12,8 15,1 170* 461* Used

with lights

HM Yes 8 9,8 11,1 12,2 64 295 Used with

lights

Table 6. Subject performances using the SSVEP based BCI

All the subjects were between 22 and 55 years old, 9 of them were male. Seven

subject out of ten generated an SSVEP and six of them were able to train a classifier

and use the system (subject HE, showed an SSVEP response, but had problems with

electrodes). Subject HL was able to use the system, but the test times were not

considered in the statistics because didn’t completely understand the instruction.

The average time to complete the eight symbols test was: 68,2 seconds (std 8,2 s).

The best performance was 59 seconds. Considering the time in which the lights are

switched off, according to Wolpaw definition [Kronegg, 2005] the resulting bit rate

is:

• Mean bit rate: 21,8 bit per minute;

• Maximum bit rate: 27,4 bit per minute.

During the Astrobrainfight protocol the average performance was 236 (std 41,4)

seconds to reach 15 scores which lead to collision time on one collision in 15

seconds.

120

The obtained value are comparable to those found in literature [Ming, 2007]. We

have to point out that this speed test wasn’t a measurement of the maximum

transfer rate reached by a skilled user with an user optimized classifier, but the

transfer rate obtained by a first time BCI user. In order to give and idea of the effect

of user training on system performance we underline that the subject HF obtained the

best score in the AstroBrainFight protocol and the worst one int the eight symbols

test. All the six subject that acceded to the second part of the protocol were able to

control the home automation system even after having switched the lights on, thus

the system results strong to environmental light variation.

The most of the subjects (5 out of 6) reported that they were able to control the

biofeedback bars, thus we can conclude that the adopted biofeedback synthesis

algorithm and display strategy have a positive impact on the adaptation of the subject

to the protocol.

Subjects reported to be annoyed by the external stimulation during the screening and

the training phase since they had to continuously focus the stimulator, while during

the other phases the frequent pauses reduced this effect.

A delay between gaze onset or offset and the algorithm response is present and is due

to the use of a three second windows: during the AstroBrainFight game three subject

were able to understand this effect and to adopt a command anticipation strategy in

order to achieve a better result.

BCI Development experience

From the technological point of view, the framework appears to be well structured

and flexible. The easiness of expansion is provided in any activity involved in the

development of a BCI:

• Algorithms;

• Applications/protocols;

• Input devices;

• Stimulation devices.

The usability of the framework was demonstrated in the implementation of the

SSVEP based system: during the development of the system a Bioengineering master

thesis student without previous C++ programming experience, was successfully

involved in the development two new BCI-driven application.

121

When Matlab® algorithms are provided, the implementation of a new testing

protocol usually takes about 4 hours for the user interface customization and a few

hours for testing purposes; thus evaluation of a new operative protocol isn’t

constrained by technical issues. It was possible to enable the framework for use with

a commercial EEG equipment (G.Tec) in less than a working day.

We maintain that the reported usability impressions are provided by the same

research team that carried out the development of the framework: a more objective

evaluation could be provided by external developers.

122

124

CONCLUSIONS AND FURTHER WORKS

The framework proved a good flexibility and the wide set of debugging instruments

dramatically simplified the debug and the test of a new system. Starting from a

previous experience and set of algorithms for the SSVEP protocol it was possible to

set-up a complete BCI system in our new laboratory in CampusPoint@Lecco (Lecco,

Italy) in a few weeks. The resulting system proved a good usability 7 subjects out of

ten were able to operate a gaming interface and an home automation system.

The performance of the SSVEP based BCI are comparable to the ones reported in

literature and the subject were instantly able to use the BCI after the calibration of

the classifier. The adopted unsupervised biofeedback scaling gave good results in

terms of Brain-Machine mutual adaptation: many subjects reported that they

considered themselves able to have an amplitude control of the feedback bars.

The adopted user interface was immersive and the user was stimulated in reaching

the maximum performance.

The most relevant aspect of the developed framework is the possibility for unskilled

developers (eg. Master thesis students) to develop and test their own work and to

actively help increasing the number of available instruments in the framework.

The final goal is to make this fascinating technology available at home by any

potential user with the help of their family members. The next steps planned in order

to reach this objective are in the following directions:

• software platform distribution;

• application to disabled people;

• usability enhancement for people without technological background;

• availability of low cost acquisition devices.

In the immediate future all the software platform will be available free-of-charge to

other research group and further efforts will be done in documenting and

disseminating the obtained results in order to give a better visibility to this project, in

the hope that the community will help to complete the framework in all the specific

aspects.

125

A permanent BCI in a hospital environment will be installed in order to verify the

usability of the developed by the final users and of the framework by physicians and

therapists.

A feasibility study for the industrialization and the certification of the device is in

progress by our partner SXT - Sistemi per telemedicina.

Apart from intrinsic challenges of EEG signal analysis, one of the main obstacles

precluding EEG-BCI from being used in patients' daily lives is setup encumbrance.

Modern EEG practice, as part of the electrode application procedure known to

specialists as montage, requires tedious application of conductive gel between

electrodes and scalp. The development of dry electrodes for EEG wasn’t discussed in

this work, but it is a actually considered a challenging issue for many groups: a

success in this research field will mark a key point in diffusion and usability of

uninvasive Brain Computer Interfaces.

126

APPENDIX A – SCHEMATICS OF THE GENERAL PURPOSE ACQUISITION BOARD

The following schematics are property of SXT – Sistemi per telemedicina S.R.L.

127

128

129

130

131

132

133

134

136

REFERENCES

Andreoni G., Beverina F., Palmas G., Silvoni S., Ventura G., Piccione F. BCI based on

SVEP: methodological basis. Biomedizinische Technik, Band 49, Ergänzungsband 1, 2004

pp. 33-34.

Andreoni G., Parini S., Maggi L., Piccini L., Panfili G., Torricelli A., “Human Machine

Interface for Healthcare and Rehabilitation”, (book) Advanced Computational Intelligence

Paradigms in Healthcare-2, (Book Series) Studies in Computational Intelligence, Volume

65/2007, June 07, CAP.7, pgs. 131-150, Springer Berlin / Heidelberg

Andreoni G., Piccini L., Maggi L.. Active customized control of thermal comfort,

International Encyclopedia of Ergonomics and Human Factors, 2.nd edition. CRC Press –

2006

Bauer G., Gerstenbrand F., and Rumpl E., “Variables of the locked-in syndrome,” J. Neurol.,

vol. 221, pp. 77–91, 1979.

Bellman, T., & MacKenzie, I. S. (1998), A probabilistic character layout strategy for mobile

text entry. Proceedings of Graphics Interface '98, pp. 168-176, Toronto: Canadian

Information Processing Society.

Beverina F., Giorgi F., Giove S., Piccione F., Silvoni S. (2004). P300 off-line detection: a

fuzzy-based support system. In A.Bonarini (Ed), Soft Computing Applications (pp. 155-

164). New York Springer-Verlag Inc.

Beverina F., Palmas G., Silvoni S., Piccione F. and Giove S. (2004). User adaptive BCIs:

SSVEP and P300 based interfaces”. Psychnology journal. ISSN 1720-7525.Vol. 1 [4]. 331-

354

Beverina F., Silvoni S., Palmas G., Piccione F., Giorni F., Tonin P., Andreoni G.. P300-

based BCI: a real-time working environment to test HCI on healthy and tetraplegic subjects.

Biomedizinische Technik, Band 49, Ergänzungsband 1, 2004 pp. 35-36.

Birbaumer et al., 1999 N. Birbaumer, N. Ghanayim, T. Hinterberger, I. Iversen, B.

Kotchoubey, A. Kubler, J. Perelmouter, E. Taub and H. Flor, A spelling device for the

paralysed, Nature 398 (6725) (1999), pp. 297–298.

Blankertz B., Krauledat M., Dornhege G., Williamson J., Murray-Smith R., and Klaus-

Robert Müller. A note on brain actuated spelling with the Berlin Brain-Computer Interface.

In C. Stephanidis, editor, Universal Access in HCI, Part II, HCII 2007, volume 4555 of

LNCS, pages 759-768, Berlin Heidelberg, 2007. Springer.

Cambridge University Press “Numerical Recipes: The Art of Scientific Computing, Third

Edition (2007)” is published by Cambridge University Press

Cheng et al., 2002 M. Cheng, S. Gao, S. Gao and D. Xu, Design and implementation of a

brain-computer interface with high transfer rates, IEEE Trans. Biomed. Eng. 49 (2002), pp.

1181–1186.

137

Chia L. G., “Locked-in syndrome with bilateral ventral midbrain infarcts,” Neurol., vol. 41,

pp. 445–446, 1991.

Donoghue J. P., “Connecting cortex to machines: Recent advances in brain interfaces,”

Nature Neurosci. Supp., vol. 5, pp. 1085–1088, 2002.

Dornhege G., Millán J., Hinterberger T., McFarland D., and Müller KR., Eds., Towards

Brain-Computer Interfacing, MIT Press, 2007

Duhamel, 1985 P. Duhamel, Implementation of ‘split radix’ FFT algorithms for complex,

real and real-symmetric data, IEEE Trans. Accoust. Speech Signal Process. 34 (2) (1985),

pp. 285–295.

Elizabeth M. Badley, “Enhancing the conceptual clarity of the activity and participation

components of the International Classification of Functioning, Disability, and Health”,

Journal of Social Science & Medicine, In Press, 2008

Euler and Kiessling, 1981 M. Euler and J. Kiessling, Frequency-following potentials in man

by lock-in technique, Electroencephalogr. Clin. Neurophysiol. 52 (1981), pp. 400–404.

Gao et al., 2003 X. Gao, D. Xu, M. Cheng and S. Gao, A BCI-based environmental

controller for the motion-disabled, IEEE Trans. Neural Syst. Rehabil. Eng. 11 (2003), pp.

137–140.

Herrmann, C.S., Human EEG responses to 1–100 Hz flicker: resonance phenomena in visual

cortex and their potential correlation to cognitive phenomena, Exp. Brain Res. 137 (2001),

pp. 346–353.

J. Leon-Carrion, P. van Eeckhout, M. d. R. Dominguez-Morales, and F. J. Perez-Santamaria,

“The locked-in syndrome: A syndrome looking for a therapy,” Brain Inj., vol. 16, pp. 571–

82, 2002.

Jette A., “Toward a Common Language for Function, Disability, and Health”, PHYSICAL

THERAPHY Vol. 86, No. 5, May 2006, pp. 726-734

Kelly et al., 2005a S.P. Kelly, E.C. Lalor, R.B. Reilly and J.J. Foxe, Visual spatial attention

tracking using high-density SSVEP data for independent brain-computer communication,

IEEE Trans. Neural Syst. Rehabil. Eng. 13 (2) (2005), pp. 172–178.

Kelly et al., 2005b S.P. Kelly, E.C. Lalor, C. Finucane, G. McDarby and R.B. Reilly, Visual

spatial attention control in an independent brain-computer interface, IEEE Trans. Biomed.

Eng. 52 (9) (2005), pp. 1588–1596

Kennedy PR, Bakay RAE, Moore MM, Adams K, Goldwaithe J. Direct control of a

computer from the human central nervous system. IEEE Trans Rehabil Eng 2000;8:198–202.

Krepki R., Curio G., Blankertz B., Muller K.T., “Berlin Brain–Computer interface—The

HCI communication channel for discovery”-, Int. J. Human-Computer Studies 65 (2007)

460–477

Kronegg J, Voloshynovskiy T., Pun. T, “Analysis of bit-rate definitions for Brain-Computer

Interfaces”- Int. Conf. on Human-computer Interaction 2005

Kübler A., Kotchoubey T., Kaiser J., Wolpaw J., and Birbaumer N., “Brain-computer

communication: Unlocking the locked in,” Psychological Bulletin, vol. 127, no. 3, pp. 358–

375, 2001.

138

Kübler, V. K. Mushahwar, L. R. Hochberg, and J. P. Donoghue, “BCI Meeting 2005—

Workshop on Clinical Issues and Applications”, AIEEE TRANSACTIONS ON NEURAL

SYSTEMS AND REHABILITATION ENGINEERING, VOL. 14, NO. 2, JUNE 2006

Lalor et al., 2004 Lalor e, Kelly SP, Finucane C, Burke R, Reilly RB, McDarby G. Brain-

computer interface based on the steady-state VEP for immersive gaming control. In:

Proceedings of the 2nd International Brain-Computer Interface Workshop and Training

Course 2004, suppl., Biomed. Techn. (Berl.), 2004;49:63–64.

M. Middendorf, G.R. McMillan, G.L. Calhoun and K.S. Jones, Brain computer interfaces

based on the steady-state visual-evoked response, IEEE Trans. Neural Syst. Rehabil. Eng. 8

(2000), pp. 211–214.

Maggi L., Parini S., Piccini L. and Andreoni G., “A four-class BCI system based on the

SSVEP protocol”, Proceedings of IEEE EMBC 2006, August 30-September 3, 2006, New

York (USA)

Maggi L., Piccini L., Parini S., Andreoni G. and Panfili G., “Biosignal acquisition device: A

novel topology for wearable signal acquisition devices”, Proceedings of Biosignals 2008 -

2008, January 27-31, Funchal (Portugal)

Mason G., Birch G., “A General Framework for Brain–Computer Interface Design”, Steven

IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION

ENGINEERING, VOL. 11, NO. 1, MARCH 2003

Mazzucco L., Parini S., Maggi L., Piccini L., Andreoni G., Arnone L., “A platform

independent framework for the development of real-time algorithms: application to the

SSVEP BCI protocol”, Proceedings of International BCI Workshop 2006, September 21-24,

2006, Graz (AU).

McMillan et al., 1995 G.R. McMillan, G.L. Calhoun, M.S. Middendorf, J.H. Schnurer, D.F.

Ingle and V.T. Nasman, Direct brain interface utilizing self-regulation of steady-state visual

evoked response (SSVER), Proceedings of the RESNA 18th Annual Conference (RESNA)

(1995).

Millan and Mourino, 2003 J.R. Millan and J. Mourino, Asynchronous BCI and local neural

classifiers: an overview of the adaptive brain interface project, IEEE Trans. Neural Syst.

Rehabil. Eng. 11 (2003), pp. 159–161.

Ming Cheng, Xiaorong Gao, Shangkai Gao “Design and Implementation of a Brain-

Computer Interface With High Transfer Rates”, Conference of the IEEE EMBS August

2007.

Muller-Putz et al., 2005 G.R. Muller-Putz, R. Scherer, C. Brauneis and G. Pfurtscheller,

Steady-state visual evoked potential (SSVEP)-based communication: impact of harmonic

frequency components, J. Neural Eng. 2 (2005), pp. 123–130.

Muller-Putz et al., 2006 G.R. Muller-Putz, R. Scherer, C. Neuper and G. Pfurtscheller,

Steady-state somatosensory evoked potentials: suitable brain signals for Brain-Computer

Interfaces?, IEEE Trans. Neural Syst. Rehabil. Eng. 14 (2006), pp. 30–37.

Parini S., Maggi S., Piccini L., Andreoni G., “Real-time feedback solution applied to the

motor imagery based BCI protocol”, Proceedings of International BCI Workshop 2006,

September 21-24, 2006, Graz (AU).

139

Pfurtscheller G, Neuper C, Schlo¨gl A, Lugger K. Separability of EEG signals recorded

during right and left motor imagery using adaptive autoregressive parameters. IEEE Trans

Rehabil Eng 1998;6:316–325.

Pfurtscheller G, Neuper C. Motor imagery and direct brain-computer communication. Proc

IEEE 2001;89:1123–1134.

Pfurtscheller G. and Lopes da Silva, 1999 G. Pfurtscheller and F.H. Lopes da Silva, Event-

related EEG/MEG synchronization and desynchronization: basic principles, Clin.

Neurophysiol. 110 (1999), pp. 1842–1857.

Pfurtscheller G. and Neuper C., Motor imagery activates primary sensorimotor area in

humans, Neurosci. Lett. 239 (1997), pp. 65–68.

Pfurtscheller G. and Neuper C., Motor imagery and direct brain-computer communication,

Proc. IEEE 89 (2001), pp. 1123–1134.

Pfurtscheller G., Müller-Putz G. R., Schlögl A., Graimann B., Scherer R., Leeb R., Brunner

C., Keinrath C., Lee F., Townsend G., Vidaurre C., and Neuper C., “15 Years of BCI

Research at Graz University of Technology: Current Projects”, IEEE TRANSACTIONS ON

NEURAL SYSTEMS AND REHABILITATION ENGINEERING, VOL. 14, NO. 2, JUNE

2006

Piccini L., Maggi L., Parini S., Andreoni G., A Wearable Home BCI system: preliminary

results with SSVEP protocol, Proceedings of IEEE EMBC 2005, September 1-4, 2005,

Shanghai (China)

Piccione, F. Giorgi, P. Tonin, K. Priftis, S. Giove, S. Silvoni, G. Palmas, F. Beverina, “P300-

based brain computer interface: Reliability and performance in healthy and paralysed

participants”. Clinical Neurophysiology, Volume 117, Issue 3, Pages 531-537 F. (2006)

Regan, 1989 D. Regan, Human brain electrophysiology: evoked potentials and evoked

magnetic fields in science and medicine, Elsevier, Amsterdam (1989).

Schacham and Pratt, 1985 S.E. Schacham and H. Pratt, Detection and measurement of

steady-state evoked potentials in real-time using a lock-in amplifier, J. Neurosurg. 62 (1985),

pp. 935–938

Scherer R. et al, “Inside the Graz-BCI: rtsBCI“, Biomedizinische Technik, vol. 49, 2004, pp

81-82

Shalk G. et al, “BCI2000: A General-Purpose Brain-Computer Interface (BCI) System”,

IEEE trans. Biomed. Eng. 2004; 51(11): 1034-1043

Sutter EE. The brain response interface: communication through visually induced electrical

brain responses. J Microcomput Appl 1992;15:31–45.

Vaughan T., et al, “Use of the ICF Model as a Clinical Problem-Solving Tool in Physical

Therapy and Rehabilitation Medicine”, PHYSICAL THERAPHY, Vol. 82, No. 11,

November 2002, pp. 1098-1107

Wolpaw J. R., “The Wadsworth BCI Research and Development Program: At Home With

BCI.” IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION

ENGINEERING, VOL. 14, NO. 2, JUNE 2006

140

Wolpaw J. R., Birbaumer N., McFarland D., Pfurtscheller G., and Vaughan T., “Brain-

computer interfaces for communication and control,” Clinical Neurophysiology, vol. 113,

no. 6, pp. 767–791, 2002.

Wolpaw JR, McFarland DJ, Neat GW, Forneris CA. An EEG-based brain– computer

interface for cursor control. Electroenceph clin Neurophysiol 1991;78:252–259.

World Health Organization, “International Classification of Functioning, Disability, and

Health (ICF)”. ICF full version. Geneva, Switzerland:, 2001


Recommended