Two performers sit in front of a large projection screen where computer-generated events are projected in stereo.
A console, played in front of the operators on an LCD monitor, allows performers to control the happening.
Three electrodes attached to the performer’s heads transmit the brain’s electric signals to an electroencefalograph, which analyses and transfers them to a computer, which further processes them.
EEG signals will be used for different purposes during the performance.
An eye-movement tracking system enables the performers to control, in real time, visual entities in Virtual Reality, by their eye movements.
A digital camera is aimed at the operator’s eye. The operator wears a black band on his forehead on to which a white reference point is placed. The reference point allows the computer to calculate the distance and the position of the pupil in relation to it. This way computer knows where the operator is looking.
Actions, which operators perform on the console, are projected to the lateral so that the audience can monitor the event.
Sonic data, which are generated by the main computer, are transmitted to an amplifier and further into real space by four loudspeakers.
The performance lasts for about 30 minutes.