This joint research project is developing new techniques and strategies for computer-assisted composition in the context of real-time user control with non-standard human interface devices for applications in electronic art and digital entertainment systems. The research team is designing and implementing real-time software, hardware and specialized human-interfaces that will provide tools and resources for music, dance, theatre, installation artists, interactive kiosks, computer games, internet/web information systems.

The outcome of the project will be the creation of a modular toolbox for real-time dynamic music generation that will allow for easy creation of software applications for the purposes described above. The toolbox will be highly flexible allowing its use both by trained musicians and the general public. Simply by patching together the desired modules for music generation, musical parameters can be seamlessly operated and controlled by gesture driven interface/kinetic controllers, thereby granting the user of the system a very intuitive way of music control and interaction.

Casa da Música and YDreams are pivotal partners, in which they help keeping the overall focus of the project – the creation of a software toolbox for real time control and generation of music able to be utilized by a broad range of users, into applications meant to be engaging, entertaining and stimulating. The applications to be developed will be aimed both at (1) highly specialized users aiming at a standard professional quality for use in products such as inline/offline interactive marketing, computer assisted performance and accompaniment, interactive installations, computer games, etc.; (2) non-specialized users, including people with disabilities, children and the elderly for use in sound based games, interactive music creation and cognitive sound stimulation.