Description of the challenges faced by the Tech Project
The project technology, with immersive video rendering, spatial audio rendering, large-space tracking, and advanced interaction constitutes a strong and challenging support for an artistic production aiming at a live performance. Up till now, we proposed some experiences to public audience where the visitor was immersed in interactive sound environments either to produce a sound feedback (http://people.irisa.fr/Ronan.Gaugne/factorymusic/), or to explore a musical environment (http://people.irisa.fr/Ronan.Gaugne/evoluson/). We now target a project of live music, as it is very challenging for our platform in term of combination of technologies (sound, video and tracking), and opens new possibilities, in term of combination of experiences (hearing, seeing, moving). It also raises challenges for VR with interaction, rendering latency, and multi-modal rendering.
Brief description of technology
Immersia is a large immersive virtual reality research facility of Inria/IRISA, Rennes, France. This platform, with full visual and sound immersion, is dedicated to real-time, multimodal (vision, sound, haptic, BCI) and immersive interaction. The main immersive place counts four screens (a frontscreen, two sidescreens and a groundscreen). Dimensions of the immersive structure are 9.6 m wide, 2.9 m deep and 3.1 m height. Visual rendering is provided by 14 Barco F90-4K13 projectors, and 7 PC with 2×7 Nvidia quadro P6000, through glass screens for front and side screens and rendered on the acrylic ground. An Optitrack localization system, constituted of 16 cameras, enables real objects to be located within the whole environment and enables full body tracking. Sound rendering is provided by a Yamaha processor, linked either to Genelec speakers with 10.2 format sound or audio headsets. In addition, 2 Virtuose 6D35-45 haptic devices allow to add haptic feedback within experiments. The rendering software mainly uses Unity and MiddleVR, but other rendering software such as OpenSceneGraph, Covise, Collaviz, OpenSG or OpenMASK have already been successfully installed and used in Immersia.
What the project is looking to gain from the collaboration and what kind of artist would be suitable
Music allows to imagine new devices, either by augmenting existing music instruments or by designing new ones. The combination with immersive video rendering and full body motion provides opportunities to enrich both the experiences of the audience and the performances of the artists. We look forward to host a sound artist, interested in integrating motion and digital world in his/her work. The project is an opportunity to develop the sound rendering that is seldom integrated in VR projects, and even less directly associated to interactions. Furthermore, the project allows to organize public events that provides visibility to a platform dedicated to research, and more generally to VR.
Resources available to the artist
During the residency, the artist will have access to the Immersia platform, which consists of a large experiment room (212m2) with the immersive structure, a project room (20-25m2), with a smaller immersive structure (1.9m x 2m x 2.6m), a platform team, with four engineers, two offices, allowing to host seven persons, and additional VR equipment such as head-mounted displays, graphic computers, motion capture system, computer and software resources (Unity 3D, Max 8, Middle VR, Dante Via, Dante Virtual Soundcard, DME Designer…). Our institution is willing to host an artist up to 9 months within the period from 1/1/2019 to 31/12/2020. This long period will allow the artist to learn the technologies involved in the platform and to coordinate different skills (video, motion capture, and scenography)