Tech Project

RIVAS - Real-time interactive visualization on panorama screens
VRVisualizationUXUIAudioDisplayImmersionProjectionInteraction
Open Call
In Residency
Residency Outcome
Real-time interactive visualization of artistic content on UHD resolution panorama screens an VR glasses
From Jan. 2, 2017 to Dec. 31, 2020
  • Interactive viewing of artwork in the TiME-Lab  Fraunhofer HHI
    Interactive viewing of artwork in the TiME-Lab Credits: Fraunhofer HHI
  • Tomorrows Immersive Media Experince Lab – 180° ultra-high resolution projection system and WFS spatial audio system  Fraunhofer HHI
    Tomorrows Immersive Media Experince Lab – 180° ultra-high resolution projection system and WFS spatial audio system Credits: Fraunhofer HHI
  • Example of rendered artwork  J.H_D
    Example of rendered artwork Credits: J.H_D

Description of the challenges faced by the Tech Project

There are many challenges in this project, which result from the variety of supported platforms in combination with interactivity on the one hand and from the esthetic demands of the artists on the other hand. In particular the following technical challenges occur: - Display of the 360° content on various platforms (Smart phones, tablets, VR glasses, 180° screen, 360° screens) - Display on different devices with different capabilities - Online rendering versus streaming - Touchless interaction in different environments (with VR glasses or in immersive viewing room) - Haptic feedback - Generation of artistic 360° content in ultra-high resolution - Real-time versus non-real-time rendering - Joint visualization on and interaction between remote platforms (e.g. in Berlin in Poznan) - Joint visualization on VR glasses and on 180°/360° platforms and interaction between them.

Brief description of technology

The objective of research is the immersive display of 360° content. This can either happen on VR glasses or in an immersive viewing room like our TiME-Lab. The Tomorrow’s Immersive Media Experience (TiME) Lab at Fraunhofer Heinrich Hertz Institute consists of an ultra-high resolution (2k x 7k) multi projection system in 2D and 3D and a wave field synthesis audio system with 140 loudspeakers allowing the reproduction of 3D sound. The projection system consists of 7 HD projectors for 2D or 14 HD projectors for 3D images/video and in the 3D case the Infitec system is used for view separation. 180° video panoramas can either be played back from files or rendered in real time by a local render engine. This video and audio installation provides a fully immersive viewing and hearing experience in this environment. While using the system for rendering computer generated content in real time, it can also be used in an interactive way by allowing the user to interact via gestures or usual input devices such as a keyboard or a 3D mouse. In addition, there is a possibility to stream the same content to VR glasses using different streaming technologies (e.g. tile based streaming). For the artists part it is planned to set-up the subproject epiMimesis. EpiMimesis considers “All is Data”, starting from the most elementary particles to the most complex self-organizing systems. The main scientific inspiration for this subproject came from the latest scientific research on the basic structure of life - RNA molecules from which, according to the RNA World hypothesis, the Big Web of Life evolved. RNA plays a key role in most cellular processes and the translation of the genetic information into proteins. The examination of the basic molecular mechanisms affects not only our understanding of the evolution and eigen-dynamics of complex systems, moreover it provides the ability to design new molecular structures and functions of great importance for modern medicine and for the future well-being of the society. All data and sets of big data can interact individually and with all environmental data, as well. Thus, this subproject will be based on artistic interpretation of some real scientific big data to achieve an immersive artistic experience through virtual image, touch, and hearing of the processed data. The real scientific big data will be provided by outstanding scientific researchers from prominent European research institutions (LHC / CERN / Polish Academy of Sciences / IIMCB - International Institute of Molecular and Cell Biology / Adlershof HZB and others). The processing, visualization, hearing and feeling (3D, Interactive VR, touch-less interaction, AR etc...) of the big data will be provided and supported by Fraunhofer HHI in cooperation with PSNC with their newest technologies. The artist together with her team will use and challenge these technologies in order to create the above mentioned ambitious artwork. This will provide valuable feedback to the researchers in the “interactive TiME-Lab” project and help them to develop new means of interactivity in immersive environments, and at the same time provide new means for distributing such content, by enabling its visualization on widely used end devices such as smart phones, tablets and VR glasses.

What the project is looking to gain from the collaboration and what kind of artist would be suitable

The objective of the proposed project is twofold. On one hand, new forms of interaction over large distances will be studied. Therefore it is planned, to set-up a high-speed link between Poznan Supercomputing and Networking Center (PSNC) (http://www.man.poznan.pl/online/en/) and Fraunhofer HHI. PSNC has a 360° display system based on single TV displays. Therefore, 360° content can be visualized simultaneously in both centers and novel interaction concepts over large distances can be investigated. On the one hand, the focus lies on the development and exploration of new types of content, which can be viewed on the different platforms mentioned above and which will challenge the developed technologies. Immersive platforms (VR glasses and 180°/360° viewing rooms) have been developed, to give the user an immersive viewing and hearing experience. Currently most content is still 360° video either from live cameras or from postproduction. However, interactive viewing of artistic content is regarded as an attractive application for this kind of immersive platforms. However, the needs of artists with respect to portability of content and with respect to interaction possibilities are not yet well enough studied and therefore the technical requirements for visualizing artistic content interactively are not yet fully understood. In addition, the artistic content has to be made available at very high quality and resolution in order to fulfill the requirement for multi-platform support. It is expected that the planned cooperation with the external artist will help deliver such attractive content and will enable new immersive and interactive experiences and thus make the developed technologies attractive for larger communities.

Resources available to the artist

Resources available to the artist -- e.g. office facility, studio facility, technical equipment, internet connection, laboratory, and periods of availability for artistic production, staff possibly allocated to the project, available budget for travel, consumables and equipment, etc... (50 – 100 words). The artist will get access to the complete TiME-Lab and VR-technologies. This means that the artist can either write his own application software or she will get support from local staff to display her interactive content. A wideband Internet connection will be made available in order to create a distributed immersive viewing experience, e.g. together with the PSNC in Poznan. Travel expenses up to 1.000 € will be made available in order to coordinate the work between Poznan and Berlin.