FlyCamXR - Autonomous camera drones for interactive experiences in extended reality telepresence applications with enhanced naturalness
DAAD - Programmes for Project-Related Personal Exchange (PPP) from 2022 with India
Projektstart: 01.01.2023
Projektdauer: 24 Monate
Budget: € 12.200
Abstract:
Immersive experiences (in which the user is completely involved in the media) have been intended in the past with several technologies such as High Definition and Ultra High Definition (UHD) in the context of TV; and Cinemascope, IMAX and Dolby Cinema in the context of cinema, and 360° video in the context of VR films. The emergence of VR/AR/MR devices, however, has created a completely new opportunity for delivering truly immersive experiences to final users, and these immersive experiences become interactive which enable joint collaboration among people either with virtually created objects or in virtual spaces.
FlyCamXR addresses the acquisition, virtualization and rendering of real environments and objects in order to enable a natural collaboration in a shared space in XR telepresence scenarios. The project’s main goal is to enable a joint natural communication among participants located at different places and to bring them together in a real and virtualized space. In order to successfully accomplish this goal, the project defines the following specific research objectives:
- To design the technology requirements from a deep understanding of user needs and expectations, how naturalness is perceived by them, as well as industry needs.
- To provide a realistic shared space to host the interactive experience which increases the naturalness of the communication. The objective is to provide a virtualization of a real environment as it is at the moment of the communication. To cope with real-time requirements, this objective includes the improvement of offline 3D reconstruction of environments and the exploitation of the use of affordable acquisition technologies for updating the offline reconstructed environment in real-time to match changes in the environment, in particular color-transfer approaches to update illumination changes. This objective includes the exploration of affordable acquisition systems for indoor environments, the exploitation of drones/UAVs to acquire representations in outdoor and mobile environments, and developing autonomous navigation algorithms taking into account a social perspective. This objective also includes the development of artificial intelligence (AI) scene analysis and scene understanding technologies to detect and recognize objects and/or elements of the environment for information enrichment and future object placement in the rendering stage.
- To provide quality metrics for color transfer results of realistic shared spaces. As in real scenarios 3D reference models of the shared spaces do not exist, non-reference quality metrics for quality assessment of the generated shared spaces need to be developed and validated.
- To demonstrate the proposed solution in different use cases in the professional and private contexts to gather evidence of the benefits at increasing efficiency and naturalness of joint communication.
Ansprechpartner: Prof. Dr. Sebastian Knorr
Kontaktdetails:
Tel.: +49 3641 205 739
E-Mail: sebastian.knorr@eah-jena.de
Web: Homepage