Learning Digital Humans in Motion

The project aims to enhance immersive telepresence by using natural language to reconstruct and animate photo-realistic digital humans for interactive communication in AR and VR environments.

Subsidie
€ 1.500.000
2025

Projectdetails

Introduction

We are taking part in a revolution of how people perceive each other and how they communicate in the digital space. Emerging technologies demonstrate how humans can be digitized such that their digital doubles (digital humans) can be viewed in mixed or virtual reality applications.

Importance of 3D Digital Humans

Especially for immersive telepresence, these 3D digital humans are of high interest since people can see each other in augmented (AR) or virtual reality (VR) and can interact with each other. For an immersive experience, this requires:

  • A high-quality level of appearance reproduction
  • The estimation of subtle motions of the human

Hardware Considerations

Besides quality aspects, digital humans need to be reconstructed and driven by consumer-grade hardware solutions (e.g., a webcam) to enable a wide applicability of digital humans for:

  1. Immersive telecommunication
  2. Virtual mirrors (e-commerce)
  3. Other entertainment purposes (e.g., computer games)

Research Focus

In this proposal, we concentrate on the computer vision and graphics aspects of capturing and rendering (realizing) motions and appearances of photo-realistic digital humans reconstructed with commodity hardware and ask the question:

Can we leverage natural language for the reconstruction, representation, and modelling of the appearance, motion, and interactions of digital humans?

Motion and Interaction Analysis

As the face and gestures play an important role during conversations, we focus our research on the upper body of the human, aiming at capturing and synthesizing complete, contact-rich digital humans with life-like motions.

We will analyze how humans move and will develop data-driven motion synthesis methods that learn how to generate talking and listening motion with interactions of the hand with the face.

Financiële details & Tijdlijn

Financiële details

Subsidiebedrag€ 1.500.000
Totale projectbegroting€ 1.500.000

Tijdlijn

Startdatum1-3-2025
Einddatum28-2-2030
Subsidiejaar2025

Partners & Locaties

Projectpartners

  • TECHNISCHE UNIVERSITAT DARMSTADTpenvoerder

Land(en)

Germany

Vergelijkbare projecten binnen European Research Council

ERC Consolid...

Learning to synthesize interactive 3D models

This project aims to automate the generation of interactive 3D models using deep learning to enhance virtual environments and applications in animation, robotics, and digital entertainment.

€ 2.000.000
ERC Starting...

SpatioTemporal Reconstruction of Interacting People for pErceiving Systems

The project aims to develop robust methods for inferring Human-Object Interactions from natural images/videos, enhancing intelligent systems to assist people in task completion.

€ 1.500.000
ERC Consolid...

Learning to Create Virtual Worlds

This project aims to develop advanced machine learning techniques for automatic generation of high-fidelity 3D content, enhancing immersive experiences across various applications.

€ 2.750.000
ERC Proof of...

A Robust, Real-time, and 3D Human Motion Capture System through Multi-Cameras and AI

Real-Move aims to develop a marker-less, real-time 3D human motion tracking system using multi-camera views and AI to enhance workplace safety and ergonomics, reducing costs and improving quality of life.

€ 150.000
ERC Consolid...

Computational Design of Multimodal Tactile Feedback within Immersive Environments

ADVHANDTURE aims to enhance virtual reality by developing innovative computational models for realistic multimodal tactile feedback, improving 3D interaction and user immersion.

€ 1.999.750

Vergelijkbare projecten uit andere regelingen

Mkb-innovati...

Samenwerken in een full immersive VR-omgeving

Vier innovatieve organisaties uit Zuid-Nederland ontwikkelen een multiplayer virtual reality systeem met natuurlijke interactie om toepassingen in diverse sectoren te verbeteren.

€ 197.860
Mkb-innovati...

Blended Learning for Dementia

Tinqwise en TMVRS ontwikkelen een innovatief blended learning platform voor mentale gezondheidszorg, gericht op sociale cognitieve vaardigheden via interactieve VR-ervaringen met realistische gezichtsanimaties.

€ 185.360
Mkb-innovati...

Human-size holographic simulation AR display

Het project onderzoekt de haalbaarheid van een betaalbaar holografisch AR-display en bijbehorende 3D content creatiesoftware.

€ 20.000
Mkb-innovati...

2D naar 3D mapping voor autonome realtime feedback (MARF)

360SportsIntelligence en Ludimos ontwikkelen 3D-mapping software voor het tracken van sporters en het genereren van digital twins, gericht op geautomatiseerde training en wedstrijdregistratie.

€ 256.200
Mkb-innovati...

HoloCloud MR - toekomst voor de zorg

HoloMoves ontwikkelt een Mixed Reality systeem met IoT-integratie om revalidatie te versnellen en zorgverleners inzicht te geven in patiëntbewegingen, met als doel verbeterde zorg op afstand.

€ 20.000