High-fidelity Collaborative Perception for Small Unmanned Aerial Vehicles

SkEyes aims to enhance UAV swarm autonomy and perception by integrating diverse sensors for effective navigation in challenging environments, enabling timely responses to critical situations.

Subsidie
€ 2.346.219
2023

Projectdetails

Introduction

Over the past two decades, we witnessed impressive advancements in Robotics. Amongst the most disruptive developments was the demonstration of small Unmanned Aerial Vehicles (UAVs) equipped with onboard cameras conducting autonomous, vision-based flights without reliance on GPS. This sparked booming interest in a plethora of use-cases, such as:

  • Automation of drone delivery
  • Infrastructure inspection and maintenance

This led to the emergence of new algorithms, advanced sensors, as well as miniaturized, powerful chips, opening exciting opportunities for automating single- as well as multi-UAV navigation.

Current Challenges

Current solutions, however, lack greatly in robustness and generality, struggling to perform outside very controlled settings. Onboard perception constitutes the biggest impediment.

Working in this area for over a decade, it is troubling that despite dramatic progress, we still lack the technology to enable a UAV swarm to autonomously scan the seas for refugee dinghies or forest areas for wildfires, and to provide help in such dire situations.

Use-Case Complexity

While, in principle, the core technology is the same across all use-cases, battling adverse conditions, such as:

  1. Wind
  2. Smoke
  3. Degraded illumination

These render the latter use-cases extremely challenging as they are time-critical and cannot be postponed until favorable conditions arise.

Proposed Solutions

Employing some of the currently most promising sensors, such as lidar, and advanced depth, thermal, event, and visual cameras, SkEyes proposes to address fundamental research questions to understand how to process and combine these heterogeneous sensing cues onboard a swarm of small UAVs.

Goals

The goal is to achieve joint spatial understanding and scene awareness for effective autonomy in highly dynamic and realistic scenarios. Engaging such eyes in the sky, the focus is on robust, collaborative perception to enable intelligent UAV swarm navigation exhibiting adaptability in completing a mission in challenging conditions, at the push of a button.

Financiële details & Tijdlijn

Financiële details

Subsidiebedrag€ 2.346.219
Totale projectbegroting€ 2.346.219

Tijdlijn

Startdatum1-10-2023
Einddatum30-9-2028
Subsidiejaar2023

Partners & Locaties

Projectpartners

  • UNIVERSITY OF CYPRUSpenvoerder

Land(en)

Cyprus

Vergelijkbare projecten binnen European Research Council

ERC Consolid...

Real-Time Urban Mobility Management via Intelligent UAV-based Sensing

URANUS aims to enhance urban mobility management through real-time UAV-based traffic sensing, enabling intelligent monitoring and control of vehicular and pedestrian traffic.

€ 1.999.938
ERC Proof of...

Dynamical Recurrent Visual Perceiver

The project aims to develop DRVis, an algorithm that enhances computer vision tasks using low-resolution frames from moving cameras, targeting applications in smart agriculture and drone navigation.

€ 150.000
ERC Starting...

Omni-Supervised Learning for Dynamic Scene Understanding

This project aims to enhance dynamic scene understanding in autonomous vehicles by developing innovative machine learning models and methods for open-world object recognition from unlabeled video data.

€ 1.500.000
ERC Advanced...

Breaching the boundaries of safety and intelligence in autonomous systems with risk-based rationality

This project aims to develop a comprehensive risk-based autonomy framework for autonomous systems, enhancing safety and decision-making in marine environments through advanced modeling and human supervision.

€ 2.499.773
ERC Starting...

Smart E-skins for Life-like Soft Robot Perception

SELECT aims to develop advanced electronic skins for soft robots to enhance sensory perception and improve human-robot interactions through innovative machine learning techniques.

€ 1.486.463

Vergelijkbare projecten uit andere regelingen

EIC Accelerator

GPS-free, beyond the visual line of sight navigation for logistics drones in urban environments

Sightec's platform enables autonomous navigation and inspection for drones using vision-based algorithms, allowing complex tasks without GPS, demonstrated through successful delivery flights.

€ 2.456.377
Mkb-innovati...

Perception of Collaborative Robots

Het project onderzoekt de haalbaarheid van technieken zoals voice control en machine vision om collaboratieve robots beter omgevingsbewust te maken voor gebruik in high-mix low-volume productie.

€ 20.000
Mkb-innovati...

Sparrow

Mapture.ai en BrainCreators ontwikkelen een privacyveilige drone-oplossing voor het automatisch surveilleren van grote gebieden, waarbij verdachte situaties geautomatiseerd worden herkend en gefilterd.

€ 135.030
EIC Accelerator

On the edge AI-driven Autonomous Inspection Robots

Energy Robotics aims to develop an AI-driven software platform for mobile robots to autonomously perform inspections in dynamic industrial environments, enhancing operational efficiency across various sectors.

€ 2.499.999
Mkb-innovati...

Drone Cropview Greenhouse

Het project ontwikkelt een autonome drone die met vision technologie teeltdata verzamelt in kassen, om de efficiëntie en duurzaamheid van de glastuinbouw te verbeteren.

€ 104.213