High-fidelity Collaborative Perception for Small Unmanned Aerial Vehicles
SkEyes aims to enhance UAV swarm autonomy and perception by integrating diverse sensors for effective navigation in challenging environments, enabling timely responses to critical situations.
Projectdetails
Introduction
Over the past two decades, we witnessed impressive advancements in Robotics. Amongst the most disruptive developments was the demonstration of small Unmanned Aerial Vehicles (UAVs) equipped with onboard cameras conducting autonomous, vision-based flights without reliance on GPS. This sparked booming interest in a plethora of use-cases, such as:
- Automation of drone delivery
- Infrastructure inspection and maintenance
This led to the emergence of new algorithms, advanced sensors, as well as miniaturized, powerful chips, opening exciting opportunities for automating single- as well as multi-UAV navigation.
Current Challenges
Current solutions, however, lack greatly in robustness and generality, struggling to perform outside very controlled settings. Onboard perception constitutes the biggest impediment.
Working in this area for over a decade, it is troubling that despite dramatic progress, we still lack the technology to enable a UAV swarm to autonomously scan the seas for refugee dinghies or forest areas for wildfires, and to provide help in such dire situations.
Use-Case Complexity
While, in principle, the core technology is the same across all use-cases, battling adverse conditions, such as:
- Wind
- Smoke
- Degraded illumination
These render the latter use-cases extremely challenging as they are time-critical and cannot be postponed until favorable conditions arise.
Proposed Solutions
Employing some of the currently most promising sensors, such as lidar, and advanced depth, thermal, event, and visual cameras, SkEyes proposes to address fundamental research questions to understand how to process and combine these heterogeneous sensing cues onboard a swarm of small UAVs.
Goals
The goal is to achieve joint spatial understanding and scene awareness for effective autonomy in highly dynamic and realistic scenarios. Engaging such eyes in the sky, the focus is on robust, collaborative perception to enable intelligent UAV swarm navigation exhibiting adaptability in completing a mission in challenging conditions, at the push of a button.
Financiële details & Tijdlijn
Financiële details
Subsidiebedrag | € 2.346.219 |
Totale projectbegroting | € 2.346.219 |
Tijdlijn
Startdatum | 1-10-2023 |
Einddatum | 30-9-2028 |
Subsidiejaar | 2023 |
Partners & Locaties
Projectpartners
- UNIVERSITY OF CYPRUSpenvoerder
Land(en)
Vergelijkbare projecten binnen European Research Council
Project | Regeling | Bedrag | Jaar | Actie |
---|---|---|---|---|
Real-Time Urban Mobility Management via Intelligent UAV-based SensingURANUS aims to enhance urban mobility management through real-time UAV-based traffic sensing, enabling intelligent monitoring and control of vehicular and pedestrian traffic. | ERC Consolid... | € 1.999.938 | 2023 | Details |
Dynamical Recurrent Visual PerceiverThe project aims to develop DRVis, an algorithm that enhances computer vision tasks using low-resolution frames from moving cameras, targeting applications in smart agriculture and drone navigation. | ERC Proof of... | € 150.000 | 2022 | Details |
Omni-Supervised Learning for Dynamic Scene UnderstandingThis project aims to enhance dynamic scene understanding in autonomous vehicles by developing innovative machine learning models and methods for open-world object recognition from unlabeled video data. | ERC Starting... | € 1.500.000 | 2023 | Details |
Breaching the boundaries of safety and intelligence in autonomous systems with risk-based rationalityThis project aims to develop a comprehensive risk-based autonomy framework for autonomous systems, enhancing safety and decision-making in marine environments through advanced modeling and human supervision. | ERC Advanced... | € 2.499.773 | 2025 | Details |
Smart E-skins for Life-like Soft Robot PerceptionSELECT aims to develop advanced electronic skins for soft robots to enhance sensory perception and improve human-robot interactions through innovative machine learning techniques. | ERC Starting... | € 1.486.463 | 2024 | Details |
Real-Time Urban Mobility Management via Intelligent UAV-based Sensing
URANUS aims to enhance urban mobility management through real-time UAV-based traffic sensing, enabling intelligent monitoring and control of vehicular and pedestrian traffic.
Dynamical Recurrent Visual Perceiver
The project aims to develop DRVis, an algorithm that enhances computer vision tasks using low-resolution frames from moving cameras, targeting applications in smart agriculture and drone navigation.
Omni-Supervised Learning for Dynamic Scene Understanding
This project aims to enhance dynamic scene understanding in autonomous vehicles by developing innovative machine learning models and methods for open-world object recognition from unlabeled video data.
Breaching the boundaries of safety and intelligence in autonomous systems with risk-based rationality
This project aims to develop a comprehensive risk-based autonomy framework for autonomous systems, enhancing safety and decision-making in marine environments through advanced modeling and human supervision.
Smart E-skins for Life-like Soft Robot Perception
SELECT aims to develop advanced electronic skins for soft robots to enhance sensory perception and improve human-robot interactions through innovative machine learning techniques.
Vergelijkbare projecten uit andere regelingen
Project | Regeling | Bedrag | Jaar | Actie |
---|---|---|---|---|
GPS-free, beyond the visual line of sight navigation for logistics drones in urban environmentsSightec's platform enables autonomous navigation and inspection for drones using vision-based algorithms, allowing complex tasks without GPS, demonstrated through successful delivery flights. | EIC Accelerator | € 2.456.377 | 2022 | Details |
Perception of Collaborative RobotsHet project onderzoekt de haalbaarheid van technieken zoals voice control en machine vision om collaboratieve robots beter omgevingsbewust te maken voor gebruik in high-mix low-volume productie. | Mkb-innovati... | € 20.000 | 2020 | Details |
SparrowMapture.ai en BrainCreators ontwikkelen een privacyveilige drone-oplossing voor het automatisch surveilleren van grote gebieden, waarbij verdachte situaties geautomatiseerd worden herkend en gefilterd. | Mkb-innovati... | € 135.030 | 2022 | Details |
On the edge AI-driven Autonomous Inspection RobotsEnergy Robotics aims to develop an AI-driven software platform for mobile robots to autonomously perform inspections in dynamic industrial environments, enhancing operational efficiency across various sectors. | EIC Accelerator | € 2.499.999 | 2023 | Details |
Drone Cropview GreenhouseHet project ontwikkelt een autonome drone die met vision technologie teeltdata verzamelt in kassen, om de efficiëntie en duurzaamheid van de glastuinbouw te verbeteren. | Mkb-innovati... | € 104.213 | 2018 | Details |
GPS-free, beyond the visual line of sight navigation for logistics drones in urban environments
Sightec's platform enables autonomous navigation and inspection for drones using vision-based algorithms, allowing complex tasks without GPS, demonstrated through successful delivery flights.
Perception of Collaborative Robots
Het project onderzoekt de haalbaarheid van technieken zoals voice control en machine vision om collaboratieve robots beter omgevingsbewust te maken voor gebruik in high-mix low-volume productie.
Sparrow
Mapture.ai en BrainCreators ontwikkelen een privacyveilige drone-oplossing voor het automatisch surveilleren van grote gebieden, waarbij verdachte situaties geautomatiseerd worden herkend en gefilterd.
On the edge AI-driven Autonomous Inspection Robots
Energy Robotics aims to develop an AI-driven software platform for mobile robots to autonomously perform inspections in dynamic industrial environments, enhancing operational efficiency across various sectors.
Drone Cropview Greenhouse
Het project ontwikkelt een autonome drone die met vision technologie teeltdata verzamelt in kassen, om de efficiëntie en duurzaamheid van de glastuinbouw te verbeteren.