Loading Events

« All Events

Séminaire ETIS-CELL – Sylvain Colomer

April 2 2026 | 16h00 - 17h30

Orateur : Sylvain Colomer

Title: Visual perception across scales: from bio-inspired navigation to large-scale scene understanding

Abstract:

Visual perception, or computer vision, is a central component of intelligent and robotic systems. It plays a key role in a wide range of applications, from robotic navigation to remote sensing, scene understanding, medical imaging, or large-scale visual data analysis. Although visual data is very rich in information, it is also highly complex, as it is subject to many challenges such as illumination variations, weather conditions, viewpoint and scale changes, as well as domain shifts or noise. Learning efficient and meaningful representations from such data is therefore a major challenge for intelligent systems, especially under constraints related to performance, computational cost, and energy efficiency. In this context, my research focuses on understanding how visual representations can be learned and exploited across different scales and application domains. In particular, I investigate how perception models can be designed to remain robust and adaptable when moving from local, embodied settings to large-scale visual understanding tasks. During my PhD, I developed bio-inspired approaches for visual navigation and localization, leveraging principles from neuroscience to design robust and adaptive models for decision-making in autonomous systems. These models aim to bridge perception and action through structured visual representations, with a particular emphasis on efficiency through the use of simple, optimized monocular vision systems and lightweight computational models. In addition, part of this work focused on deploying these models on FPGA platforms, with the goal of designing efficient hardware implementations and exploring circuit architectures adapted to bio-inspired and neural computation. More recently, my work has explored large-scale scene understanding in the context of remote sensing, using deep learning techniques for tasks such as instance segmentation and analysis of hyperspectral aerial imagery, in particular for large-scale forest monitoring in Canada. This setting introduces new challenges related to scale, variability, and data complexity, and raises questions about the deployment of such systems directly on drones to enable efficient and adaptive forest monitoring. In this talk, I will highlight the connections between these research directions and show how they contribute to the design of more general, robust, and efficient visual perception systems. I will finally discuss perspectives towards adaptive, energy-efficient, and deployable AI models, capable of bridging embodied perception and large-scale visual understanding in real-world applications.

Bio: Sylvain Colomer is a researcher in computer vision and robotic navigation. His work focuses on designing robust and efficient visual models for real-world applications, from autonomous navigation to large-scale remote sensing. He obtained his PhD from CY Cergy Paris University, where he developed bio-inspired approaches for visual navigation on embedded systems. He then worked as a postdoctoral researcher at the University of Toronto on deep learning methods for hyperspectral image analysis and forest monitoring. His research aims to bridge embodied perception and large-scale visual understanding. His current research focuses on designing robust and efficient visual perception models that can operate in real-world conditions, from resource-constrained embedded systems to large-scale data analysis, with an emphasis on multimodality, generalization, and deployment.

Teams link: https://teams.microsoft.com/meet/37168326511389?p=dQSREX67PUox8rsHTo
Meeting ID: 371 683 265 113 89
Passcode: Ry6ZC3bU

Details

  • Date: April 2 2026
  • Time:
    16h00 - 17h30
  • Event Categories: ,

Venue

  • Online

Organiser

  • Stéphane Zuckerman
  • Email stephane.zuckerman@etis-lab.fr