From perception to action: How CERTH enables autonomous robotics in ARISE

As part of the Horizon Europe ARISE project, CERTH is developing the perception and navigation systems that enable the robot to understand and navigate through complex environments. These systems give the robot the ability to interpret its surroundings, make informed decisions, and operate autonomously in challenging conditions. Whether working in large outdoor PV farms or inside structured hydroponic greenhouses, our goal is to help the ARISE robot navigate safely and support essential inspection and manipulation activities.
To achieve this, we develop algorithms that allow the robot to recognise key elements in its environment, including solar panels, bolts, cracks, lettuce plants, and nearby human workers through semantic segmentation, object detection, and pose estimation. By fusing information from RGB and depth cameras, LiDAR, and GPS, the robot creates highly accurate 2D and 3D maps along with a dense 3D reconstruction of its surroundings. These representations combine geometrical structure with semantic information, providing the robot with a clear and consistent understanding of its environment that it can rely on even in dynamic, unstructured conditions.
Based on this environmental awareness, we also develop the robot’s localisation and navigation modules. These include a multimodal SLAM system for tracking the robot’s position, even in GPS-denied areas, and path planners that incorporate semantic cues to choose routes that are safe and efficient. We also work on human-aware navigation and traversability analysis, allowing the robot to adjust its movement based on ground conditions, narrow passages, and the presence of people. These elements help the robot move confidently and react appropriately to its surroundings.
All of our systems are deployed and tested on real robot platforms through laboratory trials and field testing in solar farms and hydroponic facilities. Evaluating the perception, mapping, and navigation components directly in real-world environments, with natural lighting changes, weather variations, and uneven terrain ensures that the technologies we develop are reliable and ready for practical deployment. This hands-on evaluation also allows us to continuously refine the algorithms and adapt the system to the real challenges that robots encounter in complex operational environments
