Master Thesis - Multi-modal traversability estimation for Autonomous Outdoor Navigation
Advertisement for the field of study such as: Automation technology, electrical engineering, computer science, cybernetics, mechatronics, control engineering, software design, software engineering, technical computer science or comparable. In the Professional Service Robots - Outdoor research group we develop autonomous, mobile robots for a variety of outdoor applications, such as agriculture, forestry and logistics. The focus is on the development of an autonomous outdoor navigation solution as well as the hardware of the robots. For mobile robots operating in outdoor, unstructured environments with unknown terrain conditions, an accurate representation of the environment is essential. For this purpose, data from multiple sensors needs to be interpreted and fused to reliably estimate the traversability of the surrounding environment. Terrain traversability can be evaluated by analyzing the geometry of elevation maps generated using LiDAR scans. This method alone is inherently unable to capture subtle semantic information such as surface properties, which could, for example, help prioritize paths along dirt roads rather than through dense vegetation. For this purpose, semantic traversability scores can be extracted from RGB images produced by a stereo camera using Deep Neural Networks. But relying only on the semantic traversability information is sensitive to domain shifts and weather conditions.
Therefore, the objective of this thesis is to develop and test a real-time, tightly-coupled traversability algorithm that fuses information from both sensor modalities, therefore providing a more complete understanding of the environment as an input for the path planning.
Be part of change
In this thesis, you will design a traversability estimation algorithm that fuses geometric information derived from LiDAR elevation maps with semantic annotations inferred from RGB images. You will focus on fusing traversability scores generated by independent sensor modalities and evaluate the effectiveness of different fusion strategies, including early-stage and late-stage approaches.
You will evaluate the accuracy and real-time computational performance of your implementation in real-world scenarios using both recorded data and real-life deployment with our mobile CURT robots, to ensure real-time performance.
What you contribute
What we offer
We value and promote the diversity of our employees' skills and therefore welcome all applications – regardless of age, gender, nationality, ethnic and social origin, religion, ideology, disability, sexual orientation and identity. Severely disabled persons are given preference in the event of equal suitability. Our tasks are diverse and adaptable – for applicants with disabilities, we work together to find solutions that best promote their abilities. The same applies if they do not meet all the profile requirements due to a disability.
With its focus on developing key technologies that are vital for the future and enabling the commercial utilization of this work by business and industry, Fraunhofer plays a central role in the innovation process. As a pioneer and catalyst for groundbreaking developments and scientific excellence, Fraunhofer helps shape society now and in the future.
Ready for a change? Then apply now and make a difference! Once we have received your online application, you will receive an automatic confirmation of receipt. We will then get back to you as soon as possible and let you know what happens next.
Ms. Jennifer Leppich Recruiting +49 711 970-1415 jennifer.leppich@ipa.fraunhofer.de
Fraunhofer Institute for Manufacturing Engineering and Automation IPA
Requisition Number: 82321 Application Deadline:
Job Segment:
Test Engineer, Manufacturing Engineer, Computer Science, Electrical Engineering, Bilingual, Engineering, Technology