· Design and integrate complex mechanical, electrical, and software components for autonomous robotic systems.
· Develop and implement advanced algorithms for navigation, obstacle avoidance, and coordinated mission execution.
· Create multi-modal perception systems using sensor fusion (e.g., LiDAR, cameras, IMUs) for environmental mapping and object recognition.
· Integrate flight controllers and autopilot systems (e.g., PX4) into system architectures to ensure seamless communication and control.
· Conduct system-wide debugging, integration testing, and performance tuning in both simulation (e.g., AirSim, Gazebo SITL with PX4) and real-world environments.
· Develop methodologies for fault detection, redundancy, and failure recovery to enhance system reliability.
· Optimize overall system performance and energy efficiency for extended operations under dynamic conditions.
· Collaborate with interdisciplinary teams (AI researchers, control engineers, hardware designers) to ensure seamless system functionality.
· Prototype, test, and iterate on novel autonomous capabilities in simulation and field environments.
Requirements
Required Qualifications:
· Master’s or PhD in Robotics, Mechanical Engineering, AI, or a closely related field.
· 3+ years of hands-on experience in autonomous systems development or equivalent R&D experience (strong research records from PhD candidates are encouraged).
· Proficiency in C++ and Python; experience with ROS is a plus.
· Strong background in sensor fusion, SLAM, and multi-agent coordination.
· Demonstrated experience with flight controllers or autopilot systems (e.g., PX4) in robotic platforms is highly desirable.
Preferred Qualifications:
· Postdoctoral research experience in robotics, autonomous systems, or related fields.
· Experience with AI-driven decision-making and learning-based autonomy.
· Proficiency in simulation platforms (e.g., AirSim, Gazebo SITL with PX4, CoppeliaSim) and rapid prototyping.
· A strong publication record in robotics, AI, or autonomous systems research.