I am passionate about robot perception, localization, and autonomous navigation for Unmanned Vehicles. My research focuses on:
- SLAM & State Estimation – Visual-inertial and LiDAR-based localization.
- Learning-Enabled Autonomous Systems – AI-driven autonomy and decision-making.
- Sensor Fusion – Combining IMU, GNSS, cameras, and depth sensors.
- Embedded AI for Robotics – Optimizing computer vision models for resource-constrained platforms.
- Uncertainty Estimation & Safety – Ensuring robustness in dynamic environments.
- Robot Perception & Localization
- ROS 2 Development
- Machine Learning & Computer Vision for Robotics
- Photogrammetry & Multi-Spectral Imaging
- AI in Agriculture
- Developing perception & localization pipelines for autonomous agricultural UGVs, focusing on sensor fusion, SLAM, and real-time navigation.
- Optimizing machine learning and computer vision models for resource-constrained robotics, enhancing efficiency and deployment feasibility.
- I actively share insights on robotics, AI, and computer vision through my Medium articles.
I am seeking PhD opportunities in:
- Robot Perception & SLAM
- AI for autonomous navigation
- Sensor fusion for safety-critical systems
- Autonomous agricultural robots
- Precision-Farming
- LinkedIn: Sagar Kumar
- Email: [email protected]
- X (Twitter): @sagarcadet
- Medium: @sagarcadet
Let’s collaborate! If you're working on robot perception, SLAM, or AI-driven autonomy, I’d love to discuss potential research opportunities.
