Skip to main content

Examples

Visualize real-world robotic data using Foxglove.

Click on an example multimodal dataset below to visualize it directly in Foxglove.

More examples

Argoverse 2 autonomous vehicle dataset

Autonomous Vehicle

Argoverse 2 is a large-scale, open-source dataset designed to advance autonomous vehicle perception, prediction, and mapping research.

Unitree humanoid retargeting dataset

Humanoid

The Unitree LAFAN1 Retargeting Dataset uses numerical optimization techniques based on an inverse kinematics approach that ensures that the retargeted motions adhere to and effective pose constraints and joint position/velocity limits, effectively minimizing issues like foot slippage.

Autonomous surface vehicle marine dataset

Autonomous Surface Vehicle

MIT Sea Grant AUV Lab Prodromos marine perception dataset includes recordings from a surface vessel navigating a diverse maritime scenario such as sailboat and kayak traffic, commercial shipping vessels, and bridges. This dataset is a key feature in the integration of AIS data with radar imagery, enabling vessel identification and tracking in real-world conditions.

Wayve autonomous driving dataset

Autonomous Vehicle

WayveScenes101 is a groundbreaking dataset tailored to propel advancements in novel view synthesis for autonomous driving. This dataset encompasses 101 diverse driving scenarios, encompassing urban, suburban, and highway environments, all captured under varying weather and lighting conditions.

nuScenes autonomous vehicle dataset

Autonomous Vehicle

The nuScenes dataset (pronounced /nuːsiːnz/) is a public large-scale dataset for autonomous driving developed by the team at Motional (formerly nuTonomy).

UZH-FPV drone racing dataset

Drone

The UZH-FPV drone racing dataset provides the most aggressive visual-inertial odometry dataset with high-speed 6DoF trajectories developed using state estimation, large accelerations, and rotations to date.

Boston Dynamics Spot quadruped dataset

Quadruped

This dataset is a short validation of the Boston Dynamics Spot robot, combining the URDF with onboard sensor data, including recorded velocities from onboard odometry and contact states (pressure for each foot, or paw in this case perhaps).

Mobile robot SLAM dataset

Mobile Robot

SLAM (Simultaneous Localization and Mapping) is crucial for autonomous robotic systems, enabling efficient path planning and effective obstacle avoidance.

NASA Valkyrie humanoid robot dataset

Humanoid

NASA Valkyrie humanoid was designed and built to be a robust, rugged, entirely electric humanoid robot with 44 DoF that could operate in degraded or damaged human-engineered environments.

Simulated UR5e robotic arm dataset

Autonomous Robotic Manipulation

Simulation plays a crucial role in robotics development by allowing developers to assess their algorithms and designs without physical prototypes. This dataset simulates a Universal Robots UR5e robotic arm with physics-based simulation Gazebo.

Clearpath Husky agricultural robot dataset

Unmanned Ground Vehicle

Using a Clearpath A200 Husky mobile robot equipped with a Velodyne HDL32 lidar sensor, an Xsens MTi-30 IMU, and wheel encoders, the research team overcomes the limitations of manual agriculture, allowing them to cover far more ground.

SubPipe autonomous underwater vehicle dataset

Autonomous Underwater Vehicle

SubPipe is a submarine outfall pipeline dataset, acquired by the Light AUV developed by Oceanscan-MST, within the scope of Challenge Camp 1 of the H2020 REANMO project.

Hilti robot SLAM dataset

Mobile Robot

Hilti Robot SLAM challenge with robot-mounted sensor suite consisting of Robosense BPearl lidar, Xsens MTi-670 IMU, and 4x OAK-D cameras.

Treescope autonomous mobile robot dataset

Autonomous Mobile Robot

Treescope uses lidar, IMU, GPS, RGBD and thermal cameras for agricultural semantic segmentation of tree trunks, with the goal of automating the process.

Agricultural unmanned ground vehicle dataset

Unmanned Ground Vehicle

A multi-cue positioning system for agricultural robots aimed at developing a self-localization method capable of robust pose estimation for accurate mapping while operating in farming environments.

KITTI-360 autonomous vehicle dataset

Autonomous Vehicle

KITTI-360 is a large-scale dataset for 3D scene reconstruction, object detection, and semantic segmentation in urban environments, encompassing 360° sensor data.

Handheld RGB-D indoor mapping dataset

Handheld

RGB-D cameras can be used to create dense 3D maps of indoor environments. These maps can be used for robot navigation, manipulation, and semantic mapping.

Rellis-3D unmanned ground vehicle dataset

Unmanned Ground Vehicle

The Rellis-3D dataset was collected on the Rellis Campus of Texas A&M University and includes an Ouster OS1 64-beam lidar, Velodyne VLP-32C 32-beam lidar, and a Nerian SceneScan stereo depth camera.

Hilti handheld SLAM dataset

Handheld

The Hilti Handheld SLAM dataset was captured using a Sevensense Alphasense Core unit, which houses 5 cameras plus a high-end IMU, and a Hesai PandarXT-32 lidar sensor.

Start building with Foxglove.

Get started for free