This project implements a hand gesture-controlled vehicle system using fuzzy logic.
The car's movement is controlled by hand angles captured from the user's gestures.
- Start Detection: Show a thumbs-up gesture to start gesture control.
- Stop Detection: Open your palm fully facing the camera to stop gesture control.
- Turning the right hand to the right (positive angle) makes the car turn right.
- Turning the right hand to the left (negative angle) makes the car turn left.
- Holding the left hand vertically sets the car to the lowest speed.
- Tilting the left hand to either side increases the speed.
- Implements the fuzzy control system
- Input: Right and left hand angles
- Output: Car's steering angle and speed
- Implements the linear control system
- Draws the car and visualizes its movement on a 2D plane.
Installing Mediapipe 0.8.5 on jetson Nano
- We follow this link to install mediapipe-GPU on Jetson[https://jetson-docs.com/libraries/mediapipe/overview]
- To build the mediapipe-CPU
- First, you need to install bazel in your system specifically 3.7.2 for 0.8.5
- Clone this mediapipe files [https://github.com/google-ai-edge/mediapipe]:
git clone -b 0.8.5 https://github.com/google-ai-edge/mediapipe.git - Run the command:
cd mediapipe - Run this command to build the C++ program
python3 setup.py install - Then create the wheel
python3 setup.py bdist_wheel - Now you have folder named dist:
pip install dist/filename.whl
To launch the gesture-controlled car simulator, run:
python hand_control_sim.py --input camera # Using camera
python hand_control_sim.py --input video --video_path inputs/short_straight.mp4 --controller linear # Using pre-record videoTo launch the gesture-controlled remotely, run:
python server_robot_receiver.py # Robot side
python client_camera_sender.py # Client side, need to modify the IP to server's IPEvaluate GPU efficiency on the Jetson Nano:
python3 hand_control_efficiency.pyEvaluate CPU efficiency on the Jetson Nano:
python3 hand_control_CPU.py