@@ -27,8 +27,8 @@ The vision YOLO script:
2727
2828### Additional Dependencies
2929
30- <Tabs groupId =" language " >
31- <TabItem value =" js " label =" JavaScript " default >
30+ <Tabs groupId =" run " className = " shell-tabs " >
31+ <TabItem value =" js " label =" Run (JS) " default >
3232
3333The JavaScript version uses ONNX Runtime for inference:
3434
@@ -38,7 +38,7 @@ bun install
3838```
3939
4040</TabItem >
41- <TabItem value =" python " label =" Python " >
41+ <TabItem value =" python " label =" Run ( Python) " >
4242
4343``` bash
4444# Install vision dependencies
@@ -52,8 +52,8 @@ pip install ultralytics opencv-python numpy
5252
5353## Running the Script
5454
55- <Tabs groupId =" language " >
56- <TabItem value =" js " label =" JavaScript " default >
55+ <Tabs groupId =" run " className = " shell-tabs " >
56+ <TabItem value =" js " label =" Run (JS) " default >
5757
5858``` bash
5959bun robot:vision
@@ -62,7 +62,7 @@ bun src/vision_yolo.js
6262```
6363
6464</TabItem >
65- <TabItem value =" python " label =" Python " >
65+ <TabItem value =" python " label =" Run ( Python) " >
6666
6767``` bash
6868uv run python src/vision_yolo.py
@@ -430,15 +430,15 @@ def on_image(msg):
430430
431431For simulation environments, ` vision_colors.js ` / color detection may work better since you can place distinctly colored objects in Gazebo:
432432
433- <Tabs groupId =" language " >
434- <TabItem value =" js " label =" JavaScript " default >
433+ <Tabs groupId =" run " className = " shell-tabs " >
434+ <TabItem value =" js " label =" Run (JS) " default >
435435
436436``` bash
437437bun robot:vision:colors
438438```
439439
440440</TabItem >
441- <TabItem value =" python " label =" Python " >
441+ <TabItem value =" python " label =" Run ( Python) " >
442442
443443``` bash
444444# Implement HSV-based color detection
0 commit comments