ELEC-E7840 Smart Wearables -- Aalto University
Team: Saara (ML, sensors, docs), Alex (Prototyping, user testing, design), Jing (Circuit, ESP32, coordination)
Recognizes 5 activity categories (walking, stair climbing, sitting, sit-to-stand, standing) using textile-based pressure and stretch sensors on socks and knee pads. No IMUs -- textile sensors only.
cd ./04_Code/python
uv sync # Creates .venv and installs all dependencies
source .venv/bin/activate # Activate environment
# Or run scripts directly: uv run python script.pySee [[README_PYTHON_SETUP]] for UV installation and detailed instructions.
# Upload firmware
pio run -e xiao_esp32s3 -t upload
# Serial monitor
pio device monitor -e xiao_esp32s3See [[PLATFORMIO_SETUP]] for VS Code integration and board setup.
6-sensor configuration on a single ESP32S3 (pins A0-A5):
ESP32S3 XIAO (all 6 sensors):
A0: L_P_Heel - Left heel pressure
A1: L_P_Ball - Left ball pressure
A2: L_S_Knee - Left knee stretch
A3: R_P_Heel - Right heel pressure
A4: R_P_Ball - Right ball pressure
A5: R_S_Knee - Right knee stretch
↓ Serial / WiFi / BLE ↓
Python collector.py (reads 6-column CSV)
↓
CSV: time_ms,L_P_Heel,L_P_Ball,L_S_Knee,R_P_Heel,R_P_Ball,R_S_Knee
↓
data_preprocessing.py → feature_extraction.py → train_model.py
↓
Random Forest (.joblib)
↓
real_time_classifier.py
- Per leg: 2 piezoresistive pressure sensors (sock) + 1 stretch sensor (knee pad)
- Circuit: Voltage dividers with 10k resistors, 12-bit ADC (0-4095)
- Sampling: 50 Hz
- Communication: Serial/USB, WiFi (auto-connect saved networks, background retry), BLE
Smart Socks/
├── 00_Planning/ Project plans, timeline, meeting notes
├── 01_Design/ Sensor placement, circuit diagrams
├── 02_Fabrication/ Prototype photos, build documentation
├── 03_Data/ Raw and processed sensor data
├── 04_Code/
│ ├── arduino/
│ │ ├── data_collection_leg/ Reference copy of main firmware
│ │ ├── calibration_all_sensors/ Calibration firmware (simple serial)
│ │ ├── ei_data_forwarder/ Edge Impulse data collection
│ │ └── ei_led_feedback/ LED feedback demo
│ ├── python/
│ │ ├── config.py Centralized configuration
│ │ ├── calibration_visualizer.py Real-time sensor visualization
│ │ ├── collector.py Data collection tool (--validate for live QA)
│ │ ├── data_preprocessing.py Data cleaning
│ │ ├── data_validation.py Data quality checks
│ │ ├── auto_validator.py Live validation during collection
│ │ ├── feature_extraction.py ML feature extraction
│ │ ├── feature_importance.py Feature importance for edge deployment
│ │ ├── train_model.py Random Forest training (--rejection support)
│ │ ├── unknown_class.py Unknown class detection/rejection
│ │ ├── real_time_classifier.py Live activity classification
│ │ ├── run_full_pipeline.py End-to-end ML automation
│ │ └── ...
│ ├── QUICKSTART.md Quick start for single ESP32
│ └── WIFI_CONFIGURATION.md WiFi/BLE/Hotspot setup
├── 05_Analysis/ ML results, confusion matrices
├── 06_Presentation/ Poster, slides, user testing materials
├── 07_References/ Papers, datasheets, fabrication research, video tutorials
├── src/ PlatformIO source (main firmware)
├── platformio.ini PlatformIO build configuration
├── PLATFORMIO_SETUP.md PlatformIO + VS Code setup
├── README_PYTHON_SETUP.md Python/UV environment setup
├── TROUBLESHOOTING.md Common issues and fixes
├── PROJECT_STATUS.md Current project status
└── WORK_DIARY.md Team meeting notes and progress
# Build and upload (source in src/main.ino)
pio run -t upload
# Serial monitor (115200 baud)
pio device monitorSource lives in src/main.ino. To use calibration firmware instead, copy 04_Code/arduino/calibration_all_sensors/calibration_all_sensors.ino to src/main.ino.
WiFi auto-connect: On boot, the firmware scans for saved networks and connects to the strongest match (3 attempts). If not found, proceeds and retries every 15s in the background (quick reconnect first, full scan after repeated failures). No AP mode. Edit src/credentials.h to add/remove networks. Serial always streams when a USB host is listening — no START command needed. Works on battery power without USB.
Serial port on macOS: /dev/cu.usbmodem2101 (find yours with pio device list).
cd 04_Code/python
# Record activity data
python collector.py --activity walking_forward --port /dev/cu.usbmodem2101
# Record with live data validation (checks quality every 5s)
python collector.py --activity walking_forward --port /dev/cu.usbmodem2101 --validate
# Calibration visualization (upload calibration firmware first)
python calibration_visualizer.py --port /dev/cu.usbmodem2101
# Controls: Q=quit, R=reset, S=save CSV, P=pause, C=record GIF
# Enter=toggle START/STOP, I=STATUScd 04_Code/python
# Full pipeline (recommended)
python run_full_pipeline.py --raw-data ../../03_Data/raw/ --output ../../05_Analysis/
# Full pipeline with unknown class rejection and feature importance analysis
python run_full_pipeline.py --raw-data ../../03_Data/raw/ --output ../../05_Analysis/ \
--with-rejection confidence --feature-importance
# Step-by-step
python data_preprocessing.py --input ../../03_Data/raw/ --output ../../03_Data/processed/
python feature_extraction.py --input ../../03_Data/processed/ --output ../../03_Data/features/
python train_model.py --features ../../03_Data/features/features_all.csv --output ../../05_Analysis/
python analysis_report.py --results-dir ../../05_Analysis/ --output ../../06_Presentation/report/
# Train with unknown class rejection
python train_model.py --features ../../03_Data/features/features_all.csv --output ../../05_Analysis/ \
--rejection confidence --rejection-threshold 0.6
# Analyze feature importance for edge deployment
python feature_importance.py --model ../../05_Analysis/smart_socks_model.joblib --output ../../05_Analysis/feature_importance/
# Compare unknown class rejection methods
python unknown_class.py --compare --features ../../03_Data/features/features_all.csv --output ./comparison/cd 04_Code/python
# Serial mode
python real_time_classifier.py --model ../../05_Analysis/smart_socks_model.joblib --port /dev/cu.usbmodem2101
# BLE mode
python real_time_classifier.py --model ../../05_Analysis/smart_socks_model.joblib --blecd 04_Code/python
python -m pytest tests/ -v
uv run black . && uv run flake8 . && uv run mypy .CSV columns (merged): time_ms,L_P_Heel,L_P_Ball,L_S_Knee,R_P_Heel,R_P_Ball,R_S_Knee
File naming: <subject_id>_<activity>_<timestamp>.csv (e.g., S01_walking_forward_20260115_143022.csv)
Activity labels: walking_forward, walking_backward, stairs_up, stairs_down, sitting_floor, sitting_crossed, sit_to_stand, stand_to_sit, standing_upright, standing_lean_left, standing_lean_right
Subject split: S01-S06 training, S07-S09 testing (cross-subject validation)
Serial protocol (115200 baud):
START / S # Start recording
STOP / X # Stop recording
STATUS # Check status
HELP / ? # Show commands
BLE: Service 4fafc201-...914b, Characteristic beb5483e-...26a8. Device: SmartSocks.
Target accuracy: >85% average, >80% per activity on held-out test subjects.
- Fixed EeonTex resistance direction in
sensor_manufacturing_research.md: resistance decreases with stretch (was incorrectly stated as increases). Added specific values from Adafruit/Eeonyx datasheet (20kΩ → ~10kΩ). - Fixed ADC readings for knee stretch sensor: corrected physical model (bent knee = more stretch = lower R = higher ADC) and added voltage divider formula.
- Added stretch direction note: course vs. wale direction for EeonTex longevity.
- Added 10kΩ vs 18kΩ resistor note: explains choice vs. Amitrano et al. (2020).
- Added fallback links for potentially dead YouTube tutorial in
video_tutorials.md. - Cross-linked all reference docs with Obsidian
[[...]]wiki links and GitHub-compatible relative links, matching the convention in design docs. Connectedsensor_manufacturing_research.md,video_tutorials.md, andREFERENCES.mdto each other and tosensor_placement_v2,circuit_diagram_v2, andPROPOSALS.
- Removed CAL ON/OFF: Calibration mode was redundant — calibration firmware always streams, data collection firmware uses START/STOP. Removed from firmware (
src/main.ino, reference copy), visualizer, and all documentation. - Fixed GIF recording: Rewrote frame capture in
calibration_visualizer.py:- Root cause:
buffer_rgba()fails with'no attribute renderer'if canvas hasn't been drawn yet; exception was silently caught, producing empty GIFs. - Fix: Use
savefig()to BytesIO buffer instead — handles its own rendering internally. - Replaced background thread + imageio writer with in-memory PIL frames + synchronous save — eliminates race conditions that caused truncated files on quit.
- Added 10 FPS throttle (was 20 FPS) — reduces GIF size,
savefigoverhead is ~12-16ms per frame. - Removed
imageiodependency — PIL handles GIF natively.
- Root cause:
- Unknown class detection (
unknown_class.py): Confidence threshold and novelty detection (Isolation Forest) methods for rejecting non-target activities. Integrated via--rejectionflag intrain_model.py. - Feature importance analysis (
feature_importance.py): Identifies minimal feature set for edge deployment with 90/95/99% importance thresholds. Generates edge config JSON for reduced-feature inference. - Automated data validation (
auto_validator.py): Live validation during data collection via--validateflag incollector.py. Checks temporal jumps, stuck sensors, recording duration, and sample completeness in real-time. - Pipeline integration:
run_full_pipeline.pygains--with-rejectionand--feature-importanceoptional steps. - Config additions:
DATA_QUALITYnow includestemporal_jump_threshold,live_validation_interval_s,min_recording_duration_s. - Tests: 16 new tests across
test_unknown_class.py,test_feature_importance.py,test_auto_validator.py(36 total, all passing).
- Auto-connect WiFi: Firmware scans on boot, connects to strongest saved network, falls back to AP
- No more
#defineswitching — just editcredentials.hto add networks - Optional open network support (
ALLOW_OPEN_NETWORKS = true)
- WiFi tested: Phone hotspot (Maximize Compatibility), home WiFi (TP-Link) — all working
- BLE tested: Discoverable from iPhone (nRF Connect), works concurrently with WiFi hotspot
- BLE initialized before WiFi to avoid shared-antenna conflicts on XIAO ESP32S3
- Note: BLE signal is weak on XIAO ESP32S3 (tiny PCB antenna shared with WiFi); not discoverable from macOS
- Web dashboard accessible over WiFi at device IP (e.g.,
http://192.168.8.167) - WiFi auto-connects to saved networks via
SAVED_NETWORKS[]incredentials.h - Background retry every 15s if connection drops (quick reconnect, then full scan)
- Non-blocking serial command processing in firmware
- Calibration visualizer: added serial command sending (Enter=START/STOP, I=STATUS)
- Calibration visualizer: device info panel shows WiFi IP, BLE status, recording state
- Firmware workflow: swap
src/main.inobetween calibration and data collection firmware
- Unified firmware: Single
data_collection_leg.inoreads all 6 sensors (A0-A5) on one ESP32 - Removed dual-ESP32 sync system (UDP, TRIGGER/MASTER/SLAVE commands)
- Removed LEG_ID build flag system and conditional compilation
- Replaced
dual_collector.pywithcollector.py(single-port data collection) - Simplified
platformio.inito 2 environments (xiao_esp32s3,calibration) - Web dashboard shows all 6 sensors in 2x3 grid
- Updated all documentation for single-ESP32 architecture
- Added
QUICKSTART.md, removedDUAL_ESP32_QUICKSTART.md
- Migrated from 10-sensor to 6-sensor design (2 pressure + 1 stretch per leg)
- Fixed 25 audit issues across Python, Arduino, docs, and build system (see [[AUDIT_GAPS_AND_FIXES]])
- Consolidated documentation: removed redundant files, merged wireless guides into [[WIFI_CONFIGURATION]]
- Removed deprecated v1 10-sensor code and designs
- Restored Obsidian wiki-links throughout documentation
- Updated SVG diagrams to match 6-sensor configuration
- All sensor names standardized:
L_P_Heel,L_P_Ball,L_S_Knee,R_P_Heel,R_P_Ball,R_S_Knee - Single ESP32 reads all 6 sensors (A0-A5)
- GIF recording in calibration_visualizer.py fully implemented
- Sensor fabrication code
- Data collection software
- BLE transmission capability
- Calibration curves (need sensor data)
- Live demo (need hardware)
- ML pipeline (feature extraction, training, evaluation)
- Real-time classification
- Cross-subject validation framework
- User testing materials (WEAR, SUS questionnaires)
- Analysis report generator
- Trained model with >85% accuracy (need data)
- User study results (5+ participants)
- Saara -- ML, sensors, documentation
- Alex -- Prototyping, user testing, design
- Jing -- Circuit, ESP32, coordination
Last updated: 2026-02-01