Skip to content

perf: eliminate idle CPU waste (socket events + adaptive FPS)#580

Open
lucletoffe wants to merge 2 commits intoStreamController:mainfrom
lucletoffe:fix/hyprland-socket-events
Open

perf: eliminate idle CPU waste (socket events + adaptive FPS)#580
lucletoffe wants to merge 2 commits intoStreamController:mainfrom
lucletoffe:fix/hyprland-socket-events

Conversation

@lucletoffe
Copy link
Copy Markdown

@lucletoffe lucletoffe commented Mar 17, 2026

Problem

StreamController idles at 20-30% CPU on Hyprland systems, caused by two independent tight loops:

1. Window grabber polling (Hyprland)

The Hyprland integration polls hyprctl activewindow -j every 200ms. Inside Flatpak, each poll spawns flatpak-spawn → hyprctl, causing sustained CPU on flatpak-session-helper and xdg-dbus-proxy.

2. Media player thread at 30 FPS unconditionally

MediaPlayerThread runs at 30 FPS even when there is no video background, no key videos, and no scrolling labels. On a 15-key Stream Deck, this iterates all keys 30x/sec in pure Python for zero visual benefit.

Solution

Commit 1: Hyprland IPC socket events

Replace polling with a direct connection to Hyprland's socket2 IPC event stream. Listen for activewindow>> events with a blocking socket read — zero CPU when idle.

  • Auto-detect socket path via HYPRLAND_INSTANCE_SIGNATURE
  • Automatic reconnection on socket close
  • Full fallback to legacy polling if socket unavailable
  • Add --filesystem=xdg-run/hypr:ro to Flatpak manifest

Commit 2: Adaptive media player FPS

Detect whether any animated content exists (video, scroll labels, pending image tasks). When idle, throttle from 30 FPS to 2 FPS. Return to 30 FPS instantly when animation starts.

Results (measured on Stream Deck MK.2, Hyprland, Flatpak)

Metric Before After
Total SC CPU (idle) ~20-30% ~4%
Media player thread ticks/min ~108,000 ~4,300
flatpak-spawn processes/sec ~5 0
Window change latency 0-200ms <1ms

Fixes #433
Fixes #457

…hanges

Replace the 200ms polling loop that spawns 'hyprctl activewindow -j' (via
flatpak-spawn + xdg-dbus-proxy when running in Flatpak) with a direct
connection to Hyprland's socket2 IPC event stream.

The old approach spawns ~5 processes/second even when no window changes occur,
causing 10-30% CPU usage on the flatpak-session-helper and xdg-dbus-proxy
(see StreamController#433 and StreamController#457). The new approach uses a blocking Unix socket read with
zero CPU when idle.

Changes:
- Listen for 'activewindow>>' events on Hyprland's .socket2.sock
- Auto-detect socket path via HYPRLAND_INSTANCE_SIGNATURE env var
- Fallback to legacy polling if socket is unavailable
- Add --filesystem=xdg-run/hypr:ro to Flatpak manifest for socket access
- Reconnect automatically if the socket is closed (e.g. compositor restart)

Fixes StreamController#433 (partially — eliminates the subprocess-spawning component)
Fixes StreamController#457 (eliminates the flatpak-spawn/xdg-dbus-proxy tight loop)
The MediaPlayerThread runs at 30 FPS unconditionally, iterating over all
keys/dials every ~33ms even when there is no video background, no key
videos, and no scrolling labels. On a 15-key Stream Deck MK.2, this
burns ~20% CPU in pure Python overhead for no visible benefit.

Add dynamic FPS: detect whether any animated content (video, scroll
labels) or pending image tasks exist. When idle, throttle to 2 FPS
(500ms sleep). Immediately return to 30 FPS when animation starts or
tasks are queued.

This preserves full responsiveness for video/animation while reducing
idle CPU from ~20% to ~1%.
@lucletoffe lucletoffe changed the title feat(hyprland): use IPC socket events instead of polling for window changes perf: eliminate idle CPU waste (socket events + adaptive FPS) Mar 17, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: No status

Development

Successfully merging this pull request may close these issues.

BUG: Notable increase in CPU usage flatpak-session-helper high CPU usage and excessive window monitoring whenStreamController running

1 participant