GPU-accelerated video player widget for Slint using wgpu.
This project is based on iced_video_player by jazzfool, adapted to work with Slint instead of Iced.
This library provides hardware-accelerated video playback for Slint applications by leveraging wgpu for GPU-based YUV to RGB color space conversion. It integrates with GStreamer for video decoding, supporting custom pipelines for flexible video sources.
- GPU-accelerated rendering - NV12 to RGBA conversion using wgpu shaders
- Custom GStreamer pipelines - Flexible video sources (files, streams, cameras, etc.)
- Cross-platform - Works on Windows (DirectX 12), macOS (Metal), and Linux (Vulkan)
- Efficient texture sharing - Direct integration with Slint's wgpu renderer
Linux (Debian/Ubuntu):
sudo apt install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev \
gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-uglymacOS:
brew install gstreamer gst-plugins-base gst-plugins-good gst-plugins-bad gst-plugins-uglyWindows: Download and install GStreamer from https://gstreamer.freedesktop.org/download/
Add to your Cargo.toml:
[dependencies]
slint-gstreamer-video = { git = "https://github.com/Route-8/slint-gstreamer-video.git" }
slint = { version = "1.14", features = ["backend-winit", "unstable-wgpu-27"] }
gstreamer = "0.23"
gstreamer-app = "0.23"use gstreamer as gst;
use gstreamer::prelude::*;
use slint_gstreamer_video::{setup_slint_wgpu, Video, VideoPlayer};
use std::cell::RefCell;
use std::rc::Rc;
use std::sync::Arc;
slint::slint! {
export component App inherits Window {
in property <image> video_frame;
Image {
source: video_frame;
width: 100%;
height: 100%;
image-fit: contain;
}
}
}
fn main() -> Result<(), Box<dyn std::error::Error>> {
// 1. Initialize wgpu backend BEFORE creating Slint components
setup_slint_wgpu()?;
// 2. Initialize GStreamer
gst::init()?;
// 3. Create your GStreamer pipeline with an NV12 appsink
let pipeline = gst::parse::launch(
"videotestsrc ! videoconvert ! videoscale ! \
appsink name=video_sink drop=true \
caps=video/x-raw,format=NV12,pixel-aspect-ratio=1/1"
)?.downcast::<gst::Pipeline>().unwrap();
let video_sink = pipeline
.by_name("video_sink")
.unwrap()
.downcast::<gstreamer_app::AppSink>()
.unwrap();
// 4. Create the Video wrapper
let video = Arc::new(Video::new(pipeline, video_sink)?);
// 5. Create Slint app
let app = App::new()?;
// 6. Get wgpu device/queue via rendering notifier
let video_player: Rc<RefCell<Option<VideoPlayer>>> = Default::default();
let video_clone = Arc::clone(&video);
let player_ref = video_player.clone();
app.window()
.set_rendering_notifier(move |state, api| {
if let slint::RenderingState::RenderingSetup = state {
if let slint::GraphicsAPI::WGPU27 { device, queue, .. } = api {
*player_ref.borrow_mut() = Some(
VideoPlayer::new(Arc::clone(&video_clone), device.clone(), queue.clone())
);
}
}
})?;
// 7. Frame update timer
let app_weak = app.as_weak();
let player_for_timer = video_player.clone();
slint::Timer::default().start(
slint::TimerMode::Repeated,
std::time::Duration::from_millis(16),
move || {
if let (Some(app), Some(player)) = (
app_weak.upgrade(),
player_for_timer.borrow_mut().as_mut()
) {
if let Some(image) = player.next_frame() {
app.set_video_frame(image);
}
}
},
);
app.run()?;
Ok(())
}To display video in both a native window and Slint simultaneously:
let pipeline = gst::parse::launch(
"videotestsrc ! tee name=t \
t. ! queue ! autovideosink \
t. ! queue ! videoconvert ! videoscale ! \
appsink name=slint_video drop=true \
caps=video/x-raw,format=NV12,pixel-aspect-ratio=1/1"
)?;Your GStreamer pipeline must include an appsink element that outputs NV12 format:
... ! videoconvert ! videoscale ! appsink name=my_sink drop=true caps=video/x-raw,format=NV12,pixel-aspect-ratio=1/1
| Element | Purpose |
|---|---|
videoconvert |
Converts any input format to NV12 |
videoscale |
Normalizes pixel aspect ratio |
appsink |
Extracts frames for rendering |
drop=true |
Prevents backpressure if rendering falls behind |
format=NV12 |
Required format for the GPU shader |
Initializes Slint with the wgpu backend. Must be called before creating any Slint components.
Wraps a GStreamer pipeline and extracts NV12 frames.
Important: Video is non-destructive - it never modifies pipeline state. The caller is responsible for:
- Starting the pipeline (
pipeline.set_state(gst::State::Playing)) - Seeking, pausing, and stopping playback
- Cleanup (
pipeline.set_state(gst::State::Null))
impl Video {
fn new(pipeline: gst::Pipeline, video_sink: AppSink) -> Result<Self, Error>;
fn size(&self) -> Option<(u32, u32)>; // None until first frame received
fn framerate(&self) -> Option<f64>; // None until first frame received
fn duration(&self) -> Duration;
fn eos(&self) -> bool;
}Bridges Video frames to Slint images via wgpu rendering.
impl VideoPlayer {
fn new(video: Arc<Video>, device: wgpu::Device, queue: wgpu::Queue) -> Self;
fn next_frame(&mut self) -> Option<slint::Image>;
fn eos(&self) -> bool; // Returns true if end-of-stream
fn video(&self) -> &Video;
fn video_arc(&self) -> &Arc<Video>;
fn renderer(&self) -> &VideoRenderer;
}GStreamer Pipeline
|
v
AppSink (NV12 frames)
|
v
Worker Thread (frame extraction)
|
v
Arc<Mutex<Frame>>
|
v
VideoPlayer.next_frame()
|
v
wgpu Pipeline (NV12 -> RGBA shader)
|
v
slint::Image::try_from(wgpu::Texture)
|
v
Slint UI
Run the included examples:
# Basic custom pipeline example
cargo run --example custom_pipeline
# 3x3 grid with 9 simultaneous video streams
cargo run --example video_grid
# Dynamic resolution changes during playback
cargo run --example dynamic_capsLicensed under either of
- Apache License, Version 2.0 (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
- MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
at your option.