# Geek Szitman SuperCamera Rust implementation of Geek szitman supercamera endoscope viewer with PipeWire support and preparation for V4L2 fallback. ## Features - USB communication with the endoscope device - **PipeWire virtual camera** - Create a discoverable virtual camera node - **Stdout video output** - Pipe raw video frames to stdout for external processing - UPP protocol implementation - JPEG frame processing - Modular architecture for maintainability ## PipeWire Virtual Camera This project includes a **discoverable virtual camera** implementation using PipeWire that creates a `Video/Source` node in the PipeWire graph. The virtual camera can be used by any application that supports PipeWire video sources. ### How it works The PipeWire backend: 1. Connects to the PipeWire daemon 2. Creates a `Stream` configured as a `Video/Source` 3. Registers the stream with proper metadata (node name, description, media class) 4. Becomes discoverable by other applications ### Usage #### 1. Start the virtual camera ```bash # Run the application (it will automatically create the PipeWire virtual camera) cargo run --bin geek-szitman-supercamera # Or specify PipeWire backend explicitly cargo run --bin geek-szitman-supercamera --backend pipewire ``` #### 2. Verify the virtual camera appears ```bash # List all PipeWire nodes pw-dump | jq '.[] | select(.info.props["media.class"]=="Video/Source")' # Or use pw-top to see the graph pw-top ``` #### 3. Use the virtual camera in applications The virtual camera will appear as "geek-szitman-supercamera" in applications like: - OBS Studio - Cheese - Google Meet - Zoom - Any application that supports PipeWire video sources #### 4. Test with GStreamer ```bash # Get the node ID from pw-dump NODE_ID=$(pw-dump | jq -r '.[] | select(.info.props["media.class"]=="Video/Source") | .id') # Test the virtual camera gst-launch-1.0 pipewiresrc target-object=$NODE_ID ! videoconvert ! autovideosink ``` ### Configuration The virtual camera can be configured with: - **Node name**: `geek-szitman-supercamera` - **Description**: `Geek Szitman SuperCamera - High-quality virtual camera for streaming and recording` - **Media class**: `Video/Source` - **Format**: RGB24 (for maximum compatibility) - **Resolution**: 640x480 (configurable) - **Framerate**: 30 FPS (configurable) ### Architecture The PipeWire implementation follows the official Rust bindings (`pipewire = "0.8"`): 1. **MainLoop → Context → Core**: Standard PipeWire connection pattern 2. **Stream creation**: Creates a `Video/Source` stream with proper metadata 3. **Event handling**: Responds to state changes and parameter requests 4. **Thread safety**: Runs in a separate thread to avoid blocking the main application ### Troubleshooting #### Virtual camera not appearing 1. Check if PipeWire is running: ```bash systemctl --user status pipewire ``` 2. Verify the application started successfully: ```bash journalctl --user -f -u pipewire ``` 3. Check for errors in the application logs #### Permission issues Ensure your user has access to PipeWire: ```bash # Add user to video group if needed sudo usermod -a -G video $USER ``` #### Format compatibility The virtual camera currently provides RGB24 format. If you need other formats (YUV420, MJPEG), the backend can be extended to support format negotiation. ## Stdout Video Output The stdout backend outputs raw video frames to stdout, allowing you to pipe the video stream to other tools like PipeWire, FFmpeg, or custom video processing pipelines. ### How it works The stdout backend: 1. Outputs raw video frames directly to stdout 2. Optionally includes frame metadata headers 3. Supports multiple header formats (Simple, JSON, Binary) 4. Can be piped to other tools for further processing ### Usage #### 1. Basic stdout output ```bash # Output raw video frames to stdout cargo run --bin geek-szitman-supercamera --backend stdout # This will output raw JPEG frames to stdout ``` #### 2. Pipe to PipeWire ```bash # Pipe video output to PipeWire using ffmpeg cargo run --bin geek-szitman-supercamera --backend stdout | \ ffmpeg -f mjpeg -i pipe:0 -f v4l2 -pix_fmt yuv420p /dev/video0 # Or use gstreamer cargo run --bin geek-szitman-supercamera --backend stdout | \ gst-launch-1.0 fdsrc fd=0 ! jpegdec ! videoconvert ! v4l2sink device=/dev/video0 ``` ```bash # WORKING NEWEST COMMAND RUST_LOG=off \ cargo run --release -- --backend stdout 2>/tmp/supercamera.log | \ gst-launch-1.0 -v fdsrc do-timestamp=true ! \ image/jpeg,framerate=30/1,width=640,height=480 ! \ jpegparse ! jpegdec ! videoconvert ! videoscale method=lanczos ! \ video/x-raw,width=1024,height=768 ! \ queue max-size-buffers=1 leaky=downstream ! \ fpsdisplaysink video-sink=waylandsink sync=false text-overlay=true ``` #### 3. Pipe to FFmpeg for recording ```bash # Record video to file cargo run --bin geek-szitman-supercamera --backend stdout | \ ffmpeg -f mjpeg -i pipe:0 -c:v libx264 -preset ultrafast -crf 23 output.mp4 # Stream to RTMP cargo run --bin geek-szitman-supercamera --backend stdout | \ ffmpeg -f mjpeg -i pipe:0 -c:v libx264 -preset ultrafast -crf 23 \ -f flv rtmp://localhost/live/stream ``` #### 4. Custom video processing ```bash # Process with custom Python script cargo run --bin geek-szitman-supercamera --backend stdout | \ python3 process_video.py # Process with custom C++ application cargo run --bin geek-szitman-supercamera --backend stdout | \ ./video_processor ``` ### Configuration Options The stdout backend supports several configuration options: #### Header formats - **Simple**: `FRAME:size:timestamp\n` (human-readable) - **JSON**: `{"frame": {"size": size, "timestamp": timestamp}}\n` (structured) - **Binary**: 4-byte size + 8-byte timestamp (efficient) #### Frame metadata ```bash # Enable headers with simple format cargo run --bin geek-szitman-supercamera --backend stdout --config '{"include_headers": true, "header_format": "simple"}' # Enable JSON headers for parsing cargo run --bin geek-szitman-supercamera --backend stdout --config '{"include_headers": true, "header_format": "json"}' ``` ### Integration Examples #### With v4l2loopback ```bash # Load v4l2loopback module sudo modprobe v4l2loopback # Create virtual video device sudo modprobe v4l2loopback video_nr=10 card_label="SuperCamera" exclusive_caps=1 # Pipe video to virtual device cargo run --bin geek-szitman-supercamera --backend stdout | \ ffmpeg -f mjpeg -i pipe:0 -f v4l2 -pix_fmt yuv420p /dev/video10 ``` #### With OBS Studio ```bash # Create v4l2loopback device sudo modprobe v4l2loopback video_nr=20 card_label="SuperCamera" # Pipe video to device cargo run --bin geek-szitman-supercamera --backend stdout | \ ffmpeg -f mjpeg -i pipe:0 -f v4l2 -pix_fmt yuv420p /dev/video20 # Add "Video Capture Device" source in OBS, select /dev/video20 ``` #### With custom video processing ```bash # Example: Extract frames for analysis cargo run --bin geek-szitman-supercamera --backend stdout | \ ffmpeg -f mjpeg -i pipe:0 -vf "select=eq(pict_type\,I)" -vsync vfr frame_%04d.jpg ``` ### Advantages - **Flexibility**: Pipe to any tool that accepts video input - **Efficiency**: No intermediate video backend overhead - **Compatibility**: Works with standard Unix tools and pipelines - **Debugging**: Easy to inspect raw video data - **Integration**: Simple to integrate with existing video workflows ### Use Cases - **Video recording**: Pipe to FFmpeg for file output - **Streaming**: Pipe to streaming services via FFmpeg - **Video processing**: Pipe to custom video analysis tools - **Debugging**: Inspect raw video data for troubleshooting - **Integration**: Connect to existing video pipelines ## Building ```bash # Build the project cargo build # Build with optimizations cargo build --release # Run tests cargo test # Check for issues cargo clippy ``` ## Dependencies - **Rust**: 1.70+ - **PipeWire**: 0.3+ (with development headers) - **System libraries**: `libpipewire-0.3-dev` ### Installing dependencies (Arch Linux) ```bash sudo pacman -S pipewire pipewire-pulse pipewire-alsa pipewire-jack sudo pacman -S base-devel pkg-config ``` ### Installing dependencies (Ubuntu/Debian) ```bash sudo apt install pipewire pipewire-pulse pipewire-alsa sudo apt install libpipewire-0.3-dev pkg-config build-essential ``` ## Development ### Project Structure ``` src/ ├── video/ │ ├── pipewire.rs # PipeWire virtual camera implementation │ ├── v4l2.rs # V4L2 backend (future use) │ ├── stdout.rs # Stdout video output backend │ └── mod.rs # Video backend abstraction ├── usb/ # USB communication ├── protocol/ # UPP protocol implementation └── lib.rs # Main library interface ``` ### Adding new video formats To add support for additional video formats: 1. Extend the `VideoFormat` enum in `src/video/mod.rs` 2. Update the PipeWire backend to handle format negotiation 3. Implement proper SPA POD creation for the new format ### Extending the virtual camera The PipeWire backend is designed to be extensible: - **Format negotiation**: Respond to client format requests - **Buffer management**: Handle buffer allocation and deallocation - **Frame pushing**: Implement actual video frame delivery - **Metadata**: Add custom metadata support ## License This project is licensed under the CC0-1.0 License - see the [LICENSE](LICENSE) file for details. ## Contributing Contributions are welcome! Please feel free to submit a Pull Request. ## Acknowledgments - PipeWire project for the excellent audio/video framework - The Rust PipeWire bindings maintainers - The original Geek Szitman SuperCamera project