A simple UDP based video streaming tool for robotics applications where you need reliable video transmission over Wireless networks.
If you've ever tried streaming camera data through ROS2's standard DDS transport over WiFi without additional configuration, you know it can be frustrating.
This tool captures video from your camera, compresses it to JPEG if not already, packages it into RTP packets, and sends it directly over UDP to a receiver. The receiver then publishes the reconstructed video to a standard ROS2 image topic, so the rest of your system doesn't know the difference.
The sender connects to any UVC compatible camera using V4L2. It checks if your camera supports MJPEG natively - if so, great, we use that directly. If not, we grab YUYV frames and compress them to JPEG using TurboJPEG.
Each JPEG frame gets chunked into UDP packets (respecting MTU limits), wrapped with RTP headers for proper sequencing, and transmitted. The receiver collects these packets, reassembles the JPEG frames, decompresses them, and publishes them as standard ROS2 sensor_msgs/Image messages.
Required dependencies:
sudo apt update
sudo apt install -y build-essential cmake pkg-config libturbojpeg0-dev linux-libc-devYou'll also need ROS2 installed and sourced. This has been tested with ROS2 Humble but i believe it should work with any ROS2 distro.
Optional (not needed in current setup, in future maybe if i remove ros2 dependency and create a standalone receiver i will use opencv):
sudo apt install -y libopencv-devCreate a workspace and build:
mkdir faststreaming_ws && cd faststreaming_ws
git clone <this repo>
cd FastVideoStreaming
mkdir build && cd build
cmake -DCMAKE_EXPORT_COMPILE_COMMANDS=ON -DWITH_ROS2=ON ..
make
# If you want debug output:
cmake -DCMAKE_EXPORT_COMPILE_COMMANDS=ON -DWITH_ROS2=ON -DENABLE_DEBUG=ON ..
makeThis builds two executables: sender and receiver.
./sender --camera /dev/video0 --dst 192.168.1.100:5004 --width 1280 --height 720 --fps 30Replace 192.168.1.100 with the IP address of your receiving machine.
./receiver --listen :5004 --topic /camera/image_rawNow you can use the video stream like any other ROS2 camera:
ros2 topic echo /camera/image_raw
rviz2NOTE: You may need to set the QoS settings to 'Best Effort'
This is a practical tool, not a comprehensive video streaming solution. it doesn't do:
- Limited camera support: Only works with UVC cameras right now. ZED cameras, RealSense, or other cameras that need special SDKs won't work.
- NO H264 Encoding Will add in future
- No encryption or authentication: This is plain UDP. Don't use it over untrusted networks.
- No automatic bitrate/fps adaptation: If your network gets congested, you'll see packet drops. The tool won't automatically reduce quality or adjust the FPS according to packet drop rate.
- Linux only: Uses V4L2, so it's tied to Linux systems.
- No audio: Video only.
The current version includes several optimizations for low latency streaming:
- Zero copy networking: Uses scatter gather I/O to avoid copying JPEG data multiple times
- Non blocking sockets: Prevents the sender from freezing during network congestion
- Event driven timing: Responds to camera readiness instead of artificial frame rate limiting
- Monotonic RTP timestamps: Prevents timestamp drift that causes A/V sync issues
- Direct decompression: Receiver decompresses JPEG directly into ROS message buffers
I'd like to add (contributions welcome):
- Adaptive bitrate: Automatically adjust quality based on network conditions
- SDK camera support: Integrate ZED, RealSense, and other specialized cameras
- Library API: Turn this into a proper library instead of just command line tools
- Network resilience: Better handling of packet loss and network changes
- Multi stream support: Handle multiple cameras simultaneously
This started as a practical solution to a real problem I was having with robot video streaming. If you're facing similar issues, I'd love to hear about your use case or see improvements you've made.
The code is straightforward C++ - no fancy frameworks or complex abstractions. If you can read the CMakeLists.txt above, you can probably contribute.
