Update 2022-04-03: Since I published this post, Raspbian evolved into Raspberry Pi OS with its current version Bullseye. There, the old camera stack was replaced with libcamera and the old commands like raspivid and raspistill are now deprecated. If you want to check out the new commands, see Raspberry Pi Streaming update: Raspberry Pi OS.


The introduction of the Raspberry Pi High Quality Camera in 2020 has been a perfect excuse for me to acquire new hardware. After some initial tests and use as toy cam, I wanted to build a high quality streaming unit that I could place somewhere nice and watch the images at home. Of course you could buy something like that from off the shelf. But who does that when you can go through the whole development process by yourself and end up with an inferior product at a much higher cost?

Now I had some initial problems getting the stream to work but I found solutions for 3 media tools that I want to share.

Paspberry Pi HQ Cam Capabilities

The Raspberry PI HQ Camera is a module for the Raspberry Pi that can be connected via CSi and uses interchangeable lenses in C or CS mount. It uses the Sony IMX477 sensor with 12 MPix. There are lots of reviews around on google, so I will focus on the stuff that is important for streaming. The sensor is capable of a 4k image with 60fps. Unfortunately the connection to the Raspberry Pi only uses 2 CSi Lanes which limits the bandwith and makes 4k with reasonable fps impossible. If you want to try 4k anyway you should first read the debate on the limits of the module on the Raspberry Pi board. Also have a look at the respective Kernel driver.

For me, the theoretical limit is irrelevant anyway. To reduce bandwith I will use the builtin h264 hardware encoder which can only do 1080p30.

So, its more important to find out how to get the most out of the 1080p. In particular we should have a look at the sensor modes. Here is a table of the available modes according to the official documentation:

Mode Size Aspect Ratio Frame Rates FOV Binning/Scaling
0 automatic selection
1 2028x1080 169:90 0.1-50fps Partial 2x2 binned
2 2028x1520 4:3 0.1-50fps Full 2x2 binned
3 4056x3040 4:3 0.005-10fps Full None
4 1332x990 74:55 50.1-120fps Partial 2x2 binned

Depending on the tool and output resolution you use the sensor will output a cropped image. In this case you hit one of the modes with partial FOV.

The sensor module has a C mount and you get an adapter to CS. With other adapters its possible to use a wide selection of other 3rd party lenses.

Streaming

I tested the HQ camera module on a Raspberry Pi 3B+ with the official 6mm lens. For everyone who thinks 6mm is not enough: due to the crop factor of 5.6 we end up with a moderate wide anle of 33.6mm in terms of Full Frame. This is because of the small sensor size of 1/2.3”.

My target was to stream from a headless Raspberry Pi to my MacBook on the same network. I used the raspivid, vlc and ffmpeg packages from the Raspi repositories and VLC on the MacBook.

Probably helpful hints

To make in particular ffmpeg work repeatably I had to increase the video RAM settings. Otherwise I would get strange error messages like this:

ffmpeg: ioctl(VIDIOC_STREAMON): Operation not permitted

When one of these errors occured I had to completely reboot the Raspi. Just reloading the kernel module for the camera didn’t work.

To increase the video RAM, add the following snippet to the boot configuration file:

/boot/config.txt
gpu_mem=256

raspivid

Raspivid is the canonical choice for anything related to the raspberry pi cameras. It has an extensive documentation.

To start a stream, do this:

raspivid -t 0 -l -o tcp://0.0.0.0:3333

You will be able to access it with VLC using the following URL:

tcp/h264://tnglab.fritz.box:3333

Raspivid has a lot of possible settings including the video mode. In the following example we choose a 720p resolution in sensor mode 1, a bitrate of 15Mbit, h264 encoding with level 4.2, profile high, quantizer 25, and flush the buffers to reduce latency.

raspivid -t 0 -l -md 1 -w 1280 -h 720 -b 15000000 -pf high -lev 4.2 -qp 25 -fl -fps 30 -o tcp://0.0.0.0:3333

Among the tools that I tested raspivid is the only one that can explicitly set the sensor mode.

That beeing said, even with a reduced resolution, bitrate and buffer flushing I never got a fluid stream out of raspivid. I experienced a lot of stuttering and dropped frames. Maybe an upgrade to a raspberry pi 4 would help.

Apart from the dropped frames and intermittend stuttering, the overall lag was around 3.5 seconds.

(c)VLC

With VLC the lag was a bit worse at around 4 seconds, but I could achieve a fluid stream. VLC has its own HLS streaming server built-in. Below is the raspivid command line translated to VLC syntax:

cvlc v4l2:///dev/video0 --v4l2-fps 30 --v4l2-width 1280 --v4l2-height 720 --v4l2-chroma h264 --sout '#standard{access=http,mux=ts,dst=0.0.0.0:3333}'

You can then access it in VLC via this URL:

http://tnglab.fritz.box:3333

In VLC you cannot explicitly select the sensor mode. As stated earlier this might lead to a cropped image depending on your choice of resolution and fps. However, for standard resolutions like 1080p30 or 720p30 the whole sensor is being used.
Ok, not the whole sensor: a few pixels on the sides will be missing. Also, you of course have the crop from 4:3 to 16:9.

Using v4l2-ctl we can modify the camera settings like ISO or color modes before or during the stream. Unfortunately it’s not possible to pass the parameters in the cvlc call.

ffmpeg

ffmpeg is the swiss army knife of media tools and of course it also can do streaming. Unfortunately it does it at the same level as many other things: mediocre. At least in my experience. It can do streaming, but does not provide an own server but rather sends the stream to another location where it should be processed. Alternatively you can create the necessary files to create an HLS stream, but you have to host these files by yourself. Also, it can send a fluid Full HD stream to a player, but somehow VLC can’t cope with the stream but only the ffmpeg sister tool ffplay.

FFmpeg MPEG-TS Stream

Anyway, starting the stream in MPEG-TS format works like this:

ffmpeg -v verbose -f v4l2 -input_format h264 -video_size 1920x1080 -framerate 30 -i /dev/video0 -c:v copy -an -f mpegts udp://10.42.2.5:3334

Where the IP address is the address of my MacBook which is the target of the stream.

And you can watch it using fflpay like this:

ffplay udp://tnglab.fritz.box:3334

FFmpeg RTMP Stream

If you want to stream via RTMP instead use the following command line:

ffmpeg -f v4l2 -input_format h264 -video_size 1920x1080 -framerate 24 -i /dev/video0 -c:v copy -an -f flv rtmp://10.8.1.111:3334/live/stream

FFmpeg will most likely complain about missing timings, we can fix that with the -ts abs switch. Also the stream won’t be really real-time yet, but most likely laggy. Here is an example with all flags that I found that helped me:

ffmpeg -v verbose -fflags nobuffer -hwaccel auto -f v4l2 -input_format h264 -ts abs -video_size 1920x1080 -framerate 24 -i /dev/video0 -c:v copy -an -rtmp_live 1 -tcp_nodelay 1 -f flv -flvflags +no_duration_filesize+no_sequence_end -bufsize 20M rtmp://10.8.1.111:3334/live/stream

The real power of ffmpeg is of course its vast collection of included filters, encoders, muxers and so on. I guess I will have to play around with them for a later post.

Conclusion

As far as the streaming unit goes: I still haven’t deployed it. The quality of the 6mm kit lens is way to bad for the effort. But I learned a lot about streaming and the raspberry pi hq cam and maybe I can resolve the quality issues with a better lens some time in the future.

When it comes to the streaming solutions presented above, all of them are flawed. Just pick the one that suits your use case or combine them as necessary.