megi's PinePhone Development Log RSS

2022–05–29: Pinephone Pro camera pipeline testing app

I've been playing with the Pinephone Pro cameras a bit more, and found a few issues. Things mostly work though.

Here are some videos from my experiments:

The capture is happening at full resolution of the sensors at ~30fps. I had to bump up exposure time of the selfie camera to 3 frames, to get reasonable image in the dark, but it can also do preview at 30fps.

Some recent changes in my kernel

A bit of news about my camera app

Here is --help output:

Usage: ppp-photo [--sensor selfie|world] [--focus <value>] [--flash]
                 [--exposure <value>] [--again <value>] [--dgain <value>]
                 [--mode <mode>] [--test <pattern>] [--delay-shot <seconds>]
                 [--format bayer|ppm|jpg|webp|png] [--output <path>]
                 [--quality <quality>] [--gui]

Sensor options:
  -s, --sensor <type>    Which sensor to use: selfie or world
      --focus <value>    Focus (values 0.0 (far) - 1.0 (near))
      --flash            Use flash LED during a shot
      --exposure <value> Exposure time (1.0 = time it takes to scan a frame)
      --again <value>    Analogue gain (1.0 = no gain)
      --dgain <value>    Digital gain (1.0 = no gain)
      --mode <mode>      Select a different sensor output mode (see below)
      --test <pattern>   Output sensor test pattern (values 1 - 4)

  Pinephone Pro Selfie Camera:
    - mode 1: 4032 x 3024
    - mode 2: 2104 x 1560
    - mode 3: 1048 x 780

  Pinephone Pro World Camera:
    - mode 1: 3264 x 2448
    - mode 2: 1632 x 1224

UI options:
      --delay-shot <seconds>
                         Wait <seconds> before taking a shot
      --gui              Run in interactive GUI mode
      --setup-only       Setup the media pipeline and print paths to
                         various device files for the selected sensor
                         without doing any capture (useful for scripting).

Output options:
  -o, --output <path>    Specify path where you want to store the pictures
                        (When using --gui mode a ".####" number will be
                         appended to the path)
  -f, --format <format>  Specify format of picture files:
                         - bayer - raw data from the sensor (in bayer format)
                         - tiff - raw TIFF6 data from the sensor (in bayer format)
                         - ppm - debayered uncompressed PPM image
                         - jpg - JPEG (requires ImageMagick)
                         - webp - WebP (requires ImageMagick)
                         - png - PNG (requires ImageMagick)
                        (Append .zst to bayer/ppm formats for Zstandard
      --quality <value>  Specify jpg/webp compression quality (values 1 - 100)

Misc options:
  -v, --verbose          Show details of what's going on.
  -h, --help             This help.

Pinephone Pro photo shooting tool 0.1
Written by Ondrej Jirman <>, 2022

The app is available as a pre-built static binary in this repository The code will not be available at this time.

You can use it to test cameras on your phone, before a proper end-user app is developed based on libcamera or by extending megapixels app.

Power consumption

Sensors + ISP + image rotation consumes additional 850mW for the world sensor, and 600mW for the selfie sensor. CPU use is <1% during the tests, because my test app is written in such a way that no processing happens on the CPU (zero-copy buffer sharing between ISP<->RGA and RGA<->DRM).

Hardware issues I've found

While trying out the flash LED, I've noticed that flash mode is not really different from torch mode and that torch mode consumes excessive amounts of power (1.5W). LED is rated for 150 mA. It's clearly driven at much higher current (~3× the rated current in torch mode), which will degrade the LED and eventually kill it if used for longer than brief periods of time.

GPIO that should be switching between flash and torch modes seems to have no effect.

Also the driver chip used on PPP is not the same as on PP. It's somewhat compatible, except for supporting a feature where by toggling the enable pin enough times, it's possible to select flash intensity and duration. (except that flash mode doesn't work at all)

That may be because flash control signal is not connected or connected to a different GPIO.

Sensor driver for the selfie camera doesn't load calibration data from the OTP, because the data is probably not there. This is somewhat unfortunate.

Software issues I've found

I use Rockchip RGA to rotate the picture from the camera to match the orientation of the screen, and to change the pixel format from BGRX to RGB supproted by Rockchip's VOP/DRM driver.

I noticed that the image on the display had the RB components swapped, so I searched for the culprit, and it turned out to be the RGA driver. When converting between BGR and RGB, it will swap the colors correctly, but when converting between BGRX and RGB it will just remove the X byte without swapping the colors.

Hardware JPEG encoding

I've noticed that Rockchip's Hantro based HW video encoder can encode YUV422 to JPEG, and that mainline driver already supports this. So I decided it may be a good idea to use HW encoding for JPEG output because it may be faster and more energy efficient than using a CPU to do the same.

I've measured speed of conversion for 4032×3024 sized images, and it is 84 ms. For full HD resolution, it is 14–15ms. This makes it possible to encode FullHD mjpeg videos in real-time at 60 FPS and to significantly speed up encoding of full resolution photos and save them at the rate of ~10 photos per second continuously.

HW encoding is roughly 10× faster than using Imagemagick to encode PPM files to JPEG.

You can test HW encoding speed using gstreamer:

strace -f -e ioctl -tt \
gst-launch-1.0 videotestsrc num-buffers=2 ! \
  video/x-raw,width=1920,height=1080 ! v4l2jpegenc ! filesink location=test.jpg

Next steps

Now that I have basic live preview from the sensor, I can start playing with configuration of the ISP and dynamic updates of exposure/gains and other parameters based on statistics calculated by the ISP. Important parts to get right will be color calibration for both sensors, and lens shade correction for the selfie sensor. World sensor seems to have lens shade correction working.

Live preview is also very useful for debugging sensor controls and testing effects of various register values on sensor output. Seeing the effect of changes right away is extremely important for the ease of development and testing.

I intend to add some touch controls to the camera app's --gui mode, to manually control exposure/gain/focus/flash and various ISP options.

I'll try to integrate mp4/mjpeg recording into my app, once I figure out mp4 container format. :) And I'll definitely use the HW accelerated jpeg encoder for encoding captured images at full resolution, because the speedup is very significant.

It should be possilbe to record mjpeg videos at real-time with very little CPU use, and thus at reasonable power consumption. I've already tested JPEG HW encoding in my app, and this should be rather straightforward.