0

I'm working on a project that will treat the images coming from a camera in YUV420 format connected by ethernet.

During the development, we can't have constant access to the camera so I wanted to emulate the behaviour.

My idea is to take video, convert it from mp4 to yuv using ffmpeg:

ffmpeg -i input.mp4 -pix_fmt yuv420p output.yuv

Then, stream the video in an infinite loop. This works:

ffmpeg -stream_loop -1 -f rawvideo -s 960x540 -r 30 -pix_fmt yuv420p -i output.yuv -f mpegts udp://127.0.0.1:23000

I managed to read (and show) the images with:

#In python
import cv2
cap = cv2.VideoCapture('udp://127.0.0.1:23000', cv2.CAP_FFMPEG)
...

However, the shape of the images I get is (540, 960, 3), but I expected YUV420 format, I wanted (540*3/2, 960) or 3 channel 540x960, 540/2x960/2, 540/2x960/2.

How can I get the "raw" format?, I'm not sure if it is ffmpeg or opencv adding treatment to the stream.

4
  • 1
    if you want the raw yuv data, you need to browse the docs for VideoCapture on its "CONVERT_RGB" flag. its name is an accident. it's not about RGB. it's about converting from the source color space (yuv or whatever) into opencv's preferred BGR, or dumping the source data (after decompression of course). Commented Apr 11 at 18:03
  • @Ivan "How can I get the "raw" format?" just read the file's bytes into an Array. Each byte (integer) is a part of the YUV values. What's the point though? I mean is your "treat" a custom function(s) that works only on Y & UV values?
    – VC.One
    Commented Apr 11 at 19:13
  • @VC.One Just reading the file would get the data I need, I was expecting to stream it to emulate the camera condition. And yes, I need the three channels (YUV) separated for my application.
    – Ivan
    Commented Apr 12 at 9:09
  • if you only care about playback speed, then just read from the file while keeping pace. Commented Apr 12 at 9:51

0

Browse other questions tagged or ask your own question.