Recently, I came across YUV420 image data whilst working with a hardware H264 compression card. The image data was planar, and arranged like the image below. This is standard YUV420 planar format, with the U and V components being at 1/2 resolution of the Y component.
Now, when using MS Windows, you have to display images using bitmaps packed as RGB888 or RGBA8888 colour format. Even when using OpenGL you need basically RGB or 8bit grey images. I believe MacOS might be nicer (anybody?) and certainly supports YUV422, but under Windows, you have a problem with YUV. DirectX might help out too (anybody?) - but I live in the OpenGL world here.
So how do we display YUV420 video in real-time?
The first thing I tried was a CPU conversion to RGB888, then transferred the RGB data to OpenGL for display. Easy enough to code in C++ and took about an hour to optimise. But it still took about 8ms per frame to convert (on 768x576 frames) and really hit the CPU loading, which felt like a real waste of clock cycles for just displaying an image.
The solution we ended up with was to transfer YUV420 image data raw as GL_LUMINANCE image data, essentially just transferring the whole image (as above) as if it were a 768x864 greyscale image. We then wrote a Cg fragment shader to do the YUV to RGB conversion and display on the graphics unit. This worked a treat and even the Intel embedded graphics on the motherboard was able to handle the shader. This reduced the time to 1.4ms per frame, without any CPU loading.
To finish up, we wrapped up the entire functionality as a stand-alone DLL with just a few simple function calls. Now anybody here can display YUV420 images in a Window without any CPU overhead and without having to be concerned about how it happens. NVidia Cg requires two additional DLL's to be supplied with the package, but thats it.
You can get the DLL from us at http://www.vision4ce.com
Job for a rule-loving engineer?
9 years ago