Marek is a Linux kernel and u-boot contributor.
On his free time he’s also an FPGA enthusiast, which is what lead him to try to use an FPGA and his signal processing knowledge to try to automate display output testing.

sched slides

One of Marek’s tasks at work included verifying that what appears on the display of an embedded device corresponds to what is expected.
At first, it consisted of plugging in actual displays to different devices, and doing a visual verification.
Over time it proved to be cumbersome to do, and it would also wear out the display connectors of the board.
There had to be a better way to do it.

From there came the idea: the display output is in the end nothing more than data. What if we could verify that the data that comes out of the board is the expected one, without having to plug in a display every time?
If we get that, we could even use it to automate the validation to use it in CI!

The first option that was considered was using a webcam to capture the output.
However, not only is it not very convenient in terms of physical space, it also has another problem: the image captured by the webcam is a representation of the actual display output but it’s not complete. In particular, the image given by the webcam can be distorted (while the real one is not).

Quickly, it was clear that capturing the actual image data from the board was needed.
For that, three options were considered initially: grabbing fbdev content and streaming it (e.g. over ethernet), using Weston’s RDP backend, or leveraging the CRC of frames from the DRM subsystem to do the verification.
However, all these options also had their drawbacks, namely that the data captured does not necessarily correspond to what the screen displays.

The next parts of the presentation went into the details of specific display busses (DPI, FPD, and finally MIPI DSI).
DPI (Display Parallel Interface), the first one covered, is the oldest and most simple one of the three.

When looking at how it works, it becomes clear that the data the bus carries is actually more than what the screen displays!
Indeed, it for example contains synchronization data that is not visible on the screen.

So a question that arose is: what should be captured? Only the visible areas, or everything including margins?
Marek went for the latter option, as it allows for more complete CI testing.
A challenge with capturing all this data is that it can be pretty big.
For example, the data rate at 1920×1080 with 24 bits per pixel at 60Hz is about 500MiB/s.
That’s more than the bandwidth of Gigabit ethernet!
Instead of ethernet, another interface commonly available that supports such a high bandwidth was used: USB3.0

The first approach Marek took was to try to give the DPI data to an FPGA, that would in turn give the data to a bridge, that would finally be used by the UVC driver of a host PC doing the verification.
That turned out to be unsuccessful as it required many workaround and was overall quite complicated.

After some more investigation (and struggling with vendor tooling for a USB controller, before finding an easier solution), Marek realized it might actually be possible to give the DPI data to a bridge chip, without having to use an intermediate FPGA.
For that he used an open source firmware for a Cypress FX3 USB controller to use it as a logic analyzer. The firmware project is called fx3lafw.

For it to work, it was needed to change fx3lafw to use an external clock (the DPI one).
fx3lafw being open source, Marek was able to patch it to behave the way he needed it to.

Fast forward a couple of details, and the setup could pipe the DPI data into gstreamer for manual verification, or it could even be used for automated verification!

Now that the DPI case had been tackled, it was time to look at the LVDS (FDP) one.
LVDS is more or less a serialization of the DPI bus, and deserializer chips do exist.
The problem then becomes simple: use a deserializer chip, and the problem is suddenly the same as the already-solved DPI problem.
Finally, the MIPI DSI case is going to be a more complex one where using an FPGA is going to be mandatory (but Marek already has ideas on how to tackle it).

As a conclusion, the presentation Marek gave was a very good example of what can be achieved when leveraging open source software, firmware, and even hardware. What was once manual and tedious verification work can now be done remotely using affordable hardware, and can even be automated.