Created attachment 115725 [details]
Platform: Haswell, i7-4600U
Distro: Debian stretch/testing
Steps to reproduce:
1. g++ vaapi_jpeg_decoder.cpp -ljpeg -lva -lva-drm
2. ./a.out sample.jpg
a.out memory usage almost does not increase.
`free -m` shows system shared memory usage increases by several hundred MB per second until full.
System begins swapping. OOM killer gets triggered.
Valgrind detects no leak.
No accumulating memory usage
I tried to use libva to process a stream of jpeg files. The decoder I wrote reuses surface and other structures across jpeg files.
My cursory look in libdrm shows some complex memory management infrastructure which is beyond my reach.
Created attachment 115726 [details]
I find that after I explicitly call vaDestroyBuffer() on all buffers created, the leak disappears.
But comment on vaRenderPicture() says "Buffers are automatically destroyed afterwards". The reproducer here followed this documentation and did not explicit destroy buffers.
*** This bug has been marked as a duplicate of bug 75287 ***