Click here to Skip to main content
15,667,422 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:

I am working on the OpenGL Application. which can open and read multiple video file frames. so, in output is can play three different videos at the same time without async frames rendering. what are odds get the sync frames rendering? After reading all the frames, define a buffer for multiple video frames or alternative could be the FFmpeg filter function? I have some doubts that it can add a buffer for every video frame that is in the queue. I know framebuffer is used for visual effects. can it be possible to use for texture(images or frames of videos) so it can update texture very quickly rather than going to another function to get pixels values?

I am stuck at this level.

What I have tried:

I tried a couple of things such FFmpeg terminal way
for example, 
<pre>ffmpeg -i "Background - Australia -" -i "Background - Wales -" -i "Rising" -i "Shield -" -i "Shield -" -filter_complex '[0:v][1:v]overlay=x=0:y=0[a],[a][2:v]overlay=x=0:y=0[b],[3:v]scale=-1:800[c],[4:v]scale=-1:800[d],[b][c]overlay=x=0:y=0[e],[e][d]overlay=x=W-w:y=0[f]' -map '[f]'

Framebuffer could be possible( i am not sure)
trying use ffmpeg filter cmd in code. but lack of advice.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900