I need some advice regarding bitrates for video encoding.
I'll be recording some 1280x720@50fps videos from my CPC soon, and encode them into blu-ray m2ts video files.
For these m2ts files I'm trying to find the optimal bitrate to use. I.e. one that is as low as it can be without damaging quality of the video.
The socalled Kush Gauge formula looks like this:
Bitrate (kbps) = Width * Height * Framerate * Motionfactor * 0,07 / 1000
The Motionfactor can be 1 or 2 or 4 depending on how much is happening in the video.
I don't know what the 0,07 represents in the Kush Gauge formula. Compression maybe?
Anyway, for my video it'll then be:
1280 * 720 * 50 * 1 * 0,07 / 1000 = 3225,6 kbps
I think it's safe to say that most games on the CPC doesn't have a whole lot of motion though. Most of the screen doesn't change, so maybe a motion-factor of 1 is too high. Maybe it should be 0,5 or even 0,25.
And sure enough. Even a bitrate as low as 500 kbps seems to produce a decent result. It would be nice to find some mathematical formula to explain this though.
Like, an explanation could be: "Movement only happens on about 25% of the screen, so the bitrate = 3225,6 * 0,25 = 806,4 kbps"
But where's the "bits per pixel" value in this formula? I'm assuming 24 bits is used per pixel to indicate a colour.
So one uncompressed full frame consists of 1280*720 pixels * 24bpp /1000 = 22118,4 kbits.
Adding the 0,07 variable (assuming that's the compression factor) I get that one full frame requires roughly 1548,29 kbit of data.
With my limited knowledge, that logically means that the bitrate shouldn't be lower than 1548,29 kbps, because it should be able to send that amount in one frame, right? (I'm thinking keyframes here).
In that case, 806,4 kbps isn't enough. I should probably stick with 1600 kbps or above.
But all of this a purely speculation and guesses.
Does anyone know more? Am I way off? Do you know of other calculation methods for finding a good bitrate?
Thanks