Finding The Right Bitrate for Packet Transmission With H264 Encoder

Hi everyone,

I have the following issue. I am trying to write a program which should simulate the encoding of a video file through an IP-based packet switched
network with a H.264 encoder. The available bitrates are 100kbps and 10 in 100 packets are corruupted due to errors or loss. Assuming the
packets have the same length and packets which are corrupted have to be retransmitted. To encode the file, I would be varying the QP and the FrameSkip to vary the frame rate.

I thought I had to make the bitrate be less than 90kbps of the available bandwidth since (1-(10/100))*100 = 90% so 0.9*100kbps = 90 kbps but supposedly I am not taking package loss retransmission into account so my value of 90 kbps is not
right. Could someone explain me what the true bitrate should be ? Thanks in advance.
is this a true simulation or a hard-coded exercise?
the difference is that 10% packet loss, you can lose 50 packets in a row, and then push 100% successfully for a while. A hard coded engine can fail 10 out of 100 every single group of 100 perfectly.

I am not sure what is wanted here. your bitrate should use 100% of the bandwidth, so 100 k/s unless otherwise directed (hogging the line may be bad behavior IRL). If you lose a packet, you can either drop a frame or encode a different frame to a higher compression to make up for it, but this is kind of nonsense because video frames compress *VERY* differently from each other and its not how anyone would solve the problem. Video streamers send faster than the frame rate so when they lose a packet/frame they have a buffer ahead of the problem and keep showing video to the consumer while refilling the buffer … its the old differential equation of you have a keg with a hole in the bottom and a hose in the top... how fast do you need to fill the keg to keep the flow from the hole steady? You need to be pro-active, not re-active, so the user does not know there was an issue....

that said, I guess you are sort of buffering by sending lower quality and dropping frames so that when you have a problem, you are still on target to display without any glitches. But you still want to send at max bandwidth, not reduced, right?! On the encoder side, you want to adjust to produce it packet-wise.. you want to produce 9 packets over the same amount of time that you would have produced 10. So you produce 9 packets and the networking sends 10 because 1 was a repeat. That works if its not adaptive, assuming a fixed failure rate not a cluster of fails or % of fails.
Last edited on
your bitrate should use 100% of the bandwidth
But if 10% of packets are lost, then the bandwidth is not 100 kbps.

By my reckoning, 90 kbps is the right value. If you transmit 90,000 bits, you'll have to retransmit 9,000 bits, then again 900 bits, then 90, then 9. So your usable data was only 90 kilobits, but your actual data was 99,999 bits.
Topic archived. No new replies allowed.