Chaithanya
Joined: 19 Jan 2007 Posts: 3
|
Posted: Tue 23 Jan 2007, 8:11 Post subject: Testing the quality of the encoded video streams |
|
|
Hi All,
I'am a new member to this group and also I'am a beginner in Video domain.
I just need to know what are all the video parameters we need to consider when judging the quality of the encoded video streamed over TCP?
I have a Video streamer which takes the analog video from a camera through a bnc cable. The digital video from the streamer is streamed over TCP/IP which can be viewed by opening the web client.
Now if i have to test the quality of the video displayed in the client, what are all the parameters i have to consider ? and how to measure those parameters?
Should I have to do these things manually? OR is there any tool available to do these things
Is it possible to measure the video parameters by connecting a oscilloscope at the client end??
Please can anyone help me on this. Any help is very much appreciated |
|
RMN Site Admin
Joined: 04 Feb 2003 Posts: 587 Location: Lisboa, Portugal
|
Posted: Tue 23 Jan 2007, 18:00 Post subject: |
|
|
The type of connection used to transfer the video makes no difference. What matters is the codec and the encoding settings (that incudes the bitrate). For those same settings, it doesn't matter if the file is going over a TCP connection or over a SATA or SCSI or SDI connection; the data remains the same.
Of course, some encoders that compress video "on-the-fly" will monitor the speed of the connection and adjust the compression in (near) real-time. But, in that situation, the quality will depend on the line conditions anyway, so you can't judge what it will be like under different conditions (e.x., if you do your testing while the line is almost free, it might look great, but when the line gest busy it'll look much worse, or vice-versa).
Either way, quality is a very subjective concept, it's not something that you can define with a number. For example, is it better to have 15 very sharp frames per second or 30 slightly blurry ones? Is it better to have great contrast but lots of noise, or no noise at all but poor contrast? Different codecs make different tradeoffs, and some people might prefer one over another.
You can try compositing the original and compressed streams (using an editing or animation program) in "subtract" mode; that will give you an idea of how different they are (pixels that are identical will appear black, pixels that changed will appear in colour). Of course, then it's still a matter of opinion which "differences" are more important (ex., is it more important to have faithful gradients or is it more important to have faithful outlines, etc.).
All things considered, the best "system" out there to judge image quality is a human brain (preferably one with some experience in video compression, but judging quality is simpler than achieving it). Just look at the original image, then at the compressed image (if possible flip between the two), and see what each codec or compression setting does.
Specifically, look at things such as contrast (some codecs turn white into light gray and black into dark gray), edge noise (blocks or "mosquitos" around the edges of moving objects), grain, banding (non-continuous gradients), overall sharpenss, resolution, and frame rate.
RMN
~~~ |
|