9fingers
Established Member
Up to a point XY you are correct. Digital coding schemes are chosen to tolerate a degree of interference especially interference that they cause themselves from digital signals in the adjacent channel. But they fail in a digital on/off manner as well.
You may recall from analogue TV days that porr signals caused an ever mors 'snowy' picture and even when the picture was not viewable, you could still see outlines and things moving about. On digital TV the picture stays 99.9% perfect as signal levels drop (or interfence levels rise) and then collapse dramatically into coarse pixellation blocks as the digital decoder gives up and freezes. It tries to compare the previous good picture with the incoming data stream and gives a software estimate of what the next frame should be but after a while the new picture is so far removed from the last decent one it gives up until it can make sense of the data stream once more.
hth
Bob
You may recall from analogue TV days that porr signals caused an ever mors 'snowy' picture and even when the picture was not viewable, you could still see outlines and things moving about. On digital TV the picture stays 99.9% perfect as signal levels drop (or interfence levels rise) and then collapse dramatically into coarse pixellation blocks as the digital decoder gives up and freezes. It tries to compare the previous good picture with the incoming data stream and gives a software estimate of what the next frame should be but after a while the new picture is so far removed from the last decent one it gives up until it can make sense of the data stream once more.
hth
Bob