For most of us today, television static is a thing of the past.
With digital tuners, HDMI connections and flat screen TVs, even if there is interference,
we don't get a chance to see it, as the television circuity boldly saves us from this ordeal.
Early TVs adopting digital tuners would actually spoof this snowy noise themselves, so as not
to disturb people, but otherwise we get a blank screen, maybe a message hovering about,
politely informing us of the lack in coherent signal.
Thankfully the message never, ever, directly hits the corner.
If it did, well, I hate to imagine the consequences....
*EXPLOSION*
But people like me, and probably you, still have to deal with garbled screens of static
on a regular basis, especially for older consoles which only provide an RF output, and even
then, it's non uncommon to get quite a noisy picture.
But given we're using colour TVs for the most part, producing a colour image, why on earth
is the TV static only black and white?
Well, let's start with fundamentals.
Why do we even get this static fuzz?
Well analogue televisions try to amplify any signals they receive, so if there isn't a
strong enough signal from a TV mast, or a modulator being sent into the TVs receiver
at the frequency it's currently tuned to, it will instead amplify whatever electromagnetic
signals it picks up.
These signals can originate from a number of sources including residual signals spilling
over from the big bang - cosmic microwave background radiation, man made signals buzzing
around the air, atmospheric sources, but mainly, it's Johnson noise (not that kind).... essentially
thermal RF noise generated by the components of the TV itself.
Because of the random nature of these signals, your television receiver interprets it as
as a garbled fuzz of noise, presented as a swarm of seemingly frantic ants on screen.
If it wasn't random you'd perceive patterns in this dissaray, and of course an ordered,
stronger signal would pretty much cancel out this noise altogether and present us with
a coherent image.
The reason the colour of this snow isn't as random as the patterns themselves is due to
the way televisions and transmissions have evolved.
Let's look at segment from a PAL video signal.
From left we have the end of a video scan line.
This constitutes a single line drawn across your television by an electron gun (on cathode
ray tube televisions at the least).
We then have the front porch, this is here to prevent interference between individual
lines.
Next is the horizontal sync pulse, which signifies the start of the next scan line.
This is followed by the back porch, which restores the black levels and also leads onto
the colour burst.
Now this is the important part, as it effectively tells our television how to create the colour
image.
The colourburst synchronises a subcarrier signal containing the colour data.
The encoded format of which relies on the YUV colour space, providing chrominance data
to go with the luminance value.
The chrominance data carries blue and red values, which are substracted from the luminance
figure to provide a value for green.
Given that monochrome televisions rely on the embedded luminance data to identify how
bright each part of the image is, delivering colour information outside of each visible
scanline, ensured compatibility with monochrome TV receivers, which simply ignored it - an
essential caveat during its introduction.
It also required far less bandwidth than using 3 separate signals to transmit Red, Green
and Blue components.
The colour subcarrier itself reduces bandwidth further by only providing half the vertical
resolution on every other scan line.
We don't perceive any different as our eyes see a constrasting monochrome image in more
detail than a colour one.
But the upshot is, until your TV receives and recognises this information, it's essentially
running in Black & White mode.
Each time it draws a line, it looks for the subsequent colourburst pattern, but doesn't
find it, and moves on.
It hasn't been given the information to create a colour image.
So although you may expect the random signals & fluctuations received & displayed on an
untuned television to be interpreted as a multitude of different colours, they're simply
not delivered in a fashion which the TV can decode into colour information.
At this point, the decoder is really still expecting a 1960s B&W film to be streamed
into living rooms.
So, what about the sound?
Well, in an untuned set, the sound is as random as the on screen image, with the set amplifying
all those floating no good signals.
With a tuned channel, the sound data is held on a frequency with a fixed off-set.
This is why even if you can sometimes tune into a picture spilling over from its broadcast
frequency, it may may still have no sound.
The receiver won't find the sound data until the picture is tuned to the correct frequency
and the off-set is matched up.
Now, I've been talking about the PAL system here and although there are small differences
from TV protocol to TV protocol, the reason is very much the same.
NTSC for example, works on an almost identical principle, but with a different colour subcarrier
frequency.
This is why when we play an NTSC game or video over here, we get a black and white image....
again, the TV has no idea on what colours to paint over the image.
Over in France, their SECAM protocal encoded colour signals in FM rather than relying on
a colourburst sync.
But regardless, where-ever in look, the decoders need to be able to find suitable colour information
to decode, before it can be presented to our eyes.
Không có nhận xét nào:
Đăng nhận xét