The TV will try and amplify and display any signal. Without a station, it will end up amplifying random radio noise and tiny fluctuations in the amplifier circuits themselves.
The momentary signal strength is interpreted as brightness of a spot which is rapidly scanned over the display. In this case the signal is random so every spot on the screen will be a random brightness, changing every frame.
Modern digital TVs won’t do this, because with compressed video recognizable data is needed to even attempt displaying a picture.
As for the sources of the radio noise, most of it is from electrons being jostled by heat, some from space. (Including the cosmic microwave background others have mentioned)
The electron jostling (thermal noise) is the reason the receivers on radio telescope as cooled to insanely low temperatures often with liquid helium.
This is the closest to the correct explanation. The reason televisions based on AM radio reception showed static is because of a circuit called the AGC (Automatic Gain Control) which worked like a robotic volume control. Its job is to keep the recovered video signal within a certain amplification range. As long as there was a carrier (the TV station was “on the air”), you’d see whatever the station broadcast. But when they turned off their transmitter, the signal strength would fall and the AGC would increase the amplification until what you see is white noise, mostly due to the random motion of electrons in the electronic components. We can minimize that by cooling, but it can’t be totally eliminated. Audio amplifiers often come with a “hiss” specification that tells you how much of this kind of noise you can expect at normal operating temperature.
BTW, modern digital TVs -will- show a noise picture if they lack a video muting function when no carrier is detected. I have an LG bought in 2019 that does this, and it’s hella annoying when I accidentally hit the input selection button on the remote, switching from HDMI to TV reception.
The TV will try and amplify and display any signal. Without a station, it will end up amplifying random radio noise and tiny fluctuations in the amplifier circuits themselves.
The momentary signal strength is interpreted as brightness of a spot which is rapidly scanned over the display. In this case the signal is random so every spot on the screen will be a random brightness, changing every frame.
Modern digital TVs won’t do this, because with compressed video recognizable data is needed to even attempt displaying a picture.
As for the sources of the radio noise, most of it is from electrons being jostled by heat, some from space. (Including the cosmic microwave background others have mentioned)
The electron jostling (thermal noise) is the reason the receivers on radio telescope as cooled to insanely low temperatures often with liquid helium.
This is the closest to the correct explanation. The reason televisions based on AM radio reception showed static is because of a circuit called the AGC (Automatic Gain Control) which worked like a robotic volume control. Its job is to keep the recovered video signal within a certain amplification range. As long as there was a carrier (the TV station was “on the air”), you’d see whatever the station broadcast. But when they turned off their transmitter, the signal strength would fall and the AGC would increase the amplification until what you see is white noise, mostly due to the random motion of electrons in the electronic components. We can minimize that by cooling, but it can’t be totally eliminated. Audio amplifiers often come with a “hiss” specification that tells you how much of this kind of noise you can expect at normal operating temperature.
BTW, modern digital TVs -will- show a noise picture if they lack a video muting function when no carrier is detected. I have an LG bought in 2019 that does this, and it’s hella annoying when I accidentally hit the input selection button on the remote, switching from HDMI to TV reception.