How NASA Broadcast Neil Armstrong Live from the Moon
Two-way voice communications, uplinked data, and downlinked telemetry were done using ultra high frequency (UHF) and very high frequency (VHF) systems while tracking was achieved with a C-band beacon on the spacecraft interrogated by ground-based radar. The system worked on simpler missions, but Apollo would be going much farther than Earth orbit, and with
three men working in two spacecraft that would be operating simultaneously and sending down live television images, NASA needed a new way to uplink and downlink more data.
The solution was called Unified S-band or USB. It combined tracking, ranging, command, voice and television data into a single antenna. Voice and biomedical data were transmitted on a 1.25 MHz FM subcarrier, telemetry was done on a 1.024 MHz bi-phase modulated subcarrier, and the two spacecraft — the command and lunar modules — would use a pseudo-random ranging code using a common phase-modulated S-band downlink frequency of 2287.5 MHz for the CSM and 2282.5 MHz for the LM. In short, every type of information traveling between the ground and a Moon-bound spacecraft had its place. Except for the television broadcast.
The right camera transmitted colour broadcasts from the Apollo 11 command module while the left camera broadcast the first live video of Apollo 11 astronauts walking on the Moon.
To free up space for a television downlink from the lunar module, NASA removed the ranging code and changed the modulation from phase to frequency. This freed up 700 kHz of bandwidth for a television downlink on the USB signal. The problem was that this wasn't enough bandwidth for the standard video camera of the day that transmitted 525 scan lines of data at 30 frames per second at 5 MHz. Instead, NASA would need a slow-scan camera optimized for a smaller format, 320 scan lines of data at 10 frames per second that could be transmitted at just 500 kHz.
The signal was sent from the LM’s antenna to the tracking stations at Goldstone, Honeysuckle Creek near Canberra, and the Parkes Radio Astronomy Site in New South Wales, Australia. NASA used a scan converter to adapt the image to a broadcast standard format of 525 scan lines at the higher 30 fps rate. Then, the tracking stations transmitted the signals by microwaves to Intelsat communications satellites and AT&T landlines to Mission Control in Houston at which point they were broadcast to the world. The translation process left the image significantly degraded, but it was still live footage of man’s first steps on the Moon.
We only got five channels, two of em were uhf, and you had to carefully tweek the " fine tune" knob to see em through the snow...and yeah, about them live TV signals from the moon...considering signal loss and the extreme distance...how did they power the transmitter to deliver that signal way back then with what they had for battery technology? that is the question no one anticipated of being asked.
Oh , yeah, that was top secret

wink wink nod nod