6

Error correction coding was used for downlinked Voyager data, but why not for the uplink direction (if I'm correct in thinking it wasn't)? According to Edelson et al. 1979 "Voyager Telecommunications: The Broadcast from Jupiter" (Science, 204(4396), 913-921) the uplink bit error rate tolerance (1x10^-5) was "the most stringent BER threshold" of the Voyager design, yet I can find no mention of error correction being used for uplink. Why not? Was the reason that the Voyager computers couldn't handle the computational load of decoding, so engineers relied instead on high transmit power and low bit rate (16 bps) to keep BER acceptably low? Sincere thanks for any insights or (if possible) citeable sources.

uhoh
  • 148,791
  • 53
  • 476
  • 1,473
Larry Gilman
  • 381
  • 1
  • 4
  • 1
    i deleted it because i answered when i was travelling and didnt have the resources to put links and evidence etc. i still don't so left it. These links have more info though: https://space.stackexchange.com/q/54054/40489 and https://space.stackexchange.com/q/54055/40489 . The focus is on Voyager's encoding but covers your question too. – blobbymcblobby Oct 29 '23 at 00:51
  • @blobbymcblobby Thanks very much -- I'll look into those links. – Larry Gilman Oct 30 '23 at 22:55

0 Answers0