You are generally correct.
Antenna tuning is primarily about maximizing radiation for the transmission frequency, which is important because power emitted by the final transmitter amplifier stage that isn't radiated or consumed by ohmic resistance will reflect back into the amplifier, causing (potentially catastrophic) heating of the components.
This only matters, however, when significant power is being emitted -- and when receiving only, the transceiver shouldn't emit any power (or only a tiny fraction of a milliwatt), at least for a well designed set.
Therefore, antenna tuning matters almost not at all in receive mode; all you care about is that the antenna produces an RF voltage at the receiver terminals for the receiver to amplify, tune, detect, decode, etc. to eventually produce sounds of either Morse or voice. While impedance matching the antenna (which is what you're really doing when you "tune" to a given frequency) will improve the received signal strength (at best, maybe double it), with most modern receiver designs it's unnecessary. It might be helpful for things like a crystal set, or extreme DX, but the presumption here is that you just want to be sure the receive section of your freshly built radio works as it should.
From my personal experience, my Heathkit SB-102 receives fine with an FM antenna from an old stereo, on bands from 80 m up to 10 m (providing there is signal present), just as my old Hallicrafters S-120 multi-band receiver does. It wouldn't be a good idea to key up 100 W output to that antenna on 80 m, however; I'd be likely to damage one of my 6146 tubes or other components connected to them.