Looking at https://gssc.esa.int/navipedia/index.php/GPS_Navigation_Message and other websites, in order for a satellite receiver (which doesn't have other means of getting precise GPS time) to determine precise GPS time, after it's determined its position, and subsequently the distances from the satellites, it may look at the HOW word, which contains information about GPS time. (We can also assume the receiver has decoded clock corrections, ephemerides, etc., but here I'd like to focus on something much more basic.)
The ("legacy") NAV message is transmitted at around 50 bps, and let's say that translates to the receiver obtaining this HOW word every 6 seconds – because each subframe is 300 bits long, that means $\frac{300~b}{50~bps} = 6~s.$
The HOW from what I understand is a counter of 1.5-second-long periods since the start of the week (actually truncated because it's only incremented every 6 seconds, meaning the last two bits aren't needed) and let's assume these satellites start broadcasting as soon as the week starts, meaning at 0 s of the week. (We can assume the receiver knows when each week starts.)
My question is twofold, since I can't find this information anywhere:
Do satellites indeed start broadcasting as soon as the week begins (according to their internal clocks), meaning the first subframe is broadcast at exactly 0.0000000000 s of the week, the second at 6.0000000000 s, etc.?
Do satellites always, without exception, transmit HOW words at exactly those intervals, i.e. 0.0000000000 s, then 6.0000000000 s, etc.? Or can the HOW sequence sometimes be (translated to seconds) 0, 4.5, 12, 18, 22.5, etc. (as opposed to 0, 6, 12, 18, 24, ...) because maybe that 4.5 was in fact 5.9997994899 s, and the 22.5 was actually 23.9999593899 s? Or maybe they always do this 0, 6, 12, 18, 24, but really they don't start at 0.0000000000 s of the week, but rather at, say 0.0000030004 s, which means their next subframe transmission time will be at 6.0000030004 s.
I can only imagine this GPS timing thing working if satellites are programmed to send these subframes at exactly x.0000000000-second-intervals, according to satellite-internal clocks, and that the receivers are to trust that is the case.
If satellites don't emit subframes at precisely x.0000000000 s, then in my view we cannot possibly determine precise GPS time, because this 1.5(or in reality 6)-second counter that the HOW represents has too low a resolution (i.e. it has a 1.5-second resolution, as opposed to, say, milli- or micro-second resolution). Maybe this happens, but because there are multiple satellites involved (4+), the receiver sort of figures out if something like 22.5 s is encountered, and the actual time is 23.9999593899 s, that that's too big an offset because three other satellites are showing 24 s.
Just a couple of things to note here: I'm aware that there's a discrepancy between GPS time and UTC time, and between receiver and satellite clocks; I know the clocks exhibit drift from GPS time, and that due to the satellites' velocity, they're adjusted to 'tick' at a different frequency from normal GPS atomic clocks, and that there are instrument errors; I know how receiver position is obtained; I know that there's extra clock-correcting information embedded in the NAV message, precisely due to the drift, etc., of satellite clocks.