The type of undefined is a para-primitive of a 'string literal' as opposed to null which is a para-primitive of a 'number primitive' - Therefore a string literal + any other JavaScript Type adds a string representation of its value in expression.
So in case you have a = undefined [which is equally the same as your let a;] declaration, and then using a + 3 is the same as writing "undefined" + 3 and evaluating which is: (string + number ) = NaN;
It is quite the contrary with the use of a proper type...: as null is a higher rank of undefined type related to undefined values of real object's it will coerce into a 0 number primitive instead, and evaluate (numerical[empty] + number) = 3, that's actually 0 + 3 = 3.
Bekim Bacaj ©
Therefore you need to make sure you declare at least the correct type variables and propositions before using them in combination and interaction with specific type values.