Compiling the first example may compile with warnings. e.g in gcc:
main.c: In function ‘main’:
main.c:4:15: warning: multi-character character constant [-Wmultichar]
4 | char c2 = 'ab';
| ^~~~
main.c:4:15: warning: overflow in conversion from ‘int’ to ‘char’ changes value from ‘24930’ to ‘98’ [-Woverflow]
Since:
The value of an integer character constant containing more than one character (e.g., 'ab'), [...] is implementation-defined.
The warnings also suggests the implementation behaviour: 24930 = 0x6162 where 0x62 is ASCII 'b'. So the implementation behaviour is a matter of byte-order, then assignment of an int to a char, which assigns the least significant byte.
The behaviour of the second example is no surprise - input is not assignment or initialisation, and it is not a C language behaviour, but a system I/O behaviour. Input is buffered and requesting a single character input will take the first character from the first-in-first-out (FIFO) input queue. The first character when you enter 'a' followed by 'b' is of course 'a'.
A second input request would retrieve the 'b' whereas in the initialisation example the 'b' is simply discarded by the assignment. Comparing I/O behaviour with C language assignment behaviour are not at all comparable. The input is a FIFO queue of single characters, whereas 'ab' is an int (in this implementation).