#include<stdio.h>
int main()
{
char *a;
char *temp ='55515';
a = &temp;
printf("%s ", a);
}
The expected output is 55515 but the actual output is 5155?
#include<stdio.h>
int main()
{
char *a;
char *temp ='55515';
a = &temp;
printf("%s ", a);
}
The expected output is 55515 but the actual output is 5155?
'55515' is a multi-character constant that is converted to int. Your platform has 32-bit ints so the MSB byte is discarded, and the resulting int is (int)0x35353135. This is then converted to a pointer to char in an implementation-defined manner. Your platform is a little-endian platform, and the char conversion retains the int value. The value of the pointer object is now laid out in memory as
temp:
| 0x35 | 0x31 | 0x35 | 0x35
or
| 0x35 | 0x31 | 0x35 | 0x35
It cannot be deducted whether or not you're using 64-bit or 32-bit platform.
Now you make another pointer to char * that points to the first byte of the pointer object, i.e. the byte 0x35, then printf this as a string with %s.
Depending on your platform the printf call has only implementation-defined behaviour or also possibly undefined behaviour - it depends on whether your pointers are 32 or 64 bits wide - if 32, then you have undefined behaviour, if 64, you're just depending on implementation-defined behaviour. All in all, not something that a strictly-conforming program would rely on.