How big is a char? A short ? An int? A long?
One byte, at least two bytes, at least two bytes, and at least four bytes. Other than that, don’t count on
anything.
anything.
A char is defined as being one eight-bit byte long. That’s easy.
A short is at least two bytes long. It might be four bytes, on some machines, with some compilers. It could be even longer. An int is the “natural” size of an integer, as long as that’s at least two bytes long and at least as big as a short. On a 16-bit machine, an int is probably two bytes long. On a 32-bit machine, an int is probably four bytes long. When 64-bit machines become common, their ints will probably be eight bytes long. The operative word is “probably.” For example, the original Motorola 68000 was a hybrid 16/32-bit machine. One 68000 compiler generated either two-byte ints or four-byte ints, depending on a command-line option. A long is at least as big as an int (and thus, at least as big as a short). A long must be at least four bytes long. Compilers for 32-bit machines might make shorts, ints, and longs all be four bytes long—or they might not.
If you need some integral variable to be four bytes long, don’t assume that an int or a long will do. Instead,
have a typedef to some built-in type (one probably exists), and surround it with #ifdefs:
#ifdef FOUR_BYTE_LONG
typedef long int4;
#endif
You might use such a type if you need to write an integer variable as a stream of bytes, to a file or to a network, to be read by a different machine. (If you do, you should see the next FAQ as well.) If you need some integral variable to be two bytes long, you might be in trouble! There’s no guarantee such a beast exists. You can always squeeze a small value into a two-char array; see the next FAQ for details
.
Cross Reference:
Cross Reference:
X.6: How are 16- and 32-bit numbers stored?
XV.5. What’s the difference between big-endian and little-endian machines?
No comments:
Post a Comment