Click here to Skip to main content
15,881,172 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
Is there any difference between int64_t and long data types in C? Does int64_t reduce portability of the code ?
Posted

Have a look at: C data types[^]

As you see, it all depends on the platform and the compiler, but you can rely on sizeof(long) to return the correct number of bytes for a given platform, and you can do your own typedefs, like:
#ifdef _MSC_VER
typedef unsigned char uchar;
typedef unsigned char byte;
typedef char sbyte;
typedef short int16;
typedef unsigned short uint16;
typedef unsigned short ushort;
typedef int int32;
typedef unsigned int uint32;
typedef unsigned int uint;

typedef long long int64;
typedef unsigned long long uint64;

#endif


Best regards
Espen Harlinn
 
Share this answer
 
Comments
Sergey Alexandrovich Kryukov 25-Apr-13 18:44pm    
All correct, a 5.
—SA
Espen Harlinn 25-Apr-13 18:45pm    
Thank you, Sergey :-D
The size of some basic data types is affected by the data model used by your actual compiler. With different compilers the size of your integral types may differ. For a list of data models used by compilers (LP64, ILP64, LP32, ILP32, ...) check out these sites:
http://www.unix.org/version2/whatsnew/lp64_wp.html[^]
http://msdn.microsoft.com/en-us/library/windows/desktop/aa384083%28v=vs.85%29.aspx[^]

I usually define my own fixed size data types (int8 - int64 and uint8 - uint64) with the following technique:
C++
#ifdef _MSC_VER

// using Visual C++ specific __intX types
typedef signed __int8 int8;
typedef unsigned __int8 uint8;
typedef signed __int16 int16;
...

#else

#include <stdint.h>
typedef int8_t int8;
typedef uint8_t uint8;
typedef int16_t int16;
...

#endif


Newer Visual C++ versions have an <stdint.h> include like most unix systems. It is a standard header containing int8_t and its friends so its very well portable.
Another tiny detail is that char, signed char, and unsigned char are 3 different types! char and signed char are not the same as many think but automatic cast is done in 99% of the cases. As a proof you can write 3 overloaded functions accepting these 3 different types!
 
Share this answer
 
v2
A long on some systems is 32 bits (same as an integer), the int64_t is defined as a 64 bit integer on all systems (otherwise known as a long long).

Portability may be affected using long, but using int64_t looks like it was created to increase portability.
 
Share this answer
 
Comments
Sergey Alexandrovich Kryukov 25-Apr-13 18:48pm    
Your statements about int64_t are of course correct, but, as far as I understand, types as long long are, by standard, application-dependent. So, I would not be surprised if, on some systems, long long or even long is compiled as 128-bit word; if not presently, then in near future...
—SA
Ron Beyer 25-Apr-13 19:20pm    
What I meant was that on a 32 bit system when integer and long are the same size, long long is a 64 bit integer. I'm really not sure if long long or long is ever a 128 bit integer but it is certainly plausible that it could be in the future.
H.Brydon 25-Apr-13 20:53pm    
Whether or not your statement is true now on certain systems, as SA says, you can't count on it forever or on all architectures now. The sizes of int, long, long long are implementation dependent and up to the compiler writers to define for a particular platform.

Having said that, you can probably count on an int32_t to be 32 bits and int64_t to be 64 bits on all platforms (if they are defined).

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900