Click here to Skip to main content
15,889,931 members
Please Sign up or sign in to vote.
1.00/5 (1 vote)
Basically i wanna ask is this that size of datatype changes with language used or not.

What I have tried:

I am not able to find out this one?
Posted
Updated 9-Aug-18 3:46am

Whooo! Nope. It's not just the language, it's the environment in which the language will be run.
For example, I've seen C compilers that generate code using a 16 bit int, a 32 bit int, and if I recall correctly I've seen a 128 bit integer as well!

Modern languages - or at least frameworks - define a size for an int , normally at 32 bits as this allows for a good sized range of -2,147,483,648 to 2,147,483,647 but if it isn;t explicitly defined by the language then it is completely implementation dependant.

So "no". It's not the same.
 
Share this answer
 
Google is your friend...
Integer (computer science) - Wikipedia[^]
 
Share this answer
 
Quote:
Is the size of int is same for all languages or it varies with the language you use?

Size of int have evolved with time.
In the 80"s int was 8 bits because' processors were 8 bits.
then it was 16 bits and now it is usually 32 bits.
lets guess in future it will be 64 bits.
if you get a C program from 70's or 80's, int are expected to be 8 bits. it have changed over time.
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900