Question Hexadecimal vs integers

Badger101

Member
Joined
Dec 14, 2020
Messages
20
Programming Experience
Beginner
So I have a question if I'm making a project with a lot of integer variable,s is it better to write int num = 6; or int num = 0x0006
Is there a performance different between the two will it run faster with hexadecimal,s or slower?
 
It is not "vs", it just a coding notation, both decimal and hexadecimal (and binary) notation is translated to same int number by compiler. You can see that in your code also, it is just an int variable in all cases:
C#:
var i1 = 1;
var i2 = 0x1;
var i3 = 0b1;
 
It is not "vs", it just a coding notation, both decimal and hexadecimal (and binary) notation is translated to same int number by compiler. You can see that in your code also, it is just an int variable in all cases:
C#:
var i1 = 1;
var i2 = 0x1;
var i3 = 0b1;
Thanks I wasn't sure if there was a difference between them
 
It's a case of hexadecimal vs decimal, not hexadecimal vs integer. As suggested, hexadecimal and decimal are just ways of displaying integers with different bases: 16 or 10.

In the vast majority of cases, you'll use decimal. Hexadecimal is most often used for constants for API calls. If you have a situation where hexadecimal notation adds clarity then use hexadecimal, but such cases will be rare.
 
Back
Top Bottom