Discussion:
16-bit int
Göran Steen
2012-08-09 11:26:17 UTC
Permalink
Hi!

I use gcc version 4.3.4, where int is 32-bit variables. Is it possible to setup compiler to compile int as 16-bit variables?

With best regards / Med vänlig hälsning / Mit freundlichen Grüßen / Saudações

Göran Steen
Senior Software Developer
EIS Mälardalen
 
EIS by Semcon AB
Patentgatan 8
112 67 STOCKHOLM
Sweden
 
Phone 08-56290697
Mobile 073-6840004
E-mail ***@eis.semcon.com
 
www.semcon.com
www.facebook.com/semcon
David Brown
2012-08-09 13:37:05 UTC
Permalink
Post by Göran Steen
Hi!
I use gcc version 4.3.4, where int is 32-bit variables. Is it
possible to setup compiler to compile int as 16-bit variables?
With best regards / Med vänlig hälsning / Mit freundlichen Grüßen / Saudações
Göran Steen
The size of an int depends on the target - gcc supports dozens of
targets. Most have 32-bit ints, but some have 16-bit ints and at least
one has a compile-time option to support 8-bit ints (though that goes
against C standards, and is deprecated on current builds). There are
probably also targets with 64-bit ints.

So step one in asking for help here is to tell us your target.

Step two is to tell us what you are hoping to achieve. Almost
certainly, there is no way to change the int size - and even if there
happens to be a command-line switch for the given target, it is probably
not a good idea (you'll get in a horrible mess with library
compatibility, for example). And even if it is possible, it is highly
unlikely to be advantageous. Tell us what you really want to achieve
here, and people can give you advice towards that.

mvh.,

David
Vincent Lefevre
2012-08-09 18:52:10 UTC
Permalink
The size of an int depends on the target - gcc supports dozens of targets.
Most have 32-bit ints, but some have 16-bit ints and at least one has a
compile-time option to support 8-bit ints (though that goes against C
standards, and is deprecated on current builds). There are probably also
targets with 64-bit ints.
Couldn't the list of supported targets, with type size and other
similar information, be available on some web page?
--
Vincent Lefèvre <***@vinc17.net> - Web: <http://www.vinc17.net/>
100% accessible validated (X)HTML - Blog: <http://www.vinc17.net/blog/>
Work: CR INRIA - computer arithmetic / AriC project (LIP, ENS-Lyon)
David Brown
2012-08-09 19:58:36 UTC
Permalink
Post by Vincent Lefevre
The size of an int depends on the target - gcc supports dozens of targets.
Most have 32-bit ints, but some have 16-bit ints and at least one has a
compile-time option to support 8-bit ints (though that goes against C
standards, and is deprecated on current builds). There are probably also
targets with 64-bit ints.
Couldn't the list of supported targets, with type size and other
similar information, be available on some web page?
There is probably some list in the gcc internals documents, and of
course you could see it in the source configuration files if you want to
go there. But you can get a fair idea from the manual on the
target-specific options, as every target will have a few options of
their own:

<http://gcc.gnu.org/onlinedocs/gcc/Submodel-Options.html>

This doesn't list the sizes of int or other target-specific information.
However, usually you know what target processor you are going to use,
and usually you know the basic size of the processor (if not, then you
really should find out before starting to use it!).

There are also a number of out-of-tree gcc ports.
Vincent Lefevre
2012-08-09 22:38:52 UTC
Permalink
There is probably some list in the gcc internals documents, and of course
you could see it in the source configuration files if you want to go there.
But you can get a fair idea from the manual on the target-specific options,
<http://gcc.gnu.org/onlinedocs/gcc/Submodel-Options.html>
This doesn't list the sizes of int or other target-specific information.
However, usually you know what target processor you are going to use, and
usually you know the basic size of the processor (if not, then you really
should find out before starting to use it!).
There are many processors I don't know. What I'm interested in is
information like: is there any target that has some given int type
size?
There are also a number of out-of-tree gcc ports.
Similar information on them would be interesting too.
--
Vincent Lefèvre <***@vinc17.net> - Web: <http://www.vinc17.net/>
100% accessible validated (X)HTML - Blog: <http://www.vinc17.net/blog/>
Work: CR INRIA - computer arithmetic / AriC project (LIP, ENS-Lyon)
Ian Lance Taylor
2012-08-09 22:51:44 UTC
Permalink
Post by Vincent Lefevre
There are many processors I don't know. What I'm interested in is
information like: is there any target that has some given int type
size?
In the GCC source code:

grep INT_TYPE_SIZE gcc/config/*/*.h
Post by Vincent Lefevre
Post by David Brown
There are also a number of out-of-tree gcc ports.
Similar information on them would be interesting too.
Pretty hard to come by, though. Many out-of-tree ports are
out-of-tree for a reason.

Ian
Jose-Marcio Martins da Cruz
2012-08-10 06:52:47 UTC
Permalink
Post by Ian Lance Taylor
Post by Vincent Lefevre
There are many processors I don't know. What I'm interested in is
information like: is there any target that has some given int type
size?
grep INT_TYPE_SIZE gcc/config/*/*.h
Post by Vincent Lefevre
Post by David Brown
There are also a number of out-of-tree gcc ports.
Similar information on them would be interesting too.
Pretty hard to come by, though. Many out-of-tree ports are
out-of-tree for a reason.
Usually, a regular compiler user just needs to know the size of int on the particular
architecture he's working on, not on every processor/OS existing on the wild. So, the
simpler is just to run a program like :

#include <stdio.h>
int main()
{
printf("sizeof int = %d\n", sizeof(int));
}


If someone needs, for some valid reason, that the integer variables be 16-bit wide, the
good practice to write clean, portable and maintainable code is to use "int16_t" integer
type instead of just "int".

JM
--
Envoyé de ma machine à écrire.
---------------------------------------------------------------
Vincent Lefevre
2012-08-11 10:14:31 UTC
Permalink
Post by Ian Lance Taylor
Post by Vincent Lefevre
There are many processors I don't know. What I'm interested in is
information like: is there any target that has some given int type
size?
grep INT_TYPE_SIZE gcc/config/*/*.h
Thanks. Actually

grep 'define[[:space:]]*INT_TYPE_SIZE' gcc/config/*/*.h

to get exactly what I need. And

grep 'define[[:space:]]*LONG_TYPE_SIZE' gcc/config/*/*.h
grep 'define[[:space:]]*LONG_LONG_TYPE_SIZE' gcc/config/*/*.h

for the long type and long long type respectively. So, I can see that
one just has: 16/32/64, 32/32/64 and 32/64/64.
--
Vincent Lefèvre <***@vinc17.net> - Web: <http://www.vinc17.net/>
100% accessible validated (X)HTML - Blog: <http://www.vinc17.net/blog/>
Work: CR INRIA - computer arithmetic / AriC project (LIP, ENS-Lyon)
Georg-Johann Lay
2012-08-10 06:45:43 UTC
Permalink
Post by David Brown
Post by Göran Steen
I use gcc version 4.3.4, where int is 32-bit variables. Is it
possible to setup compiler to compile int as 16-bit variables?
The m86k port has 32-bit int, -mshort turns it to 16 bit.
Post by David Brown
The size of an int depends on the target - gcc supports dozens of
targets. Most have 32-bit ints, but some have 16-bit ints and at least
one has a compile-time option to support 8-bit ints (though that goes
against C standards, and is deprecated on current builds). There are
probably also targets with 64-bit ints.
You mean the avr port? int is 16 bits wide and -mint8 implements the
non-standard 8-bit int. That option was broken for quite some time,
see PR46261. But that PR is fixed now.

Where in the release notes did you read about deprecation of -mint8?

Johann
David Brown
2012-08-10 08:07:40 UTC
Permalink
Post by Georg-Johann Lay
Post by David Brown
Post by Göran Steen
I use gcc version 4.3.4, where int is 32-bit variables. Is it
possible to setup compiler to compile int as 16-bit variables?
The m86k port has 32-bit int, -mshort turns it to 16 bit.
Post by David Brown
The size of an int depends on the target - gcc supports dozens of
targets. Most have 32-bit ints, but some have 16-bit ints and at
least one has a compile-time option to support 8-bit ints (though that
goes against C standards, and is deprecated on current builds). There
are probably also targets with 64-bit ints.
You mean the avr port? int is 16 bits wide and -mint8 implements the
non-standard 8-bit int. That option was broken for quite some time,
see PR46261. But that PR is fixed now.
Where in the release notes did you read about deprecation of -mint8?
I thought -mint8 had been broken for so long, and that there were no
plans to fix it or support it in avrlibc, especially as the main reason
for its existence (inefficient code generated for uint8_t and int8_t
with normal 16-bit ints) is much less of a problem with newer versions
of avr-gcc.

I guess I incorrectly extrapolated from this that the option was
deprecated - I certainly hadn't noticed that the PR had been fixed.

Thanks for the correction.
Paulo J. Matos
2012-08-09 17:14:03 UTC
Permalink
Post by Göran Steen
Hi!
I use gcc version 4.3.4, where int is 32-bit variables. Is it possible to setup compiler to compile int as 16-bit variables?
With best regards / Med vänlig hälsning / Mit freundlichen Grüßen / Saudações
David answered your question as it came across... however if I
re-interpret your question as wanting an int type that is 16bits, then
you can try int16_t from stdint.h.

Cheers,
--
PMatos
Göran Steen
2012-08-10 07:15:46 UTC
Permalink
Thank you for your response.

I want to make sure that my code snippets, if they are compiled and run on a machine with 16-bit int, still works. I don't have access to such a machine, so I want to compile and run them with 16-bit int on my machine that defaults to 32-bit int. Especially the intermediate results' size are interesting. What happens if they are truncated to 16-bit?

There will not be any problem with library compatibility.

BR /göran

-----Original Message-----
From: David Brown [mailto:***@westcontrol.com]
Sent: den 9 augusti 2012 15:37
To: Göran Steen
Cc: gcc-***@gcc.gnu.org
Subject: Re: 16-bit int
Post by Göran Steen
Hi!
I use gcc version 4.3.4, where int is 32-bit variables. Is it possible
to setup compiler to compile int as 16-bit variables?
With best regards / Med vänlig hälsning / Mit freundlichen Grüßen /
Saudações
Göran Steen
The size of an int depends on the target - gcc supports dozens of targets. Most have 32-bit ints, but some have 16-bit ints and at least one has a compile-time option to support 8-bit ints (though that goes against C standards, and is deprecated on current builds). There are probably also targets with 64-bit ints.

So step one in asking for help here is to tell us your target.

Step two is to tell us what you are hoping to achieve. Almost certainly, there is no way to change the int size - and even if there happens to be a command-line switch for the given target, it is probably not a good idea (you'll get in a horrible mess with library compatibility, for example). And even if it is possible, it is highly unlikely to be advantageous. Tell us what you really want to achieve here, and people can give you advice towards that.

mvh.,

David
David Brown
2012-08-10 08:12:05 UTC
Permalink
Post by Göran Steen
Thank you for your response.
I want to make sure that my code snippets, if they are compiled and
run on a machine with 16-bit int, still works. I don't have access to
such a machine, so I want to compile and run them with 16-bit int on
my machine that defaults to 32-bit int. Especially the intermediate
results' size are interesting. What happens if they are truncated to
16-bit?
The answer here is simple - #include <stdint.h>, and use types like
int_fast16_t. If intermediary results might need 32 bits, cast to
int_fast32_t as needed. On a target with 32-bit ints, both these types
will normally be 32-bit (though int_fast16_t could theoretically be
16-bit), and the cast will do nothing. On a target with 16-bit ints,
you will get 16-bit and 32-bit ints respectively.

This will give you optimal code for all sizes of target ints, while also
being correct on each target.

mvh.,

David
Post by Göran Steen
There will not be any problem with library compatibility.
BR /göran
-----Original Message----- From: David Brown
Post by Göran Steen
Hi!
I use gcc version 4.3.4, where int is 32-bit variables. Is it
possible to setup compiler to compile int as 16-bit variables?
With best regards / Med vänlig hälsning / Mit freundlichen Grüßen
/ Saudações
Göran Steen
The size of an int depends on the target - gcc supports dozens of
targets. Most have 32-bit ints, but some have 16-bit ints and at
least one has a compile-time option to support 8-bit ints (though
that goes against C standards, and is deprecated on current builds).
There are probably also targets with 64-bit ints.
So step one in asking for help here is to tell us your target.
Step two is to tell us what you are hoping to achieve. Almost
certainly, there is no way to change the int size - and even if there
happens to be a command-line switch for the given target, it is
probably not a good idea (you'll get in a horrible mess with library
compatibility, for example). And even if it is possible, it is
highly unlikely to be advantageous. Tell us what you really want to
achieve here, and people can give you advice towards that.
mvh.,
David
Göran Steen
2012-08-10 08:35:20 UTC
Permalink
Thank you for your response.

Sorry, this will not help. I want to test the snippets, not change them. Besides, I don't know if the compiler that will be used for the 16-bit machine will support any of the types you suggest, so I only want to use standard types.

BR /göran

-----Original Message-----
From: David Brown [mailto:***@westcontrol.com]
Sent: den 10 augusti 2012 10:12
To: Göran Steen
Cc: gcc-***@gcc.gnu.org
Subject: Re: 16-bit int
Post by Göran Steen
Thank you for your response.
I want to make sure that my code snippets, if they are compiled and
run on a machine with 16-bit int, still works. I don't have access to
such a machine, so I want to compile and run them with 16-bit int on
my machine that defaults to 32-bit int. Especially the intermediate
results' size are interesting. What happens if they are truncated to
16-bit?
The answer here is simple - #include <stdint.h>, and use types like int_fast16_t. If intermediary results might need 32 bits, cast to int_fast32_t as needed. On a target with 32-bit ints, both these types will normally be 32-bit (though int_fast16_t could theoretically be 16-bit), and the cast will do nothing. On a target with 16-bit ints, you will get 16-bit and 32-bit ints respectively.

This will give you optimal code for all sizes of target ints, while also being correct on each target.

mvh.,

David
Post by Göran Steen
There will not be any problem with library compatibility.
BR /göran
-----Original Message----- From: David Brown
Post by Göran Steen
Hi!
I use gcc version 4.3.4, where int is 32-bit variables. Is it
possible to setup compiler to compile int as 16-bit variables?
With best regards / Med vänlig hälsning / Mit freundlichen Grüßen /
Saudações
Göran Steen
The size of an int depends on the target - gcc supports dozens of
targets. Most have 32-bit ints, but some have 16-bit ints and at
least one has a compile-time option to support 8-bit ints (though that
goes against C standards, and is deprecated on current builds).
There are probably also targets with 64-bit ints.
So step one in asking for help here is to tell us your target.
Step two is to tell us what you are hoping to achieve. Almost
certainly, there is no way to change the int size - and even if there
happens to be a command-line switch for the given target, it is
probably not a good idea (you'll get in a horrible mess with library
compatibility, for example). And even if it is possible, it is highly
unlikely to be advantageous. Tell us what you really want to achieve
here, and people can give you advice towards that.
mvh.,
David
David Brown
2012-08-10 08:45:15 UTC
Permalink
Could you please not top-post? It makes it very difficult to follow the
order of the thread. Remember that some people see these emails in
isolation, rather than threaded in an email client. I've re-ordered the
mail below.
Post by Göran Steen
-----Original Message----- From: David Brown
Post by Göran Steen
Thank you for your response.
I want to make sure that my code snippets, if they are compiled
and run on a machine with 16-bit int, still works. I don't have
access to such a machine, so I want to compile and run them with
16-bit int on my machine that defaults to 32-bit int. Especially
the intermediate results' size are interesting. What happens if
they are truncated to 16-bit?
The answer here is simple - #include <stdint.h>, and use types like
int_fast16_t. If intermediary results might need 32 bits, cast to
int_fast32_t as needed. On a target with 32-bit ints, both these
types will normally be 32-bit (though int_fast16_t could
theoretically be 16-bit), and the cast will do nothing. On a target
with 16-bit ints, you will get 16-bit and 32-bit ints respectively.
This will give you optimal code for all sizes of target ints, while
also being correct on each target.
mvh.,
David
Thank you for your response.
Sorry, this will not help. I want to test the snippets, not change
them. Besides, I don't know if the compiler that will be used for the
16-bit machine will support any of the types you suggest, so I only
want to use standard types.
BR /göran
If you need to test the snippets on different targets, then you need to
have compilers and targets available. You still haven't said which
target(s) you are using, but unless it is one like the M68k then there
are no compiler switches to help you.

Note that there is no point in testing anything until you have done what
you can to make sure the code is correctly written. Otherwise your
tests may work on some targets by coincidence, but not on others, or may
be sub-optimal on some targets.

Any C99 compiler - and almost all pre-C99 compilers - will have
<stdint.h>. I don't know of any compiler less than 15 years old that
doesn't come with a <stdint.h>, and many people using such compilers
have written their own <stdint.h>.

The int_fast16_t and int_fast32_t types (and the unsigned versions) are
mandatory in <stdint.h>, so you can take it for granted that /all/
compilers support them. This is unlike the fixed-size types (like
int16_t) that will be defined if and only if the target supports types
of exactly that size (some architectures don't support the smaller types).

mvh.,

David
Göran Steen
2012-08-10 09:05:13 UTC
Permalink
-----Original Message-----
From: David Brown [mailto:***@westcontrol.com]
Sent: den 10 augusti 2012 10:45
To: Göran Steen
Cc: gcc-***@gcc.gnu.org
Subject: Re: 16-bit int


Could you please not top-post? It makes it very difficult to follow the order of the thread. Remember that some people see these emails in isolation, rather than threaded in an email client. I've re-ordered the mail below.
Post by Göran Steen
-----Original Message----- From: David Brown
Post by Göran Steen
Thank you for your response.
I want to make sure that my code snippets, if they are compiled and
run on a machine with 16-bit int, still works. I don't have access to
such a machine, so I want to compile and run them with 16-bit int on
my machine that defaults to 32-bit int. Especially the intermediate
results' size are interesting. What happens if they are truncated to
16-bit?
The answer here is simple - #include <stdint.h>, and use types like
int_fast16_t. If intermediary results might need 32 bits, cast to
int_fast32_t as needed. On a target with 32-bit ints, both these
types will normally be 32-bit (though int_fast16_t could theoretically
be 16-bit), and the cast will do nothing. On a target with 16-bit
ints, you will get 16-bit and 32-bit ints respectively.
This will give you optimal code for all sizes of target ints, while
also being correct on each target.
mvh.,
David
Thank you for your response.
Sorry, this will not help. I want to test the snippets, not change > them. Besides, I don't know if the compiler that will be used for the > 16-bit machine will support any of the types you suggest, so I only > want to use standard types.
BR /göran
If you need to test the snippets on different targets, then you need to have compilers and targets available. You still haven't said which
target(s) you are using, but unless it is one like the M68k then there are no compiler switches to help you.

Note that there is no point in testing anything until you have done what you can to make sure the code is correctly written. Otherwise your tests may work on some targets by coincidence, but not on others, or may be sub-optimal on some targets.

Any C99 compiler - and almost all pre-C99 compilers - will have <stdint.h>. I don't know of any compiler less than 15 years old that doesn't come with a <stdint.h>, and many people using such compilers have written their own <stdint.h>.

The int_fast16_t and int_fast32_t types (and the unsigned versions) are mandatory in <stdint.h>, so you can take it for granted that /all/ compilers support them. This is unlike the fixed-size types (like
int16_t) that will be defined if and only if the target supports types of exactly that size (some architectures don't support the smaller types).

mvh.,

David

Thank you for your response.

I want to test the snippets with 16-bit int, without having target system or compiler available. You wrote that there is no compiler switch to help me. Thank you for that answer.

BR /göran
David Brown
2012-08-10 09:10:18 UTC
Permalink
Post by Göran Steen
-----Original Message----- From: David Brown
Could you please not top-post? It makes it very difficult to follow
the order of the thread. Remember that some people see these emails
in isolation, rather than threaded in an email client. I've
re-ordered the mail below.
Post by Göran Steen
-----Original Message----- From: David Brown
Post by Göran Steen
Thank you for your response.
I want to make sure that my code snippets, if they are compiled
and run on a machine with 16-bit int, still works. I don't have
access to such a machine, so I want to compile and run them with
16-bit int on my machine that defaults to 32-bit int. Especially
the intermediate results' size are interesting. What happens if
they are truncated to 16-bit?
The answer here is simple - #include <stdint.h>, and use types
like int_fast16_t. If intermediary results might need 32 bits,
cast to int_fast32_t as needed. On a target with 32-bit ints, both
these types will normally be 32-bit (though int_fast16_t could
theoretically be 16-bit), and the cast will do nothing. On a
target with 16-bit ints, you will get 16-bit and 32-bit ints
respectively.
This will give you optimal code for all sizes of target ints,
while also being correct on each target.
mvh.,
David
Thank you for your response.
Sorry, this will not help. I want to test the snippets, not change
Post by Göran Steen
them. Besides, I don't know if the compiler that will be used for
the > 16-bit machine will support any of the types you suggest, so
I only > want to use standard types.
BR /göran
If you need to test the snippets on different targets, then you need
to have compilers and targets available. You still haven't said
which target(s) you are using, but unless it is one like the M68k
then there are no compiler switches to help you.
Note that there is no point in testing anything until you have done
what you can to make sure the code is correctly written. Otherwise
your tests may work on some targets by coincidence, but not on
others, or may be sub-optimal on some targets.
Any C99 compiler - and almost all pre-C99 compilers - will have
<stdint.h>. I don't know of any compiler less than 15 years old that
doesn't come with a <stdint.h>, and many people using such compilers
have written their own <stdint.h>.
The int_fast16_t and int_fast32_t types (and the unsigned versions)
are mandatory in <stdint.h>, so you can take it for granted that
/all/ compilers support them. This is unlike the fixed-size types
(like int16_t) that will be defined if and only if the target
supports types of exactly that size (some architectures don't support
the smaller types).
mvh.,
David
Thank you for your response.
I want to test the snippets with 16-bit int, without having target
system or compiler available. You wrote that there is no compiler
switch to help me. Thank you for that answer.
BR /göran
Note that even though gcc doesn't have such a switch for many targets,
there may be other 16-bit C compilers available that you can test with.
You /still/ haven't given any information about your systems - either
the targets you are aiming for, or the systems you are using for
testing. But I am guessing (from your email headers) that you are using
some sort of Windows system - it is quite possible that you can get a
16-bit DOS compiler to run for your testing.

You can also test by adding some "int16_t" casts into your code - that
will cause values to be truncated explicitly to 16 bits.

I know you don't want to change the code, but there is no way around
this - if you want to write code that is portable across different
integer sizes, you have to write code in the correct way. Bad code that
happens to test okay on one system is still bad code.

And you really should find some way to test the code on real targets -
otherwise you are just guessing.

mvh.,

David
Vincent Lefevre
2012-08-11 10:47:26 UTC
Permalink
Post by David Brown
Any C99 compiler - and almost all pre-C99 compilers - will have
<stdint.h>. I don't know of any compiler less than 15 years old that
doesn't come with a <stdint.h>, and many people using such compilers
have written their own <stdint.h>.
I thought that Microsoft's compiler didn't have <stdint.h>.
I've also heard that <inttypes.h> is more common.
Post by David Brown
The int_fast16_t and int_fast32_t types (and the unsigned versions)
are mandatory in <stdint.h>, so you can take it for granted that
/all/ compilers support them. This is unlike the fixed-size types
(like int16_t) that will be defined if and only if the target
supports types of exactly that size (some architectures don't
support the smaller types).
But int_fast16_t is useless to test whether code can be affected
by 16-bit truncation on platforms for which int_fast16_t is really
a 16-bit type. For tests, int16_t is necessary. Now the user may
want to know what targets provide this type.
--
Vincent Lefèvre <***@vinc17.net> - Web: <http://www.vinc17.net/>
100% accessible validated (X)HTML - Blog: <http://www.vinc17.net/blog/>
Work: CR INRIA - computer arithmetic / AriC project (LIP, ENS-Lyon)
Ian Lance Taylor
2012-08-11 15:32:05 UTC
Permalink
Post by Vincent Lefevre
Post by David Brown
Any C99 compiler - and almost all pre-C99 compilers - will have
<stdint.h>. I don't know of any compiler less than 15 years old that
doesn't come with a <stdint.h>, and many people using such compilers
have written their own <stdint.h>.
I thought that Microsoft's compiler didn't have <stdint.h>.
I've also heard that <inttypes.h> is more common.
It's true that <inttypes.h> is older than <stdint.h>. But as Göran
said, <stdint.h> is in C99, a standard that was released 13 years ago.
I have to assume that even Microsoft compilers support <stdint.h> by
now.

Ian
Tim Prince
2012-08-11 16:28:25 UTC
Permalink
Post by Ian Lance Taylor
Post by Vincent Lefevre
Post by David Brown
Any C99 compiler - and almost all pre-C99 compilers - will have
<stdint.h>. I don't know of any compiler less than 15 years old that
doesn't come with a <stdint.h>, and many people using such compilers
have written their own <stdint.h>.
I thought that Microsoft's compiler didn't have <stdint.h>.
I've also heard that <inttypes.h> is more common.
It's true that <inttypes.h> is older than <stdint.h>. But as Göran
said, <stdint.h> is in C99, a standard that was released 13 years ago.
I have to assume that even Microsoft compilers support <stdint.h> by
now.
Ian
I haven't been able to find an authoritative source, but someone said
recently that Microsoft had stated they would some day support those C99
features which are common with C++ (VS2010 doesn't, and VS2012 isn't
widely available, but I don't hold my breath).
Microsoft never committed to bring C99 support to the level of any other
specified compiler, nor did other vendors of Microsoft-compatible
compilers commit to the same level as on linux, for example.
For example, I haven't found a way to engage the __restrict extension of
MSVC to emulate (even to the documented extent) the corresponding
feature of C99.
VS2010 has stdint.h (at least at current service pack level), but not
inttypes.h. VS2008 doesn't, and it's still under full support and in
wide professional use. As previously mentioned, some use open source
replacements, even though that is antithetical to the Microsoft proposition.
--
Tim Prince
David Brown
2012-08-13 09:25:29 UTC
Permalink
Post by Vincent Lefevre
Post by David Brown
Any C99 compiler - and almost all pre-C99 compilers - will have
<stdint.h>. I don't know of any compiler less than 15 years old that
doesn't come with a <stdint.h>, and many people using such compilers
have written their own <stdint.h>.
I thought that Microsoft's compiler didn't have <stdint.h>.
I've also heard that <inttypes.h> is more common.
Post by David Brown
The int_fast16_t and int_fast32_t types (and the unsigned versions)
are mandatory in <stdint.h>, so you can take it for granted that
/all/ compilers support them. This is unlike the fixed-size types
(like int16_t) that will be defined if and only if the target
supports types of exactly that size (some architectures don't
support the smaller types).
But int_fast16_t is useless to test whether code can be affected
by 16-bit truncation on platforms for which int_fast16_t is really
a 16-bit type. For tests, int16_t is necessary. Now the user may
want to know what targets provide this type.
The use of "int_fast16_t" was to get correct and optimal code, whether
the target is 16-bit or 32-bit. You are right that this will not test
whether the code will work on 16-bit targets if it is compiled on a
32-bit target - using int16_t will help more for that (but it won't give
guarantees, unless you are absolutely sure there are no hidden int
promotions).

Jonathan Wakely
2012-08-10 09:21:40 UTC
Permalink
Post by Göran Steen
Thank you for your response.
Sorry, this will not help. I want to test the snippets, not change them. Besides, I don't know if the compiler that will be used for the 16-bit machine will support any of the types you suggest, so I only want to use standard types.
Those are standard types.

But the bottom line is that on most targets you can't change the size
of int. You still haven't said what target you're using, but chances
are it's not possible. You could look into running a simulator for a
target with 16-bit int instead.
Vincent Lefevre
2012-08-11 10:32:57 UTC
Permalink
Post by David Brown
Post by Göran Steen
I want to make sure that my code snippets, if they are compiled and
run on a machine with 16-bit int, still works.
This is what I wish to do too.
Post by David Brown
Post by Göran Steen
I don't have access to such a machine, so I want to compile and run
them with 16-bit int on my machine that defaults to 32-bit int.
Note that if your code snippets use functions from the C library, this
won't work.
Post by David Brown
Post by Göran Steen
Especially the intermediate results' size are interesting. What
happens if they are truncated to 16-bit?
The answer here is simple - #include <stdint.h>, and use types like
int_fast16_t.
Actually you need to use your *own* type, say my_int16_t, and typedef
it to int_fast16_t for the normal use of your code when <stdint.h> is
available, to int for the normal use of your code when <stdint.h> is
not available, and to int16_t (not int_fast16_t) for the tests.

Now if you want to also test code with calls to the C library, you'll
need to find a target with 16-bit int's to test everything.
--
Vincent Lefèvre <***@vinc17.net> - Web: <http://www.vinc17.net/>
100% accessible validated (X)HTML - Blog: <http://www.vinc17.net/blog/>
Work: CR INRIA - computer arithmetic / AriC project (LIP, ENS-Lyon)
Loading...