Discussion:
signed/unsigned char
Milena Constantino Caires
2005-07-20 14:45:55 UTC
Permalink
I would like to know why the following statements do not generate a
error during compiling but they do generate segmentation fault and
core dump when the program is running:


char* c = new char(1024);

or

u_char* c = new u_char(1024);

I know that the correct form of this memory allocation is "[1024]" and not
"(1024)" but sometimes it occurs by mistake.

The compiler is not supposed to complain about it???

Thanks in advance,
Milena
Eljay Love-Jensen
2005-07-20 14:57:40 UTC
Permalink
Hi Milena,
Post by Milena Constantino Caires
char* c = new char(1024);
This allocates a single character, initialized with the value 1024. (Since 1024 is bigger than what a char can hold on most systems, it will get sliced.)
Post by Milena Constantino Caires
I know that the correct form of this memory allocation is "[1024]" and not "(1024)" but sometimes it occurs by mistake.
Both forms are correct.

You can allocate an array of char.

You can allocate a single char.
Post by Milena Constantino Caires
The compiler is not supposed to complain about it???
No, there is nothing to complain about. You are allowed to allocate a single object.

HTH,
--Eljay
r***@bubblescope.net
2005-07-20 14:56:28 UTC
Permalink
Post by Milena Constantino Caires
I would like to know why the following statements do not generate a
error during compiling but they do generate segmentation fault and
char* c = new char(1024);
or
u_char* c = new u_char(1024);
I know that the correct form of this memory allocation is "[1024]" and not
"(1024)" but sometimes it occurs by mistake.
The compiler is not supposed to complain about it???
Because what you have written is that you want a new char whose value is
initalised to 1024. For example write:

To convince yourself this is true, try for example:
#include<iostream>
int main(void) {
char* c = new char(65);
std::cout << *c << std::endl;
}

Which should almost certainly print A, unless you are on a very strange
machine (65 is the ASCII value for A).

I'll admit I'm slightly suprised that the compiler doesn't warn you that
1024 is out of range for a char (on the other hand, the code is still
valid I believe if the char is unsigned).

Unfortunatly I don't know of a good way to catch these kinds of errors,
other than of course a tool like valgrind.

Chris

Loading...