0 or Null in C++?

If I created a pointer, which value should I assign it in C++?

className *obj=0;

or
className *obj=NULL;

Please give some comments

Thanks!
Use 0. The "NULL" macro is not type-safe; if you feel that you must use "null", make it a const int instead of the C-style "#define". Also see "The C++ Programming Language" for Stroustrup's argument against the usage of "NULL".
Use whichever you like best. In C++, NULL is zero.

In some older C compilers, NULL is variously defined to some weird things, so you have to be more careful with it. You won't have that problem in C++.

I use NULL to be explicit that I'm dealing with a pointer, and no other reason.

Stroustrup doesn't like macros, and he (as someone who is concerned with how people think about language) doesn't like the idea that some people will believe that NULL is anything but an integer zero (especially when they are right because of old compilers).

In C++, NULL is zero, so there is no difference except how you like it. See http://www.research.att.com/~bs/bs_faq2.html#null


Keep in mind that when you assign or compare a pointer to zero, there is some special magic that occurrs behind the scenes to use the correct pattern for the given pointer (which may not actually be zero). This is one of the reasons why things like #define NULL (void*)0 are evil -- if you compare a char* to NULL that magic has been explictly (and probably unknowingly) turned off, and an invalid result may happen. Just to be extra clear:

(my_char_ptr == 0) != (my_char_ptr == (void*)0)

Many people are lulled into a false sense of security because on PCs and some other popular hardware a NULL pointer actually is zero, but that is not always the case. For that reason, Stroustrup has introduced the nullptr keyword in C++0x.


In short, as long as NULL is
#define NULL 0
then use whichever you like best.
I was hoping one of you would respond, because I wanted your thoughts. I always use 0, probably because I come from the 16-bit segmented memory DOS days where there were near and far pointers, and NULL far pointers were anything but 0 ... as I recall a far pointer was considered NULL if the offset was zero regardless of the segment?


Nah, it was still all zero. The x86 segmented addressing scheme is very unusual (a lot of people abhor it as an abomination).

A 20-bit address was composed of two 16-bit values (segment:offset) summed as:

(segment << 4) + offset

The offset may be zero (or NULL), meaning that you are referring to the beginning of a memory segment (with a granularity of one paragraph, or 16 bytes). However, an x86 NULL pointer is actually zero in both segment and offset --making it point to the hardware interrupt address table (located in the first 1024 bytes of memory). [The interrupts are for both hardware and software events --it is just that the location of the table itself is a hardware requirement.]

The memory itself is protected, so special instructions are required to modify it --meaning attempts to dereference a NULL pointer will fail (and generate an interrupt :-P ).

The 0:0 nature of PC pointers has caused many programmers (I was one of them) to assume that NULL pointers always have a cleared bit pattern. This is not always the case on other hardware. On other hardware, the bit pattern for a NULL int* may differ from the bit pattern for a NULL char*. In fact, pointers don't even have to be the same size. This was the case in DOS, since a FAR pointer was a pointer that included both the segment and offset parts, and a NEAR pointer was a pointer that only included the offset part (and was reliant on the current segment context [CS, DS, ES, or SS, depending on what you are dereferencing] to be used properly).

Gets messy fast, doesn't it?

What this means is that you cannot use things like memset() to initialize structures containing pointers. For example, given:
1
2
3
4
5
6
7
8
9
typedef enum { Neuter, Male, Female } Sex_t;

typedef struct
  {
  char* name;
  int   age;
  Sex_t gender;
  }
  Person_t;

Then the following is wrong:
1
2
3
4
5
6
  {
  Person_t john;
  memset( &john, 0, sizeof( Person_t ) );

  // Any code that follows may fail.
  }

The char* also needs proper initialization:
1
2
3
4
5
6
7
  {
  Person_t john;
  memset( &john, 0, sizeof( Person_t ) );
  john.name = 0;  // or NULL, whichever you like

  // Alright, everything is now good to go.
  }


:-)
I always use 0. It's shorter.

I check for valid pointers like this:
if( ptr ) ...

I never did much C/C++ programming in DOS. Mostly just Turbo Pascal, QuickBasic, and Assembler. Part of it was because my first C/C++ experience was with a 32-bit flat memory model and I was hooked immediately, so I could never go back to C/C++ in DOS even though I had Turbo C++ at the time.

Been programming in unix/C++ ever since Linux/slackware became available.
Yeah, I got my start with good old GWBASIC, and hit its limitations pretty quickly. It was such a pain to write a zillion assembly routines just to do normal stuff...

I really got hooked when my dad bought me a copy of TP4. I read the manual so much I broke the spine in several spots. :-P

I used C when playing with the graphics hardware, just because TP4 still only had the inline keyword (directly listing opcode numbers) instead of the friendlier asm whereas in C I could use the assembly mneumonics. I never much cared for C though --C++ is way cooler.

I got into Win32 programming with Delphi (well, Win3.x with D1, then I bought myself a copy of D5 and haven't looked back). Along the way I picked up a lot of other stuff, both the normal and the weird and unusual.

I picked up Qt when I installed Red Hat 8 and 9 (Caldera swore me off Linux for a couple of years). I still program mostly in Win32, but for cross-platform development I'll switch back and forth quite a lot. (And I'll tend to use Tcl/Tk. LOL.)

XD
I remember having to remove comments to save memory in some BASIC programs :)
And I remember hitting the 128K limit for QuickBASIC programs too. That was why I switched to Turbo Pascal. I was doing assembler in that time too - playing around with TSRs, device drivers, etc.

I started to learn a little MFC a few years back (after it was dead anyway) for the sake of being able to do GUI programs without having to write the GUI, but I stopped because I prefer Qt anyway (yes, there is quite a bit of anti-microsoft sentiment in me... I use Linux as much as possible).
XD, this thread has become a discussion topic! I actually started with C++, although I only really got as far as cout and cin hehe. I went to TI-BASIC and made a couple of games, although it started sucking due to lack of nameable variables/functions...so I came back to C++. I mainly work on Windows (have only sampled Linux in a tech class).

I was thinking of installing Linux on my external HD and using it to play around, so I wondering if anyone had a suggestion as to which one I should play around with first?
Puppy Linux.


I think it is a good thing to learn as much as you can about all kinds of programming systems and methodologies. I know my abilities improved significantly after I learned to think in terms of functional programming (Scheme and Haskell, mostly. I really like PLT Scheme. I'm not a big fan of Haskell yet).

Using different hardware taught me the danger of assuming stuff about NULL.
I used to run RedHat/Fedora, but after I built my last PC, the kernel used in FC9 didn't support my motherboard so I switched to Mandriva. I think Mandriva is a little better in terms of user interface -- it seems more consistent than FC, but then FC's main goal seems to be to inundate the user with as many applications that do the same things as possible.
Ah, I didn't read closely enough. (I was thinking of something you could install on a USB stick.)

I like Kubuntu quite a lot. It is a Debian system. Kubuntu comes with KDE. Ubuntu comes with Gnome. (But you aren't limited to either.)

Red Hat ticked me off royally. The only thing I want back is the nice BlueCurve window decorations (to use with my Crystal theme). [I'm still having trouble configuring for the latest QuartiCurve sources --particulary as I'm not keen to upgrade to KDE 4 just yet.]
KDE 4's new plasma thingie is a big thumbs down, IMHO. I had all kinds of problems trying to run it on my old computer before it kicked the (bit) bucket.
I do like BlueCurve though.

Topic archived. No new replies allowed.