Deleting variables

Hello! I have heard that when a C++ program terminates, all the memory is freed, even the heap allocated one, but I have also heard that any memory you allocate by calling "new", you must call "delete" when you are done with it. So do I need to call "delete"?
For long running programs it can be a problem if you don't free things when you are done with them because it can lead to the program using more and more memory until it eventually runs out of memory.
Last edited on
Oh ok, I get it. I guess it also prevents issues such as dangling pointers.
Please describe "dangling pointer". (It is somewhat opposite to a leak.)



Do I call new or delete in this program?
1
2
3
4
5
6
7
#include <vector>
#include <iostream>

int main() {
  std::vector<int> example { 42, 7, 13 };
  for ( auto x : example ) std::cout << x << '\n';
}


How about in this?
1
2
3
4
5
6
7
8
#include <iostream>
#include <memory>
 
int main()
{
    std::unique_ptr<int> answer( new int {42} );
    std::cout << *answer << '\n';
}


One more?
1
2
3
4
5
6
7
8
9
10
11
#include <iostream>
#include <memory>
#include <string>
 
int main()
{
    using std::string;
    using std::make_unique;
    auto text = make_unique<string>( "It was a dark and stormy night" );
    std::cout << *text << '\n';
}



The point is that C++ Standard Library provides means to manage memory without huge fuss.

C++ program terminates, all the memory is freed


I thought it appropriate to be a bit critical about this slight misconception. I offer this from the perspective of an old hand who lately has taken up the task of helping students learn this technology, and I've discovered that some misconceptions assumed early in study last way too long into a career. They are best "nipped in the bud". This is one of them.

Yes, overall, it is the general effect that memory is freed, but it isn't just a C++ program and it actually has nothing to do with the language, or any language.

That is the feature of a modern operating system, and it is not entirely universal or reliable.

If a C++, C or any program fails to clean up after itself, leaving lots of memory allocated, it is the operating system, releasing the process, that clears the RAM. Modern operating systems may also close files and release locks.

You may think, at first, I'm being over critical, and I understand. However, even in modern operating systems there can be leftovers of various types, like certain types of memory mapped files, named mutexes, file locking entries, that can remain if not explicitly cleared. It isn't just memory that must be cleaned up, it is any resource the application acquires while it was running.

Some of these leaks are similar to leaving junk temporary files behind that clutter up a directory.

It wasn't always so, either, and still may not be in modern times depending on the platform. Linux, UNIX and Windows (as was OS/2) are good at this cleanup (but not perfect). However, there are modern embedded systems with such primitive environments that they may not be able to do this for you.

Among my recent "teaching" (maybe better called "tutoring") efforts, I've taught programming robotic controllers used in high school competitions. Some of these controllers were developed around 2008, have barely 32K of RAM and a 75 Mhz CPU. They can only run one program at a time. It is fortunately difficult to even allocate memory in their limited C language, but that's an example of a machine that can't clean up RAM that isn't allocated. Their solution is that everything is reset (power cycled even) to do that.

It is quite a bit more than merely a good habit to ensure applications clear RAM, and all resources they acquire. Several will point out the obvious notion that a long running program would continuously consume RAM if it has a leak. What may be written as a stand alone, single run application can well be turned into something you'll run thousands of times by a means so simple as changing "int main" to "int proc1", and call that "proc1" in a loop. That may be a simple means of timing for a benchmark, or it could be that the single run application is turned into an animated application that "never stops" (until the user demands it).

Yes, every new in C++ must be countered by a matching delete (there are advanced exceptions you should probably ignore for now). In modern C++ we use smart pointers to handle the delete for us, and even the "new" may be performed by a matching function like make_unique or make_shared.

The streams (from iostream, fstream, etc.) close their files when they fall from scope.

Modern GUI frameworks release their device contexts, brushes, pens, icons, bitmaps the same basic way.

That subject is called RAII. The entire notion of resource leaks (not just memory) is addressed by RAII. In C++ this means all such resources are handled by a class instance (an object). The resource (the R in the acronym) is acquired (the A) when that class is initialized (one of the I's), such that the resource is automatically released (deleted) when the object's destructor executes. It is a C++ staple (and is not as well handled by other languages).





if you are new to pointers or paranoid (you probably should be) then after delete, set the pointer to null (unless this is the end of a function and the pointer is local). This gives an easier to find error if you try to use it again and makes any extra deletes safe and is just a little bit of self defense.

performance wise,
pointers take a bit of time to new and delete. try to do this as infrequently as possible (and this is true for hidden pointers inside classes / stl as well). Try to avoid reallocation to change the size of things.

Generally, try to avoid dynamic memory as much as you can. We have tools that manage it better. If you must use it, look at the better pointer options over the basic C one.
Topic archived. No new replies allowed.