Problem reading 10 000 binary files

Hi
Windows 7, 64 bits, Visual Studio 10.
I have a problem to read a large number of binary files, process them and store them under a new name. The program and routines go very well for 505 files. After reading 506 files, the program now refuses to read the next file. I have 16 Gb of memory and tried to close all other programs and restart the PC. it always stops after 506 files (512 files would be more understanding in a way...).

Here is my code. I have tried many things without success. This is only part of the loop that stops. The if test if (myfile.is_open() returns false by some reason. I can start the process again starting with the file that does not open and then it stops again after 506 files.


char * tfiBlock;

ifstream myfile (OrigFilename, ios::in|ios::binary|ios::ate);
if (myfile.is_open())
{
int lengde = myfile.tellg();
tfiBlock = new char [lengde];
//static char memblock [size];
myfile.seekg (0, ios::beg);
myfile.read (tfiBlock, lengde);
myfile.close();
myfile.clear();
result = 0;

}
else
{
result = -1;
printf("cannot open %s \n",OrigFilename);
tfiBlock = NULL;
}

Clean up procedure:
delete[] tfiBlock;

Are there any limits to how many files that can be opened, or is it maybe someting to be set in the compiler?
Hope anyone has a suggestion.
Are there any limits to how many files that can be opened,


Assuming you are closing the 1st file before opening the 2nd, then no, there is no limit. Though there might be a limit as to how many files you have open at once (though I doubt you would be reaching it).


Anyway I don't see anything horribly wrong with your code. At least nothing that would cause the behavior you're seeing.

Can you make a small program that reproduces the problem and upload the full source of it somewhere so we can try it out?
Hi
Yes, I tested with a small program and had no problems reading > 1500 files. So the error message was fooling me. I have a feeling that something is filling the stack and that the limit is reached when I have read 506 files. I will go through the program ten more times to find out why. When I find out, I will post the result back here.
Use a different stream object per file (instead of close/open/close... on the same stream)

1
2
3
4
5
6
7
8
Loop:
{
      std::ifstream my_file( ... ) ; // open for input

      // use file

     // let the destructor of my_file close it
}

You may need to increase a resource- the number of file handles in the system. Even though your program is closing the handles, the system may not be keeping up in the background.

See: http://www.ehow.com/how_6103386_increase-file-handles.html for details.
Last edited on
Thanks a lot for several good suggestions that I will use in the future.

The problem seems to be of a strange kind. I am compiling with Release option and have put in several printf() statements to check what is going wrong. They are normally commented out. My belief was that memory leaks caused the problem, which turned out to be wrong.

I made printf() operational again and the program worked excellent. Do not understand why. With all the printf() statements it was too slow, so I commented them out again and compiled again.

Now the programs run nice and smooth without stopping. The ONLY thing I changed was adding printf() and removing printf().

This was all done in VS 2010. I am moving on to VS 2013 and will check what happens there.

Topic archived. No new replies allowed.