Segmentation fault (core dumped) when reading in large number of files.

I have a simple function that reads .dat files of various sizes into double arrays. I call the function a large number of times. When I call the function to create one more large array the program crashes with the error: Segmentation fault (core dumped).

No compilation errors are reported.
Interestingly, the code runs fine on linux (on the same machine) and reports the
error only when I use cygwin (same g++ command).

When I came across this problem with cygwin I have noticed that commenting out the line where I call one of the larger arrays prevents the segmentation error. This makes me think that either there is some sort of memory leak. I am a c++ dummy and any help would be appreciated

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
//Main 

#include <algorithm>
#include <string>
#include <vector>
#include <iostream>
#include <string.h>
#include <fstream>
#include <dirent.h>
#include<stdio.h>
#include <sstream>
#include "get_double_variable.h"


using namespace std;

int main ( ){
	
        double signal[20432]; 
get_double_variable(signal,"Draws_set.dat");
        double mu_0[20432]; 
get_double_variable(mu_0,"mu0.dat");
        double itas[10];
get_double_variable(itas,"itas.dat");
        double productivity[10];     get_double_variable(productivity,"productivity.dat");
        double Pp[10]; 	
get_double_variable(Pp,"Pp.dat");
        double Pitas[10];
get_double_variable(Pitas,"Pitas.dat");
        double capital[1277]; 
get_double_variable(capital,"capital.dat");
        double wage[1277]; 
get_double_variable(wage,"wage.dat");
        double start_month[1277]; get_double_variable(start_month,"start_year.dat");
        double m[67984]; 
get_double_variable(m,"m.dat");
        double prices[67984];  
get_double_variable(prices,"ln_price_fpym2");
        double sales[67984]; 
get_double_variable(sales,"ln_sales_fpym.dat");
        double shipments[67984];
get_double_variable(shipments,"shipments.dat");
// For example if i comment out calling the function on  shipments or any other (or group of) sufficiently large array the program runs just fine.
        double firm_loc[1277];  
get_double_variable(firm_loc,"firm_loc.dat");
        double firm_in_prod_loc[1277];
get_double_variable(firm_in_prod_loc,"firm_in_prod_loc.dat");
        double prod_loc[4960];
get_double_variable(prod_loc,"loc_prod.dat");
    return 0;
}

// Function that reads the data
#include <fstream>
#include <iostream>
#include <string>

using namespace std;	
void get_double_variable(double variable[],std:: string filename){
    double line;
    ifstream fileIN(filename);
    if (fileIN.is_open()){
        int j=0;
	while(fileIN >> line){
            variable[j]=line;
            if(j%100==0){
	    cout << "variable "<<filename <<" "<<j<<"]="<<variable[j] << '\n';
            }
	j=j+1;
	}
    fileIN.close();
    fileIN.clear();
    cout << "Successfuly read file" << filename<< endl;
	 }else{
    cout << "Unable to open file" << filename<< endl;
           fileIN.clear();
		  }
return;
}


Thank you for help in advance!
Last edited on
Hi,

You could try dynamically allocating memory, rather than on the stack which has limited size.

Look up smart pointers in the reference and use them.

Hope this helps a bit. :+)
You have almost 2 MB of data in your arrays. Add call stack and other variables to the mix, and you will be over 2 MB. Stack is usuall 1-5 MB. So it is easy to get stack overflow here.
So many of those large arrays are the same size. I'd guess you should have records (structs and classes) that manage that data.
Thank you everyone for help! Indeed allocating larger arrays dynamically has solved the issue!
Topic archived. No new replies allowed.