Readling Large Files

I've been trying to figure out a way to read files that are larger than a computers system memory. I've sort of been able to do it with the bastardization of code below that reads character by character into a char array.
In the process it over writes the same memory, but it is horribly slow and leaves of parts of files. So basically how do I read large files the right way.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
#include <iostream>
#include <vector>
#include <string>
#include <fstream>
#include <stdio.h> 
#include <stdlib.h>
#include <iomanip>
#include <windows.h>
#include <cstdlib>
#include <thread>

using namespace std;
/*======================================================*/
	string *fileName = new string("tldr");
	char data[36];
	int filePos(0); // The pos of the file
	int tmSize(0); // The total size of the file	
	
	int split(32);
	char buff;
	int DNum(0);
/*======================================================*/



int getFileSize(std::string filename) // path to file
{
    FILE *p_file = NULL;
    p_file = fopen(filename.c_str(),"rb");
    fseek(p_file,0,SEEK_END);
    int size = ftell(p_file);
    fclose(p_file);
    return size;
}

void fs()
{
	tmSize = getFileSize(*fileName);
	int AX(0);
	ifstream fileIn;
	fileIn.open(*fileName, ios::in | ios::binary);
	int n1,n2,n3;
	n1 = tmSize / 32;

	// Does the processing
	while(filePos != tmSize)
	{
		fileIn.seekg(filePos,ios_base::beg);
		buff = fileIn.get();
		// To take into account small files
		if(tmSize < 32)
		{
			int Count(0);
			char MT[40];
			if(Count != tmSize)
			{
				MT[Count] = buff;
				cout << MT[Count];// << endl;
				Count++;
			}
		}
		// Anything larger than 32
		else
		{
			if(AX != split)
			{
				data[AX] = buff;
				AX++;
				if(AX == split)
				{
					
					AX = 0;
				}
			}
			
		}
		filePos++;
	}
	int tz(0);
	filePos = filePos - 12;
	
	while(tz != 2)
	{
		fileIn.seekg(filePos,ios_base::beg);
		buff = fileIn.get();
		data[tz] = buff;
		tz++;
		filePos++;
	}

	fileIn.close();
}

void main ()
{
	fs();
	cout << tmSize << endl;
	system("pause");
}
Why do you have to read the file? I.e. what operations do you have to perform to the data?
Things like encryption and maybe even data transmission.
Why character by character? Why not buffer by buffer?

http://www.cplusplus.com/reference/istream/istream/read/

Andy
You don't need to seek before reading each character.

Also, if the file is 100,000,032 bytes in size, then it appears to me like the loop reads and discards the first 100,000,000 bytes into the data array. Only the last 32 bytes are kept. If that's the case then just seek to end-32 and read the data you want.

Finally, to get the size of the file, see if your system supports stat(). It will probably be faster than opening and closing the file.
An example?
Topic archived. No new replies allowed.