char array and reading files

I am trying to read a file using ifstream and a char* variable as part of an assignment.
What I am trying to achieve is read a file, encrypt it and save the outcome (which is another char* variable to another file. I am doing it in visual c++ 2010.

I've tried to read it first as a whole file into a char array I used a small text file so it passed. But when I tried with a big one, it naturally failed due to the large amount of data.

My second option was to read it word by word with the following code:

1
2
3
4
5
6
7
8
9
10
11
char buf[255];
char outB[255];
ifstream is;
ofstream osf;
// open the files for reading/writing
			while(!is.eof()) {
				is.getline(buf,sizeof(buf),' ');
				process(buf, outB, key);
				osf << outB;
			}
// close the files 


process is a simple caesar cipher.

Now with this implementation I am reading the file, finding where a word is and ciphering it then write it to the ofstream. I have two problems with that:

1. If a file line length is huge (like more than 80+ chars) the program hangs.
2. If the file line length is normal, it cuts off the empty character.

How can I get that empty character back and how can I prevent the large 80+ chars hang?

I know that you can implement it with std::string, but I think the point here is using char* variables.

Hi there,

My guess is that at some point is.failbit is set, which will cause your program to infinitely loop because you're only checking for is.eof(). Try while(is.good()) instead. For more information: http://www.cplusplus.com/reference/istream/istream/getline/

If you actually need the spaces, I would suggest using getline() without its delimiter argument:
is.getline(buf,sizeof(buf);

Hope that helps, please do let us know if you hae any further questions.

All the best,
NwN
Topic archived. No new replies allowed.