is this a safe way of creating binary file from vector?

hello everybody
i made a little program to create a binary file from elements in a vector.
i'd like to know if it's safe to do it the way i did (writing lots of elements at once) or if i should write one by one to the file.

1
2
3
4
5
6
7
8
9
10
int main(){
  std::vector<uint8_t> vec(400);
  std::ofstream file("file.data", std::ios::binary);
  for (size_t n=0; n<= vec.size(); n++){
    vec[n] = static_cast<size_t> (120);
  }
  file.write(reinterpret_cast<char*> (&vec[0]), vec.size() * sizeof(uint8_t));

  return 0;
}


i ask this because i don't know if a bigger vector will be allocated in a contiguous block of memory, so... i don't really know what would happen...

thanks in advance :)
Last edited on
As long as you are using C++03 or greater compiler the standard guarantees a vector will be in a contiguous block.

What is the purpose to that static_cast in your loop? Your vector is a vector of uint8_t not of a size_t. Since the size_t is probably larger than an uint8_t, you're explicitly upcasting to the size_t then implicitly downcasting to the uint8_t.

Also you have an out of bounds error in that loop as well. Remember vectors, much like arrays, start at zero and stop at size - 1.

Lastly shouldn't you be using sizeof(vec[0]) in the write() as well?
you are right, i fixed it a little. the static cast from size_t really was an error, i changed the code and forgot that. thanks for the help.

1
2
3
4
5
6
7
8
9
10
int main(){
  std::vector<uint8_t> vec(400);
  std::ofstream file("file.data", std::ios::binary);
  for (size_t n=0; n< vec.size(); n++){
    vec[n] = 120;
  }
  file.write(reinterpret_cast<char*> (&vec[0]), vec.size() * sizeof(vec[0]));

  return 0;
}
Topic archived. No new replies allowed.