Including *.cpp files

Pages: 12
When I include a *.cpp file by other C++ source, can I interpret it as getting treated like "compile this C++ file before continuing" -- Or does it treat the source like a header file? I would be so happy if it behaves just like a normal C++ file. That would make my task a lot easier.
Last edited on
It does still compile both C++ files, but there is a problem with that. The include directive simply copies and pastes the source of the file to that spot. So, if I included Bar.cpp to Foo.cpp, the source of Bar is actually being compiled twice. And, once the linker starts combining your object files, it picks up multiple definitions of whatever you had in Bar.cpp, and raises a few errors.

It's generally a better idea to use headers to declare class and function names, as multiple declarations don't cause the compiler to raise errors, while multiple definitions do.
You should never include a cpp file ( or anything that is not a header ).
If you want to compile a file, pass it to the compiler.
If you both #include and compile a source file, you'll get multiple definition errors.

When you #include a file, its contents are copied verbatim at the place of inclusion.
closed account (3hM2Nwbp)
@Bazzy

I generally separate my template classes into header / implementation files and just #include the cpp/ipp file at the end of the header file (effectively separating declaration from definition). Is this generally considered bad practice?
@benjelly
The compiler does not deal with header (.h) or implementation (.cpp) files. It deals with translation units. A translation unit is produced from a source file that is fed into the preprocessor by including whatever is necessary and by expanding all macros as necessary. So, a foo.cpp is not actually compiled. It is fed to the preprocessor. The preprocessor then could access a as many files on your hard drive as it needs to include them and to produce the translation unit. Then the compiler steps back into the scene and translates this translation unit into machine code. As I said, from the point of the compiler - no header or implementation files exist, just translation units. How you got them sewn together is of no importance.

Regards
So it almost doesn't care about the file type... well great. That ruined my idea. Wait, maybe there's some secret-sauce waiting behind precompiled headers. Any one care to explain how to make some source into a precompiled header with Code::Blocks and MingGW?
Last edited on
Most compilers will decide to compile files that files with a .cpp, .cxx, etc extension will have to be compiled. .h, .hpp, etc are to be treated as headers to be precompiled and all other files to be ignored. You can change this behaviour but it's not recommended.

GCC Manual wrote:
C++ source files conventionally use one of the suffixes .C, .cc, .cpp,
.CPP, .c++, .cp, or .cxx; C++ header files often use .hh, .hpp, .H, or
(for shared template code) .tcc; and preprocessed C++ files use the
suffix .ii. GCC recognizes files with these names and compiles them as
C++ programs even if you call the compiler the same way as for
compiling C programs (usually with the name gcc).


@Luc Lieber
It is bad practice if you use .cpp or other source-file extension (see above). It's fine if you use some other extension like .tcc
The compiler doesn't know about the .cpp/.. and .h/.. conventions. In the above quote the compiler vendor uses the extension to recognize the source language. But this is because gcc toolchain is capable of compiling several source languages as we all know. This doesn't have to be the case in general, and the standard enforces nothing in this direction.

Also, as I said, the compiler doesn't care about the raw implementation file (.cpp) at all. It cares about the translation unit that the preprocessor assembles from it. The preprocessor is usually integrated with the compiler in the same executable these days, so it appears that we are passing foo.cpp to "the compiler", but we are not. Even the preprocessor portion from the process does not discriminate between implementation and source files when it goes about its business.

It is a different question, what is a good naming convention. Also, while the toolchain is unlikely to enforce any hard rules, the IDEs tend to be influenced by the extension. I think the make utility is also indifferent.

I will be interested to hear about precompiled headers myself. Are they standard feature or vendor specific extensions? What is compiled inside a precompiled header, say if it contains templates? I am unfortunately short on info there.

Regards
I don't think you understand why including cpp files is bad. Here is an example of this:
1
2
// a.cpp
void f() {}
1
2
3
4
5
6
// b.cpp
#include "a.cpp"
int main()
{
  f();
}
$ g++ -g a.cpp b.cpp
/tmp/cceHGKV1.o: In function `f()':
~/a.hpp:2: multiple definition of `f()'
/tmp/ccfEncSY.o:~/a.hpp:2: first defined here
collect2: ld returned 1 exit status
If a.cpp is renamed to a.hpp:
$ g++ -g a.hpp b.cpp
( compiles and links fine )
This looks like vendor specific choice. In truth though, I don't really understand what is happening. hpp files are ignored from (this particular) compiler, because their extension is recognized as not suitable? And although this is another issue entirely, why on earth would you feed ordinary header files (even if they have the .cpp extension) for compilation? Precompiled header files?
In other words, Bazzy's example violates the One Definition Rule.
Well, this indeed seems to be related to precompiled header files. Apparently gcc uses the extension to recognize when to treat the file as header, that needs to be precompiled, and when to treat it as implementation file. It can be overridden with the '-x' option it seems, but without it, the first case will violate the one definition rule.

Regarding precompiled headers - after a little reading, I am not sure that they are implemented consistently across vendors. In all cases, the purpose is the same - to have smaller compilation times. But it seems that there are two approaches. One approach is described in this wikipedia article:
http://en.wikipedia.org/wiki/Precompiled_header

According to it (IIUC), the header file is parsed and is stored in some internal format. This eliminates the parsing step for this particular header file if it doesn't change. Other header files included by the precompiled header file might change, but this will not trigger re-parsing.

The gcc approach though seems to be different:
http://gcc.gnu.org/onlinedocs/gcc/Precompiled-Headers.html

I don't know how the stuff is implemented there, but considering the stringent conditions, it seems as if the compiler will simply dump its internal state in some manner or something like that. I am only speculating though.

Regards
Okay, the problem is about making my library extremely modular and dynamic. I also want it to be easy to compile everything into one executable, without importing all of the needed *.cpp files into the "project". But then again, it needs to be modular and dynamic. Any solutions?
makefiles?
I don't have much experience with libraries. And I don't really understand the "modular and dynamic" part.

Supposedly, you have to build a .lib file, and since some of the aspects in those happen to be compiler dependent (like name mangling and stuff), you will probably have to distribute your library in source form. What you will provide probably:

a) header files that declare the functions, variables, etc. that the library provides and that also defines the relevant classes/structures

b) the build script that will generate .lib files from the library sources, for which you can use makefiles as Bazzy pointed out

If your library provides templates, then modular and dynamic makes more sense. Templates are entirely header based, and no .lib file is used. Consequently, the way you partition the headers according to functionality will have effect on the compilation times that the users will experience. And those users will not need to import anything then. With .lib files though, I am not sure if any partitioning will have any effect on the compilation times or anything else.

That's just in theory. Let's see what some other forum members would say, because this is not something I've done.

Regards
Hmm. Makefiles?

Well, I have had an idea cooking in my mind this week. I'm thinking about making my own kind of IDE. One which appends onto the compiler by introducing a whole new file format to C/C++, which will even require some small appendages to the language. Source could be quickly parsed and generated into the actual C++ that needs to be compiled.

I'm thinking of something very radical...

All of your work would be contained in a single file, which is browsed like another filesystem... Of course you should still be able to export sections of your work as plain *.cpp and *.hpp files, or "code branch" files.

But gee, it even needs OpenGL to display the files, because the arrangement would only be sensible to view in 3D. Think of an ideal-volume (spherical) coordinate system. Your files (code modules) show in a web, floating at different spherical-coordinates...

Yum.

Edit: Maybe I should explain this "file system" concept a little more and why it is needed... 'over to OpenOffice Writer! I'll be back with my ideas.
Last edited on
Hmm... no one cares? I guess I won't post the document. Anyways, I'm annoyed by Code::Blocks, although it is much less annoying than Visual Studio Ultimate. I am going to start this IDE asap.
Its not that no one cares, but it sounds complicated. And a bit subjective. It is a bit like... using airplanes for personal transportation. Otherwise, radical changes to the C++ mechanics are IMO welcome. Honestly, though, I wish you success with the endeavor.
So the links of the web represent the equivalent of #include directives and so forth? Or is it not that simple ;)
A hierarchy

IO

- Bitmap
- - PNG
- - JPEG
- - BMP


- Sound
- - WAV
- - OGG
- - MP3


- Mesh
- - 3DS
- - OBJ
- - X


Now some of these formats may require other libraries, for decompression .etc But not all require the same kind. How would I fit the extra libraries in this hierarchy? It would work best with a web, since webs are much more "organic" and require less rules.
Pages: 12