GLUT and SDL

I'm working on a project and i have a question, will GLUT and SDL work together? GLUT as the rendering and SDL as the GUI.
Why would you use both? They can each perform both of those roles.
Thank you for your answer
@chrisname does glut do both 3d and 2d graphics?
For starters... GLUT is ancient and you shouldn't use it. You should use FreeGLUT which is the more modern replacement.

Likewise, SDL 1.x is ancient and you shouldn't use it either. SDL 2.0 is preferred

though I personally prefer SFML over either SDL or FreeGLUT, but whatever



Secondly... FreeGLUT (or GLUT) doesn't do any graphics. It's a window managing lib designed to work with OpenGL. OpenGL is what does the graphics.

OpenGL can be used with either FreeGLUT, SDL (or even SFML) and is fully capable of 2D or 3D graphics.


EDIT:

Also... speaking of ancient... if you are using old openGL functions like glBegin(), glMatrixMode(), etc -- that too is something you shouldn't do.
Last edited on
closed account (N36fSL3A)
if you are using old openGL functions like glBegin(), glMatrixMode(), etc -- that too is something you shouldn't do.

Why not? I find them perfectly fine and I like them better than that other crap. (What's the point of using the others?)
Why not?


They've been deprecated, which means they're being phased out.

I don't even think newer versions of OpenGL have them at all.

What's the point of using the others?


Aside from being more modern and working with more modern OpenGL features...

The others (glBindVertexArray, glDrawElements, glDrawArrays, etc), keep as much data in GPU memory as possible, to minimize transferring of data from CPU->GPU (which is extremely slow).

With glBegin/glEnd, every vertex must go over the bus every time its drawn.
With glMatrix functions, every matrix change must also get moved over

With modern OpenGL it's all on the GPU from the get-go. It lets the GPU do more of the work, and less time is wasted on CPU<->GPU communication.
so over all i should not use the code blocks glut library?

and how do i use free glut in codeblocks
Last edited on
You have to install it, but you should use SDL or SFML instead - you have to install those too; I recommend SFML which you can find at http://sfml-dev.org
does sfml do 3d graphics?
closed account (o1vk4iN6)
They've been deprecated, which means they're being phased out.


They aren't being phased out, they were already removed. They were only deprecated for one minor version 3.0 only and then removed in 3.1.

Why not? I find them perfectly fine and I like them better than that other crap. (What's the point of using the others?)


So yah if you want to use any new features of opengl you'll have to let go of those functions.

They are slow for one thing, as Disch said passing memory from CPU to GPU is slow. If you are passing a 100 000+ vertex model every frame that's going to be slow. One method I've seen of optimizing the slowness between CPU and GPU is rendering every object that uses the same shader is rendered at the same time so that any shader initialization doesn't need to happen again for another instance (shader constants included).

It was removed cause at the end of the day in production code it was never used for that reason. I mean I like it for debugging purposes, it was a fast simple way to do it, kind of wish they would implement a new solution for that. It also got rid of keywords such as gl_ProjectionMatrix or something like that in GLSL, so you have to handle passing your own matrices to the shader.
Last edited on
does sfml do 3d graphics?


No, but it sets up a GL context so you can use it just as a window manager for getting OpenGL set up... then you can use OpenGL for 3d graphics.
^ Which is just as much as what GLUT, SDL and anything else do.
Correct.
It's phased out on newer versions of OpenGL, but older versions still exist on your computer, you just need to target them.

For some 3D graphics programs, it seams like it would not be a bad idea to have fallback modes.

If you want it to work with ivy bridge intel integrated graphics, you should be able to target or fallback to 3.0 and be safe. If you want it to be supported on a VM, then you need to target or fall back to 2.1, and use all of those ancient functions.

Finally, the new Haswell integrated graphics will support version 4.0 on linux.

So if you're writing something like a gui application, then you might be better off with version 2.1, or at least falling back to 2.1, because performance would not be an issue, and you would want it to be supported on a VM.

Games generally don't work on VM's anyways, so you might as well only target or fallback to 3.0 if your trying to make it well supported.
Last edited on
Topic archived. No new replies allowed.