Image dims using Unofficial OpenGL SDK

I am trying to get the dimensions of an image in an image set using the Unofficial OpenGL SDK, but I am doing something wrong with the GetDimensions(); call. Here is the code I am using to load the image and create a texture using the glimg library:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
#include <stdlib.h>
#include <stdio.h>
#include <memory>
#include <glload/gll.hpp>
#include <glimg/glimg.h>
#include <glimg/ImageSet.h>
#include <glimg/ImageFormat.h>
#include <glimg/TextureGeneratorExceptions.h>
#include <GL/glfw.h>

using namespace glimg;

int main()
{
    int     width, height;
    int     frame = 0;
    bool    running = true;

    glfwInit();

    if( !glfwOpenWindow( 640, 480, 0, 0, 0, 0, 0, 0, GLFW_FULLSCREEN ) )
    {
        glfwTerminate();
        return 0;
    }

    glfwEnable( GLFW_MOUSE_CURSOR );
    glfwSetWindowTitle("GLFW Application");

//////////////////////////////////////////////////////////////////////////////////
    if(glload::LoadFunctions() == glload::LS_LOAD_FAILED)
    {
        printf("glload failure.\n");
        running = false;
        return 0;
    }

    GLuint theTexture = 0;

    try
    {
        std::auto_ptr<glimg::ImageSet> pImgSet(glimg::loaders::stb::LoadFromFile("TestImage.png"));
        theTexture = glimg::CreateTexture(pImgSet.get(), 0);
    }
    catch(glimg::loaders::stb::StbLoaderException &e)
    {
        printf("Image file loading failed.\n");
    }
    catch(glimg::TextureGenerationException &e)
    {
        printf("Texture creation failed.\n");
    }

    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT );
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT );
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST );
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST );

    printf("Texture creation successful.\n");

    glimg::Dimensions dims = ImageSet->glimg::ImageSet::GetDimensions();

////////////////////////////////////////////////////////////////////////////////

    while(running)
    {
    frame++;

        glfwGetWindowSize( &width, &height );
        height = height > 0 ? height : 1;

        glViewport( 0, 0, width, height );

        glClearColor( 0.0f, 0.0f, 0.0f, 0.0f );
        glClear( GL_COLOR_BUFFER_BIT );

        glMatrixMode(GL_PROJECTION);
        glLoadIdentity();
        glOrtho(0.0,width,height,0.0,-1.0,1.0);
        glMatrixMode(GL_MODELVIEW);

        glEnable(GL_TEXTURE_2D);
        glBindTexture (GL_TEXTURE_2D, 1);

        glBegin (GL_QUADS);
        glTexCoord2f (-1.0f, 1.0f); glVertex3i (0,0,0);
        glTexCoord2f (0.0f, 1.0f); glVertex3i (23,0,0);
        glTexCoord2f (0.0f, 0.0f); glVertex3i (23,29,0);
        glTexCoord2f (-1.0f, 0.0f); glVertex3i (0,29,0);
        glEnd ();

        glfwSwapBuffers();

        // exit if ESC was pressed or window was closed
        running = !glfwGetKey(GLFW_KEY_ESC) && glfwGetWindowParam( GLFW_OPENED);
    }

    glfwTerminate();

    return 0;
}


The program breaks on this line:

glimg::Dimensions dims = ImageSet->glimg::ImageSet::GetDimensions();

and returns this error:

1
2
3
4
5
||=== TextureTest, Debug ===|
C:\C++\TextureTest\main.cpp||In function 'int main()':|
C:\C++\TextureTest\main.cpp|62|error: expected primary-expression before '->' token|
C:\C++\TextureTest\main.cpp|62|warning: unused variable 'dims' [-Wunused-variable]|
||=== Build finished: 1 errors, 1 warnings (0 minutes, 0 seconds) ===|


I can't seem to figure out what is causing this. I tried posting this on an OpenGL forum, but it was deemed a c++ problem. I have also read through the documentation on the SDK's website, but I wasn't able to find out how to fix this. I'm sorry if I posted this in the wrong forum,I saw another OpenGL post here, so I thought this was the place to post this.
You need to use an instance of ImageSet class and call GetDimensions() method on it. Maybe like this (not tested)

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
std::auto_ptr<glimg::ImageSet> pImgSet;

try
    {
        pImgSet = glimg::loaders::stb::LoadFromFile("TestImage.png");
        theTexture = glimg::CreateTexture(pImgSet.get(), 0);
    }
    catch(glimg::loaders::stb::StbLoaderException &e)
    {
        printf("Image file loading failed.\n");
    }
    catch(glimg::TextureGenerationException &e)
    {
        printf("Texture creation failed.\n");
    }

    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT );
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT );
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST );
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST );

    printf("Texture creation successful.\n");

    glimg::Dimensions dims = pImgSet->GetDimensions();

//////////////////////////////////////////////////////////////////////////////// 
I tried doing that, but it returned an error saying that pImgSet isn't declared in the scope.
Last edited on
I derped a bit there... Anyways, I fixed it by just defining the ImageSet outside of the try, which I managed to not see when I read your example. Thank you for the help.
Last edited on
The compiler is returning

1
2
3
4
5
6
C:\C++\TextureTest\main.cpp||In function 'int main()':|
C:\C++\TextureTest\main.cpp|46|error: no match for 'operator=' in 
'pImgSet = glimg::loaders::stb::LoadFromFile((* & std::basic_string<char>(((const char*)"TestImage.png"), (*(const std::allocator<char>*)(& std::allocator<char>())))))'|
C:\C++\TextureTest\main.cpp|46|note:   mismatched types 'std::auto_ptr<_Tp1>' and 'glimg::ImageSet*'|
||=== Build finished: 1 errors, 0 warnings (0 minutes, 0 seconds) ===|


when I build the example code you wrote. I tried commenting out the try and just creating the ImageSet and texture as it was in the code I posted and everything ran fine. I also tried commenting out the try and creating the image and texture as you posted in your example code and the compiler returned a similar error to the one above.

I want to keep the Try statement to catch image loading errors. How would I do that while creating the ImageSet outside of it?
Last edited on
Not using std::auto_ptr ...
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
glimg::ImageSet* pImgSet = NULL;

try
    {
        pImgSet = glimg::loaders::stb::LoadFromFile("TestImage.png");
        theTexture = glimg::CreateTexture(pImgSet.get(), 0);
    }
    catch(glimg::loaders::stb::StbLoaderException &e)
    {
        printf("Image file loading failed.\n");
    }
    catch(glimg::TextureGenerationException &e)
    {
        printf("Texture creation failed.\n");
    }

    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT );
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT );
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST );
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST );

    printf("Texture creation successful.\n");
glimg::Dimensions dims;
if (pImgSet) {
   dims  = pImgSet->GetDimensions();
}
That works, except the .get() in the texture creation is no longer needed it seems. What is the std::auto_ptr for then? Thank you for all the help.
It is a "smart pointer", for more information read here:
http://www.cplusplus.com/reference/memory/auto_ptr/
Topic archived. No new replies allowed.