Problems about allocating openGL 3D texture size of 1G

1.5k views Asked by At

I'm doing some work on Volume Rendering. I want to allocate a 3D luminance texture of 1024x1024x1024 uchar. Unfortunately it always fails.

By adding glGetError() after glTexImage3D(...), I get the error code 1285, which means "Out of memory".

However, my card is NV quadro 4800, whose memory size is 1536MB, larger than the texture size above(1GB). The driver version of the card is 296.88. The version of glew is the latest version 1.8.

My code about allocating the texture is shown as follows:

glBindTexture(GL_TEXTURE_3D, texVoxels);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage3D(GL_TEXTURE_3D, 0, GL_LUMINANCE, volumeSize.x, volumeSize.y, volumeSize.z, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, voxels);

int err = glGetError();
printf("%d\n", err);

PS. By using *glGetIntegerv(GL_MAX_3D_TEXTURE_SIZE, dim)*, it returns 2048, which means my hardware should be able to allocate 1024 cubic uchar texture.

PPS. BTW, I also do some work using CUDA. There' s no problem when I allocate 2048 cubic uchar texture in CUDA.

What's the problem of this openGL 3D texture?

2

There are 2 answers

1
rtrobin On BEST ANSWER

Thanks for all friends answering my question. I eventually find my fault.

I used to compile the code under win32 mode. When I turn to using x64 mode to compile and run, it eventually WORKS !! And no matter GL_LUMINANCE or GL_LUMINANCE8, it works.

It seems like the device can' t address video memory larger than 600MB(more or less).

Here is the final link I found which explain the same problem I met. http://www.opengl.org/discussion_boards/showthread.php/177442-Out-of-memory-problem?highlight=memory

3
Nicol Bolas On

What's the problem of this openGL 3D texture?

You exceeded memory limitations.

Just because your GPU's memory is 1.5GB doesn't mean that you can allocate a 1GB chunk as a single texture. Your OpenGL implementation has the right to refuse to do so on memory grounds, regardless of how much GPU memory your card has.

And if it wants to reject this, there's nothing you can do about it.

The best you can do is try giving your image a proper pixel size. That is, use GL_LUMINANCE8 rather than the unsized GL_LUMINANCE. Or better yet, use GL_R8. There is no guarantee that any of these would work.