I'm doing some work on Volume Rendering. I want to allocate a 3D luminance texture of 1024x1024x1024 uchar. Unfortunately it always fails.
By adding glGetError() after glTexImage3D(...), I get the error code 1285, which means "Out of memory".
However, my card is NV quadro 4800, whose memory size is 1536MB, larger than the texture size above(1GB). The driver version of the card is 296.88. The version of glew is the latest version 1.8.
My code about allocating the texture is shown as follows:
glBindTexture(GL_TEXTURE_3D, texVoxels);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage3D(GL_TEXTURE_3D, 0, GL_LUMINANCE, volumeSize.x, volumeSize.y, volumeSize.z, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, voxels);
int err = glGetError();
printf("%d\n", err);
PS. By using *glGetIntegerv(GL_MAX_3D_TEXTURE_SIZE, dim)*, it returns 2048, which means my hardware should be able to allocate 1024 cubic uchar texture.
PPS. BTW, I also do some work using CUDA. There' s no problem when I allocate 2048 cubic uchar texture in CUDA.
What's the problem of this openGL 3D texture?
Thanks for all friends answering my question. I eventually find my fault.
I used to compile the code under win32 mode. When I turn to using x64 mode to compile and run, it eventually WORKS !! And no matter GL_LUMINANCE or GL_LUMINANCE8, it works.
It seems like the device can' t address video memory larger than 600MB(more or less).
Here is the final link I found which explain the same problem I met. http://www.opengl.org/discussion_boards/showthread.php/177442-Out-of-memory-problem?highlight=memory