My DirectX11 C++ Engine uses uint16_t (short) for the vertex index buffer and all was working well.
I've evolved the models I use and now they have grown with over 64k indexes.
I've changed all references to my index buffer from short to uint32_t and the render was broken.
My variable defines are:
ID3D11Buffer *IndexBuffer; //DirectX Index Buffer
vector<int32_t> primitiveIndices; //Vector array of indicies formally
I finally changed the line
Context->IASetIndexBuffer(IndexBuffer, DXGI_FORMAT_R16_UINT, 0);
to
Context->IASetIndexBuffer(IndexBuffer, DXGI_FORMAT_R8G8B8A8_UINT, 0);
This was done to allow 32bit indexes. However it fails to render. I have also updated the
D3D11_BUFFER_DESC::ByteWidth
accordingly.
Any advice welcome.
What exactly do you think is the meaning of
DXGI_FORMAT_R8G8B8A8_UINT
as an index buffer format? If you check the documentation, you will find there are only two valid formats thatIASetIndexBuffer()
will accept. If your indices arestd::uint32_t
then the corresponding DXGI format to use would beDXGI_FORMAT_R32_UINT
. Apart from that, I highly recommend to use a debug context and look at the debug output when debugging…