Why in AGAL have we a data32PerVertex with a max value of 64 in createVertexBuffer method?

173 views Asked by At

I don't understand this 64 value... For what I've understood we have 8 registers max , each one with a size of 128 bits ( 4 data32),

so we can not access more than 32 data32 ? Am I wrong ? for what are the other 32 data32 that we can store in a vertex ?

Thanks

3

There are 3 answers

0
nikitablack On

You can access this data with setVertexBufferAt and offset. But you're right - it's not possible to use ALL data since only 8 registers are available.

0
starmole On

This is related to the d3d caps value of MaxStreamStride, which is typically 256. But seriously, what do you need that much stream data for?

0
Eloims On

Their are two points to consider.

  • The first one is that having 8 registers of four data32 each will not mean that you can use 32 * data32 because you will waste space for padding.

  • The second one is that most of the time, in a video game the image you see is rendered in multiple passes that use different data from the vertex. Hence the need to put more data in each vertex that a single shader can handle.

Imagine a stupid scenario where you want to render a model with up to 10 bones per vertex. (this is purely theoretical, flash has an hardcoded ~200 agal instructions limit, and by the way, totally useless unless you are modelling an octopus)

In each vertex you will have a lot of data like this. This totals to 3*3 + 12*2 = 33 data32 per vertex.

3 data32 for position
3 data32 for normal
3 data32 for tangent
2 data32 for boneData1
2 data32 for boneData2
2 data32 for boneData3
2 data32 for boneData4
2 data32 for boneData5
2 data32 for boneData6
2 data32 for boneData7
2 data32 for boneData8
2 data32 for boneData9
2 data32 for boneData10
2 data32 for texture_uv
2 data32 for lightmap_uv

In a typical defered shading render scenario you will do the following:

1 - Render a "view space normal map" in a texture. You will then write a shader that will need to use: position, normal, tangent, and all bonedata.

So the shader will use 29 data32. But all registers will be full, because you need to pad the position, normal and tangent (va0 position, va1 normal, va2 tangent, va3-7 bone data).

You will waste space on va0.w, va1.w and va2.w.

2 - Render a "view space depth map" in a texture. You will then write a shader that will need to use: position and all bonedata.

So the shader will use 23 data32.

3 - Render a view space diffuse map in a texture You will then write a shader that will need to use: position, texture_uv and all bone data.

So the shader will use 25 data32.

4 - Render a view space light map You will then write a shader that will need to use: position, lightmap_uv and all bone data.

So the shader will use 25 data32.

5 - Finally composite and do defered lighting to build your final image.

All 33 data32 have been used. No shader used more than 8 registers at the same time.