I'm trying to create (and fill) a D3D11Buffer in c++, pass a pointer to the buffers memory via interop to c# in order to read from that buffer using slimdx - no copies of the buffer via cpu should be involved.
here's my code so far
the function export:
extern "C" {__declspec(dllexport) HRESULT getValues(int * debug, ID3D11Buffer* bufOut); }
and the function:
ID3D11Device* d3d11Device;
ID3D11Buffer* triangleVertBuffer;
ID3D11DeviceContext* d3d11DevCon;
struct Vertex //Overloaded Vertex Structure
{
Vertex() {}
Vertex(float x, float y, float z)
: pos(x, y, z) {}
XMFLOAT3 pos;
};
HRESULT getValues(int * debug, ID3D11Buffer* bufOut) {
//Create the vertex buffer
Vertex v[] =
{
Vertex(0.0f, 0.5f, 0.5f),
Vertex(0.5f, -0.5f, 0.5f),
Vertex(-0.5f, -0.5f, 0.5f),
};
D3D_FEATURE_LEVEL FeatureLevelsRequested = D3D_FEATURE_LEVEL_11_0;
UINT numLevelsRequested = 1;
D3D_FEATURE_LEVEL FeatureLevelsSupported;
//create device
hr = D3D11CreateDevice(NULL, D3D_DRIVER_TYPE_HARDWARE, NULL, D3D11_CREATE_DEVICE_DEBUG, &FeatureLevelsRequested, numLevelsRequested, D3D11_SDK_VERSION, &d3d11Device, &FeatureLevelsSupported, &d3d11DevCon);
debug[0] = hr;
D3D11_BUFFER_DESC vertexBufferDesc;
ZeroMemory(&vertexBufferDesc, sizeof(vertexBufferDesc));
vertexBufferDesc.Usage = D3D11_USAGE_DEFAULT;
vertexBufferDesc.ByteWidth = sizeof(Vertex) * 3;
vertexBufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER;
vertexBufferDesc.CPUAccessFlags = 0;
vertexBufferDesc.MiscFlags = 0;
D3D11_SUBRESOURCE_DATA vertexBufferData;
ZeroMemory(&vertexBufferData, sizeof(vertexBufferData));
vertexBufferData.pSysMem = v;
hr2 = d3d11Device->CreateBuffer(&vertexBufferDesc, &vertexBufferData, &triangleVertBuffer);
bufOut = triangleVertBuffer;
debug[1] = hr2;
return hr;
}
in C#, I import the function like this:
[System.Runtime.InteropServices.DllImport("CreateID3D11Buffer.dll", CallingConvention = CallingConvention.StdCall)]
private static unsafe extern UInt32 getValues([In, Out] int[] debug, [In, Out] IntPtr bufIn);
and then call it like that:
int[]debug = new int[2];
IntPtr bufPtr;// = IntPtr.Zero;
UInt32 hr = getValues(debug, bufPtr);
DataStream ds = new DataStream(bufPtr, sizeof(Vector3) * 3, true, false);
so far so good....now to my problems (bear with me, still c++ beginner and intermediate in c#...):
- it doesn't work: in c#, the last line, the DataStream constructor, throws an error, as bufPtr, the IntPtr to the unmanaged memory, is null. So, apparently, there's either something wrong with my p/invoke or in my c++ code...therefore, I looked at the hresult of the dx11 functions in c++ ->
- HRESULT: the return value hr of my imported function in c# equals 0, but both elements in the debug array equal -1! On the one hand I don't understand how to interpret negative values here, and on the other hand the return value hr should at least equal debug[0] (I'm aware their type differs)? So, i don't know what to make of this, but it indicates there's a problem with the function D3D11CreateDevice() and then (understandably) also with d3d11Device->CreateBuffer().
- i'm not sure if I understand slimdx's DataStream constructor correctly - the IntPtr to unmanaged memory points to graphics memory, or is this a function to read from unmanaged cpu memory? Unfortunately I can't find any examples using the IntPtr overload...here's the doc: https://slimdx.org/docs/html/M_SlimDX_DataStream__ctor_2.htm
So, am I doing this all wrong? :) If so, please point me in the right direction.
NOTE: most of the cpp code is extracted from https://www.braynzarsoft.net/viewtutorial/q16390-4-begin-drawing
Thank you for any help/hints, been trying to figure this out for days.
ID3D11Buffer*is a pointer. If you want to get back a pointer, you'll have to pass a pointer to a pointer, so define your C++ method like this:And C# like this:
And this part should work.