Save ARFaceGeometry to OBJ file

2.9k views Asked by At

In an iOS ARKit app, I've been trying to save the ARFaceGeometry data to an OBJ file. I followed the explanation here: How to make a 3D model from AVDepthData?. However, the OBJ isn't created correctly. Here's what I have:

    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
        guard let faceAnchor = anchor as? ARFaceAnchor else { return }
        currentFaceAnchor = faceAnchor

        // If this is the first time with this anchor, get the controller to create content.
        // Otherwise (switching content), will change content when setting `selectedVirtualContent`.
        if node.childNodes.isEmpty, let contentNode = selectedContentController.renderer(renderer, nodeFor: faceAnchor) {
            node.addChildNode(contentNode)
        }

        // https://stackoverflow.com/questions/52953590/how-to-make-a-3d-model-from-avdepthdata
        let geometry = faceAnchor.geometry        
        let allocator = MDLMeshBufferDataAllocator()
        let vertices = allocator.newBuffer(with: Data(fromArray: geometry.vertices), type: .vertex)
        let textureCoordinates = allocator.newBuffer(with: Data(fromArray: geometry.textureCoordinates), type: .vertex)
        let triangleIndices = allocator.newBuffer(with: Data(fromArray: geometry.triangleIndices), type: .index)
        let submesh = MDLSubmesh(indexBuffer: triangleIndices, indexCount: geometry.triangleIndices.count, indexType: .uInt16, geometryType: .triangles, material: MDLMaterial(name: "mat1", scatteringFunction: MDLPhysicallyPlausibleScatteringFunction()))

        let vertexDescriptor = MDLVertexDescriptor()
        // Attributes
        vertexDescriptor.addOrReplaceAttribute(MDLVertexAttribute(name: MDLVertexAttributePosition, format: .float3, offset: 0, bufferIndex: 0))
        vertexDescriptor.addOrReplaceAttribute(MDLVertexAttribute(name: MDLVertexAttributeNormal, format: .float3, offset: MemoryLayout<float3>.stride, bufferIndex: 0))
        vertexDescriptor.addOrReplaceAttribute(MDLVertexAttribute(name: MDLVertexAttributeTextureCoordinate, format: .float2, offset: MemoryLayout<float3>.stride + MemoryLayout<float3>.stride, bufferIndex: 0))
        // Layouts
        vertexDescriptor.layouts.add(MDLVertexBufferLayout(stride: MemoryLayout<float3>.stride + MemoryLayout<float3>.stride + MemoryLayout<float2>.stride))

        let mdlMesh = MDLMesh(vertexBuffers: [vertices, textureCoordinates], vertexCount: geometry.vertices.count, descriptor: vertexDescriptor, submeshes: [submesh])
        mdlMesh.addNormals(withAttributeNamed: MDLVertexAttributeNormal, creaseThreshold: 0.5)
        let asset = MDLAsset(bufferAllocator: allocator)
        asset.add(mdlMesh)

        let documentsPath = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
        let exportUrl = documentsPath.appendingPathComponent("face.obj")
        try! asset.export(to: exportUrl)
    }

The resulting OBJ file looks like this:

# Apple ModelIO OBJ File: face
mtllib face.mtl
g 
v -0.000128156 -0.0277879 0.0575149
vn 0 0 0
vt -9.36008e-05 -0.0242016
usemtl material_1
f 1/1/1 1/1/1 1/1/1
f 1/1/1 1/1/1 1/1/1
f 1/1/1 1/1/1 1/1/1
... and many more lines

I would expect many more vertices, and the index values look wrong.

1

There are 1 answers

8
warrenm On BEST ANSWER

The core issue is that your vertex data isn't described correctly. When you provide a vertex descriptor to Model I/O while constructing a mesh, it represents the layout the data actually has, not your desired layout. You're supplying two vertex buffers, but your vertex descriptor describes an interleaved data layout with only one vertex buffer.

The easiest way to remedy this is to fix the vertex descriptor to reflect the data you're providing:

let vertexDescriptor = MDLVertexDescriptor()
// Attributes
vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition,
                                                    format: .float3,
                                                    offset: 0,
                                                    bufferIndex: 0)
vertexDescriptor.attributes[1] = MDLVertexAttribute(name: MDLVertexAttributeTextureCoordinate,
                                                    format: .float2,
                                                    offset: 0,
                                                    bufferIndex: 1)
// Layouts
vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: MemoryLayout<float3>.stride)
vertexDescriptor.layouts[1] = MDLVertexBufferLayout(stride: MemoryLayout<float2>.stride)

When you later call addNormals(...), Model I/O will allocate the necessary space and update the vertex descriptor to reflect the new data. Since you're not rendering from the data and are instead immediately exporting it, the internal layout it chooses for the normals isn't important.

A correctly exported ARKit face mesh