Error-proofing my Logic App JSON schemas for cyber incident handling: how to proceed?

81 views Asked by At

So people,

I am building a Azure Logic App, which through an API, gets information related to a security incident. The idea is then to parse this JSON and post it into an an Azure Log Analytics Workspace for further handeling. This Logic App would be fully automated.

One issue is gathering the needed data to send. So the data from the API is in such a format:

`

{ "data": [ { "property1": "value1", "property2": "value2", "property3": "value3" } ] "extra data":[ { "edata1": "evalue1" "edata2": "edata2" "edata3": "edata3" }

`

I initially used a sample of the API response to generate the scheme in the Logic App, but later ran into different issues, as the fields slightly differ and dont always contain all the properties (in this example the "extra data"). The data is then null or in some slightly other format.

For the Log Analytics table, I have the names of all the possible properties, which I intend to populate the log table with.

As I wish to send all the data from the API response, I have failed to figure a error-proof way of doing this as the inputs differ sligthly.

Then the second part is then to send the data to the log table. I am using the Azure Log connector in Logic Apps. The format needs to be:

`

{ "property1": "value1", "property2": "value2", "property3": "value3" }

`

as these will be dynamic values, the value is actually = the property. I would like these to be dynamic, so it doesnt send colums with the value of "null" (in actual text) to the log table.

How would you go forward with this? Thanks in advance!

Provided a sample from the API response to generate the schema for JSON parsing.

For log forwarding, I simply tested by using static values from an actual run.

1

There are 1 answers

0
RithwikBojja On

but later ran into different issues, as the fields slightly differ and dont always contain all the properties (in this example the "extra data")

Then you need to change your schema, you need to remove required properties in schema like below:

Parse_Json Schema:

{
    "type": "object",
    "properties": {
        "data": {
            "type": "array",
            "items": {
                "type": "object",
                "properties": {
                    "property1": {
                        "type": "string"
                    },
                    "property2": {
                        "type": "string"
                    },
                    "property3": {
                        "type": "string"
                    }
                }
            }
        },
        "extra data": {
            "type": "array",
            "items": {
                "type": "object",
                "properties": {
                    "edata1": {
                        "type": "string"
                    },
                    "edata2": {
                        "type": "string"
                    },
                    "edata3": {
                        "type": "string"
                    }
                }
            }
        }
    }
}

send the data to the log table.

enter image description here

Output:

enter image description here