I've got some JSON data coming into an IOT Hub, which then triggers a function to un-nest the data.

The function sends this data to an Event Hub, and then the data is supposed to be ingested by Azure Data Explorer according to the mapping I've set up.

The problem is that no data makes it to the data explorer; the only way it will receive data with a mapping is by setting the origin as an event hub that is receiving information by custom routing.

Is it possible to ingest data in the data explorer by way of IOT hub -> function -> event hub?

EDIT:

The function being used to un-nest and forward the data to another event hub:

module.exports = async function (context, eventHubMessages) {

    // receive message from IOT hub
    eventHubMessages.forEach((message, index) => {
        var devicename = message.deviceName;
        // timestamp comes in two different texts, find and store correct one
        var timestamp = (message.timestamp == null) ? message.timeStamp : message.timestamp;
        //context.log("Message: " + JSON.stringify(message));
        if (message.tags != null) {
            message.tags.forEach((tag, index) => {
                // for each tag, create new object
                var name = tag.Name;
                var value = tag.Value;
                var newObject = {
                                 "name":name,
                                 "value": value,
                                 "eventenqueuedutctime": timestamp,
                                 "devicename": devicename
                                }                
                // output message object to 'splitmessage-dev' event hub
                context.bindings.outputEventHubMessage = newObject
                context.log("Sent object: " + JSON.stringify(newObject));
            })
        }
    });

};

I can confirm the other event hub is receiving this data (checked with another function that prints the incoming messages).

The mapping looks like this:

'testTableMap' '[{"column":"name", "path":"$.name"}, 
{"column":"value", "path":"$.value"}, 
{"column":"eventenqueuedutctime", "path":"$.eventenqueuedutctime"},
{"column":"devicename", "path":"$.devicename"}]'

1 Answers

0
Daniel Dubovski On Best Solutions

It should be possible to ingest data with the setup you provided.

Since data isn't being ingested, you should probably try and diagnose where it gets stuck.

I'll provide some general guidelines that should help with debugging.

Data doesn't reach the EventHub that is configured as the Data Source for ADX

You can check the EventHub monitoring and verify that events are flowing. Also, I would recommend reading the data from the EventHub just to make sure it looks like you would expect. (seems like you did this already)

This doesn't seem like the case here, as you have clarified that you do see events flowing into EventHub and you can read them successfully via another function.

Table / Mapping aren't configured properly

Try and ingest the data from the configured EventHub manually

// create table 
// I am assuming strings because other types can cause format errors, 
// so I would try strings before other specific types (dynamic, datetime)
.create table IngestionTest (name: string, value:string, eventenqueuedutctime:string, devicename:string)

// add mapping
.create table IngestionTest ingestion json mapping 'testTableMap' '[{"column":"name", "path":"$.name"}, {"column":"value", "path":"$.value"}, {"column":"eventenqueuedutctime", "path":"$.eventenqueuedutctime"},{"column":"devicename", "path":"$.devicename"}]'

// ingest data manually - replace with you actual data
.ingest inline into table IngestionTest with (jsonMappingReference='testTableMap') <| '{ "name" : "test", "value": { "some":"nested", "value": true}, "eventenqueuedutctime" : "2016-01-14", "devicename": "surface" }'

If the above doesn't work, you can try and understand why by listing ingestion errors

.show ingestion failures

If the above worked, you can try and play around with other data types.

Data Source monitoring

One more thing you can check via Azure Portal, is metrics for the ADX cluster.

Try going to your cluster in Azure Portal, and in the Metrics tab you can pick two metrics that can help you troubleshoot:

Events Processed - Gives you a sense of how many events ADX managed to read from EventHub. This basicly lets you know that the Data Source is configured properly. If you see not events, I would suggest setting up a new source.

Ingestion Result - Gives you a count on ingestion statuses (success / failure), which can also help diagnose failures.