jq - create empty array and add objects to it

17.4k views Asked by At

I am working on a bash script (using jq for JSON parsing)

  1. that needs to make multiple CURL calls (response has same structure but different values), apply some logic/filters and then collate all the responses in one final JSON array of objects.
  2. Loop through this final JSON array and write into a CSV in a predefined format.

Searched for a bit for both the requirements but could not find anything concrete. Please advice. The highlighted steps below (in ***) are the points where I need help.

Sample Flow :

create empty FINAL array

for(eachService is serviceList)
       a. CURL <service_response> returning JSON array of objects
       b. use jq filters to parse JSON response, apply some logic and modify elements in response as needed
       c. ***add this JSON array to FINAL array***

***LOOP through FINAL array, one object at a time a write to CSV.***

Sample Data :

CURL Response 1 (ex: $curl1):
[
  {
    "id":"123",
    "startDate": "2016-12-09T00:00:00Z",
    "calls":4
  },
  {
    "id":"456",
    "startDate": "2016-12-09T00:00:00Z",
    "calls":22
  }
]

CURL Response 2 (ex : $curl2): 
[
  {
    "id":"789",
    "startDate": "2016-12-09T00:00:00Z",
    "calls":8
  },
  {
    "id":"147",
    "startDate": "2016-12-09T00:00:00Z",
    "calls":10
  }
]

NEEDED OUTPUT ($final): 
[
{
    "id":"123",
    "startDate": "2016-12-09T00:00:00Z",
    "calls":4
  },
  {
    "id":"456",
    "startDate": "2016-12-09T00:00:00Z",
    "calls":22
  },
  {
    "id":"789",
    "startDate": "2016-12-09T00:00:00Z",
    "calls":8
  },
  {
    "id":"147",
    "startDate": "2016-12-09T00:00:00Z",
    "calls":10
  }
]
3

There are 3 answers

2
hek2mgl On BEST ANSWER

jq can deal with multiple input arrays. You can pipe the whole output of the loop to it:

for service in "$services" ; do
    curl "$service/path"
done | jq -r '.[]|[.id,.startDate,.calls]|@csv'

Note that the csv transformation can be done by @csv

4
janos On

You don't need a "final" array. You can process the individual JSON output of each curl call, and parse each with the same jq script, converting JSON input to CSV output. Something along the lines of:

for url; do
    curl "$url" | jq 'filter-and-extract-csv-columns'
done > output.csv

Notice the redirection of the entire loop to output.csv.

This kind of streamlined processing is possible thanks to the CVS format being flat without a surrounding context as in XML or JSON. The output of multiple computations can be simply concatenated.

0
peak On

As @hek2mlg pointed out, it should be possible to invoke jq just once. If the input is sufficiently uniform (admittedly, maybe a big "if"), you could even avoid having to name the fields explicitly, e.g.:

$ for service in "$services" ; do
    curl "$service/path"
  done | jq -sr 'add[] | [.[]] | @csv'

Output:

"123","2016-12-09T00:00:00Z",4
"456","2016-12-09T00:00:00Z",22
"789","2016-12-09T00:00:00Z",8
"147","2016-12-09T00:00:00Z",10

Note that using -s allows you to perform arbitrary computations on all the inputs, e.g. counting them.