For eg: I have response returned as follows from a GET service. I want to evaluate URLs: URL2.com and URL7.com further in the same process and store the result. What would be a good approach to do this. I know the method of using UTL_HTTP to do a GET call but in this case should I do a GET call for 'parent' URL and then further GET calls for all my required child requests?
{
"items": [
{
"PersonId": 1,
"PersonNumber": "2",
"CorrespondenceLanguage": null,
"BloodType": null,
"DateOfBirth": "1970-03-11",
"DateOfDeath": null,
"CountryOfBirth": null,
"RegionOfBirth": null,
"TownOfBirth": null,
"ApplicantNumber": null,
"CreatedBy": "",
"CreationDate": "2018-06-21T03:46:31.499+00:00",
"LastUpdatedBy": "",
"LastUpdateDate": "2021-05-25T10:52:05.751+00:00",
"links": [
{
"rel": "self",
"href": "URL1.com",
"name": "workers",
"kind": "item",
"properties": {
"changeIndicator": "X"
}
},
{
"rel": "canonical",
"href": "URL2.com",
"name": "workers",
"kind": "item"
},
{
"rel": "child",
"href": "URL3.com",
"name": "addresses",
"kind": "collection"
}
]
},
{
"PersonId": 2,
"PersonNumber": "1",
"CorrespondenceLanguage": null,
"BloodType": null,
"DateOfBirth": "1969-07-04",
"DateOfDeath": null,
"CountryOfBirth": null,
"RegionOfBirth": null,
"TownOfBirth": null,
"ApplicantNumber": null,
"CreatedBy": "X.X",
"CreationDate": "2020-03-06T14:05:09+00:00",
"LastUpdatedBy": "",
"LastUpdateDate": "2021-04-26T23:01:31.867+00:00",
"links": [
{
"rel": "self",
"href": "URL5.com",
"name": "workers",
"kind": "item",
"properties": {
"changeIndicator": "X"
}
},
{
"rel": "canonical",
"href": "URL6.com",
"name": "workers",
"kind": "item"
},
{
"rel": "child",
"href": "URL7.com",
"name": "addresses",
"kind": "collection"
}
]
}
],
"count": 25,
"hasMore": true,
"limit": 25,
"offset": 0,
"links": [
{
"rel": "self",
"href": "URL8",
"name": "workers",
"kind": "collection"
}
]
}
To evaluate URLs returned in HATEOAS format using Oracle PL/SQL, you can follow these steps:
Here's a sample PL/SQL script to get you started:
This script will help you parse the JSON response and extract the URLs from the "links" array, and make GET requests to the URLs you're interested in. You'll need to add error handling and further processing based on your specific requirements.