We are in the process of migrating code from a SQL Server database into Postgres v16 database
We have a sample file called 'temprj.json' which is defined as follows:
temprj.json
{
"ChannelReadings": [
{
"ReadingsDto": [
{
"Si": 22.771737,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T12:57:43"
},
{
"Si": 22.734136,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:02:43"
},
{
"Si": 22.680228,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:07:43"
},
{
"Si": 22.619847,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:12:43"
},
{
"Si": 22.549753,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:17:43"
},
{
"Si": 22.486253,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:22:43"
},
{
"Si": 22.416321,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:27:43"
},
{
"Si": 22.339979,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:32:43"
},
{
"Si": 22.262115,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:37:43"
},
{
"Si": 22.197308,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:42:43"
},
{
"Si": 22.145514,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:47:43"
},
{
"Si": 22.105082,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:52:43"
},
{
"Si": 22.056599,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:57:43"
},
{
"Si": 22.0243,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T14:02:43"
},
{
"Si": 21.996859,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T14:07:43"
}
],
"ChannelId": 11
},
{
"ReadingsDto": [
{
"Si": 47.67,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T12:57:43"
},
{
"Si": 47.22,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:02:43"
},
{
"Si": 47.6,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:07:43"
},
{
"Si": 47.5,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:12:43"
},
{
"Si": 47.64,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:17:43"
},
{
"Si": 47.71,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:22:43"
},
{
"Si": 47.85,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:27:43"
},
{
"Si": 48.04,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:32:43"
},
{
"Si": 48.08,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:37:43"
},
{
"Si": 48.18,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:42:43"
},
{
"Si": 48.28,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:47:43"
},
{
"Si": 48.37,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:52:43"
},
{
"Si": 48.34,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:57:43"
},
{
"Si": 48.4,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T14:02:43"
},
{
"Si": 48.45,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T14:07:43"
}
],
"ChannelId": 14
},
{
"ReadingsDto": [
{
"Si": 11.088512570249659,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T12:57:43"
},
{
"Si": 10.911263312482156,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:02:43"
},
{
"Si": 10.982728529559426,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:07:43"
},
{
"Si": 10.89588156979372,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:12:43"
},
{
"Si": 10.876124721972,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:17:43"
},
{
"Si": 10.840173572226611,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:22:43"
},
{
"Si": 10.820314510971443,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:27:43"
},
{
"Si": 10.810096243819933,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:32:43"
},
{
"Si": 10.751362653255747,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:37:43"
},
{
"Si": 10.723286252634733,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:42:43"
},
{
"Si": 10.707037296936024,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:47:43"
},
{
"Si": 10.698006273047567,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:52:43"
},
{
"Si": 10.644282198132391,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:57:43"
},
{
"Si": 10.63333677772909,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T14:02:43"
},
{
"Si": 10.623712097975587,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T14:07:43"
}
],
"ChannelId": 17
},
{
"ReadingsDto": [
{
"Si": 2.893605,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:07:43"
}
],
"ChannelId": 12
},
{
"ReadingsDto": [
{
"Si": 3.294233,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:07:43"
}
],
"ChannelId": 13
},
{
"ReadingsDto": [
{
"Si": 3.294233,
"Raw": 0,
"Conversion": 0,
"TimeStamp": "2023-01-24T13:07:43"
}
],
"ChannelId": 16
}
],
"DeviceSerialNumber": "894339",
"RestartPointerNo": 5514732,
"NewDownloadTable": false,
"DataHashDto": "5Mckxoq42EeLHmLnimXv6A=="
}
The JSON in this file can be directly accessed using the following t_SQL Code:
SELECT
d.DeviceSerialNumber,
c.channelid,
r.[Si],
r.[Raw],
r.[TimeStamp]
FROM OPENROWSET(BULK 'C:\ChrisDev\Readings\temprj.json',
SINGLE_CLOB) AS J
OUTER APPLY OPENJSON(BulkColumn)
WITH
(
DeviceSerialNumber NVARCHAR(10) '$.DeviceSerialNumber',
ChannelReadings NVARCHAR(MAX) '$.ChannelReadings' AS JSON
) d
OUTER APPLY OPENJSON(d.ChannelReadings)
WITH
(
ChannelId INT '$.ChannelId',
Readings NVARCHAR(MAX) '$.ReadingsDto' AS JSON
) c
OUTER APPLY OPENJSON(c.Readings)
WITH (
Si DECIMAL(10, 2) '$.Si',
[Raw] INT '$.Raw',
[TimeStamp] DATETIME '$.TimeStamp'
) r
Is there a Postgres equivalent function of Select from OpenRowset where i can directly reference the contents of My JSON file.
Or do i have to go through the route of using the Copy command to load my JSON into a staging table and then extract the JSON direcrtly from there?
What you are looking for is called a Foreign Data Wrapper (FDW). Specifically you will want a file-based one that understands json. The wiki has a list of all extensions available (or anyway, those that people have recorded). There is a json file wrapper listed there, but I've never used it, so can't say how easy it is. Also you are on Windows, so you probably don't have suitable developer tools to compile anything that needs compiling.
So - unless this is going to be a regular thing that you want to automate, or the files in question are massive, I'd just import it to a single value and extract what you want from inside the database.
Note too that the PostgreSQL server process almost certainly won't have permission to read files from your user account (which is a good thing). If you want to access the files server-side you'll need to put them in some more globally-accessible place on the filesystem first or change the owner.