I'm trying to load tables from Data services to Snowflake using a s3 bucket (it's required to bulkload the info).
I can't format output files to s3 bucket. I have problems with break lines (doesn't break the lines) and dates (extra precision) , and probably i'll have problem with commas if any text has one (actually separators are comma).
I've seen the posibility to write the file as a json in s3 bucket with nested schema. But if i do that i don't know how to call the copy into from Snowflake.
This project is a migration. I'm changing the old database for snowflake. Jobs in SAP DS are already created, and the idea is just change the destiny, but not the information flow.
If someone can bring me some help would be awesome. Thanks
You can use a table with a single column of type VARIANT to load the json file.
Here is an example:
For more information have a look here
You can also query directly a JSON file staged, see below example:
I get:
I can also do:
And I get: