Is it always necessary to convert glue dynamic frame to spark dataframe before writing to snowflake? I didn't find any other way anywhere. This conversion for 20 million records is taking most of the time. Writing only takes 2 mins.
Has anyone done this like writing the dynamic frame directly to snowflake? Also learnt AWS Glue doesn't support JDBC connect to Snowflake. So giving connect details in job parameters.