I am trying to do some data exploration for this dataset I have. The table I want to import is 11 million rows. Here is the script and output
#Creating a variable for our BQ project space
project_id = 'project space'
#Query
Step1 <-
"
insertquery
"
#Executing the query from the variable above
Step1_df <- query_exec(Step1, project = project_id, use_legacy_sql = FALSE, max_pages = Inf,page_size = 99000)
Error:
Error in curl::curl_fetch_memory(url, handle = handle) :
Operation was aborted by an application callback
Is there a different bigquery library I can use ? Looking to also speed up the upload time .