I put the following file in a local Databricks volume named livelandingtest:
/Volumes/seary_test/default/livelandingtest/TIOBE/TIOBE-2018-09.csv
Also, I have this code that tries to do an autoload on any files inside the livelandingtest/TIOBE directory:
-- https://www.databricks.com/discover/pages/getting-started-with-delta-live-tables
CREATE LIVE TABLE TIOBE_Live_Test_Bronze
COMMENT "Auto Loader Test"
TBLPROPERTIES ("quality" = "bronze")
AS
SELECT * FROM cloud_files(
"dbfs:/Volumes/seary_test/default/livelandingtest/TIOBE/*",
"csv",
map("cloudFiles.inferColumnTypes", "true")
);
When I run the above code as a Delta Live Tables Workflow, I get this error:
java.lang.IllegalArgumentException: Cannot access the UC Volume path from this location. Path was /Volumes/seary_test/default/livelandingtest/TIOBE/
What syntax should I use in my FROM clause on lines 6-10 to access the CSV files in my local volume /Volumes/seary_test/default/livelandingtest?
The problem wasn't necessarily the path as listed above. It seems I wasn't choosing the correct settings when creating my Delta Live Tables (DLT) pipeline. When I chose my seary_test catalog under Destination Storage Options and added a STREAMING keyword on Line#2 of the above code, my DLT pipeline was able to startup:
STREAMING keyword was added on Line#2 below
Since I'm new to DLT, I'm not completely sure why I needed the STREAMING keyword. I guess that has something to do with my use of the cloud_files function. However, the key factor for accessing the /Volumes/seary_test/default/livelandingtest/TIOBE/* path seems to be the need to select my seary_test catalog under the Destination section when creating the pipeline via the Web GUI.