I run following task in Azure DevOps, it always hangs for input? Why wasn't my bash automatic supply working?
databricksUrl=https://...
databricksToken=*****
databricks configure --token << EOF
$(databricksUrl)
$(databricksToken)
EOF
I run following task in Azure DevOps, it always hangs for input? Why wasn't my bash automatic supply working?
databricksUrl=https://...
databricksToken=*****
databricks configure --token << EOF
$(databricksUrl)
$(databricksToken)
EOF
You try the below Inline bash script to authenticate with Azure Databricks without variables.
databricks configure --token <<EOF
https://centralus.azuredatabricks.net
dapXXXXXXXXXXXXXXXXXXXXXX467
EOF
You try the below Inline bash script to authenticate with Azure Databricks with variables.
adburl="https://centralus.azuredatabricks.net"
token=dapXXXXXXXXXXXXXXXXXXXXXXXXX467
databricks configure --token <<EOF
$adburl
$token
EOF
Successfully authenticated with Azure Databricks:
OR
You can use DevOps for Azure Databricks extension.
This extension brings a set of tasks for you to operationalize build, test and deployment of Databricks Jobs and Notebooks.
Once DevOps for Azure Databricks extension installed, you can directly use Configure Databricks CLI by clicking on the Add tasks.
There are two solutions for Databricks CLI > 0.11.0:
~/.databricks.cfg
directly in form:echo "[DEFAULT]
host = $url
token = $token" > ~/.databricks.cfg
--host
& --token-file
to specify host & token:echo $token > token-file
databricks configure --host $url --token-file token-file
rm -f token-file
I struggled with this for days. Two things I've seen is that when you have your token-file (in my Windows case %user%/.databrickscfg) in place and you execute
databricks configure --host https://centralus.azuredatabricks.net --token-file .databrickscfg
repacing https://centralus.azuredatabricks.net with your actual URL
It doesn't give a success status message.
It actually changes the file contents of your token-file. It replaces the token = dapi.... with token = [DEFAULT]. I'm deploying my solution in Azure Batch on remote nodes with a Start Task. So what I had to do (using application packages) is zip the .databrickscfg file and let Batch install it on the nodes. Then run
databricks configure --host https://centralus.azuredatabricks.net --token-file .databrickscfg
then xcopy /Y the .databrickscfg file from the %AZ_BATCH_APP_PACKAGE_% location to the named user working directory %USERPROFILE%. Then run any databricks commands.
I have the same problem. My release pipeline worked fine in the past and now it got stuck at authentication step even though no changes were made.
UPDATTE: The problem was caused by the new version of
databricks-cli
(0.12.0). Using version 0.11.0 will solve the problem: