ODBC driver not working - Azure databricks unity catalog enabled

483 views Asked by At

I am trying to connect to sql server from databricks using pyodbc facing below issue

import pyodbc
sql_server_url='server_url'
sqlDB='db-dev'
SQL_DB_USER='user'
SQL_DB_PASS='123'
conn = pyodbc.connect('DRIVER={ODBC Driver 17 for SQL Server};SERVER=' + sql_server_url +';DATABASE=' + sqlDB + ';UID='+SQL_DB_USER+';PWD='+SQL_DB_PASS)

Error: ('01000', "[01000] [unixODBC][Driver Manager]Can't open lib 'ODBC Driver 17 for SQL Server' : file not found (0) (SQLDriverConnect)")

so I am trying to run below bash command

%sh
curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add -
curl https://packages.microsoft.com/config/ubuntu/16.04/prod.list > /etc/apt/sources.list.d/mssql-release.list
sudo apt-get update
sudo ACCEPT_EULA=Y apt-get -q -y install msodbcsql17

E: This command can only be used by root. /bin/bash: line 1: /etc/apt/sources.list.d/mssql-release.list: Permission denied sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper

The same code works for standard databricks version but not for unity catalog enabled for runtime version 11.3. I have a super admin access still when I try to run this command it says permission denied.

Can anyone suggest how to install these odbc drivers?

3

There are 3 answers

2
Pratik Lad On

The issue might be with the improper use of sudo in the command I tried below code, and it successfully installed for me.

Sample Code:

%sh  
 curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add -  
 curl https://packages.microsoft.com/config/ubuntu/16.04/prod.list > /etc/apt/sources.list.d/mssql-release.list   
 apt-get update  
 ACCEPT_EULA=Y apt-get install msodbcsql17

enter image description here

also refer this Similar thread which addressing similar issue.

0
Noble Gaels On

I just ran into this issue myself. What was interesting was when I created the cluster via UI and ran my init script which installed MSODBC, it worked fine. When I created the cluster through our DevOps release via API, the init script failed with this issue.

This led me to believe permissions were at play within UC. I found that the way an init script works on a UC-enabled cluster is it uses the permissions of the user who CREATED the cluster. Packages on the other hand (whl, maven, etc) use the permissions of the user who is RUNNING the cluster.

Since I was creating the cluster via API in a release, the cluster was getting created using a service principal. I gave that service principal read volume/execute/modify permissions on my schema containing the volume which holds the init script, and use catalog/use schema/modify/execute on the catalog. After assigning these permissions to the service principal that is used to create the cluster via API, the init script ran sucessfully.

0
Aneesh On

The cluster permission works differently when you are using Shared access mode cluster in unity catalog enabled workspace.

The Pyodbc install .sh script work normally [we can install the pyodbc driver using a notebook & .sh script] in an Access mode: Single User Cluster.

It have issues with (Access mode: Shared Cluster). We can not able to install the pyodb .sh using a notebook.

As the shared access mode is configured for a multiple users, the permissions to perform the root commands are restricted.

To install/use any root commands you can perform it through the Init script. The commands in the Init script will run as a root user.

Cluster init Location

If we are running Init script, we need to allow that init in the catalog explorer. As mentioned in the below image

Metastore allowed Init Location

Create a notepad in local machine to create a init script with extension .sh. Import the file from local to the volume/abfss

When you create the .sh file please make sure it is Unix compactable. Please use the below link for the referance