I am trying to list all the tables in an MSSQL database and place them in a pandas dataframe with the correct types, but it doesn't seem to work.
Approach 1
# Imports
from sqlalchemy import create_engine
# Creating Engine
server="myserver"
database="mydatabase"
driver="ODBC Driver 13 for SQL Server"
cnx = create_engine(f"mssql+pyodbc://{server}/{database}?trusted_connection=yes&driver={driver}")
# Fetching Information Table
cursor = cnx.execute("SELECT * FROM information_schema.tables;")
table_names_df=pd.DataFrame(cursor.fetchall(), columns=cursor.keys())
My problem here is that I not have the types for the table, so everything is imported as objects. Is there not some metadata that I can extract and use to set the dtypes?
Approach 2
Here I just do the import with Pandas, as it can somehow translate the types for us given the metadata.
# Pandas does the magic!
pd.read_sql_table(table_name='tables',
schema='information_schema',
con=cnx)
Well, this approach does somehow not work for the information_schema...