I can connect sqlite from Google Colab by uploading the database file and executing the following commands:
import sqlite3 con = sqlite3.connect('new.db') cur = con.cursor() cur.execute("SELECT * FROM page_log") # page_log is a table name in the new.db. cur.fetchone()
This works fine since we can upload the database file in sqlite due to its nature. For postgres, the situation is not so clear, at least I could not figure out how to do it. I have checked this and implemented as:
%sql postgres://postgres:[email protected]/postgres
but it did not work. The error message:
Connection info needed in SQLAlchemy format, example: postgresql://username:[email protected]/dbname or an existing connection: dict_keys()
I tried this as well:
from sqlalchemy.orm import scoped_session, sessionmaker engine = create_engine("postgres://postgres:[email protected]:5432") db = scoped_session(sessionmaker(bind=engine)) db.execute("SELECT * FROM page_log").fetchall()
OperationalError: (psycopg2.OperationalError) could not connect to server: Connection refused Is the server running on host "localhost" (::1) and accepting TCP/IP connections on port 5432?
Finally, this did not work as well.
db = psycopg2.connect(host='localhost', port=5432, user='postgres', password='1234', dbname='postgres')
The error message:
OperationalError: could not connect to server: Connection refused Is the server running on host "localhost" (::1) and accepting TCP/IP connections on port 5432?
I could not find a solution. I'd appreciate your help!