Closed vnijs closed 5 years ago
@psychemedia should be able to help you, previous blog post on the topic https://blog.ouseful.info/2019/02/04/running-a-postgresql-server-in-a-mybinder-container/
Thanks for the link @betatim. I did see that blog post before but I'm not using binder and was hoping to see some more examples that I could use as a starting point to get a separate postgres (10) DB setup for each student on our (ubuntu 18.04) server. I didn't see an (easy) path to do that from the content of the blog post. I have a setup that works locally (see below) but there is obviously a problem because I cannot get the connection to work on the server running jupyterhub. If you (or @psychemedia) have any more suggestions or examples that would be very useful. Thanks
Error:
OperationalError: (psycopg2.OperationalError) could not connect to server: Connection refused
Is the server running on host "127.0.0.1" and accepting
TCP/IP connections on port 5432?
Start docker container:
docker volume create --name=pg_data
docker run --rm -p 8888:8888 -p 5432:5432 \
-e NB_USER=0 -e NB_UID=1002 -e NB_GID=1002 \
-v ~:/home/jovyan \
-v pg_data:/var/lib/postgresql/10/main \
vnijs/rsm-jupyterhub:latest
Phyhon code to connect working locally:
from sqlalchemy import create_engine
import pandas as pd
## connecting to the rsm-docker database
engine = create_engine('postgresql://jovyan:postgres@127.0.0.1:5432/rsm-docker')
engine.execute("CREATE TABLE IF NOT EXISTS films (title text, director text, year text)")
engine.execute("INSERT INTO films (title, director, year) VALUES ('Doctor Strange', 'Scott Derrickson', '2016')")
pd.read_sql_query('SELECT * FROM films', con=engine)
Source:
@vnijs This post embeds an example of how to use Docker Compose to link a Jupyter notebook to Postgres: docker-compose
file, notebook fragment.
Docker-compose can create a network connecting two containers and then allow you to call between them by name.
For example:
#Based on: https://github.com/khezen/compose-postgres
version: '3.5'
services:
postgres:
container_name: postgres_container
image: postgres
environment:
POSTGRES_USER: ${POSTGRES_USER:-postgres}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-changeme}
PGDATA: /data/postgres
volumes:
- postgres:/data/postgres
#ports:
# - "5432:5432"
networks:
- jnet
restart: unless-stopped
jupyter:
container_name: jupyter_notebook_container
#image: jupyter/minimal-notebook
build:
context: .
dockerfile: Dockerfile-jupyter
environment:
JUPYTER_TOKEN: "$JUPYTER_TOKEN"
ports:
- "80:8888"
volumes:
#A file needs to be mounted against a files, not a directory
- ./DatabaseConnections.ipynb:/home/jovyan/work/DatabaseConnections.ipynb
networks:
- jnet
restart: unless-stopped
networks:
jnet:
driver: bridge
volumes:
postgres:
and use the following connection string in a notebook in the Jupyter container: postgresql://postgres:changeme@postgres_container/postgres
Thanks @psychemedia. Have you seen anyone using an approach with postgres in the same docker container as jupyer (lab) and then managing docker spawning through jupyterhub? I have several approaches know that work well locally but none (yet) where I can have each student connect to their own postgres instance one a server (with jupyterhub). I thought it might be similar to what you do with binder but perhaps not. Any suggestions?
@vnijs just because you aren't using Binder doesn't mean you can't use the tools that build the images for binder :) You can use http://repo2docker.readthedocs.io/ locally to build the image and then use that in your JupyterHub. repo2docker is what Binder uses so you can have all the same special files -> maybe try out Tony's blog post recipe?
I still don't have a full answer but it is apparently possible to do this with jupyterhub. You campus IT figured out how to make it work but I don't have all the details unfortunately. Will close.
Do you have any examples of starting up postgres using a trailet? Tried the below but no luck so far. Postgres is set up in the same container as jupyter (lab and hub).