-
Notifications
You must be signed in to change notification settings - Fork 4
Open
Labels
documentationImprovements or additions to documentationImprovements or additions to documentation
Description
I will add comments on this issue over time to remember what setup has been done
This can be collected into a Kubernetes configuration for postgres at some point when we have that running in our production environment.
- create postgres user airflow_user
I used the [airflow docs](https://airflow.apache.org/docs/apache-airflow/stable/howto/set-up-database.html#setting-up-a-postgresql-database as a guide
CREATE DATABASE airflow_db;
CREATE USER airflow_user WITH PASSWORD 'airflow_pass';
GRANT ALL PRIVILEGES ON DATABASE airflow_db TO airflow_user;
ALTER ROLE airflow_user SET search_path = airflow;
- allow other servers to connect by setting up /etc/postgresql/10/main/pg_hba.conf
My first attempt is this line:
host airflow_db airflow_user 192.168.86.12/30 scram-sha-256
This may be possible to degrade to a local connection.
Pending Questions
How do we set up the SQL Alchemy driver that airflow uses to not store the user and password in-line?
Possible answer
Which parts of Airflow connect to postgres? Do we need to allow connections from non-local workers? If we can do that, we can add a Linux user id for postgres_user and have postgres trust that the unix domain socket can verify the identity. That would avoid having to put a password into the connection string.
Metadata
Metadata
Assignees
Labels
documentationImprovements or additions to documentationImprovements or additions to documentation