CNLearn FastAPI 2 - App Settings and PostgreSQL Connection, Alembic Migrations and Authentication (Maybe)
So what will we do today? -> “Use Pydantic for the app settings, connect to PostgreSQL database (asynchronously and synchronously), use Alembic, and deal with authentication. Yes we’ll do authentication before we create the words/characters table.” Yes I quoted myself from the last post. Actually, Alembic migrations and authentication in next post. Too much otherwise.
App Settings
So we are going to use Pydantic for settings management. We will first create a settings folder in the app directory. In there, let’s create a base.py file with the following content:
import secrets
from typing import Any, Dict, List, Optional
from pydantic import AnyHttpUrl, BaseSettings, PostgresDsn, validator
class Settings(BaseSettings):
# Application settings
SECRET_KEY: str = secrets.token_urlsafe(32)
API_V1_STR: str = "/api/v1"
SERVER_NAME: str
SERVER_HOST: AnyHttpUrl
BACKEND_CORS_ORIGINS: List[AnyHttpUrl] = []
APP_NAME: str
VERSION: str
# PostgreSQL database settings
POSTGRES_USER: str
POSTGRES_PASSWORD: str
POSTGRES_SERVER: str
POSTGRES_DB: str
POSTGRES_PORT: int
SQLALCHEMY_POSTGRES_URI: Optional[PostgresDsn] = None
# if the last one is None, let's build it ourselves
@validator('SQLALCHEMY_POSTGRES_URI', pre=True)
def create_postgres_uri(cls, v: Optional[str], values: Dict[str, Any]) -> Any:
if isinstance(v, str):
return v
return PostgresDsn.build(
scheme="postgresql",
user=values.get("POSTGRES_USER"),
password=values.get("POSTGRESS_PASSWORD"),
host=values.get("POSTGRES_SERVER"),
path=f"/{values.get('POSTGRES_DB') or ''}",
port=f"{values.get('POSTGRES_PORT')}",
)
class Config:
case_sensitive = True
class DevSettings(Settings):
class Config:
env_file = '.dev.env'
env_file_encoding = 'utf-8'
class StageSettings(Settings):
class Config:
env_file = '.stage.env'
env_file_encoding = 'utf-8'
class ProdSettings(Settings):
class Config:
env_file = '.prod.env'
env_file_encoding = 'utf-8'
settings = DevSettings()
We will add to it in just a little while but let’s first see what we have. We have a base Settings class that all the other ones will inherit from. In there, we have a secret_key generated by the Python secrets module, an api_v1_str which is the prefix that will be added to our api, a server_name string, a server_host http url and a list of backend_cors_origins. We also have an app_name as string and a version as string. The more interesting settings are the database settings. We have the expected strings: user, password, server, db and port but also an optional SQLALCHEMY_POSTGRES_URI. If the user passes in the last field, it will use that. If not, it will create it. That’s what the Pydantic validator does. Please keep in mind that any validator is a class method, so the first argument is cls. The second argument is the field value we validate -> leave it as v. The third argumnt is a values dictionary which is a dictionary that contains any previously-validated fields (e.g. POSTGRES_USER, POSTGRES_PASSWORD, POSTGRES_SERVER, etc.). So the @validator decorator is applied to the ‘SQLALCHEMY_POSTGRES_URI’ field with a pre=True keyword argument which causes it to be called to other validation. So what does the validator function do in this case? Well if v (here v refers to the value of SQLALCHEMY_POSTGRES_URI) is a string (i.e. if it was passed by the user), it simply returns that. Otherwise, it builds a PostgresDsn field (listed, but not described, here). To gain a better understanding of it, let’s look at the Pydantic source code for it here. We have the following:
class PostgresDsn(AnyUrl):
allowed_schemes = {'postgres', 'postgresql'}
user_required = True
Ok so it has a certain dictionary of allowed schemes (in the URI string). This part is actually slightly annoying at the moment because it means only the default psycopg2 driver currently works. The reason it’s annoying is because I will be using an async driver for SQLAlchemy to connect to the PostgreSQL. Also I say annoyance but really it’s a minor inconvenience. I went to the project on GitHub and did a quick search and it seems someone asked the same thing a few weeks ago and there are a few discussions and pull requests linked to this (1, 2, 3, 4). If there wasn’t I would have volunteer to do it :) open source is great everyone!!! But anyway, you’ll see my “fix” in a bit. Just keep in mind that in a few weeks or months I will change the settings just slightly. Back to PostgresDsn.
So it inherits from AnyUrl, the relevant parts of which are:
...
self,
url: str,
*,
scheme: str,
user: Optional[str] = None,
password: Optional[str] = None,
host: str,
tld: Optional[str] = None,
host_type: str = 'domain',
port: Optional[str] = None,
path: Optional[str] = None,
query: Optional[str] = None,
fragment: Optional[str] = None,
...
Now some of these won’t be applicable to PostgreDsn. The ones that we are actually using in our validator are scheme, user, password, host, path and port. I am sure soon enough there will be a driver one too in which case we will use that too. So that’s what we have in settings for now.
Then I have a few different classes inheriting from BaseSettings: DevSettings, StageSettings, ProdSettings. What’s different about them is the .env file they read from. In the root directory of the project, I have a .dev.env file so the DevSettings class has a “env_file = ‘.dev.env’” attribute. Finally, since we are currently developing it, I create an instance of my DevSettings and save it as settings objects, which I’ll then import and use in other parts of the project. What does the .dev.env file look like? It looks something like this. Keep in mind that this file won’t appear on the repo on GitHub. Please don’t commit your settings unless there’s nothing in there that you care about. In this case I really could, but I want to set a good example. It looks like:
# APP SETTINGS
APP_NAME=CNLearn
VERSION=0.0.5
# DEVELOPMENT SETTINGS
SERVER_NAME=localhost
SERVER_HOST=http://127.0.0.1:8000
POSTGRES_USER=postgres
POSTGRES_PASSWORD=postgres
POSTGRES_SERVER=localhost
POSTGRES_PORT=5432
POSTGRES_DB=postgres
BACKEND_CORS_SETTINGS=[*]
When the app starts, it looks for the settings in that settings/base.py file. If it can’t fin the values, it reads the .dev.env file from the root directory.
Using the settings module in our app
The first place we use it is in the main server.py file:
from fastapi import FastAPI
from starlette.middleware.cors import CORSMiddleware
from app.api.v1.api import api_router
from app.settings.base import settings
def create_application():
"""
Creates and returns a FastAPI instance.
"""
app = FastAPI(
title=settings.APP_NAME,
version=settings.VERSION,
)
# change CORS settings
app.add_middleware(
CORSMiddleware,
allow_origins=settings.BACKEND_CORS_ORIGINS,
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
app.include_router(api_router, prefix=settings.API_V1_STR)
return app
Notice that we first import the settings object. Then we use its fields in a few places including APP_NAME, VERSION, BACKEND_CORS_ORIGINS (we’ll come back to this at some point) and API_V1_STR.
Dealing with Database Connections
Ok first of all I am using async SQLAlchemy which is still in beta. This is a fun project for me so I don’t mind having to rewrite some parts later on and I definitely expect it to break. Isn’t it nice when the disclaimer is at the beginning and not at the end of the post? Also, while I did say that, in the next post we will also create a synchronous connection to the database for the sake of Alembic (until that gets async support). There’s a lot of information on async SQLAlchemy here. Ok so in the db folder, let’s create a connections.py file and add the following code:
from sqlalchemy.orm.session import sessionmaker
from fastapi import FastAPI
from sqlalchemy.ext.asyncio import create_async_engine
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.orm import sessionmaker, close_all_sessions
import logging
from app.settings.base import settings
logger = logging.getLogger(__name__)
async def open_postgres_database_connection(app: FastAPI) -> None:
# these are configured in the settings module
ASYNC_URI: str = settings.SQLALCHEMY_POSTGRES_URI.replace(
"postgresql", "postgresql+asyncpg", 1)
engine = create_async_engine(ASYNC_URI, echo=True)
async_session = sessionmaker(
engine, expire_on_commit=False, class_=AsyncSession
)
try:
logging.info("You have connected to the database")
app.state._db = async_session
except Exception as e:
logger.warn("ERROR OCCURRED WHILE CONNECTING TO THE DATABASE")
logger.warn(e)
logger.warn("ERROR OCCURRED WHILE CONNECTING TO THE DATABASE")
async def close_postgres_database_connection(app: FastAPI) -> None:
try:
close_all_sessions()
logging.info("DB connection closed")
except Exception as e:
logger.warn("ERROR OCCURRED WHILE CLOSING ALL THE SESSIONS")
logger.warn(e)
logger.warn("ERROR OCCURRED WHILE CLOSING ALL THE SESSIONS")
Ok let’s go through it. I have two functions. One to create the connetion to the database and one to close any remaining sessions. Remember how I said you can’t yet use an async driver in the PostgresDsn? Well, I create an ASYNC_URI string by taking in the SQLALCHEMY_POSTGRES_URI string from our settings and replacing (only the first occurrence) “postgresql” with “postgresql+asyncpg”. Ok now it works. At some point this “fix” won’t be necessary and I’ll update the code accordingly. Then I create an async_engine using the (drum rolls please) create_async_engine function from SQLAlchemy. Then I create an async_session sessionmaker. Please note that the documentation doesn’t yet document the AsyncSession part well, so everything is based on this. Then I set the async_session as a private _db variable on our app’s state. If everything goes well, i.e. connection is made, we will see You have connected to the database". In the other function, I use the close_all_sessions() function.
Starlette Event Handlers
But why did we create these two functions? Note that we are not actually creating any sessions. We will do that using FastAPI dependencies in just a little bit. We created these two functions so that we can run them when the app starts and shutdowns. In order to do so, let’s create a tasks folder where, for now, we will have two files: startup and shutdown. The startup file will contain tasks that will run when our application launches. The shutdown file will contain tasks that will run when the application is shutting down. There is some information on event handlers in the FastAPI documentation here but most comes from Starlette’s documentation on Events.
Let’s see our startup.py events:
from typing import Callable
from fastapi import FastAPI
from app.db.connections import open_postgres_database_connection
def database_start_app_handler(app: FastAPI) -> Callable:
async def start_db() -> None:
await open_postgres_database_connection(app)
return start_db
Since the event handlers must return a Callable, we wrapped our open_postgres_database_connection function in a function. Similarly, in shutdown.py we have:
from typing import Callable
from fastapi import FastAPI
from app.db.connections import close_postgres_database_connection
def database_stop_app_handler(app: FastAPI) -> Callable:
async def stop_db() -> None:
await close_postgres_database_connection(app)
return stop_db
Ok, we now have two event handlers. How do we run them at startup and shutdown? Let’s go back to our server.py file:
...
from app.tasks.startup import database_start_app_handler
from app.tasks.shutdown import database_stop_app_handler
def create_application():
"""
Creates and returns a FastAPI instance.
"""
app = FastAPI(
title=settings.APP_NAME,
version=settings.VERSION,
)
app.add_event_handler("startup", database_start_app_handler(app))
app.add_event_handler("shutdown", database_stop_app_handler(app))
# change CORS settings
app.add_middleware(
CORSMiddleware,
allow_origins=settings.BACKEND_CORS_ORIGINS,
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
app.include_router(api_router, prefix=settings.API_V1_STR)
return app
We include them by using the add_event_handler method of our app instance. Now when our app starts or shut downs, the respective handlers are called. Ok, we’re almost there. Actually, we are there for today. I don’t want to create the DB dependencies and injecting them in our routers just yet just because we don’t have either the database tables or the respective api routes. We’ll get there :) So in the next post we’ll look at creating an authentication system and using Alembic to create the necessary migrations. and we should really start testing it…
The commit for this post is available here.