Hooking FastAPI Endpoints up to a Postgres Database

undraw svg category

Welcome to Part 3 of Up and Running with FastAPI. If you missed part 2, you can find it here .

This series is focused on building a full-stack application with the FastAPI framework. The app allows users to post requests to have their residence cleaned, and other users can select a cleaning project for a given hourly rate.

Up And Running With FastAPI

Last time we left off, we setup our postgresql database and configured our migration running with alembic and sqlalchemy. In this post, we’ll add CRUD endpoints to our cleanings router and hook them up to the database. Additionally, we’ll create a cleanings repository that will interface with our db, and execute any related SQL queries. FastAPI depends on Pydantic for data validation, so we’ll also need to use the library to construct models for our cleanings resource.

We’ll begin at the data level and work our way outward.

FastAPI and Pydantic

For each resource, we’ll describe how it should look and behave at various stages of our application. FastAPI explains how to use Pydantic in their docs:

You declare the ‘shape’ of the data as classes with attributes.

And each attribute has a type.

Then you create an instance of that class with some values and it will validate the values, convert them to the appropriate type (if that’s the case) and give you an object with all the data.

These Pydantic models should live in their own directory and be namespaced according to the resource. We’ll also create a core.py file to store any common logic shared across all models.

In the core.py file, add the following code:

models/core.py

From the outset, these look like simple python dataclasses , and for the most part they are. For those who are less familiar, dataclasses are just classes that contain primarily data. For example, the following two dataclasses (taken from the python docs) are essentially identical - save for a few minor differences.

For anyone unfamiliar with the * syntax in function declarations, the FastAPI docs do a nice job of explaining its use. Any parameter that appears after the * should be called as a keyword argument, and doesn’t require a default value.

Pydantic’s BaseModel builds on dataclasses by by offering additional functionality around data validation and data type coercion. The Pydantic docs tell us:

You can think of models as similar to types in strictly typed languages, or as the requirements of a single endpoint in an API. Untrusted data can be passed to a model, and after parsing and validation pydantic guarantees that the fields of the resultant model instance will conform to the field types defined on the model.

Almost every new model we create will inherit from CoreModel to enable us share logic across models at any point down the road. This feature will come in handy later on.

The IDModelMixin class will be used for all resources coming out of the database. By not providing a default value for the id attribute, we’re telling Pydantic that this field is required for all new instances. Since we used the int type declaration, the Pydantic docs tell us that “Strings, bytes or floats will be coerced to ints if possible; otherwise an exception will be raised.”

Let’s make a few more Pydantic models to get a feel for this.

Update the cleaning.py file:

models/cleaning.py

Our CleaningBase model inherits from CoreModel, while CleaningCreate, CleaningUpdate, CleaningInDB, and CleaningPublic all inherit from CleaningBase. We use the Optional type declaration to specify that any attribute not passed in when creating the model instance will be set to None. To provide a clearer example, check out these two Pydantic classes that are functionally identical.

We’ve also defined a custom CleaningType class that can only take 3 values. Any time we want to restrict the number of valid inputs to an explicit set, we’ll import Enum and create a sub-class that inherits from our data type of choice (str in this case) and from Enum.

The 5 model types we just defined demonstrate a pattern that will be used for almost every resource:

With our models in place, we are now ready to craft a database interface accordingly.

Repositories

The purpose of a repository is to serve as a layer of abstraction on top of database actions. Each repository encapsulates database functionality corresponding to a particular resource. In doing so, we decouple persistence logic from our application logic. The Repository Pattern , introduced in Domain Driven Design back in 2004, has a number of popular implementations. For Phresh, we’ll stick to a slightly customized approach and treat repositories as a stand-in for a more traditional ORM.

We’ll start by setting up a base repository, and then we’ll move on to our cleanings repository.

Create a cleanings.py file in the repositories directory.

Our BaseRepository will be a simple class needed only to keep a reference to our database connection. In the future we can add functionality for common db actions, but we’ll keep it lightweight for now. In base.py add the following:

db/repositories/base.py

Nothing special is happening here, and that’s fine. In the repositories/cleanings.py file, add this:

db/repositories/cleanings.py

Now we’re getting somewhere. We’re importing our BaseRepository and a few models associated with the Cleaning resource. We’ve also written our first SQL query, complete with the :query_arg style that the databases package expects. One of the benefits of using the Repository pattern is that we get the flexibility of pure SQL, with the clean interface of an ORM.

The CleaningsRepository that we’ve created inherits from our BaseRepository and defines a single method meant to insert a new cleaning into our postgres database. Notice how our create_cleaning method expects a new_cleaning argument that is type annotated using the CleaningCreate model. We first call the .dict() method on our model to convert it to a dictionary. There’s quite a bit we can do when exporting Pydantic models in this way, so curious readers should feel free to read up on it in the docs .

In a nutshell, here’s what happens when we create a Pydantic model and call .dict() on it.

Pydantic parses and validates our inputs, fills in default values, and converts the model into a dictionary. Each of the keys in this dictionary map neatly to the query arguments in our CREATE_CLEANING_QUERY SQL query, meaning most of the work is done for us.

We then create the cleaning by using the db.fetch_one method that the databases package provides us, passing in the query we want to execute, and the associated query args. Finally, we take the database record that is returned from our SQL query and unpack it using the **cleaning syntax. What this does is take each key-value pair in the record and turn them into keyword arguments for our CleaningInDB instance.

So, where is the CleaningsRepository getting its database reference seen in self.db.fetch_one? Well, we haven’t handled that yet. Every repository is going to need access to the postgres, and fortunately FastAPI provides a clean and powerful dependency injection system that handles this setup relatively seamlessly.

Dependencies in FastAPI Endpoints

The FastAPI docs tell us:

“Dependency Injection” means, in programming, that there is a way for your code (in this case, your path operation functions) to declare things that it requires to work and use: “dependencies”.

Every API endpoint we define is going to need access to our database - making our postgres db the perfect candidate for a dependency.

Those unfamiliar with FastAPI are strongly encouraged to read the dependencies docs linked above, as they go into explicit detail on the dependency injection system - which we will not do here.

When ready, make a new dependencies directory inside of api, and create 2 new files.

Inside of our new database.py file, add the following:

api/dependencies/database.py

We have two dependencies here: get_database and get_repository. FastAPI dependencies are just functions - or more accurately Callables - which are called as API endpoint path parameters .

They probably look pretty funky to the uninitiated, so don’t worry about understanding this all quite yet. It’ll make more sense once we actually get to using them.

Head into the api/routes/cleanings.py file and update it accordingly.

api/routes/cleanings.py

Things are starting to heat up now. Our new POST route is significantly more fleshed out than the dummy route - and for good reason.

While we break down what’s actually going on, let’s pull up the OpenAPI docs that FastAPI generates for our new endpoint. Head to localhost:8000/docs and check out the automatic, interactive, API documentation provided to us for free.

phresh api docs

Let’s click on on our newly minted POST route - should be a big green button.

FastAPI reminds us in the request body docs :

A “request body is data sent by the client to your API. A response body is the data your API sends to the client.”

We’re shown both here - exactly what’s required in the body for a successful request, and the kind of responses we can expect to see for validation errors or successful execution. The success response schema comes from response_model=CleaningPublic, while the success status comes from status_code=HTTP_201_CREATED. And the route’s name comes from declaring name="cleanings:create-cleaning".

All of this is standard FastAPI stuff that comes from our declarations in the POST route decorator. FastAPI handles all of this for us!

api/routes/cleanings.py

The first of two path parameters in our create_new_cleaning function, new_cleaning, is annotated with the CleaningCreate type. By default, FastAPI expects the body of a post request directly. If we want it to expect JSON with a key new_cleaning and inside of it the model contents, we use the special Body parameter embed in the parameter default.

By simply specifying the CleaningCreate Python type declaration for new_cleaning, FastAPI will:

That’s a lot of functionality for very little code.

api/routes/cleanings.py

The second of the two path parameters - cleanings_repo - is our database interface, and the route’s only dependency. We’re setting its default to Depends(get_repository(CleaningsRepository)) so that we have access to it in our route and can create the new cleaning whenever the function is executed. FastAPI automatically validates and converts the created_cleaning to an instance of our CleaningPublic model, and sends the appropriate JSON as a response.

api/routes/cleanings.py

Getting back to dependencies - FastAPI understands that our route depends on the CleaningsRepository returned by the get_repository function. Recall how we defined that function in our dependencies.py file. It probably makes sense to return to that file, now that we have a better understanding of its purpose.

api/dependencies/database.py

In the get_repository function, we declare a single Repo_type parameter and return another function called get_repo. The get_repo has its own dependency - db - declared as a single parameter. This is known in FastAPI as a sub-dependency and depends on the get_database function, which grabs a reference to our database connection that we established in our app’s startup event handler.

The Request object comes directly from the starlette framework, and FastAPI handles passing that along for us. Read more about how to use the Request object directly here .

We then pass that database reference to our CleaningsRepository and let the repo interface with the postgres db as needed.

Testing Our Route

The OpenAPI docs also provide us with a ‘Try it out’ button that allows us to execute any request with some dummy data. We click on the route, edit the JSON with whatever data we to send to our API, hit the execute button, and it should issue a POST request against the route we just created.

If we forgot to migrate our database, we’ll get a 500 Internal Server error thrown from asyncpg.exceptions.UndefinedTableError. The hint that follows it - relation "cleanings" does not exist - is letting us know that we need to run our migrations to create the cleanings table in our database. See the end of the last post for instructions on how to do that.

Let’s open up psql in our db container and make sure things worked according to plan.

Once connected, we’ll query our database for all cleaning records.

Lovely! We’ve created a cleaning and persisted it in our database. And just like that, we’ve completed our first API endpoint and tested it using FastAPI’s interactive docs.

Wrapping Up and Resources

This is a pattern we’ll employ for every resource. Structure our database, design our resource repository, and build our Pydantic models. Afterwards, we flesh out our endpoints and integrate our models and db interface, testing pieces of it along the way.

Having finished a lot of the grunt work needed to build our app’s foundation, we’ll spend the next few posts adding features and functionality. We’ll also establish some development best practices by implementing proper testing protocols using pytest and Docker. Stay tuned.

Github Repo

All code up to this point can be found here:

Tags:

Previous Post undraw svg category

Configuring a PostgreSQL DB with your Dockerized FastAPI App

Next Post undraw svg category

Testing FastAPI Endpoints with Docker and Pytest