Receipe API project learning Django Rest Framework and more
Following along with Vuild a Backend REST API with Python & Django
My own creation - trying to add a frontend ina JavaScript framework
docker build .
docker-compose build .
for running backend app (check with backend folder configuration):
docker-compose run --rm backend sh -c "python manage.py collectstatic"
for running linting (untested):
docker-compose run --rm backend sh -c "flake8"
for unit testing: using django test suite
docker-compose run --rm backend sh -c "python manage.py test"
docker-compose run --rm backend sh -c "django-admin startproject app ."
docker compose -f compose.yml -f compose.cicd.yml run --rm backend sh -c "python manage.py startapp <ne app name>"
First build a node container with only
FROM node:20
WORKDIR /app
in the Dockerfile.
Then build the container with docker build -t frontend --progress=plain .
in the frontend directory.
Then add svelte:
docker-compose run -f compose.yml -f compose.dev.yml --rm frontend sh -c "npm create svelte@4.2.0"
Add svelte node adapter in frontend directory: docker run -it -w /app --mount type=bind,src="$(pwd)",target=/app frontend npm i -D @sveltejs/adapter-node
or through compose-file in root directory of project: docker-compose run -f compose.yml -f compose.dev.yml -rm frontend npm i -D @sveltejs/adapter-node
or after docker-compose -f compose.yml -f compose.dev.tml up -d
: docker-compose run <image-name> npm i -D <...>
Build the container with docker build -t frontend --progress=plain .
and run it with docker run -p 3000:3000 frontend
docker compose -f compose.yml -f compose.cicd.yml run --rm frontend sh -c "npm update"
docker-compose -f compose.yml -f compose.dev.yml up -d
based on How to configure Docker with two containers: Frontend and Backend
Find actions in Github Marketplace
docker-compose is preinstalled in ubuntu!
- unittest library
- django adds: test client, simulate authentication, temporary database
- django restframework adds: API test client
add tests.py
(module) OR tests-directory
(prefix every module with tests_
).
Tests directory must contain __init__.py
Test database: django creates a specific database for tests (and deletes it after every test)
Test classes:
- SimpleTestCase (without database)
- TestCase (database included)
prefix test-methods with test_
First write test and then add the development
- override or change behavior of dependencies
- avoid unintended side effects
- isolate code being tested
benefits:
- avoid relying on external services: might not available at test time
- avoid unintended consequences: like sending an email for every test or overloading external servicers in another way
Example:
- user registers -> gets welcome email. "Mock" the sending of the email.
- check database and then sleep for a while
Use unittest.mock
- MagicMock/Mock: replace real objects
- patch: overrides code for tests
Django rest framework API client
- based on Django's TestClient
- make requests
- check result
- override authentication
from rest_framework.test import APIClient
check status code: self.assertEqual(res.status_code, '200')
number of tests ran not correct
- missing init.py in tests/ directory
- indentation of test cases
- missing
test
prefix for test methods name (those can be used as helper functions in the test file) - import error when running test: Is this module globally installed: both
tests
directory andtests.py
exist in app
How to Do Test-Driven Development with Svelte and Vitest – A Project-Based Tutorial
configure vitest in vite.config.js
or override it with vitest.config.js
PostgreSQL
- popular open source
- free
- integrates well with Django
using Docker Compose for configuration,
- so its defined with project and other developers can also use it
- persistent data using volumes
- handles network configuration
- handles environment variable configuration
- Docker compose adds another service: database
- backend need to
depends_on
database -> still needs handling of database race condition
Volumes in docker compose:
- persist data
- maps directory in container to local machine
- using a named volume (that is at top)
- configure django: how to connect to database
- install database adaptor dependencies
- update python requirements to include postgres adaptor Django needs to know:
- database engine
- hostname / ip of database's host
- port (usually 5432)
- database name
- credentials: username and password
everything goes into
settings.py
and pull everything from environment variables
environment variables
- can be easily passed to docker,
- used in prod and dev
- single place to configure project
- easy to do with python code to pull environment variables in python `os.environ.get('')``
package psycopg2: package needed for django to connect to database
- most popular postgres adaptor for python
- supported officially by django
installation options for psycopg2:
- psycopg2-binary, not optimized for production
- psycopg2, e.g. with pip, compiled from source, needs dependencies but optimizes for machine - but easy with docker
dependencies for psycopg2:
- c compiler
- python3-dev
- libpq-dev in alpine called
- postgresql-client
- build-base
- postgresql-dev
- musl-dev
best practice in docker: clean-up dependencies after install in Dockerfile (keep at minimum and light weight)!
Configuring database in django in settings.py file: Reference
docker compose has depends_on
, which only waits until service has started, but not yet, that the service is running. Solution: make Django wait for db, using a custom Django management command.
Adding a new python app for awaiting the database:
docker compose -f compose.yml -f compose.cicd.yml run --rm backend sh -c "python manage.py startapp core"
is an abstraction layer for data to let django handle databse structure and changes.
- Handles table generation, adding columns, and so on.
- also allows to switch databases (within reason): for example switching from postgres to mySQL
- define models (done through programming)
- generate migration file (handled by ORM automatically)
- setup the database (handled by ORM automatically)
- store data (handled by ORM automatically)
each model
- maps to a table
- contains
- name
- fields (represent columns)
- metadata (relationships between tables)
- custom python logic (e.g. execute code on every safe or validation)
Model example:
class ingredient (`models.Model`):
name = models.CharField(max_length=255)
user = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE
)
enable app in settings.py
and run through python manage.py makemigrations
:
- checks existing database
- looks for differences of models
- adds new migrations
- at first time running: initial setup of database
afterwards apply the migrations to database by running python manage.py migrate
Always run the migration after wait_for_database
Using default django user model. Django has built in user authentication
Framework for basic features:
- registration
- login
- auth
Integrates with Django admin
Default Django user model is foundation of django auth system
Default user model
- uses username instead of email
- not easy to customize
Therefore create a custom user model
based on AbstractBaseUser
and PermissionMixin
Create custom manager used for CLI integration
Afterwards set AUTH_USER_MODEL in settings.py
Create and run migrations
AbstractBaseUser Provides features for authentication but does not include fields.
PermissionsMixin
Support for Django permission system
Includes fields and methods
Common issue
- Running migrations before setting custom model: not good. Therefore clear them! ALWAYS set user model first before running migrations.
- Typos in config
- Indentation in manager or model
user fields:
- email (EmailField)
- name (CharField)
- is_active (BooleanField)
- is_staff (BooleanField) -> if true, allows login to django admin
user model manager:
- used to manage objects
- custom logic for creating objects
- hash passwords
- used by Django CLI
- create superuser
BaseUserManager
- base class for managing users
- useful helper methods
normalize_email
for storing emails consistently
- Methods to define:
create_user
called when creating usercreate_superuser
used by CLI to create a superuser (=admin)
in core/models.py
Make migrations:
docker compose -f compose.yml -f compose.dev.yml run --rm backend sh -c "python manage.py makemigrations"
Apply migrations:
docker compose -f compose.yml -f compose.dev.yml run --rm backend sh -c "python manage.py wait_for_db && python manage.py migrate"
=> fails, as migrations ran previously with standard Django user model.
Therefore clearing data for database:
docker compose down
docker volume rm receipe-app-api_dev-db-data
After clearing the database, it works:
docker compose -f compose.yml -f compose.dev.yml run --rm backend sh -c "python manage.py wait_for_db && python manage.py migrate"
in command line:
docker compose -f compose.yml -f compose.dev.yml run --rm backend sh -c "python manage.py createsuperuser"
- Graphical User Interface for models
- Create, Read, Update, Delete (CRUD)
- very little coding required
- needs to be enabled per model in
admin.py
- `admin.site.register()``
Customizing:
- through class based of
ModelAdmin
orUserAdmin
- override/ set class variables
- Changing list of objects
ordering
: changing order of items appearancelist_display
: fields to appear in list
- add/update page:
fieldsets
control layout of pagereadonly_fields
- add page:
add_fieldsets
: fields displayed only on add page
Understanding the structure of a SvelteKit project
Purpose:
- designed for developers
- developers need to know how to use it
- API is only as good as its documentation
What to document:
- everything needed to use the API
- available endpoints
- supported methods, like
GET
POST
PUT
PATCH
DELETE
- Format of payloads (input)
- parameters (f.x. filtering)
- Post JSON format
- Format of responses (outputs)
- Response JSON format
- Authentication process
Options for documentation:
- Manual: (drawback: manual updates & risk of not updated documenting)
- word document
- markdown
- automated
- use metadata from code (comments)
- generate documentation pages
Here:
- Tools for making documentation seamless
- Add documentation for our API, including graphical interface to run tests
Docs in DRF
- auto generate docs (with third party library)
- many tools available
- here
drf-spectacular
- generates a schema
- which allows a browsable web interface
- can make test requests
- handle auth
How it works:
- Generate
schema
file - Parse schema in GUI
- standard for describing APIs
- popular in industry
- supported by most API documentation tools - here using Swagger
- uses popular formats: YAML/JSON
- download and run in local or public Swagger instance
- serve swagger with API (makes it interactive)
Functionality:
- User registration
- creating an auth token
- viewing and updating user profile
Endpoints:
user/create
POST
register new user
user/token
POST
create a new token: payload contains username and password, returns token
user/me
GET
to view profilePUT/PATCH
update user profile
docker compose -f compose.yml -f compose.dev.yml run --rm backend sh -c "python manage.py startapp user"
remove directory
- migrations and files:
- admin.py
- models.py as all of this is managed in core app
and also remove
- tests.py and use subdirectory for tests instead
- Basic (basic HTTP authentication: username + password with every request) => bad, because client needs to store username and password
- Token (generate a token from username and password and send that token with every request made to the backend)
- JSON Web Token (JWT):
- similar to token authentication
- use an access and refresh token
- (only refresh token needs user authorization)
- especially relevant for systems with millions of users to reduce the number of authorization requests
- session authorization
- use cookies (common way to use authentication)
Here: using token authentication due to
- balance between simplicity and security
- supported out of the box by Django REST framework
- well supported by most clients
- only required to store token on disk or in memory
- create a token from endpoint (that accepts username+password, creates a new token in database and returns that token to client)
- client stores token in
- session storage
- local storage
- cookie
- client side database (for desktop applications)
- client includes token in HTTP headers
- supported out of the box by django REST framework
- simple to use
- supported by all clients (as long as client can store the token)
- avoid sending username / password each time
- token needs to be secure on client
- requires a database for requests
- happens entirely on client side
- simply delete token
Why don't we have a logout API?
- unreliable: no guarantee, that the user can reach the logout API
- loosing internet connection
- clearing session
- uninstalling client side mobile app
Probably session authentication are more useful moving forward, as token authentication is stateless. Token authentication is useful for registering app's to API's but not for user session handling.
see
Using a registered user (can be registered through POST
to /api/user/create/
) retrieving a token from POST
request to /api/user/token/
gives token. In frontend app save this token somewhere (localstorage) and resend, whenever necessary to access API. Then click the Authorize
in SwaggerAPI and use under tokenAuth. Use Token
before the token and mind the space in between the word Token and the the actual token.
For session authentication this would be the cookie instead.
PUT
replaces the entire objectPATCH
only updates specific fields
Install following Tailwind CSS installation and add Tailwind Elements installation
including test driven development, wherever possible.
Svelte is missing possibilities for programmatically fill slots, so no testing of route guard
but implemented route guard as component and session handling (token storage) through a cookie
Features:
- create,
- list,
- view detail,
- update, and
- delete a recipe
Endpoints:
/recipes/
GET
list all recipesPOST
create a recipe
/recipes/<recipe_id>
GET
view details of a recipePUT/PATCH
update a recipeDELETE
delete a recipe
A view
- handles a request made to a URL
- Django uses functions
- Django Rest Framework (DRF) builds on top of that and adds classes
- reusable logic (reusable from DRF)
- override behavior
- DRF also supports function based views (via decorators)
- both
APIView
andViewsets
are DRF base classes
used already here for the authentication endpoints
- focused around HTTP methods
- class methods for HTTP methods
- GET, POST, PUT, PATCH, DELETE (classes are in lowercase)
- provide flexibility over URLs and logic
- useful for non CRUD APIs
- Create, Read, Update, Delete
- Bespoke logic (e.g. auth, jobs, calls to external APIs)
- focused around actions
- retrieve, list, update, partial update, destroy
- map to django models
- use routers to generate URLs automatically
- Great for CRUD operations on models
all models are in core app
after registering the model in the app with admin.site.register(models.Recipe)
, run migrations
docker compose -f compose.yml -f compose.dev.yml run --rm backend sh -c "python manage.py makemigrations"
For testing purposes, this is enough, because the django test runner applies all migrations before running the tests.
In development, the compose.dev.yml
file runs the migrations - but for production the migration command is missing!
docker compose -f compose.yml -f compose.dev.yml run --rm backend sh -c "python manage.py startapp recipe"
In directory recipe
it creates a subdirectory migrations
(not needed) and files
__init__.py
admin.py
(not needed)apps.py
models.py
(not needed - core app used instead)tests.py
(not needed)views.py
And enable app in settings.py under INSTALLED_APPS