This is the backend API for Ashes.live, a fan-developed deckbuilder and community site for the card game Ashes Reborn.
You must install the following to run the Ashes.live API locally:
That's it! For local development, all other code is executed in Docker via Make using the standard 3 Musketeers pattern.
Please note: in order to run Docker Desktop on Windows you will need a recent copy of Windows 10 with WSL 2 enabled.
Because WSL 2 runs faster when files are living under the Linux filesystem, you will probably
want to clone this repo into your Linux file system, install make
under your Linux distro
(if necessary) and then execute your make commands from the WSL command line (accessible via
wsl
in PowerShell, or by opening the Linux terminal directly).
This means that on Windows you are typically:
make
commands in a WSL command line instead of standard Windows cmd or PowershellHowever, if for whatever reason you do want to install make
on Windows, this is an easy way:
choco install make
in an elevated command promptAfter installing the dependencies above:
.env.example
named .env
in your root directoryPOSTGRES_PASSWORD
and SECRET_KEY
in .env
(you can update other
values if you wish; they aren't required to run locally, though)make
from the root project directoryThis will build your main Docker container and display the available commands you can
execute with make
.
Now that you have a functional API stack, you can run make data
to create some example
testing data in your database (note: this may fail if you have never run make run
or
make db
prior to make data
because the Postgres database must be initialized first).
Please note: if you do not use the example data, you will need to install the extension
pgcrypto
before running any migrations (via the SQL create extension pgcrypto;
).
At this point, you can execute make run
to start a local development server, and view your
site's API documentation at
From within the API docs, you can query the API directly and inspect its output. If you need
to authenticate, use the email [email protected]
as the username with the password changeme
to log in as IsaacBot#30000. You must not make your API public without changing this password.
If you are running a local development server to work on the front-end application, you're done!
If you wish to contribute to the API, read on!
You can use Visual Studio Code to develop directly within the Docker container, allowing you direct access to the Python environment (which means linting, access to Python tools, working code analysis for free, and bash shell access without needing to run a make command). To do so:
make run
to launch the API containerasheslive:dev
). You can find explicit instructions for this in the
Visual Studio Code documentationjson
{
"workspaceFolder": "/code",
"settings": {
"terminal.integrated.shell.linux": "/bin/bash",
"python.pythonPath": "/usr/local/bin/python3.8",
"python.linting.pylintEnabled": true,
"python.linting.enabled": true,
"editor.formatOnSave": true,
"python.formatting.provider": "black",
"editor.wordWrapColumn": 88
},
"remoteUser": "root",
"extensions": [
"EditorConfig.EditorConfig",
"ms-python.python"
]
}
You will need to start the API prior to launching VSCode to automatically attach to it. (I am looking into ways to improve this workflow, but short-term this is the easiest to get working consistently without requiring rebuiding the API with every poetry change.)
Please note: you must run your make commands in an external shell! The VSCode Terminal
in your attached container window will provide you access to the equivalent of make shell
,
but running the standard make commands there will result in Docker-in-Docker, which is not
desirable in this instance.
You can use PyCharm to develop directly within the Docker container, allowing you access to the Python environment (which means linting, access to Python tools, etc.). To do so:
make run
to ensure the local stack is runningapi
under the "Service" dropdownYou will now have auto-completion, automatic imports, and code navigation capabilities in PyCharm. To enable local debugging:
Local
)uvicorn
as the "Module name"api.main:app --reload --host 0.0.0.0 --port 8000
as the "Parameters"This project is configured to use isort
and black
for import and code formatting, respectively.
You can trigger formatting across the full project using make format
, or you can also set up automatic
formatting on a per-file basis within PyCharm:
Python
Project Files
make
(macOS/Linux) or wsl
(Windows)format FILEPATH=$FilePathRelativeToProjectRoot$
(macOS/Linux) or
make format FILENAME="$UnixSeparators($FilePathRelativeToProjectRoot$)$"
(Windows)$FilePath$
$ProjectFileDir$
If automatic formatting is behaving too slowly for your tastes, you can optionally install isort and black in your local environment and configure them that way:
The Ashes.live API uses the FastAPI framework to handle view logic, and SQLAlchemy for models and database interaction. Pydantic is used for modeling and validating endpoint input and output. Pytest is used for testing.
The primary entrypoint for the application is api/main.py
. This file defines
the FastAPI app and attaches all site routers. Site modules are organized as follows:
api/views
: Route view functions, typically organized by base URL segment. Start here to
trace a code path for a given endpoint.api/models
: Data models used to persist to and represent info from the databaseapi/schemas
: Pydantic models used to validate and model endpoint input/outputapi/tests
: Integration tests (with some unit tests where integration testing is not feasible)api/services
: Functions for performing "business logic"; e.g. creating and modifying models,
shared queries that span model relationships, etc.api/utils
: Utility functions for doing a single small thing (placed here because multiple
endpoints leverage the function, or to make testing easier)Services and utility functions are quite similar. Generally speaking, if it's working with simple data, it's a utility. If it's manipulating models, it's probably a service.
You will likely leverage the following files, as well:
api/db.py
: Convenience access to SQLAlchemy objects and methodsapi/depends.py
: View dependencies (e.g. to allow endpoints access to the logged-in user)api/environment.py
: Exports the settings
object for access to environment settingsI am shooting to maintain 100% code coverage. When you submit a PR, I will expect you to
include tests that fully cover your code. You can view line-by-line coverage information
by executing make test
and then loading htmlcov/index.html
into your favorite browser.
Note that full code coverage simply means the tests must exercise all possible logic paths
in your code. However, If you check the api/tests
folder you will find that most existing
tests are integration tests; they setup a scenario, query a single endpoint, and check that
the status code is correct (typically no other information is verified). Tests do not need
to exhaustively cover every eventuality; they simply need to ensure that all code paths are
functional and appear to be working as expected.
Testing performs queries against an actual database, and every individual test starts with an empty slate (there is no pre-existing data, and data does not persist between tests).
In some instances, you may need to write unit tests instead (for instance, user badge generation logic does this). This will typically come up when you need to verify error handling within a service or utility function for failure states that are not possible to trigger externally.
Migrations are handled by Alembic. To create a new migration:
api/models
. If you add a new model class, make sure to
hoist the class to the root module in api/models/__init__.py
or else it will not be
detected by Alembic!make shell
alembic revision --autogenerate -m "Short description here"
migrations/versions
; verify the contents and remove the
"autogenerated" comments.make migrate
to update your local database!You can find documentation for Alembic here: https://alembic.sqlalchemy.org/en/latest/
Make sure that your model classes all inherit from api.db.AlchemyBase
! This is what allows
SQLAlchemy and Alembic to map the class to a table definition.
The Ashes.live API uses Poetry for dependency management. To install a new dependency from outside of the container:
sh
$ make shell
[email protected]:/code$ poetry add DEPENDENCY
(If you are developing within Visual Studio Code, you can open the built-in terminal and skip
the make shell
command.)
Then commit changes in your updated poetry.lock
and pyproject.toml
. Please see the
Poetry docs for other available commands.
You may wish to shut down your container, run make build
, and relaunch it to ensure that
newly added dependencies are available. If you pull down code and stuff starts failing in
weird ways, you probably need to run make build
and make migrate
.
Please note: make shell
will log you into the Docker container as the root user!
This is unfortunately necessary to allow Poetry to function properly (I haven't found a
good way yet to install initial dependencies as a non-root account and have them work,
which means the shell has to be root in order to properly calculate the dependency graph).
The underlying Dockerfile uses the following tools, pinned to specific release versions:
In order to update these tools, you must update their pinned version in Dockerfile
and (for Poetry) in pyproject.toml
then rebuild your API container using make build
.
Ashes.live is currently setup for deployment to Render.com. To deploy a copy of the site:
/bin/bash -c cd /code && /gunicorn.sh
/health-check
.env.example
):production
That's it!
Please note that the .env
file is not populated in your production images. The .env
file
works locally because Docker Compose automatically loads its contents as environment variables, but
when running in production mode Pydantic is not capable of reading an .env
file with the current
setup (which is why you must define your environment variables one-by-one in the Render control
panel).
I need to upgrade to SQLAlchemy 1.4, ensure that everything is compatible with it, and then create a task for migrating all SQLAlchemy calls to 2.0-style as a future update.
This is tentative planning for fully-fledged Red Rains support (that is, the ability to customize, share, and track all aspects of Red Rains campaigns).
Campaigns are shared setups for playing a three-game Red Rains match. E.g. there are four official campaigns that ship with the initial box:
A campaign dictates the Chimera's starting level and deck and may optionally dictate what preconstructed decks are available for the player to deckbuild with. Players choose a campaign and create a deck for a campaign attempt (they can attempt a campaign as many times as they'd like, and each attempt will serve as a reminder of exactly what deck they played and how they modified it). Campaign attempts can be published to the player's profile page.
Only necessary to set this if restricting the decks that can be selected for play/deckbuilding at a given stage (otherwise allows access to the player's full collection).
E.g. this URL dies: https://api.ashes.live/v2/cards?limit=101
Per the title; if I implement this, it might be a good time to revisit the dice ANY/ALL logic and replace it with something more useful.
Obviously should still respect the "can I play these" filter, but it would make sense to include conjured alterations alongside alterations instead of just alongside conjurations.
It would be very handy to be able to organize decks (either one's own, or possibly arbitrary published decks across the site) into collections with a shareable URL.
Initial public release.