jdkent/example_celery_app
deciphering celery best practices
Flask Library App
A Flask application with Celery for background tasks, PostgreSQL for storage, and Redis as a broker. Manages holders and books, including a library holder.
Flask and Celery Entrypoints & App Structure
Flask Initialization
The Flask app is created in app/__init__.py using a factory pattern:
from flask import Flask
def create_app():
app = Flask(__name__)
# Register blueprints
return app
app = create_app()The app object is referenced as the main Flask application.
Celery Initialization
Celery is initialized in app/celery.py:
from celery import Celery
from app.config import Config
celery = Celery(
"my_example_celery_app",
broker=Config.CELERY_BROKER_URL,
backend=Config.CELERY_RESULT_BACKEND,
)
celery.conf.update(
broker_url=Config.CELERY_BROKER_URL,
result_backend=Config.CELERY_RESULT_BACKEND,
)The celery object is referenced for background task processing.
Docker Compose Entrypoints
-
Flask:
Service:web
Entrypoint:flask run --host=0.0.0.0 --reload
-
Celery:
Service:worker
Entrypoint:celery -A app.celery worker --loglevel=info --concurrency=1
Required Environment Variables
Both services use the following variables (see .env):
FLASK_APP,FLASK_ENV,SECRET_KEYPOSTGRES_DB,POSTGRES_USER,POSTGRES_PASSWORD,POSTGRES_HOST,POSTGRES_PORTREDIS_URL,CELERY_BROKER_URL,CELERY_RESULT_BACKEND
Running with Docker Compose
Build and start all services (Flask, Celery worker, PostgreSQL, Redis):
docker-compose up --buildTo run only the Flask app or Celery worker:
docker-compose run --rm web
docker-compose run --rm workerSetup
Prerequisites
- Docker & Docker Compose
Clone the Repository
git clone <repo-url>
cd my_example_celery_appEnvironment Variables
Edit .env as needed:
FLASK_APP,FLASK_ENV,SECRET_KEYPOSTGRES_DB,POSTGRES_USER,POSTGRES_PASSWORD,POSTGRES_HOST,POSTGRES_PORTREDIS_URL,CELERY_BROKER_URL,CELERY_RESULT_BACKEND
Build and Start Services
docker-compose up --build- Flask app: http://localhost:5000
- PostgreSQL: localhost:5432
- Redis: localhost:6379
Database Initialization
The app uses SQLAlchemy. All database configuration and session creation should be imported from app/config.py, which is the single source of truth for database connectivity.
To initialize the database tables:
-
Enter the web container:
docker-compose exec web bash -
Run Python shell:
python
-
Initialize tables:
from app.models import Base from app.config import Config # Use Config for engine Base.metadata.create_all(Config.engine)
Alternatively, run the sample data script (see below).
Populating Sample Data
After initializing the database, populate it with example holders and books:
docker-compose exec web python app/sample_data.pyRunning Flask and Celery
- Flask server: started automatically by Docker Compose.
- Celery worker: started automatically by Docker Compose.
API Usage
Holders
GET /holders— List holdersPOST /holders— Create holderGET /holders/<id>— Get holder by ID
Books
GET /books— List booksPOST /books— Create bookGET /books/<id>— Get book by ID
Tasks
POST /checkout— Checkout a book (Celery task)POST /return— Return a book (Celery task)
Testing
Run tests inside the web container:
docker-compose exec web pytestNotes
- All database configuration and session creation should be imported from
app/config.py. Do not initializeDATABASE_URL,engine, orSessionlocally—always import them fromapp.config. - The library itself is represented as a Holder with the name "Library".
- See
app/sample_data.pyfor example data population.
Running Tests with PostgreSQL
-
Ensure Docker and Docker Compose are running.
-
Create the test database (only once):
docker compose exec db psql -U myuser -d mydb -c "CREATE DATABASE test_mydb OWNER myuser ENCODING 'UTF8';"
-
Run tests using Docker Compose:
docker compose run web pytest
Tests will use the test_mydb PostgreSQL database, matching production configuration. No in-memory databases are used.