This guide covers deploying the Skincare Allergy Filter application to production environments.
How to install and set up your project:
Note: This project uses uv for dependency management, which provides fast, reliable installs with lockfile support.
git clone https://github.com/RJChoe/Skincare-Filter-Web-Application.git
cd Skincare-Filter-Web-Application
# macOS/Linux
curl -LsSf https://astral.sh/uv/install.sh | sh
# Windows (PowerShell)
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
# Or via pip
pip install uv
Windows users: If the installation script fails, ensure PowerShell execution policy allows scripts. Run as Administrator if needed, or check where.exe uv to verify the installation path. Alternatively, use WSL2 for a Unix-like environment.
Note: uv sync may prune undeclared tools (pip/uv) inside the venv; this is expected behavior.
uv python install 3.13
uv python pin 3.13
This creates/updates .python-version which CI uses to enforce consistency (see CI workflow).
Verify the pin:
# Windows
type .python-version
# macOS/Linux
cat .python-version
Output should show 3.13.
Note: If .python-version doesn’t exist, run uv python pin 3.13 to generate it. This file ensures CI and local development use the same Python version.
# Create venv
uv venv
# Activate on Windows (PowerShell)
.venv\Scripts\Activate.ps1
# Activate on Windows (CMD)
.venv\Scripts\activate.bat
# Activate on macOS/Linux
source .venv/bin/activate
Note: All commands in this README use uv run, which automatically uses the venv without manual activation. Activation is optional but shown for reference.
uv sync --group dev
uv run python manage.py makemigrations allergies users
uv run python manage.py migrate
uv run python manage.py runserver
Note: requirements.txt and requirements-dev.txt are not committed —
generate them on demand from uv.lock:
uv export --no-hashes --format requirements-txt -o requirements.txt
uv export --no-hashes --format requirements-txt --group dev -o requirements-dev.txt
uv.lock is the source of truth for all dependency resolution. The --no-hashes flag ensures cross-platform compatibility.
This project uses PEP 735 dependency groups for organized development dependencies:
test - Testing tools (pytest, pytest-cov, coverage)lint - Code formatting and linting (ruff, pre-commit)type-check - Type checking tools (mypy, django-stubs)security - Security scanning (bandit, safety)dev - Full development environment (includes all groups above)Adding dependencies:
# Add a runtime dependency
uv add package-name
# Add a dependency to a specific group
uv add --group test pytest-mock
uv add --group lint pylint
uv add --group type-check types-requests
uv add --group security semgrep
Installing specific groups:
# Install only test dependencies
uv sync --group test
# Install multiple groups
uv sync --group test --group lint
#
# Install full dev environment
uv sync --group dev
Updating dependencies:
# Update all dependencies
uv lock --upgrade
Quick checks to confirm your environment:
# Check Python version (should be 3.13)
uv run python -V
# Check Django version (should be 6.0)
uv run python -c "import django; print(django.get_version())"
# Verify uv installation
uv --version
Before deploying to production, ensure:
uv run pytestuv run bandit -r allergies users skincare_projectuv run safety scan --non-interactive.env file configured with production valuesDEBUG = False in production environmentSECRET_KEY generatedALLOWED_HOSTS configured correctlySAFETY_API_KEY GitHub secret configured (see CI/CD Secrets)This project’s CI enforces consistency between local development and automated testing:
.python-version matches the active interpreter (see workflow)uv run for consistent Python/tool resolutionrequirements.txt files stay in sync with uv.locksafety scan requires a SAFETY_API_KEY repository secret — without it the static-analysis job fails and blocks the merge. See CI/CD Secrets for setup.Run the same checks locally:
# Verify Python version matches .python-version
uv run python --version
# Run pre-commit (same as CI lint job)
uv run pre-commit run --all-files
# Run tests (same as CI test job)
uv run pytest
# Validate lockfile sync
uv lock --check
Before committing: Ensure .python-version exists (uv python pin 3.13) and pre-commit hooks are installed (uv run pre-commit install). CI will fail if Python versions mismatch or requirements files drift from uv.lock.
The CI workflow uses GitHub Actions secrets for external service integrations. Unlike Variables, secrets are masked (***) in all log output, preventing accidental key exposure to anyone with read access to the repository.
| Secret | Required | Purpose |
|---|---|---|
SAFETY_API_KEY |
✅ Yes | Authenticates safety scan for vulnerability checks |
CODECOV_TOKEN |
⚠️ Recommended | Uploads coverage reports to Codecov |
SAFETY_API_KEYRegister or log in at safety.pyup.io, then navigate to Account Settings → API Keys → New API Key.
SAFETY_API_KEY| State | CI behaviour |
|---|---|
| Secret set and valid | ✅ Scan runs authenticated; results appear in step summary |
| Secret missing or expired | ❌ Annotation step exits with code 1; static-analysis job fails; merge blocked via branch protection |
Automate testing and coverage reporting on pull requests to maintain code quality.
Note: The example below shows a simplified workflow for learning purposes. This project’s actual CI workflow is .github/workflows/ci.yml, which includes 5 specialized jobs (dependencies, lint, test, type-check, security) for comprehensive quality enforcement.
First, create the workflow directory (if it doesn’t exist):
# Windows: New-Item -ItemType Directory -Force -Path .github\workflows
mkdir -p .github/workflows
Example simplified workflow (.github/workflows/test.yml):
name: Tests
on:
pull_request:
branches:
- main
- develop
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['3.13']
steps:
- uses: actions/checkout@v5
- name: Set up Python $
uses: actions/setup-python@v5
with:
python-version: $
- name: Install dependencies
run: |
uv sync --group test
- name: Run migrations
run: |
uv run python manage.py migrate
- name: Run tests with coverage
run: |
uv run pytest
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v5
with:
token: $
file: ./coverage.xml
flags: unittests
name: codecov-umbrella
fail_ci_if_error: true
This workflow:
main and develop branchespyproject.toml)Enforce quality standards by requiring all checks to pass before merging.
Setup in GitHub repository settings:
main (repeat for develop)ci-success (from GitHub Actions workflow — aggregates all jobs)codecov/project (from Codecov integration)Admin Override Process: If emergency merges are needed despite failed checks, repository admins can override protection. This requires:
This setup ensures coverage drops and test failures block merges, maintaining code quality standards.
Track and visualize coverage trends across commits and pull requests.
Step-by-step setup:
RJChoe/Skincare-Filter-Web-Application from your repository listCODECOV_TOKENConfigure coverage thresholds (optional):
Create .codecov.yml in project root:
coverage:
status:
project:
default:
target: 75% # Target coverage percentage
threshold: 5% # Allow coverage to drop 5% before failing
patch:
default:
target: 75% # New code should have 80% coverage
range: 75..100 # Coverage color coding (red at 75%, green at 100%)
comment:
layout: "header, diff, files"
behavior: default
ignore:
- "*/migrations/*"
- "*/tests/*"
This configuration:
For more advanced configuration, see:
uv add django-environ
Create a .env file on your production server (never commit this):
# Production .env
DEBUG=False
SECRET_KEY=your-production-secret-key-here
ALLOWED_HOSTS=yourdomain.com,www.yourdomain.com
DATABASE_URL=postgres://username:password@hostname:5432/database_name
# Optional: Security settings
SECURE_SSL_REDIRECT=True
SESSION_COOKIE_SECURE=True
CSRF_COOKIE_SECURE=True
uv run python -c "from django.core.management.utils import get_random_secret_key; print(get_random_secret_key())"
Important: Use a different SECRET_KEY for each environment (development, staging, production).
Configure based on your hosting provider:
# Heroku
ALLOWED_HOSTS=yourapp.herokuapp.com
# DigitalOcean
ALLOWED_HOSTS=your-droplet-ip,yourdomain.com,www.yourdomain.com
# AWS EC2
ALLOWED_HOSTS=ec2-xx-xxx-xxx-xx.compute-1.amazonaws.com,yourdomain.com
# Render
ALLOWED_HOSTS=yourapp.onrender.com,yourdomain.com
# Multiple domains
ALLOWED_HOSTS=example.com,www.example.com,api.example.com
# Ubuntu/Debian
sudo apt update
sudo apt install postgresql postgresql-contrib
# macOS
brew install postgresql
# Windows
# Download from https://www.postgresql.org/download/windows/
# Connect to PostgreSQL
sudo -u postgres psql
# Create database
CREATE DATABASE skincare_filter_db;
# Create user with password
CREATE USER skincare_user WITH PASSWORD 'secure_password_here';
# Grant privileges
GRANT ALL PRIVILEGES ON DATABASE skincare_filter_db TO skincare_user;
# Exit
\q
# .env
DATABASE_URL=postgres://skincare_user:secure_password_here@localhost:5432/skincare_filter_db
# With connection pooling (recommended)
DATABASE_URL=postgres://skincare_user:secure_password_here@localhost:5432/skincare_filter_db?conn_max_age=600
uv add psycopg2-binary
uv run python manage.py migrate
Django should not serve static files in production. Use a web server or CDN.
Update settings.py:
# settings.py
STATIC_ROOT = BASE_DIR / 'staticfiles'
STATIC_URL = '/static/'
uv run python manage.py collectstatic --noinput
# /etc/nginx/sites-available/skincare_filter
server {
listen 80;
server_name yourdomain.com www.yourdomain.com;
location /static/ {
alias /path/to/your/project/staticfiles/;
}
location / {
proxy_pass http://127.0.0.1:8000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
}
}
uv add whitenoise
Update settings.py:
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'whitenoise.middleware.WhiteNoiseMiddleware', # Add this
# ... other middleware
]
STATICFILES_STORAGE = 'whitenoise.storage.CompressedManifestStaticFilesStorage'
Django’s development server is not suitable for production. Use Gunicorn or uWSGI.
uv add gunicorn
uv run gunicorn skincare_project.wsgi:application --bind 0.0.0.0:8000
Create gunicorn_config.py:
# gunicorn_config.py
bind = "0.0.0.0:8000"
workers = 3 # (2 x $num_cores) + 1
worker_class = "sync"
worker_connections = 1000
max_requests = 1000
max_requests_jitter = 50
timeout = 30
keepalive = 2
errorlog = "-"
accesslog = "-"
loglevel = "info"
uv run gunicorn skincare_project.wsgi:application -c gunicorn_config.py
Create /etc/systemd/system/skincare_filter.service:
[Unit]
Description=Skincare Filter Gunicorn Daemon
After=network.target
[Service]
User=www-data
Group=www-data
WorkingDirectory=/path/to/your/project
Environment="PATH=/path/to/your/project/.venv/bin"
ExecStart=/path/to/your/project/.venv/bin/gunicorn \
skincare_project.wsgi:application \
-c gunicorn_config.py
[Install]
WantedBy=multi-user.target
Enable and start:
sudo systemctl enable skincare_filter
sudo systemctl start skincare_filter
sudo systemctl status skincare_filter
uv add uwsgi
Create uwsgi.ini:
[uwsgi]
module = skincare_project.wsgi:application
master = true
processes = 4
socket = /tmp/skincare_filter.sock
chmod-socket = 666
vacuum = true
die-on-term = true
uv run uwsgi --ini uwsgi.ini
# macOS
brew tap heroku/brew && brew install heroku
# Windows
# Download from https://devcenter.heroku.com/articles/heroku-cli
heroku create your-app-name
heroku addons:create heroku-postgresql:mini
heroku config:set DEBUG=False
heroku config:set SECRET_KEY="your-secret-key"
heroku config:set ALLOWED_HOSTS=your-app-name.herokuapp.com
Procfile:web: gunicorn skincare_project.wsgi --log-file -
git push heroku main
heroku run python manage.py migrate
heroku run python manage.py collectstatic --noinput
app.yaml:name: skincare-filter
services:
- name: web
github:
repo: RJChoe/Skincare-Filter-Web-Application
branch: main
build_command: uv sync && uv run python manage.py collectstatic --noinput
run_command: uv run gunicorn skincare_project.wsgi:application
envs:
- key: DEBUG
value: "False"
- key: SECRET_KEY
type: SECRET
- key: ALLOWED_HOSTS
value: "${APP_DOMAIN}"
databases:
- name: skincare-db
engine: PG
version: "14"
doctl CLI.Launch EC2 Instance (Ubuntu 22.04 LTS recommended)
SSH into Instance:
ssh -i your-key.pem ubuntu@your-ec2-ip
sudo apt update
sudo apt install python3.13 python3.13-venv postgresql nginx
curl -LsSf https://astral.sh/uv/install.sh | sh
git clone https://github.com/RJChoe/Skincare-Filter-Web-Application.git
cd Skincare-Filter-Web-Application
Set Up Application (follow environment configuration above)
Configure Nginx (see Static Files section)
Set Up SSL with Let’s Encrypt:
sudo apt install certbot python3-certbot-nginx
sudo certbot --nginx -d yourdomain.com -d www.yourdomain.com
render.yaml:services:
- type: web
name: skincare-filter
env: python
buildCommand: "uv sync && uv run python manage.py collectstatic --noinput && uv run python manage.py migrate"
startCommand: "uv run gunicorn skincare_project.wsgi:application"
envVars:
- key: DEBUG
value: false
- key: SECRET_KEY
generateValue: true
- key: PYTHON_VERSION
value: 3.13
- key: DATABASE_URL
fromDatabase:
name: skincare-db
property: connectionString
databases:
- name: skincare-db
plan: starter
Connect GitHub Repository in Render Dashboard
Deploy (automatic on push to main)
# Check application is running
curl https://yourdomain.com
# Check admin access
curl https://yourdomain.com/admin/
# Check static files
curl https://yourdomain.com/static/css/style.css
uv run python manage.py createsuperuser
Consider using:
# PostgreSQL backup
pg_dump -U skincare_user skincare_filter_db > backup_$(date +%Y%m%d).sql
# Automated daily backups (cron)
0 2 * * * pg_dump -U skincare_user skincare_filter_db > /backups/backup_$(date +\%Y\%m\%d).sql
Update settings.py:
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'handlers': {
'file': {
'level': 'ERROR',
'class': 'logging.FileHandler',
'filename': BASE_DIR / 'logs' / 'django.log',
},
},
'loggers': {
'django': {
'handlers': ['file'],
'level': 'ERROR',
'propagate': True,
},
},
}
# Verify STATIC_ROOT
uv run python manage.py findstatic style.css
# Recollect static files
uv run python manage.py collectstatic --clear --noinput
# Test database connection
uv run python manage.py dbshell
# Verify DATABASE_URL format
echo $DATABASE_URL
# Fix file permissions
sudo chown -R www-data:www-data /path/to/project
sudo chmod -R 755 /path/to/project
# Check logs
sudo journalctl -u skincare_filter -n 50
# Test WSGI application
uv run python manage.py check --deploy
Important: CI enforces .python-version matching (see workflow). Always run uv python pin 3.13 after cloning to ensure alignment.
This project uses a .python-version file to pin Python 3.13 for local development. uv automatically detects and uses this version.
# List all installed Python versions managed by uv
uv python list
# Example output:
# cpython-3.13.0-windows-x86_64-none <-- active
# cpython-3.12.1-windows-x86_64-none
The active version (marked with <--) should be 3.13.x to match .python-version.
If you don’t have Python 3.13 installed:
# Install python 3.13 (uv will download and manage it)
uv python install 3.13
# Verify installation
uv python list
If you need to explicitly pin or change the Python version:
# Pin to python 3.13 (updates .python-version)
uv python pin 3.13
# Verify the pin
cat .python-version # Should output: 3.13
If your virtual environment is using the wrong Python version:
# Remove existing venv
Remove-Item -Recurse -Force .venv # Windows PowerShell
rm -rf .venv # macOS/Linux
# Create new venv with correct Python version
uv venv --python 3.13
# Reinstall dependencies
uv sync --group dev
Issue: python --version shows wrong version despite .python-version
Solution: The .python-version file is for uv’s Python management. To ensure consistency:
uv run python instead of python directly.venv\Scripts\Activate.ps1 (Windows) / source .venv/bin/activate (Linux/macOS)Issue: Pre-commit hook fails with version mismatch
Solution: Your system Python differs from the pinned version. Either:
uv run commands which automatically use the correct versionAs your application grows:
For deployment issues or questions, open an issue on GitHub.