Chapter 52: Environment Management and Project Structure
Two things trip up almost every beginner Python developer:
- "Why does my code work on my machine but not on my teammate's?"
- "I installed something and now everything is broken."
Both problems come from the same root cause: mixing packages across projects. The solution is virtual environments — isolated Python installations, one per project.
This chapter covers virtual environments in depth, the modern tools that make them easier, how to handle secrets safely, and how to structure a professional Python project.
The Problem — Why You Need Virtual Environments
Python packages are installed globally by default. This causes conflicts:
Project A needs: requests==2.28
Project B needs: requests==2.32
Python global: requests==2.28 <- Project B breaks
Virtual environments solve this by giving each project its own isolated copy of Python and its packages.
venv — Python's Built-in Solution
venv ships with Python. No installation needed.
# Create a virtual environment
python -m venv .venv
# Activate it
# Windows (PowerShell):
.\.venv\Scripts\Activate.ps1
# Windows (Command Prompt):
.\.venv\Scripts\activate.bat
# Mac / Linux:
source .venv/bin/activate
# Your prompt changes to show the active environment:
# (.venv) user@machine:~/project$
# Install packages — goes into .venv, not global Python
pip install requests pandas fastapi
# See what's installed
pip list
pip freeze # with version pins (for requirements.txt)
# Deactivate when done
deactivate
Always add .venv (or venv, env) to your .gitignore. Never commit the virtual environment folder.
Creating requirements.txt
pip freeze > requirements.txt
This captures the exact versions of every installed package:
fastapi==0.110.0
pydantic==2.6.4
requests==2.31.0
uvicorn==0.28.0
Someone else reproduces your environment with:
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
The problem with pip freeze
pip freeze includes indirect dependencies (packages that your packages depend on). This is fine for deployment but noisy for development.
The cleaner approach: separate your direct dependencies (what you need) from indirect ones (what they need):
requirements.in <- only what YOU need
requirements.txt <- complete frozen list (generated)
pip-tools — Clean Dependency Management
pip install pip-tools
requirements.in — your direct dependencies:
fastapi
uvicorn[standard]
pydantic-settings
sqlalchemy>=2.0
Generate a pinned requirements.txt:
pip-compile requirements.in
This produces a fully pinned requirements.txt with hashes:
# This file was autogenerated by pip-compile
fastapi==0.110.0
# via -r requirements.in
uvicorn==0.28.0
# via -r requirements.in
...
Install exactly these versions:
pip-sync requirements.txt
pip-sync also removes packages that aren't in requirements.txt — your environment stays clean.
Keep both files in Git. Regenerate requirements.txt when you add a dependency:
pip-compile requirements.in
pip-sync requirements.txt
uv — The Modern Fast Alternative
uv is a next-generation Python package manager written in Rust. It's 10-100x faster than pip and handles virtual environments, dependency resolution, and more.
# Install uv (replaces pip in new projects)
pip install uv
# Create a project (creates pyproject.toml + .venv automatically)
uv init my-project
cd my-project
# Add dependencies
uv add fastapi uvicorn pydantic-settings
# Add dev dependencies
uv add --dev pytest ruff mypy
# Install everything
uv sync
# Run a command inside the environment
uv run python main.py
uv run pytest
# Remove a dependency
uv remove requests
# Update all packages to latest compatible versions
uv lock --upgrade
uv stores dependencies in pyproject.toml and locks them in uv.lock. Both go in Git.
For new projects in 2026, uv is the recommended choice. For existing projects with requirements.txt, stick with pip or pip-tools.
poetry — Dependencies + Packaging in One
Poetry is another popular tool that combines dependency management and packaging:
pip install poetry
poetry new my-project # scaffold a new project
cd my-project
poetry add fastapi # add to [tool.poetry.dependencies]
poetry add --dev pytest # add to dev dependencies
poetry install # install all dependencies
poetry run python main.py # run inside the environment
poetry shell # activate the environment
poetry update # update to latest compatible versions
poetry show # list installed packages
poetry build # build a wheel for publishing
poetry publish # publish to PyPI
Poetry is opinionated and self-contained. If you're building a library to publish on PyPI, Poetry is a solid choice.
Choosing a Tool
| Need | Recommended tool |
|---|---|
| Quick personal project | venv + pip |
| Team project, simple deps | venv + pip-tools |
| New project, want speed | uv |
| Library for PyPI | uv or Poetry |
| Complex monorepo | uv workspaces |
Professional Project Structure
This is the layout I recommend for any non-trivial Python project:
my-project/
├── src/
│ └── myapp/
│ ├── __init__.py
│ ├── main.py
│ ├── config.py
│ ├── models/
│ │ ├── __init__.py
│ │ └── user.py
│ ├── services/
│ │ ├── __init__.py
│ │ └── auth.py
│ └── utils/
│ ├── __init__.py
│ └── helpers.py
├── tests/
│ ├── conftest.py
│ ├── unit/
│ │ ├── test_models.py
│ │ └── test_utils.py
│ └── integration/
│ └── test_api.py
├── docs/
│ ├── index.md
│ └── api.md
├── scripts/
│ ├── seed_db.py
│ └── generate_report.py
├── .github/
│ └── workflows/
│ └── ci.yml
├── .env.example <- template for environment variables
├── .env <- your actual secrets (in .gitignore)
├── .gitignore
├── pyproject.toml
├── README.md
└── LICENSE
Key principles:
src/layout — prevents accidental imports of your code during testing- Separate
tests/— outsidesrc/, withunit/andintegration/subdirectories scripts/— one-off utility scripts (seed database, generate reports)docs/— documentation lives close to the code.env.example— checked into Git; shows what variables are needed without exposing values
Environment Variables and Secrets
Never hardcode credentials, API keys, database URLs, or any secret in your code. Not even in comments.
.env files
pip install python-dotenv
.env — your actual secrets (never commit this):
DATABASE_URL=postgresql://user:password@localhost/mydb
SECRET_KEY=super-secret-key-here
DEBUG=false
SMTP_HOST=smtp.gmail.com
SMTP_PORT=587
SMTP_USER=myapp@gmail.com
SMTP_PASSWORD=app-password-here
STRIPE_KEY=sk_live_abc123
.env.example — the template (commit this):
DATABASE_URL=postgresql://user:password@localhost/mydb
SECRET_KEY=change-this-to-a-random-string
DEBUG=false
SMTP_HOST=smtp.gmail.com
SMTP_PORT=587
SMTP_USER=
SMTP_PASSWORD=
STRIPE_KEY=
Load them in Python:
from dotenv import load_dotenv
import os
load_dotenv() # loads .env into os.environ
db_url = os.environ["DATABASE_URL"] # raises KeyError if missing
debug = os.getenv("DEBUG", "false") == "true" # with default
secret_key = os.environ.get("SECRET_KEY", "")
pydantic-settings — Validated Configuration
For serious projects, use pydantic-settings. It validates environment variables against a schema and provides autocomplete:
# config.py
from functools import lru_cache
from pydantic_settings import BaseSettings, SettingsConfigDict
class Settings(BaseSettings):
# App
app_name: str = "My App"
debug: bool = False
log_level: str = "INFO"
# Database
database_url: str
# Auth
secret_key: str
token_expiry_hours: int = 24
# Email
smtp_host: str = "smtp.gmail.com"
smtp_port: int = 587
smtp_user: str = ""
smtp_password: str = ""
model_config = SettingsConfigDict(
env_file=".env",
env_file_encoding="utf-8",
case_sensitive=False,
)
@lru_cache
def get_settings() -> Settings:
"""Return a cached Settings instance."""
return Settings()
Use it anywhere in your app:
from config import get_settings
settings = get_settings()
print(settings.database_url) # type: str — validated and typed
print(settings.debug) # type: bool — "false" -> False automatically
If DATABASE_URL or SECRET_KEY is missing from the environment, pydantic raises a clear error at startup — not buried in a cryptic crash later.
Never Do This
# BAD — hardcoded secret
API_KEY = "sk-abc123xyz"
db = connect("postgresql://admin:password123@prod-db.example.com/mydb")
# BAD — secret in a comment
# password: hunter2
# BAD — secret in a test
def test_something():
token = "eyJhbGci..." # a real JWT from production
Production Secrets
For production, use your hosting platform's secret management:
- Render / Railway / Heroku -> Environment variables in the dashboard
- AWS -> AWS Secrets Manager or Parameter Store
- GCP -> Secret Manager
- GitHub Actions -> Repository secrets (
Settings -> Secrets -> Actions)
# .github/workflows/ci.yml — access secrets in CI
env:
DATABASE_URL: ${{ secrets.DATABASE_URL }}
SECRET_KEY: ${{ secrets.SECRET_KEY }}
Managing Multiple Python Versions
Different projects need different Python versions. Use pyenv to manage them:
# Install pyenv (Mac/Linux)
curl https://pyenv.run | bash
# List available versions
pyenv install --list | grep "3\."
# Install specific versions
pyenv install 3.11.9
pyenv install 3.12.3
# Set global default
pyenv global 3.12.3
# Set version for a specific project
cd my-project
pyenv local 3.11.9 # creates .python-version file
python --version # 3.11.9
# List installed versions
pyenv versions
On Windows, use the Python Launcher:
py -3.11 -m venv .venv # create venv with Python 3.11
py -3.12 -m venv .venv # create venv with Python 3.12
A Complete Project Setup Script
Put this in a Makefile or justfile to automate setup:
# Makefile
.PHONY: setup test lint typecheck clean
setup:
python -m venv .venv
.venv/bin/pip install -e ".[dev]"
cp -n .env.example .env || true
test:
.venv/bin/pytest tests/ -v --cov=src --cov-report=term-missing
lint:
.venv/bin/ruff check src/ tests/
.venv/bin/ruff format --check src/ tests/
typecheck:
.venv/bin/mypy src/
clean:
rm -rf .venv __pycache__ .pytest_cache .mypy_cache dist/ build/
find . -name "*.pyc" -delete
Run everything with:
make setup # create venv, install deps
make test # run tests with coverage
make lint # check code style
make typecheck # run mypy
The Checklist for Every New Python Project
□ Create a virtual environment (.venv)
□ Add .venv to .gitignore
□ Create pyproject.toml (or requirements.in + requirements.txt)
□ Add .env.example with all required variable names
□ Add .env to .gitignore
□ Set up ruff for linting (pyproject.toml [tool.ruff])
□ Set up mypy for type checking (pyproject.toml [tool.mypy])
□ Set up pytest (pyproject.toml [tool.pytest.ini_options])
□ Create .github/workflows/ci.yml
□ Create README.md with: what it does, how to install, how to run
□ git init + first commit
What You Learned in This Chapter
- Virtual environments isolate packages per project.
python -m venv .venvcreates one. Always add.venvto.gitignore. pip freeze > requirements.txtpins all packages.pip install -r requirements.txtreproduces the environment.pip-toolsseparates your direct deps (requirements.in) from the full pinned lock (requirements.txt).pip-compilegenerates,pip-syncinstalls.uvis the modern, fast alternative:uv add,uv sync,uv run. Ideal for new projects.- Poetry combines dependency management and packaging in one tool.
- Professional projects use
src/layout, separatetests/withunit/andintegration/, ascripts/folder, and.env.example. - Secrets live in
.env(never committed). Load them withpython-dotenvorpydantic-settings. pydantic-settingsvalidates env vars against a schema and gives you typed, autocompleted config.pyenvmanages multiple Python versions.pyenv local 3.11.9pins a version to a project.
What's Next?
Chapter 53 covers Logging — replacing scattered print() statements with a structured logging system that tells you exactly what your application is doing in production, without wading through noise.