Background
Have you encountered these challenges: Traditional Django and Flask feel inadequate when building microservices? API performance can't keep up? Documentation maintenance is painful?
Recently, while refactoring an old project, I faced these exact issues. The project was originally a monolithic application written in Flask. As the business grew, the code became increasingly difficult to maintain, and performance became a bottleneck. After research and practice, I found FastAPI to be an excellent choice. Today, I'd like to share my experiences using FastAPI to build microservices.
Features
FastAPI has "fast" in its name, but just how fast is it? Let's look at some data:
Under the same hardware conditions (8-core CPU, 16GB RAM), stress testing a simple GET request: - Flask: 7000 requests/second - Django: 5000 requests/second - FastAPI: 14000 requests/second
This performance improvement is remarkable. The main reasons behind this are FastAPI's features:
First is async support. You know the async/await syntax introduced in Python 3.5+? FastAPI is built entirely on this asynchronous mechanism, allowing it to fully utilize modern hardware's multi-core capabilities. I remember the first time I rewrote an IO-intensive interface from synchronous to asynchronous, the response time dropped directly from 500ms to under 100ms.
Second is type hints. Python 3.6+'s type annotations are fully utilized in FastAPI. For example, look at this code:
from fastapi import FastAPI
from pydantic import BaseModel
class Item(BaseModel):
name: str
price: float
is_offer: bool = None
app = FastAPI()
@app.post("/items/")
async def create_item(item: Item):
return item
See how through simple type declarations, FastAPI automatically: - Validates request data types - Converts data formats - Generates API documentation - Provides code completion
This reminds me of using Flask before, where you either had to write lots of data validation code or nervously handle various runtime errors. Now with the protection of the type system, code robustness has improved significantly.
Practice
After all this theory, let's see how to use FastAPI to build microservices in real projects. I'll share some practical experience using an order system as an example.
First, the project structure:
order_service/
├── app/
│ ├── api/
│ │ ├── __init__.py
│ │ ├── orders.py
│ │ └── items.py
│ ├── core/
│ │ ├── __init__.py
│ │ └── config.py
│ ├── db/
│ │ ├── __init__.py
│ │ └── session.py
│ ├── models/
│ │ ├── __init__.py
│ │ └── order.py
│ └── schemas/
│ ├── __init__.py
│ └── order.py
├── tests/
│ └── test_api.py
├── alembic/
│ └── versions/
├── Dockerfile
└── docker-compose.yml
Looks familiar, right? Indeed, it borrows from Django's project layout but is more suited for microservice architecture. We clearly separate different functional modules: api handles routing and controller logic, models defines data models, and schemas handles data validation and serialization.
Now let's look at the core order processing logic:
from fastapi import APIRouter, Depends, HTTPException
from sqlalchemy.orm import Session
from typing import List
from app.db.session import get_db
from app.schemas.order import OrderCreate, OrderResponse
from app.models.order import Order
router = APIRouter()
@router.post("/orders/", response_model=OrderResponse)
async def create_order(
order: OrderCreate,
db: Session = Depends(get_db)
):
# Check inventory
if not await check_inventory(order.items):
raise HTTPException(status_code=400, detail="Insufficient inventory")
# Create order
db_order = Order(
user_id=order.user_id,
total_amount=order.total_amount,
status="pending"
)
db.add(db_order)
db.commit()
db.refresh(db_order)
# Send notification asynchronously
await notify_order_created(db_order.id)
return db_order
This code demonstrates several important FastAPI features:
- Dependency injection: Handling database sessions elegantly through Depends
- Request validation: OrderCreate schema automatically validates request data
- Response model: response_model ensures return data format
- Async processing: async/await handles time-consuming operations
You might have noticed the check_inventory and notify_order_created async functions. In real projects, these are cross-service calls implemented using aiohttp:
async def check_inventory(items: List[dict]) -> bool:
async with aiohttp.ClientSession() as session:
async with session.post(
"http://inventory-service/check",
json={"items": items}
) as response:
result = await response.json()
return result["available"]
async def notify_order_created(order_id: int):
async with aiohttp.ClientSession() as session:
await session.post(
"http://notification-service/notify",
json={"order_id": order_id}
)
Speaking of cross-service calls, we must mention service discovery and load balancing. In our project, we use Consul for service registration and discovery:
from fastapi import FastAPI
from consul import Consul
app = FastAPI()
consul_client = Consul(host="consul", port=8500)
consul_client.agent.service.register(
"order-service",
service_id=f"order-{get_host_ip()}",
port=8000,
tags=["api"]
)
@app.on_event("shutdown")
async def shutdown_event():
# Service deregistration
consul_client.agent.service.deregister(
service_id=f"order-{get_host_ip()}"
)
To ensure service reliability, we also implemented circuit breaking and rate limiting mechanisms:
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from slowapi import Limiter
from slowapi.util import get_remote_address
from circuitbreaker import circuit
app = FastAPI()
limiter = Limiter(key_func=get_remote_address)
app.state.limiter = limiter
@circuit(failure_threshold=5, recovery_timeout=60)
async def call_external_service():
async with aiohttp.ClientSession() as session:
async with session.get("http://external-service") as response:
return await response.json()
Monitoring is also an important part of microservice architecture. We use Prometheus and Grafana to collect and visualize metrics:
from prometheus_fastapi_instrumentator import Instrumentator
Instrumentator().instrument(app).expose(app)
from prometheus_client import Counter
order_counter = Counter(
'order_total',
'Total number of orders',
['status']
)
@app.post("/orders/")
async def create_order(order: OrderCreate):
# ... Order processing logic ...
order_counter.labels(status="success").inc()
Summary
After this period of practice, I believe FastAPI is indeed an ideal choice for building Python microservices. It not only provides excellent performance and development experience but also integrates well with various microservice ecosystem components.
Of course, choosing a technology stack is just a small part of microservice architecture. The real challenges are how to define service boundaries, handle distributed transactions, ensure data consistency, etc. We'll discuss these topics another time.
What do you think about FastAPI? What experiences have you had using it? Feel free to share your thoughts in the comments.
Extended Thoughts
At this point, you might ask: is FastAPI suitable for all projects? The answer is no.
If your project is a traditional content management system with lots of backend management features, Django might be a better choice. Django's admin interface, form handling, authentication, and authorization features are ready to use and can greatly improve development efficiency.
If your project is small and doesn't need so many features, Flask is perfectly adequate. Flask's simplicity and directness are actually its advantages.
There are no absolute right or wrong choices in technology selection; the key is to decide based on actual project requirements. As the Chinese saying goes: "To do a good job, one must first sharpen their tools." Choosing the right tools can make work twice as efficient with half the effort.
So here's the question: how do you choose your technology stack in your projects? Feel free to share your experience.