From 30 Seconds to 200ms: Optimizing a Django Admin Dashboard
The sales team was frustrated. Every time they opened the customer dashboard, they had to wait over 30 seconds for it to load. Coffee breaks became synonymous with "waiting for the CRM to load." Something had to change.
When a mid-sized e-commerce company reached out to me, their Django Admin-based CRM had become unusable. What started as a quick internal tool for 50 customers had grown to manage 10,000+ customers, each with multiple orders, support tickets, and interactions. The admin panel that once loaded instantly now took 30+ seconds and sometimes timed out completely.
The Problem: Death by a Thousand Queries
The first step in any performance optimization is understanding what's actually happening. I opened Django Debug Toolbar and watched in horror as the page fired off 847 database queries for a simple list view.
The culprit? Classic N+1 query problems everywhere. For each customer in the list, Django was making separate queries to fetch:
- The customer's latest order
- Their total number of orders
- Their assigned sales representative
- Their account status
- Their support ticket count
Here's what the original ModelAdmin looked like:
# admin.py - Before optimization
from django.contrib import admin
from .models import Customer
@admin.register(Customer)
class CustomerAdmin(admin.ModelAdmin):
list_display = [
'email',
'get_latest_order',
'get_order_count',
'get_sales_rep',
'get_ticket_count',
'account_status',
]
def get_latest_order(self, obj):
latest = obj.orders.order_by('-created_at').first()
return latest.order_number if latest else 'No orders'
get_latest_order.short_description = 'Latest Order'
def get_order_count(self, obj):
return obj.orders.count()
get_order_count.short_description = 'Orders'
def get_sales_rep(self, obj):
return obj.sales_rep.full_name if obj.sales_rep else 'Unassigned'
get_sales_rep.short_description = 'Sales Rep'
def get_ticket_count(self, obj):
return obj.support_tickets.filter(status='open').count()
get_ticket_count.short_description = 'Open Tickets'
Looks innocent enough, right? But when you have 100 customers displayed per page, that's 100 queries just for orders, 100 for order counts, 100 for sales reps, and 100 for ticket counts. Add in the base queries, and you've got a performance disaster.
The Approach: Systematic Optimization
I followed a systematic approach to tackle this:
- Measure first - Use Django Debug Toolbar and query logging
- Optimize the queryset - Use
select_related()andprefetch_related() - Add database indexes - Speed up frequent lookups
- Implement caching - Cache expensive computed values
- Measure again - Verify improvements
Step 1: Optimize the Queryset with get_queryset()
The key to solving N+1 queries is to override get_queryset() and tell Django to fetch related data upfront using select_related() and prefetch_related().
# admin.py - After optimization
from django.contrib import admin
from django.db.models import Count, Prefetch, Q
from .models import Customer, Order, SupportTicket
@admin.register(Customer)
class CustomerAdmin(admin.ModelAdmin):
list_display = [
'email',
'get_latest_order',
'order_count',
'get_sales_rep',
'open_ticket_count',
'account_status',
]
list_select_related = ['sales_rep'] # Join sales_rep table
def get_queryset(self, request):
qs = super().get_queryset(request)
# Use select_related for foreign keys (1-to-1 or many-to-1)
qs = qs.select_related('sales_rep')
# Use prefetch_related for reverse foreign keys (1-to-many)
# and add annotations for counts
qs = qs.prefetch_related(
Prefetch(
'orders',
queryset=Order.objects.order_by('-created_at')[:1],
to_attr='latest_order_cached'
),
).annotate(
order_count=Count('orders', distinct=True),
open_ticket_count=Count(
'support_tickets',
filter=Q(support_tickets__status='open'),
distinct=True
)
)
return qs
def get_latest_order(self, obj):
# Access prefetched data instead of hitting database
if hasattr(obj, 'latest_order_cached') and obj.latest_order_cached:
return obj.latest_order_cached[0].order_number
return 'No orders'
get_latest_order.short_description = 'Latest Order'
get_latest_order.admin_order_field = 'orders__created_at'
def get_sales_rep(self, obj):
# No extra query - already loaded via select_related
return obj.sales_rep.full_name if obj.sales_rep else 'Unassigned'
get_sales_rep.short_description = 'Sales Rep'
get_sales_rep.admin_order_field = 'sales_rep__full_name'
# These use annotated fields - no extra queries!
def order_count(self, obj):
return obj.order_count
order_count.short_description = 'Orders'
order_count.admin_order_field = 'order_count'
def open_ticket_count(self, obj):
return obj.open_ticket_count
open_ticket_count.short_description = 'Open Tickets'
open_ticket_count.admin_order_field = 'open_ticket_count'
Key techniques used:
select_related('sales_rep')- Performs a SQL JOIN to fetch the sales rep in the same queryannotate(order_count=Count('orders'))- Computes the count in the database, not in PythonPrefetch()- Fetches related orders in a single query with custom filteringto_attr='latest_order_cached'- Stores prefetched data in a custom attribute
This reduced queries from 847 to 12 - a 98.6% reduction!
Step 2: Add Strategic Database Indexes
Even with optimized queries, the database needs indexes to find data quickly. I analyzed the queries being run and added indexes on frequently filtered and sorted columns.
# models.py
from django.db import models
class Customer(models.Model):
email = models.EmailField(unique=True, db_index=True)
sales_rep = models.ForeignKey(
'accounts.User',
on_delete=models.SET_NULL,
null=True,
related_name='customers',
db_index=True # Index for foreign key lookups
)
account_status = models.CharField(
max_length=20,
choices=STATUS_CHOICES,
db_index=True # Frequently filtered
)
created_at = models.DateTimeField(auto_now_add=True, db_index=True)
class Meta:
indexes = [
# Composite index for common query patterns
models.Index(fields=['account_status', '-created_at']),
models.Index(fields=['sales_rep', 'account_status']),
]
class Order(models.Model):
customer = models.ForeignKey(
Customer,
on_delete=models.CASCADE,
related_name='orders'
)
order_number = models.CharField(max_length=20, unique=True)
created_at = models.DateTimeField(auto_now_add=True, db_index=True)
class Meta:
indexes = [
# Index for latest order queries
models.Index(fields=['customer', '-created_at']),
]
After adding these indexes, the query execution time dropped by another 40%.
Step 3: Implement Caching for Expensive Operations
Some computed values don't change frequently. I added Redis caching for particularly expensive operations:
# utils.py
from django.core.cache import cache
from django.utils.functional import cached_property
def get_customer_stats(customer_id):
"""Cache customer statistics for 5 minutes"""
cache_key = f'customer_stats_{customer_id}'
stats = cache.get(cache_key)
if stats is None:
# Expensive computation here
customer = Customer.objects.get(id=customer_id)
stats = {
'lifetime_value': customer.calculate_lifetime_value(),
'avg_order_value': customer.calculate_avg_order_value(),
'last_interaction': customer.get_last_interaction_date(),
}
cache.set(cache_key, stats, 300) # Cache for 5 minutes
return stats
For the detail pages (which weren't as problematic but could be better), I used cached_property:
# models.py
from django.utils.functional import cached_property
class Customer(models.Model):
# ... fields ...
@cached_property
def lifetime_value(self):
"""Cached for the lifetime of the object in memory"""
return self.orders.aggregate(
total=Sum('total_amount')
)['total'] or 0
Step 4: Use list_select_related and Raw SQL When Needed
For the admin list view, Django provides list_select_related which is a shorthand:
class CustomerAdmin(admin.ModelAdmin):
list_select_related = ['sales_rep', 'sales_rep__manager']
For extremely complex queries, sometimes raw SQL or .extra() is necessary, but I found that 95% of the time, proper use of the ORM was sufficient and more maintainable.
The Results: Dramatic Improvement
Before optimization:
- Page load time: 30-45 seconds
- Database queries: 847 queries
- Peak memory usage: 450MB
- User complaints: Daily
After optimization:
- Page load time: 200-400ms (99% improvement)
- Database queries: 12 queries (98.6% reduction)
- Peak memory usage: 85MB (81% reduction)
- User complaints: Zero
The sales team could finally use the CRM without taking coffee breaks. Loading 100 customers now took less time than loading a single customer before.
Key Takeaways and Lessons Learned
1. Always Measure First
Don't guess where the bottleneck is. Use Django Debug Toolbar in development and django-silk or similar in production. You can't optimize what you don't measure.
2. Understand select_related vs prefetch_related
select_related()- Use for ForeignKey and OneToOneField (SQL JOIN)prefetch_related()- Use for ManyToManyField and reverse ForeignKey (separate queries + Python join)
3. Annotations are Your Friend
Computing counts and aggregates in the database is always faster than doing it in Python:
# Bad - hits database for each object
total = sum(order.amount for order in customer.orders.all())
# Good - computed in database
total = customer.orders.aggregate(Sum('amount'))['amount__sum']
4. Index Strategically
Don't index everything - indexes have a cost for writes. Index columns that are:
- Used in WHERE clauses frequently
- Used in ORDER BY
- Foreign keys that are queried often
- Used in JOIN conditions
5. Cache Wisely
Not everything needs caching. Cache data that:
- Is expensive to compute
- Doesn't change frequently
- Is accessed often Use cache invalidation strategies to keep data fresh.
6. Use Pagination
Even with perfect queries, loading thousands of records is slow. Django Admin has built-in pagination - use it:
class CustomerAdmin(admin.ModelAdmin):
list_per_page = 100 # Default is 100, but be explicit
7. Monitor in Production
Set up query monitoring in production. I use django-silk and set up alerts when query counts exceed thresholds:
# settings.py
SILKY_MAX_RECORDED_REQUESTS = 10000
SILKY_INTERCEPT_PERCENT = 10 # Only profile 10% of requests
When to Use Each Optimization Technique
Use select_related() when:
- Following ForeignKey relationships
- You need the related object's data immediately
- The relationship is one-to-one or many-to-one
Use prefetch_related() when:
- Accessing reverse ForeignKey (one-to-many)
- Working with ManyToMany relationships
- You need multiple related objects
Use annotate() when:
- Computing counts or aggregates
- You need the value in list views
- The computation can be done in SQL
Use caching when:
- The computation is expensive (>100ms)
- The data doesn't change often
- You have a cache invalidation strategy
Conclusion
Performance optimization isn't about prematurely optimizing everything - it's about identifying bottlenecks and systematically addressing them. The Django ORM gives you powerful tools like select_related(), prefetch_related(), and annotate() that, when used correctly, can turn a sluggish admin panel into a snappy user experience.
The key is understanding your data access patterns and ensuring Django fetches exactly what you need, when you need it, in as few queries as possible.
If your Django Admin (or any Django view) is running slowly, start with Django Debug Toolbar, count your queries, and work through the optimization techniques above. You'll be amazed at how much performance you can squeeze out with just a few strategic changes.
And remember: a fast CRM means happy sales teams, and happy sales teams close more deals.
Let's work together
Have a project in mind or interested in discussing similar topics? I'd love to hear from you.
Get in touch