Why Is My Web Application Slow? Understanding CDNs, Caching, and Latency
Your users are far from your servers, and every request travels thousands of miles. Learn how CDNs reduce latency, when they're worth implementing, and when simpler fixes work better.
WEB OPTIMIZATIONTECHINFRASTRUCTURELATENCY
Matteo Aurelio Arellano
1/29/202611 min read
As a Product Manager or business stakeholder, you might feel frustrated that your application is not performing as fast as it should for your end users. You start thinking about several different strategies, perhaps starting with the basics:
Am I loading heavy images, text, font styles or any other UI/front-end component that slows my application down?
You might run Chrome Lighthouse Audits (a free tool built into Chrome that analyzes your site's performance) and get clear reports showing exactly where your latency is failing.
Alternatively, you start thinking about the location, network and other user-dependent factors to understand further how to serve your clients better with your tech product.
Then someone from your engineering team comes to you as a decision-maker and mentions: "Perhaps we should implement a caching strategy."
You've probably read that term before. "Oh yeah, that's basically creating copies of your static assets on servers around the world near where my real users are, right?"
And yes. You're completely right. One caching strategy is to serve your static assets—your web content that stays constant for all users—using what is known as a Content Delivery Network (CDN).
But should you use a CDN? In which cases is this appropriate and when is it not? In this article, we'll dive into that question from an engineering perspective while avoiding as much technical jargon as possible.
The truth is that a CDN adds complexity to your web application and your overall system. It becomes another layer to debug, a place where content can go stale, and it still costs resources to maintain. Sometimes that complexity pays off. Sometimes it doesn't.
Let's figure out which situation you're in.
First, Let's Understand What Actually Makes Your Application "Slow"
Before deciding on a CDN, you need to understand why web applications feel slow in the first place. There are two main culprits:
1. Physical Distance Creates Latency
When a user in Tokyo requests your website, and your server lives in Virginia, that request has to travel across the Pacific Ocean, literally through undersea fiber optic cables. Light itself takes time to travel that distance.
This round-trip time (the request going to Virginia, the server processing it, and the response coming back to Tokyo) is called latency. For a user in Tokyo hitting a Virginia server, you're looking at roughly 150-200 milliseconds just for the data to travel back and forth. And that's before your server even does any work.
Now multiply that by every single file your application needs to load: the HTML structure, CSS styling, JavaScript code, images, fonts. A typical web page might need 50+ separate files. If each one requires a round trip to Virginia, your Tokyo user is waiting a long time.
2. Your Server Does the Same Work Repeatedly
Every time someone visits your homepage, your server has to:
Find the HTML file
Find the CSS file
Find the JavaScript file
Find all the images
Send all of these back to the user
If 10,000 people visit your homepage today, your server does this exact same work 10,000 times—even though the files haven't changed. That's wasteful, and under heavy load, your server can get overwhelmed and slow down (or crash entirely).
How a CDN Solves These Problems
A CDN is essentially a network of servers spread across the world. These servers are called edge servers or edge nodes—"edge" because they sit at the edge of the network, close to your end users rather than centralized in one location.
Here's how it works:
First request: A user in Tokyo visits your site. The CDN checks: "Do I have a copy of this file stored nearby?" If not, it fetches the file from your main server (called the origin server) in Virginia, delivers it to the user, and stores a copy at the Tokyo edge node.
Subsequent requests: The next user in Tokyo (or nearby in Asia) visits your site. The CDN already has the files cached at the Tokyo edge node. It serves them directly—no round trip to Virginia needed.
The result: Your Tokyo users get files from a server that might be 20 milliseconds away instead of 200 milliseconds away. Your origin server in Virginia only handles the first request; the CDN handles the rest.
This is why CDNs can be the difference between a successful product launch and your site going down. When your product gets featured on TechCrunch or Hacker News, thousands of users hit your site simultaneously. Without a CDN, every single request hammers your origin server. With a CDN, most requests are served from cached copies at edge nodes around the world AND your origin server barely notices the traffic spike.
The First Question: Where Are Your Users Located?
Before implementing a CDN, you need data. Open your analytics tool (Google Analytics, Mixpanel, Amplitude, or whatever you use) and look at the geographic distribution of your users.
Scenario A: 85% of your users are in the United States, and your servers are in the US. A CDN provides marginal benefit. The latency difference between serving from your origin in Virginia to a user in San Francisco versus serving from a CDN edge in San Francisco is small, maybe 30-50 milliseconds. Noticeable? Barely. Worth the added complexity? Probably not.
Scenario B: You have significant traffic from Europe, Asia, and South America, but your servers are only in the US. Now we're talking about 150-300 milliseconds of latency saved per request. For a page that loads 50 assets, that's the difference between a 2-second load time and a 5-second load time. Users notice. Conversion rates drop. A CDN makes sense here.
How to check this in practice:
Google Analytics: Go to Audience → Geo → Location
Vercel/Netlify dashboards show request origins
Cloudflare (if you're already using it for DNS) shows geographic traffic distribution
Decision point: If more than 30% of your users are geographically far from your servers, a CDN will provide meaningful performance improvements.
The Second Question: What Type of Content Are You Serving?
CDNs cache content. This is important to understand: if your content is different for every user, there's nothing to cache.
Static Content (CDN-friendly)
These are files that are identical for every user:
Your logo image
Your CSS stylesheet (the file that controls colors, fonts, and layout)
Your JavaScript bundle (the code that makes your site interactive)
Marketing page content
Product images in an e-commerce catalog
These are perfect for CDNs. Cache them worldwide, serve them fast.
Dynamic Content (Not CDN-friendly)
These are responses that are personalized or change frequently:
A user's dashboard showing their specific data
A social media feed customized to their interests
Shopping cart contents
Real-time pricing or inventory
Anything that requires the user to be logged in
A CDN sitting in front of dynamic content just adds latency without providing any caching benefit. The request still has to reach your origin server because only your origin knows what data to show that specific user.
Real-world example: Think about Netflix. When you open the Netflix app:
The Netflix logo, the UI components, the JavaScript code that makes the app work, these are static. They're the same for every user and can be cached on CDNs worldwide.
Your personalized "Continue Watching" row, your recommendations, your watch history, these are dynamic. They must come from Netflix's servers because only Netflix's servers know what you've been watching.
Netflix uses CDNs aggressively for static content (and video streaming, which is a special case), but your personalized homepage data still comes from their origin servers.
Decision point: Estimate what percentage of your application's requests are for static, cacheable content. If it's less than 40%, a CDN's impact will be limited to speeding up your initial page load, helpful, but not transformative.
The Hardest Part: Cache Invalidation
There's a famous quote in computer science: "There are only two hard things in computer science: cache invalidation and naming things."
Here's the problem. When you push an update to your application—a bug fix, a new feature, a design change—that update exists on your origin server immediately. But the CDN's edge nodes around the world still have the old version cached. Your users might see outdated (stale) content until the cache refreshes.
Let me give you a concrete example:
The Bug Fix Scenario
Your engineering team discovers a critical bug in your checkout flow; the "Place Order" button doesn't work on Safari. They push a fix at 2:00 PM.
Your origin server now has the fixed code
But the CDN edge node in London still has the broken code cached
Your customer in London clicks "Place Order" and nothing happens
They abandon their cart. You lose the sale.
This is cache invalidation: making sure the old, broken version gets replaced with the new, fixed version across all edge nodes worldwide.
Your Three Options for Handling Updates
Option 1: Wait for TTL Expiration
TTL stands for "Time To Live"—it's how long you tell the CDN to keep a cached copy before checking for a new version.
How it works: You configure your CDN with a TTL of, say, 1 hour. After 1 hour, the cached copy "expires," and the next request triggers the edge node to fetch a fresh copy from your origin.
Pros: Simple. No extra work when you deploy.
Cons: If you set TTL to 1 hour and push a critical bug fix, some users see the bug for up to 1 hour. If you set TTL to 5 minutes, you get fresher content but more requests hit your origin (reducing the CDN's benefit).
Best for: Content that updates on a predictable schedule and isn't time-sensitive. Blog posts, marketing pages, documentation.
Option 2: Purge the Cache
Most CDN providers (Cloudflare, AWS CloudFront, Fastly, Akamai) offer an API to "purge" or "invalidate" cached content. You tell the CDN: "Delete the cached copy of checkout.js from all edge nodes."
How it works: Your deployment pipeline (the automated process that pushes code to production) includes a step that calls the CDN's purge API after deploying new code.
Pros: Fresh content available within seconds to minutes after deployment.
Cons: Purge propagation isn't instant, it takes time to reach all 200+ edge nodes. Sometimes specific nodes fail to purge. Debugging "why is this one user seeing old content?" becomes tricky. Also, CDN providers often rate-limit purge requests (you can only purge so many times per hour).
Best for: Applications with occasional updates where freshness matters. SaaS products that deploy once or twice a day.
Option 3: Cache-Busting URLs (Also Called Versioned Assets)
This is the most reliable approach for critical files like JavaScript and CSS.
How it works: Instead of naming your file app.js, you include a unique identifier in the filename that changes whenever the file changes. For example:
Version 1: app.d8e2f1.js
Version 2 (after your bug fix): app.a3b9c4.js
The d8e2f1 part is called a content hash—it's a short code generated from the file's contents. If even one character in the file changes, the hash changes, which means the filename changes.
Why this works: When you deploy the bug fix, your HTML starts requesting app.a3b9c4.js instead of app.d8e2f1.js. The CDN has never seen this filename before—it's a completely new file as far as the CDN is concerned. So it fetches the fresh version from your origin. The old app.d8e2f1.js just sits unused in the cache until it naturally expires.
Pros: Bulletproof. No cache invalidation needed because you're serving a "new" file. Works instantly, globally.
Cons: Requires build tooling to generate the hashes automatically. Your deployment process needs to update all references to these files. Slightly more complex infrastructure.
Best for: Production applications where you cannot afford stale code. This is what most modern deployment platforms (Vercel, Netlify, AWS Amplify) do by default.
My Recommendation
For most growing companies, use Option 3 (cache-busting URLs) for JavaScript and CSS, combined with Option 1 (reasonable TTLs) for images and other media. If you're on a platform like Vercel or Netlify, this is handled for you automatically.
When a CDN Creates More Problems Than It Solves
After understanding how CDNs work, let's be clear about when they're not the right choice.
Your Application Is Mostly Dynamic
If you're building a B2B SaaS dashboard where every screen shows user-specific data, a CDN helps only with your initial page shell (HTML, CSS, JS). The actual data—which is most of what users care about—can't be cached.
You'll add complexity, cost, and debugging overhead for limited benefit.
You Update Content Very Frequently
If your business model involves real-time content updates—live sports scores, stock prices, breaking news—aggressive caching works against you. You'll spend more time managing cache invalidation than you save in performance.
You're Small and Regional
A local business app, an internal company tool, or an early-stage startup with users in one country doesn't need global content distribution. Your single-region server handles the load fine. Focus your engineering effort elsewhere.
Your Team Lacks DevOps Experience
A CDN is another system to configure, monitor, and debug. If your team is already stretched thin, adding infrastructure complexity isn't wise. Consider using a platform (Vercel, Netlify, Cloudflare Pages) that includes CDN functionality without requiring you to manage it separately.
Practical Setup: Making Your Application "CDN-Ready"
Even if you don't implement a CDN today, you can architect your application so adding one later is painless. Here's what that means in practice:
1. Separate Static Assets from Dynamic API Routes
Structure your application so static files (images, CSS, JavaScript) are served from a different path or subdomain than your API endpoints.
Example:
Static assets: static.yourapp.com/images/logo.png
API requests: api.yourapp.com/users/profile
This separation lets you put a CDN in front of only your static assets without affecting your dynamic API, which shouldn't be cached anyway.
2. Set Proper Cache-Control Headers
Cache-Control is an HTTP header that tells browsers and CDNs how long they can cache a response. Your server sends this header with every response.
Example header: Cache-Control: public, max-age=31536000
This tells caches: "This file is public (not user-specific), and you can cache it for 31,536,000 seconds (one year)."
For different content types, you'd set different values:
Static assets with content-hashed names: max-age=31536000 (1 year)—safe because the filename changes when content changes
HTML pages: max-age=0 or no-cache—always fetch fresh to ensure users get the latest version
API responses: Usually no-store—don't cache at all because data is user-specific
Modern frameworks (Next.js, Nuxt, Rails, Django) have sensible defaults, but it's worth auditing what headers your application actually sends.
3. Use a Build Process That Generates Content-Hashed Filenames
If you're using any modern JavaScript framework (React, Vue, Angular, Svelte), your build tool (Webpack, Vite, Parcel) can automatically generate content-hashed filenames.
Before build: You write code in app.js
After build: The tool outputs app.d8e2f1.js and updates all references automatically
If you're not using a build process (maybe you're writing vanilla HTML/CSS/JS), consider adding one. The caching benefits alone are worth it.
Quick Wins Before You Implement a CDN
Sometimes the biggest performance gains come from simpler optimizations:
Compress your images: Tools or built-in framework optimizers can reduce image sizes by 50-80% with no visible quality loss.
Enable Gzip/Brotli compression: Most servers can compress text-based files (HTML, CSS, JS) before sending them. This is usually a single configuration line.
Lazy-load images below the fold: Don't load images users can't see yet. Modern browsers support this natively with loading="lazy".
Audit your JavaScript bundle: Large JS files are often the biggest performance bottleneck. Tools like Webpack Bundle Analyzer show you what's taking up space.
These optimizations are free, low-risk, and often provide more impact than a CDN for early-stage applications.
Final Thoughts + Bonus Decision Framework
A CDN is a powerful tool, but it's not a magic solution. It solves specific problems (geographic latency, origin server load, traffic spike resilience) while introducing new ones (cache invalidation, debugging complexity, additional cost).
The best engineering decisions come from understanding the trade-offs clearly—not from following "best practices" blindly.
If you're evaluating whether a CDN makes sense for your product, start with the data: where are your users, what content are you serving, and what problems are you actually experiencing? The answer will become clear.
Questions to think about:
Ask yourself these five questions:
1. Are 30%+ of my users far from my servers?
Yes → CDN helps significantly
No → CDN provides marginal benefit
2. Is 50%+ of my traffic for static, cacheable content?
Yes → CDN helps significantly
No → CDN impact is limited
3. Do I have unpredictable traffic spikes?
Yes → CDN provides resilience
No → Less urgent
4. Does my team have DevOps capacity?
Yes → Manageable complexity
No → Consider managed platforms instead
5. Do I update production multiple times per day?
Yes → Cache invalidation becomes a daily concern
No → Standard TTL approach works fine
If you answered "CDN helps" to 3+ questions: Implement a CDN. The performance and resilience benefits justify the added complexity.
If you answered "No" or "limited benefit" to 3+ questions: Skip the CDN for now. Focus on optimizing your origin server, reducing asset sizes, and using a simple hosting platform that includes basic caching.
Thank you.
If you require technical consulting, please reach out at matteo.aurelio@foresightfintelligence.com and I'll be happy to support.
Services
Strategic Data, Analytics and Automation solutions offered.
Growth
Finance
matteo.aurelio@foresightfintelligence.com
Contact e-mail:
© 2025. All rights reserved. Privacy Policy
