Cette page est affichée en anglais. Une traduction française est en cours.
ConfigsApr 29, 2026·3 min de lecture

Varnish Cache — High-Performance HTTP Reverse Proxy and Accelerator

An open-source HTTP reverse proxy and caching engine designed to accelerate web applications by serving content from memory at high throughput.

Introduction

Varnish Cache is an open-source HTTP accelerator designed for content-heavy dynamic websites and APIs. It sits between clients and your web server, caching responses in memory and serving them at speeds that typically reach hundreds of thousands of requests per second. Varnish is used by major media, e-commerce, and content platforms to reduce backend load and improve response times.

What Varnish Does

  • Caches HTTP responses in memory and serves them without hitting the backend server
  • Processes requests through a programmable VCL (Varnish Configuration Language) pipeline
  • Handles cache invalidation via purge, ban, and soft-purge mechanisms
  • Load balances traffic across multiple backend servers with health checks
  • Streams and compresses (gzip/brotli) responses to clients on the fly

Architecture Overview

Varnish is written in C and maps its cache storage into virtual memory, letting the OS manage what stays in RAM versus disk. Incoming requests pass through a state machine defined in VCL, which compiles to native C code at startup for maximum performance. Worker threads handle connections using an event-driven model. The shared memory log (VSL) records all transactions in a ring buffer, enabling real-time analytics without disk I/O overhead.

Self-Hosting & Configuration

  • Install from the official Varnish repository for latest stable releases
  • Write VCL rules to define caching behavior: which URLs to cache, TTLs, and backend selection
  • Deploy in front of Nginx, Apache, or application servers on port 80 or via a load balancer
  • Use varnishlog and varnishstat for real-time traffic analysis and hit-rate monitoring
  • Configure cache size with the -s flag (e.g., -s malloc,2G for 2 GB in-memory cache)

Key Features

  • VCL configuration language compiles to C for sub-millisecond request processing
  • Grace mode serves stale content while fetching fresh responses from the backend
  • Edge Side Includes (ESI) for caching page fragments with different TTLs
  • Built-in load balancing with round-robin, random, and health-check-based directors
  • VMODs (Varnish Modules) extend functionality with custom C plugins

Comparison with Similar Tools

  • Nginx — general-purpose reverse proxy with caching; Varnish offers more flexible cache logic via VCL
  • Squid — forward and reverse proxy; Varnish is purpose-built for HTTP acceleration with higher throughput
  • CDN services (Cloudflare, Fastly) — managed edge caching; Varnish runs on your own infrastructure (Fastly is built on Varnish)
  • Redis/Memcached — application-level key-value caches; Varnish caches complete HTTP responses transparently

FAQ

Q: When should I use Varnish over Nginx caching? A: Use Varnish when you need complex caching logic (vary on cookies, ESI, grace mode) or handle high cache-hit ratios. For simpler setups, Nginx proxy_cache may suffice.

Q: Does Varnish support HTTPS? A: Varnish handles HTTP only. Place an SSL terminator like Nginx or Hitch in front of Varnish for HTTPS.

Q: How do I invalidate cached content? A: Use HTTP PURGE requests for specific URLs, or bans for pattern-based invalidation (e.g., ban all pages matching a regex).

Q: What cache hit rate should I expect? A: Well-configured sites typically achieve 85-95% cache hit rates, meaning only 5-15% of requests reach the backend.

Sources

Fil de discussion

Connectez-vous pour rejoindre la discussion.
Aucun commentaire pour l'instant. Soyez le premier à partager votre avis.

Actifs similaires