Boost Nginx Performance: Practical OpenResty Guide for Blacklists, Rate Limiting, A/B Testing & Monitoring
This article presents a hands‑on guide to using OpenResty—Lua‑enhanced Nginx—for implementing static and dynamic blacklists, fine‑grained rate limiting, A/B testing via upstream selection, and real‑time service quality monitoring, all with production‑ready code examples.
OpenResty is a high‑performance web platform based on Nginx and Lua, integrating many Lua libraries and third‑party modules. It allows developers to write Lua scripts that run inside Nginx, achieving high concurrency with low resource usage.
Blacklist
Three methods are presented for implementing blacklists.
Static blacklist
Configure a Lua table directly in the access_by_lua* phase.
location /lua {
default_type 'text/html';
access_by_lua_file /path/to/access.lua;
content_by_lua 'ngx.say("hello world")';
}
-- Lua example
local blacklist = {
["10.10.76.111"] = true,
["10.10.76.112"] = true,
["10.10.76.113"] = true,
}
local ip = ngx.var.remote_addr
if blacklist[ip] then
return ngx.exit(ngx.HTTP_FORBIDDEN)
endDynamic blacklist (1) – Redis per request
Store the blacklist in Redis and query it for each request.
local redis = require "resty.redis"
local red = redis:new()
red:set_timeout(100)
local ok, err = red:connect("127.0.0.1", 6379)
if not ok then return end
local cid = ngx.var.arg_cid
local res, err = red:smembers("blacklist")
if not res then return end
-- check and rejectDynamic blacklist (2) – Shared memory with periodic sync
Use lua_shared_dict to keep the blacklist in Nginx shared memory and a timer to refresh it from Redis.
-- init_redis_blacklist.lua (timer)
local redis = require "resty.redis"
local dict = ngx.shared.blacklist
local function update_blacklist()
local red = redis:new()
red:connect("127.0.0.1", 6379)
local list = red:smembers("blacklist")
dict:flush_all()
for _, k in ipairs(list) do
dict:set(k, true)
end
end
ngx.timer.at(5, update_blacklist)Rate Limiting
Two Lua modules are demonstrated.
lua‑resty‑limit‑traffic
Configure a shared dict and use resty.limit.req to limit request rate per location.
http {
lua_shared_dict location_limit_req_store 1m;
server {
listen 2019;
location /limit/traffic {
access_by_lua_file "/path/to/limit_traffic.lua";
default_type 'text/html';
content_by_lua 'ngx.say("hello 2019")';
}
}
}
-- limit_traffic.lua
local limit_req = require "resty.limit.req"
local json = require "cjson"
local rate = 1
local burst = 1
local function do_limit()
ngx.header.content_type = "application/json;charset=utf8"
ngx.say(json.encode({message="Too Fast"}))
return ngx.exit(ngx.HTTP_OK)
end
local function location_limit_traffic()
local lim, err = limit_req.new("location_limit_req_store", rate, burst)
if not lim then ngx.log(ngx.ERR, "init failed: ", err); return false end
local delay, err = lim:incoming("location_limit_key", true)
if not delay then
if err == "rejected" then return true end
return false
end
if delay > 0 then ngx.sleep(delay) end
return false
end
local reject = location_limit_traffic()
if reject then do_limit() endlua‑resty‑redis‑ratelimit (cross‑machine)
Store counters in Redis so that rate limiting works across multiple Nginx instances.
local ratelimit = require "resty.redis.ratelimit"
local json = require "cjson"
local redis = {host="127.0.0.1", port=6379, timeout=0.02}
local rate = "1r/s"
local burst = 0
local duration = 1
local function user_rate_limit()
local limit_key = ngx.var.arg_unique_id
if not limit_key then return false end
local lim, err = ratelimit.new("user-rate", rate, burst, duration)
if not lim then ngx.log(ngx.ERR, "failed to instantiate: ", err); return false end
local delay, err = lim:incoming(limit_key, redis)
if not delay and err == "rejected" then return true end
if delay and delay >= 0.001 then ngx.sleep(delay) end
return false
end
local reject = user_rate_limit()
if reject then
ngx.header.content_type = "application/json;charset=utf8"
ngx.say(json.encode({message="Too Fast"}))
return ngx.exit(ngx.HTTP_OK)
endA/B Testing
Use set_by_lua_file to select an upstream based on a request parameter.
http {
upstream pool_1 { server 0.0.0.0:2020; }
upstream pool_2 { server 0.0.0.0:2021; }
server {
listen 2019;
location /select/upstream/according/cid {
set_by_lua_file $selected_upstream "/path/to/select_upstream_by_cid.lua" "pool_1" "pool_2";
if ($selected_upstream = "") {
proxy_pass http://pool_1;
}
proxy_pass http://$selected_upstream;
}
}
}
-- select_upstream_by_cid.lua
local first_upstream = ngx.arg[1]
local second_upstream = ngx.arg[2]
local cid = ngx.var.arg_cid
if not cid then return "" end
local id = tonumber(cid)
if not id then return "" end
if id % 2 == 0 then
return first_upstream
end
return second_upstreamService Quality Monitoring
Collect request count, request time, error count, upstream count and upstream response time using ngx.shared dict and log_by_lua.
-- nginx_metric.lua (simplified)
local metric = require "metric"
local dict = ngx.shared.nginx_metric
local prefix = ngx.var.proxy_host or ""
local m = metric:new(dict, "|", prefix, 86400)
m:record()The metric module records request count, request time, HTTP error status, upstream request count and total upstream response time, storing them in shared memory for later retrieval.
-- nginx_metric_output.lua
local json = require "cjson"
local dict = ngx.shared.nginx_metric
local keys = dict:get_keys()
local result = {}
for _, k in ipairs(keys) do
local v = dict:get(k)
local s, e = string.find(k, "|")
local up = string.sub(k, 1, s-1)
local metric = string.sub(k, e+1)
result[up] = result[up] or {}
result[up][metric] = (result[up][metric] or 0) + v
end
ngx.say(json.encode(result))These examples demonstrate how OpenResty can simplify development of high‑performance web services by providing Lua‑based APIs for access control, rate limiting, A/B testing and real‑time monitoring.
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
Sohu Smart Platform Tech Team
The Sohu News app's technical sharing hub, offering deep tech analyses, the latest industry news, and fun developer anecdotes. Follow us to discover the team's daily joys.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
