A caching proxy server with rate limiting and usage tracking.
- Request caching with Redis
- Rate limiting per user
- Usage tracking
- Bearer token authentication
- Configurable cache TTL
- Node.js (v14 or higher)
- Redis server
- Clone the repository
- Install dependencies:
npm install
- Copy
.env.example
to.env
and configure your environment variables:cp .env.example .env
Edit the .env
file to configure:
REDIS_URL
: Redis connection URLCACHE_TTL
: Cache time-to-live in secondsRATE_LIMIT_WINDOW
: Rate limit window in secondsRATE_LIMIT_MAX_REQUESTS
: Maximum requests per windowPORT
: Server port
Start the server:
npm start
For development with auto-reload:
npm run dev
Forwards requests to target APIs with caching.
Headers:
Authorization
: Bearer token for authenticationTarget-URL
: The URL to forward the request to
Example:
curl -X POST http://localhost:3000/proxy \
-H "Authorization: Bearer your-token" \
-H "Target-URL: https://api.example.com/data" \
-H "Content-Type: application/json" \
-d '{"key": "value"}'
Get current usage statistics.
Headers:
Authorization
: Bearer token for authentication
Example:
curl http://localhost:3000/usage \
-H "Authorization: Bearer your-token"
Response:
{
"usage": 42,
"rateLimit": 100,
"remaining": 58
}
The server returns appropriate HTTP status codes:
- 400: Bad Request (missing required headers)
- 401: Unauthorized (invalid or missing token)
- 429: Too Many Requests (rate limit exceeded)
- 500: Internal Server Error
ISC