Proxy Deployment
Complete deployment guide for the Productify Proxy component (custom Caddy build with Productify plugins).
Deployment Overview
The Proxy uses a custom Caddy build with:
- OAuth2/OIDC authentication (Caddy Security plugin)
- Productify custom plugins for Manager integration
- Automatic HTTPS with Let's Encrypt
- Prometheus metrics on port 2112
Build Custom Caddy
The proxy requires custom Caddy plugins that must be built from source:
bash
cd proxy
docker build -t ghcr.io/productifyfw/proxy:latest .The Dockerfile compiles:
- Caddy Security plugin (OAuth2/OIDC)
- Productify authentication plugins
- Productify middleware plugins
Docker Deployment
Basic Deployment
bash
docker run -d \
--name productify-proxy \
-p 80:80 \
-p 443:443 \
-p 2112:2112 \
-v $PWD/Caddyfile:/etc/caddy/Caddyfile \
-v caddy_data:/data \
-v caddy_config:/config \
ghcr.io/productifyfw/proxy:latestDocker Compose
yaml
version: "3.8"
services:
proxy:
build: ./proxy
ports:
- "80:80"
- "443:443"
- "2112:2112" # Prometheus metrics
volumes:
- ./Caddyfile:/etc/caddy/Caddyfile:ro
- caddy_data:/data
- caddy_config:/config
environment:
- MANAGER_URL=http://manager:8080
depends_on:
- manager
restart: unless-stopped
manager:
image: ghcr.io/productifyfw/manager:latest
ports:
- "8080:8080"
environment:
- DATABASE_URL=postgresql://...
restart: unless-stopped
volumes:
caddy_data:
caddy_config:Nomad Deployment
Job Specification
hcl
job "productify-proxy" {
datacenters = ["dc1"]
type = "service"
group "proxy" {
count = 2
network {
port "http" {
static = 80
to = 80
}
port "https" {
static = 443
to = 443
}
port "metrics" {
to = 2112
}
}
service {
name = "productify-proxy"
port = "https"
tags = ["proxy", "https"]
check {
type = "tcp"
port = "https"
interval = "10s"
timeout = "2s"
}
}
service {
name = "productify-proxy-metrics"
port = "metrics"
tags = ["metrics"]
}
task "caddy" {
driver = "docker"
config {
image = "ghcr.io/productifyfw/proxy:latest"
ports = ["http", "https", "metrics"]
volumes = [
"local/Caddyfile:/etc/caddy/Caddyfile",
]
}
template {
data = <<EOF
{
email admin@example.com
admin off
security {
oauth identity provider generic {
realm generic
driver generic
client_id ab098fe1-9bc0-4780-81c6-2ea17f49a3cb
client_secret pU3ZrOyPTtd4A3ex16dBzBTlDrlxqfpU
scopes openid email profile
base_auth_url http://pocketid.localhost
metadata_url http://pocketid.localhost/.well-known/openid-configuration
}
authentication portal pocketportal {
crypto default token lifetime 3600
enable identity provider generic
cookie insecure off
transform user {
match realm generic
action add role user
}
}
authorization policy pocketpolicy {
set auth url /auth/oauth2/generic
allow roles user
validate bearer header
inject headers with claims
enable strip token
}
}
productify {
manager http://172.17.0.1:8080
token supersecrettoken
}
}
# Manager UI/API
manager.example.com {
@auth {
path /auth/*
}
route @auth {
authenticate with pocketportal
}
route /* {
authorize with pocketpolicy
reverse_proxy http://172.17.0.1:8080
}
}
# Applications
app.example.com {
@auth {
path /auth/*
}
route @auth {
productify_before_auth with 123
authenticate with pocketportal
}
route /* {
authorize with pocketpolicy
productify with 123
reverse_proxy http://app-backend:8080
}
}
EOF
destination = "local/Caddyfile"
}
resources {
cpu = 200
memory = 128
}
}
}
}Configuration
Caddyfile Template
See Caddyfile Configuration for complete examples.
Environment Variables
Required:
ACME_EMAIL- Email for Let's EncryptOAUTH_CLIENT_ID- OAuth2 client IDOAUTH_CLIENT_SECRET- OAuth2 client secretOAUTH_BASE_URL- Identity provider base URLOAUTH_METADATA_URL- OIDC metadata URLMANAGER_URL- Productify Manager API URLMANAGER_TOKEN- Manager authentication token
Application-specific:
APP_ID- Application ID from ManagerAPP_DOMAIN- Application domain nameAPP_BACKEND- Backend service URL
TLS Certificates
Let's Encrypt (Automatic)
nginx
{
email admin@example.com
}
app.example.com {
# Automatic certificate
reverse_proxy backend:8080
}Custom Certificates
nginx
app.example.com {
tls /certs/cert.pem /certs/key.pem
reverse_proxy backend:8080
}Mount certificates:
yaml
volumes:
- ./certs:/certs:roMonitoring
Prometheus Metrics
The proxy exposes metrics on port 2112:
bash
curl http://proxy:2112/metricsPrometheus Scrape Config
yaml
scrape_configs:
- job_name: "productify-proxy"
static_configs:
- targets: ["proxy:2112"]Available Metrics
pfy_authentication_awaiting_users- Users in authentication flowpfy_total_requests- Total requests per applicationpfy_response_time_seconds- Response time distribution
Health Checks
TCP Health Check
hcl
check {
type = "tcp"
port = "https"
interval = "10s"
timeout = "2s"
}HTTP Health Check (Admin API)
Enable admin API for health checks:
nginx
{
admin :2019
}hcl
check {
type = "http"
port = "admin"
path = "/health"
interval = "10s"
timeout = "2s"
}Troubleshooting
Build Failures
Check:
- Go version (1.25.4 required)
- All module dependencies available
- Network access to Go module proxy
Rebuild:
bash
docker build --no-cache -t ghcr.io/productifyfw/proxy:latest ./proxyAuthentication Not Working
Verify:
- Identity provider is accessible from proxy
- OAuth2 client credentials are correct
- Metadata URL returns valid OIDC configuration
- Cookie settings match environment
Test OIDC metadata:
bash
curl https://idp.example.com/.well-known/openid-configurationManager Integration Errors
Check logs:
bash
docker logs productify-proxyVerify:
- Manager URL is accessible from proxy
- Manager token is valid
- Application ID exists in Manager database
Test Manager connectivity:
bash
curl -H "Authorization: Bearer TOKEN" http://manager:8080/api/applications/123Certificate Errors
Let's Encrypt:
- DNS points to proxy
- Ports 80/443 publicly accessible
- Valid email configured
- Not hitting rate limits
Custom certificates:
- Certificate files exist and are readable
- Certificate matches domain
- Private key is not encrypted
High Availability
Multiple Instances
Deploy multiple proxy instances behind a load balancer:
hcl
group "proxy" {
count = 3 # Run 3 instances
# ... rest of config
}Load Balancer Configuration
Use external load balancer (e.g., HAProxy, nginx) to distribute traffic:
Internet → Load Balancer → Proxy InstancesShared State
Caddy stores certificate state in /data. For multiple instances:
- Use shared volume (NFS, EFS)
- Or use Caddy clustering (requires custom setup)
Security Considerations
- Disable admin API in production (
admin off) - Use HTTPS for all production domains
- Secure Manager token - use secrets management
- Cookie security - set
cookie insecure offin production - Rate limiting - configure at load balancer level
- Security headers - add to Caddyfile