Sourcehut Web Server
Sourcehut is a collection of services that require a reverse proxy at minimum. While this is an inconvenience for a small deployment, it does mean that Sourcehut is immediately scalable for critical or commercial deployments.
Contents
Meta
To support Stripe billing, special alterations are made to the upstream configuration of Content Security Policy.
server { listen 80; gzip on; gzip_types text/css text/html; server_name meta.example.com; location / { proxy_pass http://localhost:5000; include headers.conf; add_header Content-Security-Policy "default-src 'none'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; script-src 'self' 'unsafe-inline' *.stripe.com *.stripe.network; frame-src *.stripe.com *.stripe.network" always; include web.conf; } location /register { proxy_pass http://localhost:5000; add_header Content-Security-Policy "default-src 'none'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; script-src 'self' 'unsafe-inline' *.stripe.com *.stripe.network; frame-src *.stripe.com *.stripe.network" always; } location /query { proxy_pass http://localhost:5100; include graphql.conf; } }
Optionally, serve static content directly.
location /static { root /usr/lib/python3.9/site-packages/metasrht; expires 30d; }
There are a few configuration files included here and in other site configurations.
headers.conf contains universal headers, some important and others as an homage.
add_header X-Clacks-Overhead "GNU Terry Pratchett"; add_header Strict-Transport-Security "max-age=31536000; includeSubDomains; preload" always; add_header Permissions-Policy interest-cohort=();
web.conf contains standard directives for proper proxying.
real_ip_header X-Forwarded-For; real_ip_recursive on; proxy_set_header Host $host; proxy_set_header X-Forwarded-Proto https; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
graphql.conf enables the GraphQL API and the playground (i.e. git.example.com/graphql).
real_ip_header X-Forwarded-For; real_ip_recursive on; proxy_set_header Host $host; proxy_set_header X-Forwarded-Proto https; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; if ($request_method = 'OPTIONS') { add_header 'Access-Control-Allow-Origin' '*'; add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS'; add_header 'Access-Control-Allow-Headers' 'User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range'; add_header 'Access-Control-Max-Age' 1728000; add_header 'Content-Type' 'text/plain; charset=utf-8'; add_header 'Content-Length' 0; return 204; } add_header 'Access-Control-Allow-Origin' '*'; add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS'; add_header 'Access-Control-Allow-Headers' 'User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range'; add_header 'Access-Control-Expose-Headers' 'Content-Length,Content-Range';
Git
server { listen 80; gzip on; gzip_types text/css text/html; server_name git.example.com; client_max_body_size 100M; location / { proxy_pass http://localhost:5001; include headers.conf; add_header Content-Security-Policy "default-src 'none'; style-src 'self' 'unsafe-inline'; img-src * data:; script-src 'self' 'unsafe-inline'" always; include web.conf; } location /query { proxy_pass http://localhost:5101; include graphql.conf; } }
Optionally, serve static content directly.
location /static { root /usr/lib/python3.9/site-packages/gitsrht; expires 30d; }
Cloning
Sourcehut itself does not support HTTP(S) cloning git repositories. Upstream's intention is to leave this role to an external web server with durable performance.
At minimum, the /authorize location must be configured (to ensure the privacy of private repositories) and git(1)-related URIs must be served by git-http-backend.
location = /authorize { proxy_pass http://localhost:5001; proxy_pass_request_body off; proxy_set_header Content-Length ""; proxy_set_header X-Original-URI $request_uri; } location ~ ^/([^/]+)/([^/]+)/(HEAD|info/refs|objects/info/.*|git-upload-pack).*$ { auth_request /authorize; root /var/lib/git; fastcgi_pass localhost:9000; fastcgi_param SCRIPT_FILENAME /usr/libexec/git-core/git-http-backend; fastcgi_param PATH_INFO $uri; fastcgi_param GIT_PROJECT_ROOT $document_root; fastcgi_read_timeout 500s; include fastcgi_params; gzip off; }
The appropriate fastcgi_pass value will depend on the FastCGI deployment. Upstream's recommendation is to use spawn-fcgi and fcgiwrap with a Unix socket at unix:/run/fcgiwrap/fcgiwrap.sock.
Note that some Linux distributions provide git-http-backend in a split package, such as git-daemon.
Todo
server { listen 80; gzip on; gzip_types text/css text/html; server_name todo.example.com; client_max_body_size 100M; location / { proxy_pass http://localhost:5003; include headers.conf; add_header Content-Security-Policy "default-src 'none'; style-src 'self' 'unsafe-inline'; img-src * data:; script-src 'self' 'unsafe-inline'" always; include web.conf; } location /query { proxy_pass http://localhost:5103; include graphql.conf; } }
Optionally, serve static content directly.
location /static { root /usr/lib/python3.9/site-packages/todosrht; expires 30d; }
Administration
Avoiding Timeout Errors
On weaker hardware, some Sourcehut services can take too long to reply. If faced with recurring timeout errors, try:
location /any/path/with/proxying { proxy_pass http://localhost:1234; proxy_read_timeout 300s; proxy_connect_timeout 75s; ... }
Web Crawling
Upstream strongly recommends posting a web crawler policy on each server.
location = /robots.txt { root /var/www; }
See the upstream robots.txt at https://git.sr.ht/~sircmpwn/sr.ht-nginx/tree/master/item/robots.txt.