Sourcehut Web Server
Sourcehut is a collection of services that require a reverse proxy at minimum. While this is an inconvenience for a small deployment, it does mean that Sourcehut is immediately scalable for critical or commercial deployments.
Contents
Setup
Upstream's recommendation is to use NGINX. The official site's configuration is available at https://git.sr.ht/~sircmpwn/sr.ht-nginx.
Core Site
To support Stripe billing, special alterations are made to the upstream configuration of Content Security Policy.
server { listen 80; gzip on; gzip_types text/css text/html; server_name meta.example.com; location / { proxy_pass http://localhost:5000; include headers.conf; add_header Content-Security-Policy "default-src 'none'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; script-src 'self' 'unsafe-inline' *.stripe.com *.stripe.network; frame-src *.stripe.com *.stripe.network" always; include web.conf; } location /register { proxy_pass http://localhost:5000; add_header Content-Security-Policy "default-src 'none'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; script-src 'self' 'unsafe-inline' *.stripe.com *.stripe.network; frame-src *.stripe.com *.stripe.network" always; } location /query { proxy_pass http://localhost:5100; include graphql.conf; } }
Optionally, serve static content directly.
location /static { root /usr/lib/python3.9/site-packages/metasrht; expires 30d; }
Git Site
server { listen 80; gzip on; gzip_types text/css text/html; server_name git.example.com; client_max_body_size 100M; location / { proxy_pass http://localhost:5001; include headers.conf; add_header Content-Security-Policy "default-src 'none'; style-src 'self' 'unsafe-inline'; img-src * data:; script-src 'self' 'unsafe-inline'" always; include web.conf; } location /query { proxy_pass http://localhost:5101; include graphql.conf; } }
Optionally, serve static content directly.
location /static { root /usr/lib/python3.9/site-packages/gitsrht; expires 30d; }
Cloning
Sourcehut itself does not support HTTP(S) cloning git repositories. Upstream's intention is to leave this role to an external web server with durable performance.
At minimum, the /authorize location must be configured (to ensure the privacy of private repositories) and git(1)-related URIs must be served by git-http-backend.
location = /authorize { proxy_pass http://localhost:5001; proxy_pass_request_body off; proxy_set_header Content-Length ""; proxy_set_header X-Original-URI $request_uri; } location ~ ^/([^/]+)/([^/]+)/(HEAD|info/refs|objects/info/.*|git-upload-pack).*$ { auth_request /authorize; root /var/lib/git; fastcgi_pass localhost:9000; fastcgi_param SCRIPT_FILENAME /usr/libexec/git-core/git-http-backend; fastcgi_param PATH_INFO $uri; fastcgi_param GIT_PROJECT_ROOT $document_root; fastcgi_read_timeout 500s; include fastcgi_params; gzip off; }
The appropriate fastcgi_pass value will depend on the FastCGI deployment. Upstream's recommendation is to use spawn-fcgi and fcgiwrap with a Unix socket at unix:/run/fcgiwrap/fcgiwrap.sock.
Note that some Linux distributions provide git-http-backend in a split package, such as git-daemon.
Bug Tracker Site
server { listen 80; gzip on; gzip_types text/css text/html; server_name todo.example.com; client_max_body_size 100M; location / { proxy_pass http://localhost:5003; include headers.conf; add_header Content-Security-Policy "default-src 'none'; style-src 'self' 'unsafe-inline'; img-src * data:; script-src 'self' 'unsafe-inline'" always; include web.conf; } location /query { proxy_pass http://localhost:5103; include graphql.conf; } }
Optionally, serve static content directly.
location /static { root /usr/lib/python3.9/site-packages/todosrht; expires 30d; }
Administration
Encryption
As a general rule, to re-configure the below sites for HTTPS, insert the following directives at minimum.
server { ... listen 443 ssl http2; ssl_certificate /path/to/cert; ssl_certificate_key /path/to/key; ... }
Avoiding Timeout Errors
On weaker hardware, some Sourcehut services can take too long to reply. If faced with recurring timeout errors, try:
location /any/path/with/proxying { proxy_pass http://localhost:1234; proxy_read_timeout 300s; proxy_connect_timeout 75s; ... }
Web Crawling
Upstream strongly recommends posting a web crawler policy on each server.
location = /robots.txt { root /var/www; }
See the upstream robots.txt at https://git.sr.ht/~sircmpwn/sr.ht-nginx/tree/master/item/robots.txt.