at://nekomimi.pet/sh.tangled.repo.pull/3m5htcylipq22
Back to Collection
Record JSON
{
"$type": "sh.tangled.repo.pull",
"createdAt": "2025-11-13T07:40:34Z",
"patch": "From 9a8033813d927f3db006be4ffd5be1216c781b0b Mon Sep 17 00:00:00 2001\nFrom: \"@nekomimi.pet\" \u003cmeowskulls@nekomimi.pet\u003e\nDate: Wed, 12 Nov 2025 17:28:10 -0500\nSubject: [PATCH 1/6] init support for redirects file\n\n---\n README.md | 32 +-\n hosting-service/EXAMPLE.md | 123 -------\n hosting-service/example-_redirects | 134 +++++++\n hosting-service/src/lib/redirects.test.ts | 215 +++++++++++\n hosting-service/src/lib/redirects.ts | 413 ++++++++++++++++++++++\n hosting-service/src/server.ts | 174 ++++++++-\n 6 files changed, 961 insertions(+), 130 deletions(-)\n delete mode 100644 hosting-service/EXAMPLE.md\n create mode 100644 hosting-service/example-_redirects\n create mode 100644 hosting-service/src/lib/redirects.test.ts\n create mode 100644 hosting-service/src/lib/redirects.ts\n\ndiff --git a/README.md b/README.md\nindex 0271aa3..211f12d 100644\n--- a/README.md\n+++ b/README.md\n@@ -50,10 +50,40 @@ cd cli\n cargo build\n ```\n \n+## Features\n+\n+### URL Redirects and Rewrites\n+\n+The hosting service supports Netlify-style `_redirects` files for managing URLs. Place a `_redirects` file in your site root to enable:\n+\n+- **301/302 Redirects**: Permanent and temporary URL redirects\n+- **200 Rewrites**: Serve different content without changing the URL\n+- **404 Custom Pages**: Custom error pages for specific paths\n+- **Splats \u0026 Placeholders**: Dynamic path matching (`/blog/:year/:month/:day`, `/news/*`)\n+- **Query Parameter Matching**: Redirect based on URL parameters\n+- **Conditional Redirects**: Route by country, language, or cookie presence\n+- **Force Redirects**: Override existing files with redirects\n+\n+Example `_redirects`:\n+```\n+# Single-page app routing (React, Vue, etc.)\n+/* /index.html 200\n+\n+# Simple redirects\n+/home /\n+/old-blog/* /blog/:splat\n+\n+# API proxy\n+/api/* https://api.example.com/:splat 200\n+\n+# Country-based routing\n+/ /us/ 302 Country=us\n+/ /uk/ 302 Country=gb\n+```\n+\n ## Limits\n \n - Max file size: 100MB (PDS limit)\n-- Max site size: 300MB\n - Max files: 2000\n \n ## Tech Stack\ndiff --git a/hosting-service/EXAMPLE.md b/hosting-service/EXAMPLE.md\ndeleted file mode 100644\nindex adf7cd9..0000000\n--- a/hosting-service/EXAMPLE.md\n+++ /dev/null\n@@ -1,123 +0,0 @@\n-# HTML Path Rewriting Example\n-\n-This document demonstrates how HTML path rewriting works when serving sites via the `/s/:identifier/:site/*` route.\n-\n-## Problem\n-\n-When you create a static site with absolute paths like `/style.css` or `/images/logo.png`, these paths work fine when served from the root domain. However, when served from a subdirectory like `/s/alice.bsky.social/mysite/`, these absolute paths break because they resolve to the server root instead of the site root.\n-\n-## Solution\n-\n-The hosting service automatically rewrites absolute paths in HTML files to work correctly in the subdirectory context.\n-\n-## Example\n-\n-**Original HTML file (index.html):**\n-```html\n-\u003c!DOCTYPE html\u003e\n-\u003chtml\u003e\n-\u003chead\u003e\n- \u003cmeta charset=\"UTF-8\"\u003e\n- \u003ctitle\u003eMy Site\u003c/title\u003e\n- \u003clink rel=\"stylesheet\" href=\"/style.css\"\u003e\n- \u003clink rel=\"icon\" href=\"/favicon.ico\"\u003e\n- \u003cscript src=\"/app.js\"\u003e\u003c/script\u003e\n-\u003c/head\u003e\n-\u003cbody\u003e\n- \u003cheader\u003e\n- \u003cimg src=\"/images/logo.png\" alt=\"Logo\"\u003e\n- \u003cnav\u003e\n- \u003ca href=\"/\"\u003eHome\u003c/a\u003e\n- \u003ca href=\"/about\"\u003eAbout\u003c/a\u003e\n- \u003ca href=\"/contact\"\u003eContact\u003c/a\u003e\n- \u003c/nav\u003e\n- \u003c/header\u003e\n-\n- \u003cmain\u003e\n- \u003ch1\u003eWelcome\u003c/h1\u003e\n- \u003cimg src=\"/images/hero.jpg\"\n- srcset=\"/images/hero.jpg 1x, /images/hero@2x.jpg 2x\"\n- alt=\"Hero\"\u003e\n-\n- \u003cform action=\"/submit\" method=\"post\"\u003e\n- \u003cinput type=\"text\" name=\"email\"\u003e\n- \u003cbutton\u003eSubmit\u003c/button\u003e\n- \u003c/form\u003e\n- \u003c/main\u003e\n-\n- \u003cfooter\u003e\n- \u003ca href=\"https://example.com\"\u003eExternal Link\u003c/a\u003e\n- \u003ca href=\"#top\"\u003eBack to Top\u003c/a\u003e\n- \u003c/footer\u003e\n-\u003c/body\u003e\n-\u003c/html\u003e\n-```\n-\n-**When accessed via `/s/alice.bsky.social/mysite/`, the HTML is rewritten to:**\n-```html\n-\u003c!DOCTYPE html\u003e\n-\u003chtml\u003e\n-\u003chead\u003e\n- \u003cmeta charset=\"UTF-8\"\u003e\n- \u003ctitle\u003eMy Site\u003c/title\u003e\n- \u003clink rel=\"stylesheet\" href=\"/s/alice.bsky.social/mysite/style.css\"\u003e\n- \u003clink rel=\"icon\" href=\"/s/alice.bsky.social/mysite/favicon.ico\"\u003e\n- \u003cscript src=\"/s/alice.bsky.social/mysite/app.js\"\u003e\u003c/script\u003e\n-\u003c/head\u003e\n-\u003cbody\u003e\n- \u003cheader\u003e\n- \u003cimg src=\"/s/alice.bsky.social/mysite/images/logo.png\" alt=\"Logo\"\u003e\n- \u003cnav\u003e\n- \u003ca href=\"/s/alice.bsky.social/mysite/\"\u003eHome\u003c/a\u003e\n- \u003ca href=\"/s/alice.bsky.social/mysite/about\"\u003eAbout\u003c/a\u003e\n- \u003ca href=\"/s/alice.bsky.social/mysite/contact\"\u003eContact\u003c/a\u003e\n- \u003c/nav\u003e\n- \u003c/header\u003e\n-\n- \u003cmain\u003e\n- \u003ch1\u003eWelcome\u003c/h1\u003e\n- \u003cimg src=\"/s/alice.bsky.social/mysite/images/hero.jpg\"\n- srcset=\"/s/alice.bsky.social/mysite/images/hero.jpg 1x, /s/alice.bsky.social/mysite/images/hero@2x.jpg 2x\"\n- alt=\"Hero\"\u003e\n-\n- \u003cform action=\"/s/alice.bsky.social/mysite/submit\" method=\"post\"\u003e\n- \u003cinput type=\"text\" name=\"email\"\u003e\n- \u003cbutton\u003eSubmit\u003c/button\u003e\n- \u003c/form\u003e\n- \u003c/main\u003e\n-\n- \u003cfooter\u003e\n- \u003ca href=\"https://example.com\"\u003eExternal Link\u003c/a\u003e\n- \u003ca href=\"#top\"\u003eBack to Top\u003c/a\u003e\n- \u003c/footer\u003e\n-\u003c/body\u003e\n-\u003c/html\u003e\n-```\n-\n-## What's Preserved\n-\n-Notice that:\n-- ✅ Absolute paths are rewritten: `/style.css` → `/s/alice.bsky.social/mysite/style.css`\n-- ✅ External URLs are preserved: `https://example.com` stays the same\n-- ✅ Anchors are preserved: `#top` stays the same\n-- ✅ The rewriting is safe and won't break your site\n-\n-## Supported Attributes\n-\n-The rewriter handles these HTML attributes:\n-- `src` - images, scripts, iframes, videos, audio\n-- `href` - links, stylesheets\n-- `action` - forms\n-- `data` - objects\n-- `poster` - video posters\n-- `srcset` - responsive images\n-\n-## Testing Your Site\n-\n-To test if your site works with path rewriting:\n-\n-1. Upload your site to your PDS as a `place.wisp.fs` record\n-2. Access it via: `https://hosting.wisp.place/s/YOUR_HANDLE/SITE_NAME/`\n-3. Check that all resources load correctly\n-\n-If you're using relative paths already (like `./style.css` or `../images/logo.png`), they'll work without any rewriting.\ndiff --git a/hosting-service/example-_redirects b/hosting-service/example-_redirects\nnew file mode 100644\nindex 0000000..901c201\n--- /dev/null\n+++ b/hosting-service/example-_redirects\n@@ -0,0 +1,134 @@\n+# Example _redirects file for Wisp hosting\n+# Place this file in the root directory of your site as \"_redirects\"\n+# Lines starting with # are comments\n+\n+# ===================================\n+# SIMPLE REDIRECTS\n+# ===================================\n+\n+# Redirect home page\n+# /home /\n+\n+# Redirect old URLs to new ones\n+# /old-blog /blog\n+# /about-us /about\n+\n+# ===================================\n+# SPLAT REDIRECTS (WILDCARDS)\n+# ===================================\n+\n+# Redirect entire directories\n+# /news/* /blog/:splat\n+# /old-site/* /new-site/:splat\n+\n+# ===================================\n+# PLACEHOLDER REDIRECTS\n+# ===================================\n+\n+# Restructure blog URLs\n+# /blog/:year/:month/:day/:slug /posts/:year-:month-:day/:slug\n+\n+# Capture multiple parameters\n+# /products/:category/:id /shop/:category/item/:id\n+\n+# ===================================\n+# STATUS CODES\n+# ===================================\n+\n+# Permanent redirect (301) - default if not specified\n+# /permanent-move /new-location 301\n+\n+# Temporary redirect (302)\n+# /temp-redirect /temp-location 302\n+\n+# Rewrite (200) - serves different content, URL stays the same\n+# /api/* /functions/:splat 200\n+\n+# Custom 404 page\n+# /shop/* /shop-closed.html 404\n+\n+# ===================================\n+# FORCE REDIRECTS\n+# ===================================\n+\n+# Force redirect even if file exists (note the ! after status code)\n+# /override-file /other-file.html 200!\n+\n+# ===================================\n+# CONDITIONAL REDIRECTS\n+# ===================================\n+\n+# Country-based redirects (ISO 3166-1 alpha-2 codes)\n+# / /us/ 302 Country=us\n+# / /uk/ 302 Country=gb\n+# / /anz/ 302 Country=au,nz\n+\n+# Language-based redirects\n+# /products /en/products 301 Language=en\n+# /products /de/products 301 Language=de\n+# /products /fr/products 301 Language=fr\n+\n+# Cookie-based redirects (checks if cookie exists)\n+# /* /legacy/:splat 200 Cookie=is_legacy\n+\n+# ===================================\n+# QUERY PARAMETERS\n+# ===================================\n+\n+# Match specific query parameters\n+# /store id=:id /blog/:id 301\n+\n+# Multiple parameters\n+# /search q=:query category=:cat /find/:cat/:query 301\n+\n+# ===================================\n+# DOMAIN-LEVEL REDIRECTS\n+# ===================================\n+\n+# Redirect to different domain (must include protocol)\n+# /external https://example.com/path\n+\n+# Redirect entire subdomain\n+# http://blog.example.com/* https://example.com/blog/:splat 301!\n+# https://blog.example.com/* https://example.com/blog/:splat 301!\n+\n+# ===================================\n+# COMMON PATTERNS\n+# ===================================\n+\n+# Remove .html extensions\n+# /page.html /page\n+\n+# Add trailing slash\n+# /about /about/\n+\n+# Single-page app fallback (serve index.html for all paths)\n+# /* /index.html 200\n+\n+# API proxy\n+# /api/* https://api.example.com/:splat 200\n+\n+# ===================================\n+# CUSTOM ERROR PAGES\n+# ===================================\n+\n+# Language-specific 404 pages\n+# /en/* /en/404.html 404\n+# /de/* /de/404.html 404\n+\n+# Section-specific 404 pages\n+# /shop/* /shop/not-found.html 404\n+# /blog/* /blog/404.html 404\n+\n+# ===================================\n+# NOTES\n+# ===================================\n+# \n+# - Rules are processed in order (first match wins)\n+# - More specific rules should come before general ones\n+# - Splats (*) can only be used at the end of a path\n+# - Query parameters are automatically preserved for 200, 301, 302\n+# - Trailing slashes are normalized (/ and no / are treated the same)\n+# - Default status code is 301 if not specified\n+#\n+\ndiff --git a/hosting-service/src/lib/redirects.test.ts b/hosting-service/src/lib/redirects.test.ts\nnew file mode 100644\nindex 0000000..f61d5a3\n--- /dev/null\n+++ b/hosting-service/src/lib/redirects.test.ts\n@@ -0,0 +1,215 @@\n+import { describe, it, expect } from 'bun:test'\n+import { parseRedirectsFile, matchRedirectRule } from './redirects';\n+\n+describe('parseRedirectsFile', () =\u003e {\n+ it('should parse simple redirects', () =\u003e {\n+ const content = `\n+# Comment line\n+/old-path /new-path\n+/home / 301\n+`;\n+ const rules = parseRedirectsFile(content);\n+ expect(rules).toHaveLength(2);\n+ expect(rules[0]).toMatchObject({\n+ from: '/old-path',\n+ to: '/new-path',\n+ status: 301,\n+ force: false,\n+ });\n+ expect(rules[1]).toMatchObject({\n+ from: '/home',\n+ to: '/',\n+ status: 301,\n+ force: false,\n+ });\n+ });\n+\n+ it('should parse redirects with different status codes', () =\u003e {\n+ const content = `\n+/temp-redirect /target 302\n+/rewrite /content 200\n+/not-found /404 404\n+`;\n+ const rules = parseRedirectsFile(content);\n+ expect(rules).toHaveLength(3);\n+ expect(rules[0]?.status).toBe(302);\n+ expect(rules[1]?.status).toBe(200);\n+ expect(rules[2]?.status).toBe(404);\n+ });\n+\n+ it('should parse force redirects', () =\u003e {\n+ const content = `/force-path /target 301!`;\n+ const rules = parseRedirectsFile(content);\n+ expect(rules[0]?.force).toBe(true);\n+ expect(rules[0]?.status).toBe(301);\n+ });\n+\n+ it('should parse splat redirects', () =\u003e {\n+ const content = `/news/* /blog/:splat`;\n+ const rules = parseRedirectsFile(content);\n+ expect(rules[0]?.from).toBe('/news/*');\n+ expect(rules[0]?.to).toBe('/blog/:splat');\n+ });\n+\n+ it('should parse placeholder redirects', () =\u003e {\n+ const content = `/blog/:year/:month/:day /posts/:year-:month-:day`;\n+ const rules = parseRedirectsFile(content);\n+ expect(rules[0]?.from).toBe('/blog/:year/:month/:day');\n+ expect(rules[0]?.to).toBe('/posts/:year-:month-:day');\n+ });\n+\n+ it('should parse country-based redirects', () =\u003e {\n+ const content = `/ /anz 302 Country=au,nz`;\n+ const rules = parseRedirectsFile(content);\n+ expect(rules[0]?.conditions?.country).toEqual(['au', 'nz']);\n+ });\n+\n+ it('should parse language-based redirects', () =\u003e {\n+ const content = `/products /en/products 301 Language=en`;\n+ const rules = parseRedirectsFile(content);\n+ expect(rules[0]?.conditions?.language).toEqual(['en']);\n+ });\n+\n+ it('should parse cookie-based redirects', () =\u003e {\n+ const content = `/* /legacy/:splat 200 Cookie=is_legacy,my_cookie`;\n+ const rules = parseRedirectsFile(content);\n+ expect(rules[0]?.conditions?.cookie).toEqual(['is_legacy', 'my_cookie']);\n+ });\n+});\n+\n+describe('matchRedirectRule', () =\u003e {\n+ it('should match exact paths', () =\u003e {\n+ const rules = parseRedirectsFile('/old-path /new-path');\n+ const match = matchRedirectRule('/old-path', rules);\n+ expect(match).toBeTruthy();\n+ expect(match?.targetPath).toBe('/new-path');\n+ expect(match?.status).toBe(301);\n+ });\n+\n+ it('should match paths with trailing slash', () =\u003e {\n+ const rules = parseRedirectsFile('/old-path /new-path');\n+ const match = matchRedirectRule('/old-path/', rules);\n+ expect(match).toBeTruthy();\n+ expect(match?.targetPath).toBe('/new-path');\n+ });\n+\n+ it('should match splat patterns', () =\u003e {\n+ const rules = parseRedirectsFile('/news/* /blog/:splat');\n+ const match = matchRedirectRule('/news/2024/01/15/my-post', rules);\n+ expect(match).toBeTruthy();\n+ expect(match?.targetPath).toBe('/blog/2024/01/15/my-post');\n+ });\n+\n+ it('should match placeholder patterns', () =\u003e {\n+ const rules = parseRedirectsFile('/blog/:year/:month/:day /posts/:year-:month-:day');\n+ const match = matchRedirectRule('/blog/2024/01/15', rules);\n+ expect(match).toBeTruthy();\n+ expect(match?.targetPath).toBe('/posts/2024-01-15');\n+ });\n+\n+ it('should preserve query strings for 301/302 redirects', () =\u003e {\n+ const rules = parseRedirectsFile('/old /new 301');\n+ const match = matchRedirectRule('/old', rules, {\n+ queryParams: { foo: 'bar', baz: 'qux' },\n+ });\n+ expect(match?.targetPath).toContain('?');\n+ expect(match?.targetPath).toContain('foo=bar');\n+ expect(match?.targetPath).toContain('baz=qux');\n+ });\n+\n+ it('should match based on query parameters', () =\u003e {\n+ const rules = parseRedirectsFile('/store id=:id /blog/:id 301');\n+ const match = matchRedirectRule('/store', rules, {\n+ queryParams: { id: 'my-post' },\n+ });\n+ expect(match).toBeTruthy();\n+ expect(match?.targetPath).toContain('/blog/my-post');\n+ });\n+\n+ it('should not match when query params are missing', () =\u003e {\n+ const rules = parseRedirectsFile('/store id=:id /blog/:id 301');\n+ const match = matchRedirectRule('/store', rules, {\n+ queryParams: {},\n+ });\n+ expect(match).toBeNull();\n+ });\n+\n+ it('should match based on country header', () =\u003e {\n+ const rules = parseRedirectsFile('/ /aus 302 Country=au');\n+ const match = matchRedirectRule('/', rules, {\n+ headers: { 'cf-ipcountry': 'AU' },\n+ });\n+ expect(match).toBeTruthy();\n+ expect(match?.targetPath).toBe('/aus');\n+ });\n+\n+ it('should not match wrong country', () =\u003e {\n+ const rules = parseRedirectsFile('/ /aus 302 Country=au');\n+ const match = matchRedirectRule('/', rules, {\n+ headers: { 'cf-ipcountry': 'US' },\n+ });\n+ expect(match).toBeNull();\n+ });\n+\n+ it('should match based on language header', () =\u003e {\n+ const rules = parseRedirectsFile('/products /en/products 301 Language=en');\n+ const match = matchRedirectRule('/products', rules, {\n+ headers: { 'accept-language': 'en-US,en;q=0.9' },\n+ });\n+ expect(match).toBeTruthy();\n+ expect(match?.targetPath).toBe('/en/products');\n+ });\n+\n+ it('should match based on cookie presence', () =\u003e {\n+ const rules = parseRedirectsFile('/* /legacy/:splat 200 Cookie=is_legacy');\n+ const match = matchRedirectRule('/some-path', rules, {\n+ cookies: { is_legacy: 'true' },\n+ });\n+ expect(match).toBeTruthy();\n+ expect(match?.targetPath).toBe('/legacy/some-path');\n+ });\n+\n+ it('should return first matching rule', () =\u003e {\n+ const content = `\n+/path /first\n+/path /second\n+`;\n+ const rules = parseRedirectsFile(content);\n+ const match = matchRedirectRule('/path', rules);\n+ expect(match?.targetPath).toBe('/first');\n+ });\n+\n+ it('should match more specific rules before general ones', () =\u003e {\n+ const content = `\n+/jobs/customer-ninja /careers/support\n+/jobs/* /careers/:splat\n+`;\n+ const rules = parseRedirectsFile(content);\n+ \n+ const match1 = matchRedirectRule('/jobs/customer-ninja', rules);\n+ expect(match1?.targetPath).toBe('/careers/support');\n+ \n+ const match2 = matchRedirectRule('/jobs/developer', rules);\n+ expect(match2?.targetPath).toBe('/careers/developer');\n+ });\n+\n+ it('should handle SPA routing pattern', () =\u003e {\n+ const rules = parseRedirectsFile('/* /index.html 200');\n+ \n+ // Should match any path\n+ const match1 = matchRedirectRule('/about', rules);\n+ expect(match1).toBeTruthy();\n+ expect(match1?.targetPath).toBe('/index.html');\n+ expect(match1?.status).toBe(200);\n+ \n+ const match2 = matchRedirectRule('/users/123/profile', rules);\n+ expect(match2).toBeTruthy();\n+ expect(match2?.targetPath).toBe('/index.html');\n+ expect(match2?.status).toBe(200);\n+ \n+ const match3 = matchRedirectRule('/', rules);\n+ expect(match3).toBeTruthy();\n+ expect(match3?.targetPath).toBe('/index.html');\n+ });\n+});\n+\ndiff --git a/hosting-service/src/lib/redirects.ts b/hosting-service/src/lib/redirects.ts\nnew file mode 100644\nindex 0000000..f3c5273\n--- /dev/null\n+++ b/hosting-service/src/lib/redirects.ts\n@@ -0,0 +1,413 @@\n+import { readFile } from 'fs/promises';\n+import { existsSync } from 'fs';\n+\n+export interface RedirectRule {\n+ from: string;\n+ to: string;\n+ status: number;\n+ force: boolean;\n+ conditions?: {\n+ country?: string[];\n+ language?: string[];\n+ role?: string[];\n+ cookie?: string[];\n+ };\n+ // For pattern matching\n+ fromPattern?: RegExp;\n+ fromParams?: string[]; // Named parameters from the pattern\n+ queryParams?: Record\u003cstring, string\u003e; // Expected query parameters\n+}\n+\n+export interface RedirectMatch {\n+ rule: RedirectRule;\n+ targetPath: string;\n+ status: number;\n+}\n+\n+/**\n+ * Parse a _redirects file into an array of redirect rules\n+ */\n+export function parseRedirectsFile(content: string): RedirectRule[] {\n+ const lines = content.split('\\n');\n+ const rules: RedirectRule[] = [];\n+\n+ for (let lineNum = 0; lineNum \u003c lines.length; lineNum++) {\n+ const lineRaw = lines[lineNum];\n+ if (!lineRaw) continue;\n+ \n+ const line = lineRaw.trim();\n+ \n+ // Skip empty lines and comments\n+ if (!line || line.startsWith('#')) {\n+ continue;\n+ }\n+\n+ try {\n+ const rule = parseRedirectLine(line);\n+ if (rule \u0026\u0026 rule.fromPattern) {\n+ rules.push(rule);\n+ }\n+ } catch (err) {\n+ console.warn(`Failed to parse redirect rule on line ${lineNum + 1}: ${line}`, err);\n+ }\n+ }\n+\n+ return rules;\n+}\n+\n+/**\n+ * Parse a single redirect rule line\n+ * Format: /from [query_params] /to [status] [conditions]\n+ */\n+function parseRedirectLine(line: string): RedirectRule | null {\n+ // Split by whitespace, but respect quoted strings (though not commonly used)\n+ const parts = line.split(/\\s+/);\n+ \n+ if (parts.length \u003c 2) {\n+ return null;\n+ }\n+\n+ let idx = 0;\n+ const from = parts[idx++];\n+ \n+ if (!from) {\n+ return null;\n+ }\n+ \n+ let status = 301; // Default status\n+ let force = false;\n+ const conditions: NonNullable\u003cRedirectRule['conditions']\u003e = {};\n+ const queryParams: Record\u003cstring, string\u003e = {};\n+ \n+ // Parse query parameters that come before the destination path\n+ // They look like: key=:value (and don't start with /)\n+ while (idx \u003c parts.length) {\n+ const part = parts[idx];\n+ if (!part) {\n+ idx++;\n+ continue;\n+ }\n+ \n+ // If it starts with / or http, it's the destination path\n+ if (part.startsWith('/') || part.startsWith('http://') || part.startsWith('https://')) {\n+ break;\n+ }\n+ \n+ // If it contains = and comes before the destination, it's a query param\n+ if (part.includes('=')) {\n+ const splitIndex = part.indexOf('=');\n+ const key = part.slice(0, splitIndex);\n+ const value = part.slice(splitIndex + 1);\n+ \n+ if (key \u0026\u0026 value) {\n+ queryParams[key] = value;\n+ }\n+ idx++;\n+ } else {\n+ // Not a query param, must be destination or something else\n+ break;\n+ }\n+ }\n+ \n+ // Next part should be the destination\n+ if (idx \u003e= parts.length) {\n+ return null;\n+ }\n+ \n+ const to = parts[idx++];\n+ if (!to) {\n+ return null;\n+ }\n+\n+ // Parse remaining parts for status code and conditions\n+ for (let i = idx; i \u003c parts.length; i++) {\n+ const part = parts[i];\n+ \n+ if (!part) continue;\n+ \n+ // Check for status code (with optional ! for force)\n+ if (/^\\d+!?$/.test(part)) {\n+ if (part.endsWith('!')) {\n+ force = true;\n+ status = parseInt(part.slice(0, -1));\n+ } else {\n+ status = parseInt(part);\n+ }\n+ continue;\n+ }\n+\n+ // Check for condition parameters (Country=, Language=, Role=, Cookie=)\n+ if (part.includes('=')) {\n+ const splitIndex = part.indexOf('=');\n+ const key = part.slice(0, splitIndex);\n+ const value = part.slice(splitIndex + 1);\n+ \n+ if (!key || !value) continue;\n+ \n+ const keyLower = key.toLowerCase();\n+ \n+ if (keyLower === 'country') {\n+ conditions.country = value.split(',').map(v =\u003e v.trim().toLowerCase());\n+ } else if (keyLower === 'language') {\n+ conditions.language = value.split(',').map(v =\u003e v.trim().toLowerCase());\n+ } else if (keyLower === 'role') {\n+ conditions.role = value.split(',').map(v =\u003e v.trim());\n+ } else if (keyLower === 'cookie') {\n+ conditions.cookie = value.split(',').map(v =\u003e v.trim().toLowerCase());\n+ }\n+ }\n+ }\n+\n+ // Parse the 'from' pattern\n+ const { pattern, params } = convertPathToRegex(from);\n+\n+ return {\n+ from,\n+ to,\n+ status,\n+ force,\n+ conditions: Object.keys(conditions).length \u003e 0 ? conditions : undefined,\n+ queryParams: Object.keys(queryParams).length \u003e 0 ? queryParams : undefined,\n+ fromPattern: pattern,\n+ fromParams: params,\n+ };\n+}\n+\n+/**\n+ * Convert a path pattern with placeholders and splats to a regex\n+ * Examples:\n+ * /blog/:year/:month/:day -\u003e captures year, month, day\n+ * /news/* -\u003e captures splat\n+ */\n+function convertPathToRegex(pattern: string): { pattern: RegExp; params: string[] } {\n+ const params: string[] = [];\n+ let regexStr = '^';\n+ \n+ // Split by query string if present\n+ const pathPart = pattern.split('?')[0] || pattern;\n+ \n+ // Escape special regex characters except * and :\n+ let escaped = pathPart.replace(/[.+^${}()|[\\]\\\\]/g, '\\\\$\u0026');\n+ \n+ // Replace :param with named capture groups\n+ escaped = escaped.replace(/:([a-zA-Z_][a-zA-Z0-9_]*)/g, (match, paramName) =\u003e {\n+ params.push(paramName);\n+ // Match path segment (everything except / and ?)\n+ return '([^/?]+)';\n+ });\n+ \n+ // Replace * with splat capture (matches everything including /)\n+ if (escaped.includes('*')) {\n+ escaped = escaped.replace(/\\*/g, '(.*)');\n+ params.push('splat');\n+ }\n+ \n+ regexStr += escaped;\n+ \n+ // Make trailing slash optional\n+ if (!regexStr.endsWith('.*')) {\n+ regexStr += '/?';\n+ }\n+ \n+ regexStr += '$';\n+ \n+ return {\n+ pattern: new RegExp(regexStr),\n+ params,\n+ };\n+}\n+\n+/**\n+ * Match a request path against redirect rules\n+ */\n+export function matchRedirectRule(\n+ requestPath: string,\n+ rules: RedirectRule[],\n+ context?: {\n+ queryParams?: Record\u003cstring, string\u003e;\n+ headers?: Record\u003cstring, string\u003e;\n+ cookies?: Record\u003cstring, string\u003e;\n+ }\n+): RedirectMatch | null {\n+ // Normalize path: ensure leading slash, remove trailing slash (except for root)\n+ let normalizedPath = requestPath.startsWith('/') ? requestPath : `/${requestPath}`;\n+ \n+ for (const rule of rules) {\n+ // Check query parameter conditions first (if any)\n+ if (rule.queryParams) {\n+ // If rule requires query params but none provided, skip this rule\n+ if (!context?.queryParams) {\n+ continue;\n+ }\n+ \n+ const queryMatches = Object.entries(rule.queryParams).every(([key, value]) =\u003e {\n+ const actualValue = context.queryParams?.[key];\n+ return actualValue !== undefined;\n+ });\n+ \n+ if (!queryMatches) {\n+ continue;\n+ }\n+ }\n+\n+ // Check conditional redirects (country, language, role, cookie)\n+ if (rule.conditions) {\n+ if (rule.conditions.country \u0026\u0026 context?.headers) {\n+ const cfCountry = context.headers['cf-ipcountry'];\n+ const xCountry = context.headers['x-country'];\n+ const country = (cfCountry?.toLowerCase() || xCountry?.toLowerCase());\n+ if (!country || !rule.conditions.country.includes(country)) {\n+ continue;\n+ }\n+ }\n+\n+ if (rule.conditions.language \u0026\u0026 context?.headers) {\n+ const acceptLang = context.headers['accept-language'];\n+ if (!acceptLang) {\n+ continue;\n+ }\n+ // Parse accept-language header (simplified)\n+ const langs = acceptLang.split(',').map(l =\u003e {\n+ const langPart = l.split(';')[0];\n+ return langPart ? langPart.trim().toLowerCase() : '';\n+ }).filter(l =\u003e l !== '');\n+ const hasMatch = rule.conditions.language.some(lang =\u003e \n+ langs.some(l =\u003e l === lang || l.startsWith(lang + '-'))\n+ );\n+ if (!hasMatch) {\n+ continue;\n+ }\n+ }\n+\n+ if (rule.conditions.cookie \u0026\u0026 context?.cookies) {\n+ const hasCookie = rule.conditions.cookie.some(cookieName =\u003e \n+ context.cookies \u0026\u0026 cookieName in context.cookies\n+ );\n+ if (!hasCookie) {\n+ continue;\n+ }\n+ }\n+\n+ // Role-based redirects would need JWT verification - skip for now\n+ if (rule.conditions.role) {\n+ continue;\n+ }\n+ }\n+\n+ // Match the path pattern\n+ const match = rule.fromPattern?.exec(normalizedPath);\n+ if (!match) {\n+ continue;\n+ }\n+\n+ // Build the target path by replacing placeholders\n+ let targetPath = rule.to;\n+ \n+ // Replace captured parameters\n+ if (rule.fromParams \u0026\u0026 match.length \u003e 1) {\n+ for (let i = 0; i \u003c rule.fromParams.length; i++) {\n+ const paramName = rule.fromParams[i];\n+ const paramValue = match[i + 1];\n+ \n+ if (!paramName || !paramValue) continue;\n+ \n+ if (paramName === 'splat') {\n+ targetPath = targetPath.replace(':splat', paramValue);\n+ } else {\n+ targetPath = targetPath.replace(`:${paramName}`, paramValue);\n+ }\n+ }\n+ }\n+\n+ // Handle query parameter replacements\n+ if (rule.queryParams \u0026\u0026 context?.queryParams) {\n+ for (const [key, placeholder] of Object.entries(rule.queryParams)) {\n+ const actualValue = context.queryParams[key];\n+ if (actualValue \u0026\u0026 placeholder \u0026\u0026 placeholder.startsWith(':')) {\n+ const paramName = placeholder.slice(1);\n+ if (paramName) {\n+ targetPath = targetPath.replace(`:${paramName}`, actualValue);\n+ }\n+ }\n+ }\n+ }\n+\n+ // Preserve query string for 200, 301, 302 redirects (unless target already has one)\n+ if ([200, 301, 302].includes(rule.status) \u0026\u0026 context?.queryParams \u0026\u0026 !targetPath.includes('?')) {\n+ const queryString = Object.entries(context.queryParams)\n+ .map(([k, v]) =\u003e `${encodeURIComponent(k)}=${encodeURIComponent(v)}`)\n+ .join('\u0026');\n+ if (queryString) {\n+ targetPath += `?${queryString}`;\n+ }\n+ }\n+\n+ return {\n+ rule,\n+ targetPath,\n+ status: rule.status,\n+ };\n+ }\n+\n+ return null;\n+}\n+\n+/**\n+ * Load redirect rules from a cached site\n+ */\n+export async function loadRedirectRules(did: string, rkey: string): Promise\u003cRedirectRule[]\u003e {\n+ const CACHE_DIR = process.env.CACHE_DIR || './cache/sites';\n+ const redirectsPath = `${CACHE_DIR}/${did}/${rkey}/_redirects`;\n+ \n+ if (!existsSync(redirectsPath)) {\n+ return [];\n+ }\n+\n+ try {\n+ const content = await readFile(redirectsPath, 'utf-8');\n+ return parseRedirectsFile(content);\n+ } catch (err) {\n+ console.error('Failed to load _redirects file', err);\n+ return [];\n+ }\n+}\n+\n+/**\n+ * Parse cookies from Cookie header\n+ */\n+export function parseCookies(cookieHeader?: string): Record\u003cstring, string\u003e {\n+ if (!cookieHeader) return {};\n+ \n+ const cookies: Record\u003cstring, string\u003e = {};\n+ const parts = cookieHeader.split(';');\n+ \n+ for (const part of parts) {\n+ const [key, ...valueParts] = part.split('=');\n+ if (key \u0026\u0026 valueParts.length \u003e 0) {\n+ cookies[key.trim()] = valueParts.join('=').trim();\n+ }\n+ }\n+ \n+ return cookies;\n+}\n+\n+/**\n+ * Parse query string into object\n+ */\n+export function parseQueryString(url: string): Record\u003cstring, string\u003e {\n+ const queryStart = url.indexOf('?');\n+ if (queryStart === -1) return {};\n+ \n+ const queryString = url.slice(queryStart + 1);\n+ const params: Record\u003cstring, string\u003e = {};\n+ \n+ for (const pair of queryString.split('\u0026')) {\n+ const [key, value] = pair.split('=');\n+ if (key) {\n+ params[decodeURIComponent(key)] = value ? decodeURIComponent(value) : '';\n+ }\n+ }\n+ \n+ return params;\n+}\n+\ndiff --git a/hosting-service/src/server.ts b/hosting-service/src/server.ts\nindex 45971c1..a76a0c8 100644\n--- a/hosting-service/src/server.ts\n+++ b/hosting-service/src/server.ts\n@@ -7,6 +7,7 @@ import { readFile, access } from 'fs/promises';\n import { lookup } from 'mime-types';\n import { logger, observabilityMiddleware, observabilityErrorHandler, logCollector, errorTracker, metricsCollector } from './lib/observability';\n import { fileCache, metadataCache, rewrittenHtmlCache, getCacheKey, type FileMetadata } from './lib/cache';\n+import { loadRedirectRules, matchRedirectRule, parseCookies, parseQueryString, type RedirectRule } from './lib/redirects';\n \n const BASE_HOST = process.env.BASE_HOST || 'wisp.place';\n \n@@ -35,8 +36,85 @@ async function fileExists(path: string): Promise\u003cboolean\u003e {\n }\n }\n \n+// Cache for redirect rules (per site)\n+const redirectRulesCache = new Map\u003cstring, RedirectRule[]\u003e();\n+\n+/**\n+ * Clear redirect rules cache for a specific site\n+ * Should be called when a site is updated/recached\n+ */\n+export function clearRedirectRulesCache(did: string, rkey: string) {\n+ const cacheKey = `${did}:${rkey}`;\n+ redirectRulesCache.delete(cacheKey);\n+}\n+\n // Helper to serve files from cache\n-async function serveFromCache(did: string, rkey: string, filePath: string) {\n+async function serveFromCache(\n+ did: string, \n+ rkey: string, \n+ filePath: string,\n+ fullUrl?: string,\n+ headers?: Record\u003cstring, string\u003e\n+) {\n+ // Check for redirect rules first\n+ const redirectCacheKey = `${did}:${rkey}`;\n+ let redirectRules = redirectRulesCache.get(redirectCacheKey);\n+ \n+ if (redirectRules === undefined) {\n+ // Load rules for the first time\n+ redirectRules = await loadRedirectRules(did, rkey);\n+ redirectRulesCache.set(redirectCacheKey, redirectRules);\n+ }\n+\n+ // Apply redirect rules if any exist\n+ if (redirectRules.length \u003e 0) {\n+ const requestPath = '/' + (filePath || '');\n+ const queryParams = fullUrl ? parseQueryString(fullUrl) : {};\n+ const cookies = parseCookies(headers?.['cookie']);\n+ \n+ const redirectMatch = matchRedirectRule(requestPath, redirectRules, {\n+ queryParams,\n+ headers,\n+ cookies,\n+ });\n+\n+ if (redirectMatch) {\n+ const { targetPath, status } = redirectMatch;\n+ \n+ // Handle different status codes\n+ if (status === 200) {\n+ // Rewrite: serve different content but keep URL the same\n+ // Remove leading slash for internal path resolution\n+ const rewritePath = targetPath.startsWith('/') ? targetPath.slice(1) : targetPath;\n+ return serveFileInternal(did, rkey, rewritePath);\n+ } else if (status === 301 || status === 302) {\n+ // External redirect: change the URL\n+ return new Response(null, {\n+ status,\n+ headers: {\n+ 'Location': targetPath,\n+ 'Cache-Control': status === 301 ? 'public, max-age=31536000' : 'public, max-age=0',\n+ },\n+ });\n+ } else if (status === 404) {\n+ // Custom 404 page\n+ const custom404Path = targetPath.startsWith('/') ? targetPath.slice(1) : targetPath;\n+ const response = await serveFileInternal(did, rkey, custom404Path);\n+ // Override status to 404\n+ return new Response(response.body, {\n+ status: 404,\n+ headers: response.headers,\n+ });\n+ }\n+ }\n+ }\n+\n+ // No redirect matched, serve normally\n+ return serveFileInternal(did, rkey, filePath);\n+}\n+\n+// Internal function to serve a file (used by both normal serving and rewrites)\n+async function serveFileInternal(did: string, rkey: string, filePath: string) {\n // Default to index.html if path is empty or ends with /\n let requestPath = filePath || 'index.html';\n if (requestPath.endsWith('/')) {\n@@ -138,8 +216,74 @@ async function serveFromCacheWithRewrite(\n did: string,\n rkey: string,\n filePath: string,\n- basePath: string\n+ basePath: string,\n+ fullUrl?: string,\n+ headers?: Record\u003cstring, string\u003e\n ) {\n+ // Check for redirect rules first\n+ const redirectCacheKey = `${did}:${rkey}`;\n+ let redirectRules = redirectRulesCache.get(redirectCacheKey);\n+ \n+ if (redirectRules === undefined) {\n+ // Load rules for the first time\n+ redirectRules = await loadRedirectRules(did, rkey);\n+ redirectRulesCache.set(redirectCacheKey, redirectRules);\n+ }\n+\n+ // Apply redirect rules if any exist\n+ if (redirectRules.length \u003e 0) {\n+ const requestPath = '/' + (filePath || '');\n+ const queryParams = fullUrl ? parseQueryString(fullUrl) : {};\n+ const cookies = parseCookies(headers?.['cookie']);\n+ \n+ const redirectMatch = matchRedirectRule(requestPath, redirectRules, {\n+ queryParams,\n+ headers,\n+ cookies,\n+ });\n+\n+ if (redirectMatch) {\n+ const { targetPath, status } = redirectMatch;\n+ \n+ // Handle different status codes\n+ if (status === 200) {\n+ // Rewrite: serve different content but keep URL the same\n+ const rewritePath = targetPath.startsWith('/') ? targetPath.slice(1) : targetPath;\n+ return serveFileInternalWithRewrite(did, rkey, rewritePath, basePath);\n+ } else if (status === 301 || status === 302) {\n+ // External redirect: change the URL\n+ // For sites.wisp.place, we need to adjust the target path to include the base path\n+ // unless it's an absolute URL\n+ let redirectTarget = targetPath;\n+ if (!targetPath.startsWith('http://') \u0026\u0026 !targetPath.startsWith('https://')) {\n+ redirectTarget = basePath + (targetPath.startsWith('/') ? targetPath.slice(1) : targetPath);\n+ }\n+ return new Response(null, {\n+ status,\n+ headers: {\n+ 'Location': redirectTarget,\n+ 'Cache-Control': status === 301 ? 'public, max-age=31536000' : 'public, max-age=0',\n+ },\n+ });\n+ } else if (status === 404) {\n+ // Custom 404 page\n+ const custom404Path = targetPath.startsWith('/') ? targetPath.slice(1) : targetPath;\n+ const response = await serveFileInternalWithRewrite(did, rkey, custom404Path, basePath);\n+ // Override status to 404\n+ return new Response(response.body, {\n+ status: 404,\n+ headers: response.headers,\n+ });\n+ }\n+ }\n+ }\n+\n+ // No redirect matched, serve normally\n+ return serveFileInternalWithRewrite(did, rkey, filePath, basePath);\n+}\n+\n+// Internal function to serve a file with rewriting\n+async function serveFileInternalWithRewrite(did: string, rkey: string, filePath: string, basePath: string) {\n // Default to index.html if path is empty or ends with /\n let requestPath = filePath || 'index.html';\n if (requestPath.endsWith('/')) {\n@@ -317,6 +461,8 @@ async function ensureSiteCached(did: string, rkey: string): Promise\u003cboolean\u003e {\n \n try {\n await downloadAndCacheSite(did, rkey, siteData.record, pdsEndpoint, siteData.cid);\n+ // Clear redirect rules cache since the site was updated\n+ clearRedirectRulesCache(did, rkey);\n logger.info('Site cached successfully', { did, rkey });\n return true;\n } catch (err) {\n@@ -384,7 +530,11 @@ app.get('/*', async (c) =\u003e {\n \n // Serve with HTML path rewriting to handle absolute paths\n const basePath = `/${identifier}/${site}/`;\n- return serveFromCacheWithRewrite(did, site, filePath, basePath);\n+ const headers: Record\u003cstring, string\u003e = {};\n+ c.req.raw.headers.forEach((value, key) =\u003e {\n+ headers[key.toLowerCase()] = value;\n+ });\n+ return serveFromCacheWithRewrite(did, site, filePath, basePath, c.req.url, headers);\n }\n \n // Check if this is a DNS hash subdomain\n@@ -420,7 +570,11 @@ app.get('/*', async (c) =\u003e {\n return c.text('Site not found', 404);\n }\n \n- return serveFromCache(customDomain.did, rkey, path);\n+ const headers: Record\u003cstring, string\u003e = {};\n+ c.req.raw.headers.forEach((value, key) =\u003e {\n+ headers[key.toLowerCase()] = value;\n+ });\n+ return serveFromCache(customDomain.did, rkey, path, c.req.url, headers);\n }\n \n // Route 2: Registered subdomains - /*.wisp.place/*\n@@ -444,7 +598,11 @@ app.get('/*', async (c) =\u003e {\n return c.text('Site not found', 404);\n }\n \n- return serveFromCache(domainInfo.did, rkey, path);\n+ const headers: Record\u003cstring, string\u003e = {};\n+ c.req.raw.headers.forEach((value, key) =\u003e {\n+ headers[key.toLowerCase()] = value;\n+ });\n+ return serveFromCache(domainInfo.did, rkey, path, c.req.url, headers);\n }\n \n // Route 1: Custom domains - /*\n@@ -467,7 +625,11 @@ app.get('/*', async (c) =\u003e {\n return c.text('Site not found', 404);\n }\n \n- return serveFromCache(customDomain.did, rkey, path);\n+ const headers: Record\u003cstring, string\u003e = {};\n+ c.req.raw.headers.forEach((value, key) =\u003e {\n+ headers[key.toLowerCase()] = value;\n+ });\n+ return serveFromCache(customDomain.did, rkey, path, c.req.url, headers);\n });\n \n // Internal observability endpoints (for admin panel)\n-- \n2.50.1 (Apple Git-155)\n\n\nFrom f1f70b3b22ddf300959c8855fb721e139b9ec8a6 Mon Sep 17 00:00:00 2001\nFrom: \"@nekomimi.pet\" \u003cmeowskulls@nekomimi.pet\u003e\nDate: Wed, 12 Nov 2025 18:33:31 -0500\nSubject: [PATCH 2/6] Add support for existing blob reuse in deployment process\n\n---\n cli/.gitignore | 1 +\n cli/Cargo.lock | 3 +\n cli/Cargo.toml | 3 +\n cli/src/blob_map.rs | 92 +++++++++++++++++++++++++\n cli/src/cid.rs | 66 ++++++++++++++++++\n cli/src/main.rs | 159 +++++++++++++++++++++++++++++++++-----------\n 6 files changed, 286 insertions(+), 38 deletions(-)\n create mode 100644 cli/src/blob_map.rs\n create mode 100644 cli/src/cid.rs\n\ndiff --git a/cli/.gitignore b/cli/.gitignore\nindex fcd9e40..15fe010 100644\n--- a/cli/.gitignore\n+++ b/cli/.gitignore\n@@ -1,3 +1,4 @@\n+test/\n .DS_STORE\n jacquard/\n binaries/\ndiff --git a/cli/Cargo.lock b/cli/Cargo.lock\nindex 4b0ba8b..a100cf6 100644\n--- a/cli/Cargo.lock\n+++ b/cli/Cargo.lock\n@@ -4385,10 +4385,13 @@ dependencies = [\n \"jacquard-oauth\",\n \"miette\",\n \"mime_guess\",\n+ \"multibase\",\n+ \"multihash\",\n \"reqwest\",\n \"rustversion\",\n \"serde\",\n \"serde_json\",\n+ \"sha2\",\n \"shellexpand\",\n \"tokio\",\n \"walkdir\",\ndiff --git a/cli/Cargo.toml b/cli/Cargo.toml\nindex 99493fb..6e0d1e2 100644\n--- a/cli/Cargo.toml\n+++ b/cli/Cargo.toml\n@@ -30,3 +30,6 @@ walkdir = \"2.5\"\n mime_guess = \"2.0\"\n bytes = \"1.10\"\n futures = \"0.3.31\"\n+multihash = \"0.19.3\"\n+multibase = \"0.9\"\n+sha2 = \"0.10\"\ndiff --git a/cli/src/blob_map.rs b/cli/src/blob_map.rs\nnew file mode 100644\nindex 0000000..93c86bd\n--- /dev/null\n+++ b/cli/src/blob_map.rs\n@@ -0,0 +1,92 @@\n+use jacquard_common::types::blob::BlobRef;\n+use jacquard_common::IntoStatic;\n+use std::collections::HashMap;\n+\n+use crate::place_wisp::fs::{Directory, EntryNode};\n+\n+/// Extract blob information from a directory tree\n+/// Returns a map of file paths to their blob refs and CIDs\n+/// \n+/// This mirrors the TypeScript implementation in src/lib/wisp-utils.ts lines 275-302\n+pub fn extract_blob_map(\n+ directory: \u0026Directory,\n+) -\u003e HashMap\u003cString, (BlobRef\u003c'static\u003e, String)\u003e {\n+ extract_blob_map_recursive(directory, String::new())\n+}\n+\n+fn extract_blob_map_recursive(\n+ directory: \u0026Directory,\n+ current_path: String,\n+) -\u003e HashMap\u003cString, (BlobRef\u003c'static\u003e, String)\u003e {\n+ let mut blob_map = HashMap::new();\n+ \n+ for entry in \u0026directory.entries {\n+ let full_path = if current_path.is_empty() {\n+ entry.name.to_string()\n+ } else {\n+ format!(\"{}/{}\", current_path, entry.name)\n+ };\n+ \n+ match \u0026entry.node {\n+ EntryNode::File(file_node) =\u003e {\n+ // Extract CID from blob ref\n+ // BlobRef is an enum with Blob variant, which has a ref field (CidLink)\n+ let blob_ref = \u0026file_node.blob;\n+ let cid_string = blob_ref.blob().r#ref.to_string();\n+ \n+ // Store both normalized and full paths\n+ // Normalize by removing base folder prefix (e.g., \"cobblemon/index.html\" -\u003e \"index.html\")\n+ let normalized_path = normalize_path(\u0026full_path);\n+ \n+ blob_map.insert(\n+ normalized_path.clone(),\n+ (blob_ref.clone().into_static(), cid_string.clone())\n+ );\n+ \n+ // Also store the full path for matching\n+ if normalized_path != full_path {\n+ blob_map.insert(\n+ full_path,\n+ (blob_ref.clone().into_static(), cid_string)\n+ );\n+ }\n+ }\n+ EntryNode::Directory(subdir) =\u003e {\n+ let sub_map = extract_blob_map_recursive(subdir, full_path);\n+ blob_map.extend(sub_map);\n+ }\n+ EntryNode::Unknown(_) =\u003e {\n+ // Skip unknown node types\n+ }\n+ }\n+ }\n+ \n+ blob_map\n+}\n+\n+/// Normalize file path by removing base folder prefix\n+/// Example: \"cobblemon/index.html\" -\u003e \"index.html\"\n+/// \n+/// Mirrors TypeScript implementation at src/routes/wisp.ts line 291\n+pub fn normalize_path(path: \u0026str) -\u003e String {\n+ // Remove base folder prefix (everything before first /)\n+ if let Some(idx) = path.find('/') {\n+ path[idx + 1..].to_string()\n+ } else {\n+ path.to_string()\n+ }\n+}\n+\n+#[cfg(test)]\n+mod tests {\n+ use super::*;\n+\n+ #[test]\n+ fn test_normalize_path() {\n+ assert_eq!(normalize_path(\"index.html\"), \"index.html\");\n+ assert_eq!(normalize_path(\"cobblemon/index.html\"), \"index.html\");\n+ assert_eq!(normalize_path(\"folder/subfolder/file.txt\"), \"subfolder/file.txt\");\n+ assert_eq!(normalize_path(\"a/b/c/d.txt\"), \"b/c/d.txt\");\n+ }\n+}\n+\ndiff --git a/cli/src/cid.rs b/cli/src/cid.rs\nnew file mode 100644\nindex 0000000..5190d30\n--- /dev/null\n+++ b/cli/src/cid.rs\n@@ -0,0 +1,66 @@\n+use jacquard_common::types::cid::IpldCid;\n+use sha2::{Digest, Sha256};\n+\n+/// Compute CID (Content Identifier) for blob content\n+/// Uses the same algorithm as AT Protocol: CIDv1 with raw codec (0x55) and SHA-256\n+/// \n+/// CRITICAL: This must be called on BASE64-ENCODED GZIPPED content, not just gzipped content\n+/// \n+/// Based on @atproto/common/src/ipld.ts sha256RawToCid implementation\n+pub fn compute_cid(content: \u0026[u8]) -\u003e String {\n+ // Use node crypto to compute sha256 hash (same as AT Protocol)\n+ let hash = Sha256::digest(content);\n+ \n+ // Create multihash (code 0x12 = sha2-256)\n+ let multihash = multihash::Multihash::wrap(0x12, \u0026hash)\n+ .expect(\"SHA-256 hash should always fit in multihash\");\n+ \n+ // Create CIDv1 with raw codec (0x55)\n+ let cid = IpldCid::new_v1(0x55, multihash);\n+ \n+ // Convert to base32 string representation\n+ cid.to_string_of_base(multibase::Base::Base32Lower)\n+ .unwrap_or_else(|_| cid.to_string())\n+}\n+\n+#[cfg(test)]\n+mod tests {\n+ use super::*;\n+ use base64::Engine;\n+\n+ #[test]\n+ fn test_compute_cid() {\n+ // Test with a simple string: \"hello\"\n+ let content = b\"hello\";\n+ let cid = compute_cid(content);\n+ \n+ // CID should start with 'baf' for raw codec base32\n+ assert!(cid.starts_with(\"baf\"));\n+ }\n+\n+ #[test]\n+ fn test_compute_cid_base64_encoded() {\n+ // Simulate the actual use case: gzipped then base64 encoded\n+ use flate2::write::GzEncoder;\n+ use flate2::Compression;\n+ use std::io::Write;\n+ \n+ let original = b\"hello world\";\n+ \n+ // Gzip compress\n+ let mut encoder = GzEncoder::new(Vec::new(), Compression::default());\n+ encoder.write_all(original).unwrap();\n+ let gzipped = encoder.finish().unwrap();\n+ \n+ // Base64 encode the gzipped data\n+ let base64_bytes = base64::prelude::BASE64_STANDARD.encode(\u0026gzipped).into_bytes();\n+ \n+ // Compute CID on the base64 bytes\n+ let cid = compute_cid(\u0026base64_bytes);\n+ \n+ // Should be a valid CID\n+ assert!(cid.starts_with(\"baf\"));\n+ assert!(cid.len() \u003e 10);\n+ }\n+}\n+\ndiff --git a/cli/src/main.rs b/cli/src/main.rs\nindex cfeb908..db0e7cf 100644\n--- a/cli/src/main.rs\n+++ b/cli/src/main.rs\n@@ -1,9 +1,11 @@\n mod builder_types;\n mod place_wisp;\n+mod cid;\n+mod blob_map;\n \n use clap::Parser;\n use jacquard::CowStr;\n-use jacquard::client::{Agent, FileAuthStore, AgentSessionExt, MemoryCredentialSession};\n+use jacquard::client::{Agent, FileAuthStore, AgentSessionExt, MemoryCredentialSession, AgentSession};\n use jacquard::oauth::client::OAuthClient;\n use jacquard::oauth::loopback::LoopbackConfig;\n use jacquard::prelude::IdentityResolver;\n@@ -11,6 +13,7 @@ use jacquard_common::types::string::{Datetime, Rkey, RecordKey};\n use jacquard_common::types::blob::MimeType;\n use miette::IntoDiagnostic;\n use std::path::{Path, PathBuf};\n+use std::collections::HashMap;\n use flate2::Compression;\n use flate2::write::GzEncoder;\n use std::io::Write;\n@@ -107,17 +110,56 @@ async fn deploy_site(\n \n println!(\"Deploying site '{}'...\", site_name);\n \n- // Build directory tree\n- let root_dir = build_directory(agent, \u0026path).await?;\n+ // Try to fetch existing manifest for incremental updates\n+ let existing_blob_map: HashMap\u003cString, (jacquard_common::types::blob::BlobRef\u003c'static\u003e, String)\u003e = {\n+ use jacquard_common::types::string::AtUri;\n+ \n+ // Get the DID for this session\n+ let session_info = agent.session_info().await;\n+ if let Some((did, _)) = session_info {\n+ // Construct the AT URI for the record\n+ let uri_string = format!(\"at://{}/place.wisp.fs/{}\", did, site_name);\n+ if let Ok(uri) = AtUri::new(\u0026uri_string) {\n+ match agent.get_record::\u003cFs\u003e(\u0026uri).await {\n+ Ok(response) =\u003e {\n+ match response.into_output() {\n+ Ok(record_output) =\u003e {\n+ let existing_manifest = record_output.value;\n+ let blob_map = blob_map::extract_blob_map(\u0026existing_manifest.root);\n+ println!(\"Found existing manifest with {} files, checking for changes...\", blob_map.len());\n+ blob_map\n+ }\n+ Err(_) =\u003e {\n+ println!(\"No existing manifest found, uploading all files...\");\n+ HashMap::new()\n+ }\n+ }\n+ }\n+ Err(_) =\u003e {\n+ // Record doesn't exist yet - this is a new site\n+ println!(\"No existing manifest found, uploading all files...\");\n+ HashMap::new()\n+ }\n+ }\n+ } else {\n+ println!(\"No existing manifest found (invalid URI), uploading all files...\");\n+ HashMap::new()\n+ }\n+ } else {\n+ println!(\"No existing manifest found (could not get DID), uploading all files...\");\n+ HashMap::new()\n+ }\n+ };\n \n- // Count total files\n- let file_count = count_files(\u0026root_dir);\n+ // Build directory tree\n+ let (root_dir, total_files, reused_count) = build_directory(agent, \u0026path, \u0026existing_blob_map).await?;\n+ let uploaded_count = total_files - reused_count;\n \n // Create the Fs record\n let fs_record = Fs::new()\n .site(CowStr::from(site_name.clone()))\n .root(root_dir)\n- .file_count(file_count as i64)\n+ .file_count(total_files as i64)\n .created_at(Datetime::now())\n .build();\n \n@@ -132,8 +174,9 @@ async fn deploy_site(\n .and_then(|s| s.split('/').next())\n .ok_or_else(|| miette::miette!(\"Failed to parse DID from URI\"))?;\n \n- println!(\"Deployed site '{}': {}\", site_name, output.uri);\n- println!(\"Available at: https://sites.wisp.place/{}/{}\", did, site_name);\n+ println!(\"\\n✓ Deployed site '{}': {}\", site_name, output.uri);\n+ println!(\" Total files: {} ({} reused, {} uploaded)\", total_files, reused_count, uploaded_count);\n+ println!(\" Available at: https://sites.wisp.place/{}/{}\", did, site_name);\n \n Ok(())\n }\n@@ -142,7 +185,8 @@ async fn deploy_site(\n fn build_directory\u003c'a\u003e(\n agent: \u0026'a Agent\u003cimpl jacquard::client::AgentSession + IdentityResolver + 'a\u003e,\n dir_path: \u0026'a Path,\n-) -\u003e std::pin::Pin\u003cBox\u003cdyn std::future::Future\u003cOutput = miette::Result\u003cDirectory\u003c'static\u003e\u003e\u003e + 'a\u003e\u003e\n+ existing_blobs: \u0026'a HashMap\u003cString, (jacquard_common::types::blob::BlobRef\u003c'static\u003e, String)\u003e,\n+) -\u003e std::pin::Pin\u003cBox\u003cdyn std::future::Future\u003cOutput = miette::Result\u003c(Directory\u003c'static\u003e, usize, usize)\u003e\u003e + 'a\u003e\u003e\n {\n Box::pin(async move {\n // Collect all directory entries first\n@@ -177,46 +221,66 @@ fn build_directory\u003c'a\u003e(\n }\n \n // Process files concurrently with a limit of 5\n- let file_entries: Vec\u003cEntry\u003e = stream::iter(file_tasks)\n+ let file_results: Vec\u003c(Entry\u003c'static\u003e, bool)\u003e = stream::iter(file_tasks)\n .map(|(name, path)| async move {\n- let file_node = process_file(agent, \u0026path).await?;\n- Ok::\u003c_, miette::Report\u003e(Entry::new()\n+ let (file_node, reused) = process_file(agent, \u0026path, \u0026name, existing_blobs).await?;\n+ let entry = Entry::new()\n .name(CowStr::from(name))\n .node(EntryNode::File(Box::new(file_node)))\n- .build())\n+ .build();\n+ Ok::\u003c_, miette::Report\u003e((entry, reused))\n })\n .buffer_unordered(5)\n .collect::\u003cVec\u003c_\u003e\u003e()\n .await\n .into_iter()\n .collect::\u003cmiette::Result\u003cVec\u003c_\u003e\u003e\u003e()?;\n+ \n+ let mut file_entries = Vec::new();\n+ let mut reused_count = 0;\n+ let mut total_files = 0;\n+ \n+ for (entry, reused) in file_results {\n+ file_entries.push(entry);\n+ total_files += 1;\n+ if reused {\n+ reused_count += 1;\n+ }\n+ }\n \n // Process directories recursively (sequentially to avoid too much nesting)\n let mut dir_entries = Vec::new();\n for (name, path) in dir_tasks {\n- let subdir = build_directory(agent, \u0026path).await?;\n+ let (subdir, sub_total, sub_reused) = build_directory(agent, \u0026path, existing_blobs).await?;\n dir_entries.push(Entry::new()\n .name(CowStr::from(name))\n .node(EntryNode::Directory(Box::new(subdir)))\n .build());\n+ total_files += sub_total;\n+ reused_count += sub_reused;\n }\n \n // Combine file and directory entries\n let mut entries = file_entries;\n entries.extend(dir_entries);\n \n- Ok(Directory::new()\n+ let directory = Directory::new()\n .r#type(CowStr::from(\"directory\"))\n .entries(entries)\n- .build())\n+ .build();\n+ \n+ Ok((directory, total_files, reused_count))\n })\n }\n \n-/// Process a single file: gzip -\u003e base64 -\u003e upload blob\n+/// Process a single file: gzip -\u003e base64 -\u003e upload blob (or reuse existing)\n+/// Returns (File, reused: bool)\n async fn process_file(\n agent: \u0026Agent\u003cimpl jacquard::client::AgentSession + IdentityResolver\u003e,\n file_path: \u0026Path,\n-) -\u003e miette::Result\u003cFile\u003c'static\u003e\u003e\n+ file_name: \u0026str,\n+ existing_blobs: \u0026HashMap\u003cString, (jacquard_common::types::blob::BlobRef\u003c'static\u003e, String)\u003e,\n+) -\u003e miette::Result\u003c(File\u003c'static\u003e, bool)\u003e\n {\n // Read file\n let file_data = std::fs::read(file_path).into_diagnostic()?;\n@@ -234,30 +298,49 @@ async fn process_file(\n // Base64 encode the gzipped data\n let base64_bytes = base64::prelude::BASE64_STANDARD.encode(\u0026gzipped).into_bytes();\n \n- // Upload blob as octet-stream\n+ // Compute CID for this file (CRITICAL: on base64-encoded gzipped content)\n+ let file_cid = cid::compute_cid(\u0026base64_bytes);\n+ \n+ // Normalize the file path for comparison\n+ let normalized_path = blob_map::normalize_path(file_name);\n+ \n+ // Check if we have an existing blob with the same CID\n+ let existing_blob = existing_blobs.get(\u0026normalized_path)\n+ .or_else(|| existing_blobs.get(file_name));\n+ \n+ if let Some((existing_blob_ref, existing_cid)) = existing_blob {\n+ if existing_cid == \u0026file_cid {\n+ // CIDs match - reuse existing blob\n+ println!(\" ✓ Reusing blob for {} (CID: {})\", file_name, file_cid);\n+ return Ok((\n+ File::new()\n+ .r#type(CowStr::from(\"file\"))\n+ .blob(existing_blob_ref.clone())\n+ .encoding(CowStr::from(\"gzip\"))\n+ .mime_type(CowStr::from(original_mime))\n+ .base64(true)\n+ .build(),\n+ true\n+ ));\n+ }\n+ }\n+ \n+ // File is new or changed - upload it\n+ println!(\" ↑ Uploading {} ({} bytes, CID: {})\", file_name, base64_bytes.len(), file_cid);\n let blob = agent.upload_blob(\n base64_bytes,\n MimeType::new_static(\"application/octet-stream\"),\n ).await?;\n \n- Ok(File::new()\n- .r#type(CowStr::from(\"file\"))\n- .blob(blob)\n- .encoding(CowStr::from(\"gzip\"))\n- .mime_type(CowStr::from(original_mime))\n- .base64(true)\n- .build())\n+ Ok((\n+ File::new()\n+ .r#type(CowStr::from(\"file\"))\n+ .blob(blob)\n+ .encoding(CowStr::from(\"gzip\"))\n+ .mime_type(CowStr::from(original_mime))\n+ .base64(true)\n+ .build(),\n+ false\n+ ))\n }\n \n-/// Count total files in a directory tree\n-fn count_files(dir: \u0026Directory) -\u003e usize {\n- let mut count = 0;\n- for entry in \u0026dir.entries {\n- match \u0026entry.node {\n- EntryNode::File(_) =\u003e count += 1,\n- EntryNode::Directory(subdir) =\u003e count += count_files(subdir),\n- _ =\u003e {} // Unknown variants\n- }\n- }\n- count\n-}\n-- \n2.50.1 (Apple Git-155)\n\n\nFrom 56b1ef45ccab3ffc0112d4895e2c31a0954d0199 Mon Sep 17 00:00:00 2001\nFrom: \"@nekomimi.pet\" \u003cmeowskulls@nekomimi.pet\u003e\nDate: Wed, 12 Nov 2025 20:28:44 -0500\nSubject: [PATCH 3/6] dont normalize paths when comparing CIDs\n\n---\n cli/src/blob_map.rs | 23 ++++++++---------------\n cli/src/main.rs | 37 ++++++++++++++++++++++++-------------\n 2 files changed, 32 insertions(+), 28 deletions(-)\n\ndiff --git a/cli/src/blob_map.rs b/cli/src/blob_map.rs\nindex 93c86bd..de5f211 100644\n--- a/cli/src/blob_map.rs\n+++ b/cli/src/blob_map.rs\n@@ -34,22 +34,11 @@ fn extract_blob_map_recursive(\n let blob_ref = \u0026file_node.blob;\n let cid_string = blob_ref.blob().r#ref.to_string();\n \n- // Store both normalized and full paths\n- // Normalize by removing base folder prefix (e.g., \"cobblemon/index.html\" -\u003e \"index.html\")\n- let normalized_path = normalize_path(\u0026full_path);\n- \n+ // Store with full path (mirrors TypeScript implementation)\n blob_map.insert(\n- normalized_path.clone(),\n- (blob_ref.clone().into_static(), cid_string.clone())\n+ full_path,\n+ (blob_ref.clone().into_static(), cid_string)\n );\n- \n- // Also store the full path for matching\n- if normalized_path != full_path {\n- blob_map.insert(\n- full_path,\n- (blob_ref.clone().into_static(), cid_string)\n- );\n- }\n }\n EntryNode::Directory(subdir) =\u003e {\n let sub_map = extract_blob_map_recursive(subdir, full_path);\n@@ -67,7 +56,11 @@ fn extract_blob_map_recursive(\n /// Normalize file path by removing base folder prefix\n /// Example: \"cobblemon/index.html\" -\u003e \"index.html\"\n /// \n-/// Mirrors TypeScript implementation at src/routes/wisp.ts line 291\n+/// Note: This function is kept for reference but is no longer used in production code.\n+/// The TypeScript server has a similar normalization (src/routes/wisp.ts line 291) to handle\n+/// uploads that include a base folder prefix, but our CLI doesn't need this since we\n+/// track full paths consistently.\n+#[allow(dead_code)]\n pub fn normalize_path(path: \u0026str) -\u003e String {\n // Remove base folder prefix (everything before first /)\n if let Some(idx) = path.find('/') {\ndiff --git a/cli/src/main.rs b/cli/src/main.rs\nindex db0e7cf..8db65f6 100644\n--- a/cli/src/main.rs\n+++ b/cli/src/main.rs\n@@ -152,7 +152,7 @@ async fn deploy_site(\n };\n \n // Build directory tree\n- let (root_dir, total_files, reused_count) = build_directory(agent, \u0026path, \u0026existing_blob_map).await?;\n+ let (root_dir, total_files, reused_count) = build_directory(agent, \u0026path, \u0026existing_blob_map, String::new()).await?;\n let uploaded_count = total_files - reused_count;\n \n // Create the Fs record\n@@ -182,10 +182,12 @@ async fn deploy_site(\n }\n \n /// Recursively build a Directory from a filesystem path\n+/// current_path is the path from the root of the site (e.g., \"\" for root, \"config\" for config dir)\n fn build_directory\u003c'a\u003e(\n agent: \u0026'a Agent\u003cimpl jacquard::client::AgentSession + IdentityResolver + 'a\u003e,\n dir_path: \u0026'a Path,\n existing_blobs: \u0026'a HashMap\u003cString, (jacquard_common::types::blob::BlobRef\u003c'static\u003e, String)\u003e,\n+ current_path: String,\n ) -\u003e std::pin::Pin\u003cBox\u003cdyn std::future::Future\u003cOutput = miette::Result\u003c(Directory\u003c'static\u003e, usize, usize)\u003e\u003e + 'a\u003e\u003e\n {\n Box::pin(async move {\n@@ -214,7 +216,13 @@ fn build_directory\u003c'a\u003e(\n let metadata = entry.metadata().into_diagnostic()?;\n \n if metadata.is_file() {\n- file_tasks.push((name_str, path));\n+ // Construct full path for this file (for blob map lookup)\n+ let full_path = if current_path.is_empty() {\n+ name_str.clone()\n+ } else {\n+ format!(\"{}/{}\", current_path, name_str)\n+ };\n+ file_tasks.push((name_str, path, full_path));\n } else if metadata.is_dir() {\n dir_tasks.push((name_str, path));\n }\n@@ -222,8 +230,8 @@ fn build_directory\u003c'a\u003e(\n \n // Process files concurrently with a limit of 5\n let file_results: Vec\u003c(Entry\u003c'static\u003e, bool)\u003e = stream::iter(file_tasks)\n- .map(|(name, path)| async move {\n- let (file_node, reused) = process_file(agent, \u0026path, \u0026name, existing_blobs).await?;\n+ .map(|(name, path, full_path)| async move {\n+ let (file_node, reused) = process_file(agent, \u0026path, \u0026full_path, existing_blobs).await?;\n let entry = Entry::new()\n .name(CowStr::from(name))\n .node(EntryNode::File(Box::new(file_node)))\n@@ -251,7 +259,13 @@ fn build_directory\u003c'a\u003e(\n // Process directories recursively (sequentially to avoid too much nesting)\n let mut dir_entries = Vec::new();\n for (name, path) in dir_tasks {\n- let (subdir, sub_total, sub_reused) = build_directory(agent, \u0026path, existing_blobs).await?;\n+ // Construct full path for subdirectory\n+ let subdir_path = if current_path.is_empty() {\n+ name.clone()\n+ } else {\n+ format!(\"{}/{}\", current_path, name)\n+ };\n+ let (subdir, sub_total, sub_reused) = build_directory(agent, \u0026path, existing_blobs, subdir_path).await?;\n dir_entries.push(Entry::new()\n .name(CowStr::from(name))\n .node(EntryNode::Directory(Box::new(subdir)))\n@@ -275,10 +289,11 @@ fn build_directory\u003c'a\u003e(\n \n /// Process a single file: gzip -\u003e base64 -\u003e upload blob (or reuse existing)\n /// Returns (File, reused: bool)\n+/// file_path_key is the full path from the site root (e.g., \"config/file.json\") for blob map lookup\n async fn process_file(\n agent: \u0026Agent\u003cimpl jacquard::client::AgentSession + IdentityResolver\u003e,\n file_path: \u0026Path,\n- file_name: \u0026str,\n+ file_path_key: \u0026str,\n existing_blobs: \u0026HashMap\u003cString, (jacquard_common::types::blob::BlobRef\u003c'static\u003e, String)\u003e,\n ) -\u003e miette::Result\u003c(File\u003c'static\u003e, bool)\u003e\n {\n@@ -301,17 +316,13 @@ async fn process_file(\n // Compute CID for this file (CRITICAL: on base64-encoded gzipped content)\n let file_cid = cid::compute_cid(\u0026base64_bytes);\n \n- // Normalize the file path for comparison\n- let normalized_path = blob_map::normalize_path(file_name);\n- \n // Check if we have an existing blob with the same CID\n- let existing_blob = existing_blobs.get(\u0026normalized_path)\n- .or_else(|| existing_blobs.get(file_name));\n+ let existing_blob = existing_blobs.get(file_path_key);\n \n if let Some((existing_blob_ref, existing_cid)) = existing_blob {\n if existing_cid == \u0026file_cid {\n // CIDs match - reuse existing blob\n- println!(\" ✓ Reusing blob for {} (CID: {})\", file_name, file_cid);\n+ println!(\" ✓ Reusing blob for {} (CID: {})\", file_path_key, file_cid);\n return Ok((\n File::new()\n .r#type(CowStr::from(\"file\"))\n@@ -326,7 +337,7 @@ async fn process_file(\n }\n \n // File is new or changed - upload it\n- println!(\" ↑ Uploading {} ({} bytes, CID: {})\", file_name, base64_bytes.len(), file_cid);\n+ println!(\" ↑ Uploading {} ({} bytes, CID: {})\", file_path_key, base64_bytes.len(), file_cid);\n let blob = agent.upload_blob(\n base64_bytes,\n MimeType::new_static(\"application/octet-stream\"),\n-- \n2.50.1 (Apple Git-155)\n\n\nFrom 38b1c4c6f7cc6e8f298ef3af629d2761b5f3b908 Mon Sep 17 00:00:00 2001\nFrom: \"@nekomimi.pet\" \u003cmeowskulls@nekomimi.pet\u003e\nDate: Wed, 12 Nov 2025 23:57:22 -0500\nSubject: [PATCH 4/6] add pull and serve to cli\n\n---\n cli/Cargo.lock | 560 +++++++++++++++++++++++++++++++++++++++++++-\n cli/Cargo.toml | 8 +-\n cli/src/download.rs | 71 ++++++\n cli/src/main.rs | 125 ++++++++--\n cli/src/metadata.rs | 46 ++++\n cli/src/pull.rs | 305 ++++++++++++++++++++++++\n cli/src/serve.rs | 202 ++++++++++++++++\n 7 files changed, 1295 insertions(+), 22 deletions(-)\n create mode 100644 cli/src/download.rs\n create mode 100644 cli/src/metadata.rs\n create mode 100644 cli/src/pull.rs\n create mode 100644 cli/src/serve.rs\n\ndiff --git a/cli/Cargo.lock b/cli/Cargo.lock\nindex a100cf6..5fa5a99 100644\n--- a/cli/Cargo.lock\n+++ b/cli/Cargo.lock\n@@ -173,6 +173,61 @@ version = \"1.5.0\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n checksum = \"c08606f8c3cbf4ce6ec8e28fb0014a2c086708fe954eaa885384a6165172e7e8\"\n \n+[[package]]\n+name = \"axum\"\n+version = \"0.7.9\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"edca88bc138befd0323b20752846e6587272d3b03b0343c8ea28a6f819e6e71f\"\n+dependencies = [\n+ \"async-trait\",\n+ \"axum-core\",\n+ \"bytes\",\n+ \"futures-util\",\n+ \"http\",\n+ \"http-body\",\n+ \"http-body-util\",\n+ \"hyper\",\n+ \"hyper-util\",\n+ \"itoa\",\n+ \"matchit\",\n+ \"memchr\",\n+ \"mime\",\n+ \"percent-encoding\",\n+ \"pin-project-lite\",\n+ \"rustversion\",\n+ \"serde\",\n+ \"serde_json\",\n+ \"serde_path_to_error\",\n+ \"serde_urlencoded\",\n+ \"sync_wrapper\",\n+ \"tokio\",\n+ \"tower 0.5.2\",\n+ \"tower-layer\",\n+ \"tower-service\",\n+ \"tracing\",\n+]\n+\n+[[package]]\n+name = \"axum-core\"\n+version = \"0.4.5\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"09f2bd6146b97ae3359fa0cc6d6b376d9539582c7b4220f041a33ec24c226199\"\n+dependencies = [\n+ \"async-trait\",\n+ \"bytes\",\n+ \"futures-util\",\n+ \"http\",\n+ \"http-body\",\n+ \"http-body-util\",\n+ \"mime\",\n+ \"pin-project-lite\",\n+ \"rustversion\",\n+ \"sync_wrapper\",\n+ \"tower-layer\",\n+ \"tower-service\",\n+ \"tracing\",\n+]\n+\n [[package]]\n name = \"backtrace\"\n version = \"0.3.76\"\n@@ -347,6 +402,12 @@ version = \"3.19.0\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n checksum = \"46c5e41b57b8bba42a04676d81cb89e9ee8e859a1a66f80a5a72e1cb76b34d43\"\n \n+[[package]]\n+name = \"byteorder\"\n+version = \"1.5.0\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"1fd0f2584146f6f2ef48085050886acf353beff7305ebd1ae69500e27c67f64b\"\n+\n [[package]]\n name = \"bytes\"\n version = \"1.10.1\"\n@@ -548,6 +609,16 @@ version = \"0.4.3\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n checksum = \"2f421161cb492475f1661ddc9815a745a1c894592070661180fdec3d4872e9c3\"\n \n+[[package]]\n+name = \"cordyceps\"\n+version = \"0.3.4\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"688d7fbb8092b8de775ef2536f36c8c31f2bc4006ece2e8d8ad2d17d00ce0a2a\"\n+dependencies = [\n+ \"loom\",\n+ \"tracing\",\n+]\n+\n [[package]]\n name = \"core-foundation\"\n version = \"0.9.4\"\n@@ -750,6 +821,33 @@ dependencies = [\n \"serde_core\",\n ]\n \n+[[package]]\n+name = \"derive_more\"\n+version = \"1.0.0\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"4a9b99b9cbbe49445b21764dc0625032a89b145a2642e67603e1c936f5458d05\"\n+dependencies = [\n+ \"derive_more-impl\",\n+]\n+\n+[[package]]\n+name = \"derive_more-impl\"\n+version = \"1.0.0\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"cb7330aeadfbe296029522e6c40f315320aba36fc43a5b3632f3795348f3bd22\"\n+dependencies = [\n+ \"proc-macro2\",\n+ \"quote\",\n+ \"syn 2.0.108\",\n+ \"unicode-xid\",\n+]\n+\n+[[package]]\n+name = \"diatomic-waker\"\n+version = \"0.2.3\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"ab03c107fafeb3ee9f5925686dbb7a73bc76e3932abb0d2b365cb64b169cf04c\"\n+\n [[package]]\n name = \"digest\"\n version = \"0.10.7\"\n@@ -955,6 +1053,19 @@ dependencies = [\n \"futures-util\",\n ]\n \n+[[package]]\n+name = \"futures-buffered\"\n+version = \"0.2.12\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"a8e0e1f38ec07ba4abbde21eed377082f17ccb988be9d988a5adbf4bafc118fd\"\n+dependencies = [\n+ \"cordyceps\",\n+ \"diatomic-waker\",\n+ \"futures-core\",\n+ \"pin-project-lite\",\n+ \"spin 0.10.0\",\n+]\n+\n [[package]]\n name = \"futures-channel\"\n version = \"0.3.31\"\n@@ -988,6 +1099,19 @@ version = \"0.3.31\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n checksum = \"9e5c1b78ca4aae1ac06c48a526a655760685149f0d465d21f37abfe57ce075c6\"\n \n+[[package]]\n+name = \"futures-lite\"\n+version = \"2.6.1\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"f78e10609fe0e0b3f4157ffab1876319b5b0db102a2c60dc4626306dc46b44ad\"\n+dependencies = [\n+ \"fastrand\",\n+ \"futures-core\",\n+ \"futures-io\",\n+ \"parking\",\n+ \"pin-project-lite\",\n+]\n+\n [[package]]\n name = \"futures-macro\"\n version = \"0.3.31\"\n@@ -1029,6 +1153,20 @@ dependencies = [\n \"slab\",\n ]\n \n+[[package]]\n+name = \"generator\"\n+version = \"0.8.7\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"605183a538e3e2a9c1038635cc5c2d194e2ee8fd0d1b66b8349fad7dbacce5a2\"\n+dependencies = [\n+ \"cc\",\n+ \"cfg-if\",\n+ \"libc\",\n+ \"log\",\n+ \"rustversion\",\n+ \"windows\",\n+]\n+\n [[package]]\n name = \"generic-array\"\n version = \"0.14.9\"\n@@ -1273,6 +1411,12 @@ dependencies = [\n \"pin-project-lite\",\n ]\n \n+[[package]]\n+name = \"http-range-header\"\n+version = \"0.4.2\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"9171a2ea8a68358193d15dd5d70c1c10a2afc3e7e4c5bc92bc9f025cebd7359c\"\n+\n [[package]]\n name = \"httparse\"\n version = \"1.10.1\"\n@@ -1299,6 +1443,7 @@ dependencies = [\n \"http\",\n \"http-body\",\n \"httparse\",\n+ \"httpdate\",\n \"itoa\",\n \"pin-project-lite\",\n \"pin-utils\",\n@@ -1362,7 +1507,7 @@ dependencies = [\n \"js-sys\",\n \"log\",\n \"wasm-bindgen\",\n- \"windows-core\",\n+ \"windows-core 0.62.2\",\n ]\n \n [[package]]\n@@ -1635,7 +1780,9 @@ dependencies = [\n \"bon\",\n \"bytes\",\n \"chrono\",\n+ \"ciborium\",\n \"cid\",\n+ \"futures\",\n \"getrandom 0.2.16\",\n \"getrandom 0.3.4\",\n \"http\",\n@@ -1645,6 +1792,7 @@ dependencies = [\n \"miette\",\n \"multibase\",\n \"multihash\",\n+ \"n0-future\",\n \"ouroboros\",\n \"p256\",\n \"rand 0.9.2\",\n@@ -1658,6 +1806,7 @@ dependencies = [\n \"smol_str\",\n \"thiserror 2.0.17\",\n \"tokio\",\n+ \"tokio-tungstenite-wasm\",\n \"tokio-util\",\n \"trait-variant\",\n \"url\",\n@@ -1856,7 +2005,7 @@ version = \"1.5.0\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n checksum = \"bbd2bcb4c963f2ddae06a2efc7e9f3591312473c50c6685e1f298068316e66fe\"\n dependencies = [\n- \"spin\",\n+ \"spin 0.9.8\",\n ]\n \n [[package]]\n@@ -1915,6 +2064,19 @@ version = \"0.4.28\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n checksum = \"34080505efa8e45a4b816c349525ebe327ceaa8559756f0356cba97ef3bf7432\"\n \n+[[package]]\n+name = \"loom\"\n+version = \"0.7.2\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"419e0dc8046cb947daa77eb95ae174acfbddb7673b4151f56d1eed8e93fbfaca\"\n+dependencies = [\n+ \"cfg-if\",\n+ \"generator\",\n+ \"scoped-tls\",\n+ \"tracing\",\n+ \"tracing-subscriber\",\n+]\n+\n [[package]]\n name = \"lru-cache\"\n version = \"0.1.2\"\n@@ -1973,6 +2135,21 @@ dependencies = [\n \"syn 1.0.109\",\n ]\n \n+[[package]]\n+name = \"matchers\"\n+version = \"0.2.0\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"d1525a2a28c7f4fa0fc98bb91ae755d1e2d1505079e05539e35bc876b5d65ae9\"\n+dependencies = [\n+ \"regex-automata\",\n+]\n+\n+[[package]]\n+name = \"matchit\"\n+version = \"0.7.3\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"0e7465ac9959cc2b1404e8e2367b43684a6d13790fe23056cc8c6c5a6b7bcb94\"\n+\n [[package]]\n name = \"memchr\"\n version = \"2.7.6\"\n@@ -2107,6 +2284,27 @@ dependencies = [\n \"twoway\",\n ]\n \n+[[package]]\n+name = \"n0-future\"\n+version = \"0.1.3\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"7bb0e5d99e681ab3c938842b96fcb41bf8a7bb4bfdb11ccbd653a7e83e06c794\"\n+dependencies = [\n+ \"cfg_aliases\",\n+ \"derive_more\",\n+ \"futures-buffered\",\n+ \"futures-lite\",\n+ \"futures-util\",\n+ \"js-sys\",\n+ \"pin-project\",\n+ \"send_wrapper\",\n+ \"tokio\",\n+ \"tokio-util\",\n+ \"wasm-bindgen\",\n+ \"wasm-bindgen-futures\",\n+ \"web-time\",\n+]\n+\n [[package]]\n name = \"ndk-context\"\n version = \"0.1.1\"\n@@ -2129,6 +2327,15 @@ dependencies = [\n \"minimal-lexical\",\n ]\n \n+[[package]]\n+name = \"nu-ansi-term\"\n+version = \"0.50.3\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"7957b9740744892f114936ab4a57b3f487491bbeafaf8083688b16841a4240e5\"\n+dependencies = [\n+ \"windows-sys 0.61.2\",\n+]\n+\n [[package]]\n name = \"num-bigint-dig\"\n version = \"0.8.5\"\n@@ -2246,6 +2453,12 @@ version = \"1.70.2\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n checksum = \"384b8ab6d37215f3c5301a95a4accb5d64aa607f1fcb26a11b5303878451b4fe\"\n \n+[[package]]\n+name = \"openssl-probe\"\n+version = \"0.1.6\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"d05e27ee213611ffe7d6348b942e8f942b37114c00cc03cec254295a4a17852e\"\n+\n [[package]]\n name = \"option-ext\"\n version = \"0.2.0\"\n@@ -2304,6 +2517,12 @@ dependencies = [\n \"primeorder\",\n ]\n \n+[[package]]\n+name = \"parking\"\n+version = \"2.2.1\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"f38d5652c16fde515bb1ecef450ab0f6a219d619a7274976324d5e377f7dceba\"\n+\n [[package]]\n name = \"parking_lot\"\n version = \"0.12.5\"\n@@ -2380,6 +2599,26 @@ dependencies = [\n \"siphasher\",\n ]\n \n+[[package]]\n+name = \"pin-project\"\n+version = \"1.1.10\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"677f1add503faace112b9f1373e43e9e054bfdd22ff1a63c1bc485eaec6a6a8a\"\n+dependencies = [\n+ \"pin-project-internal\",\n+]\n+\n+[[package]]\n+name = \"pin-project-internal\"\n+version = \"1.1.10\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"6e918e4ff8c4549eb882f14b3a4bc8c8bc93de829416eacf579f1207a8fbf861\"\n+dependencies = [\n+ \"proc-macro2\",\n+ \"quote\",\n+ \"syn 2.0.108\",\n+]\n+\n [[package]]\n name = \"pin-project-lite\"\n version = \"0.2.16\"\n@@ -2752,8 +2991,8 @@ dependencies = [\n \"tokio\",\n \"tokio-rustls\",\n \"tokio-util\",\n- \"tower\",\n- \"tower-http\",\n+ \"tower 0.5.2\",\n+ \"tower-http 0.6.6\",\n \"tower-service\",\n \"url\",\n \"wasm-bindgen\",\n@@ -2876,6 +3115,18 @@ dependencies = [\n \"zeroize\",\n ]\n \n+[[package]]\n+name = \"rustls-native-certs\"\n+version = \"0.8.2\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"9980d917ebb0c0536119ba501e90834767bffc3d60641457fd84a1f3fd337923\"\n+dependencies = [\n+ \"openssl-probe\",\n+ \"rustls-pki-types\",\n+ \"schannel\",\n+ \"security-framework\",\n+]\n+\n [[package]]\n name = \"rustls-pki-types\"\n version = \"1.13.0\"\n@@ -2924,6 +3175,15 @@ dependencies = [\n \"winapi-util\",\n ]\n \n+[[package]]\n+name = \"schannel\"\n+version = \"0.1.28\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"891d81b926048e76efe18581bf793546b4c0eaf8448d72be8de2bbee5fd166e1\"\n+dependencies = [\n+ \"windows-sys 0.61.2\",\n+]\n+\n [[package]]\n name = \"schemars\"\n version = \"0.9.0\"\n@@ -2948,6 +3208,12 @@ dependencies = [\n \"serde_json\",\n ]\n \n+[[package]]\n+name = \"scoped-tls\"\n+version = \"1.0.1\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"e1cf6437eb19a8f4a6cc0f7dca544973b0b78843adbfeb3683d1a94a0024a294\"\n+\n [[package]]\n name = \"scopeguard\"\n version = \"1.2.0\"\n@@ -2968,6 +3234,35 @@ dependencies = [\n \"zeroize\",\n ]\n \n+[[package]]\n+name = \"security-framework\"\n+version = \"3.5.1\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"b3297343eaf830f66ede390ea39da1d462b6b0c1b000f420d0a83f898bbbe6ef\"\n+dependencies = [\n+ \"bitflags\",\n+ \"core-foundation 0.10.1\",\n+ \"core-foundation-sys\",\n+ \"libc\",\n+ \"security-framework-sys\",\n+]\n+\n+[[package]]\n+name = \"security-framework-sys\"\n+version = \"2.15.0\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"cc1f0cbffaac4852523ce30d8bd3c5cdc873501d96ff467ca09b6767bb8cd5c0\"\n+dependencies = [\n+ \"core-foundation-sys\",\n+ \"libc\",\n+]\n+\n+[[package]]\n+name = \"send_wrapper\"\n+version = \"0.6.0\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"cd0b0ec5f1c1ca621c432a25813d8d60c88abe6d3e08a3eb9cf37d97a0fe3d73\"\n+\n [[package]]\n name = \"serde\"\n version = \"1.0.228\"\n@@ -3046,6 +3341,17 @@ dependencies = [\n \"serde_core\",\n ]\n \n+[[package]]\n+name = \"serde_path_to_error\"\n+version = \"0.1.20\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"10a9ff822e371bb5403e391ecd83e182e0e77ba7f6fe0160b795797109d1b457\"\n+dependencies = [\n+ \"itoa\",\n+ \"serde\",\n+ \"serde_core\",\n+]\n+\n [[package]]\n name = \"serde_repr\"\n version = \"0.1.20\"\n@@ -3100,6 +3406,17 @@ dependencies = [\n \"syn 2.0.108\",\n ]\n \n+[[package]]\n+name = \"sha1\"\n+version = \"0.10.6\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"e3bf829a2d51ab4a5ddf1352d8470c140cadc8301b2ae1789db023f01cedd6ba\"\n+dependencies = [\n+ \"cfg-if\",\n+ \"cpufeatures\",\n+ \"digest\",\n+]\n+\n [[package]]\n name = \"sha1_smol\"\n version = \"1.0.1\"\n@@ -3117,6 +3434,15 @@ dependencies = [\n \"digest\",\n ]\n \n+[[package]]\n+name = \"sharded-slab\"\n+version = \"0.1.7\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"f40ca3c46823713e0d4209592e8d6e826aa57e928f09752619fc696c499637f6\"\n+dependencies = [\n+ \"lazy_static\",\n+]\n+\n [[package]]\n name = \"shellexpand\"\n version = \"3.1.1\"\n@@ -3211,6 +3537,12 @@ version = \"0.9.8\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n checksum = \"6980e8d7511241f8acf4aebddbb1ff938df5eebe98691418c4468d0b72a96a67\"\n \n+[[package]]\n+name = \"spin\"\n+version = \"0.10.0\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"d5fe4ccb98d9c292d56fec89a5e07da7fc4cf0dc11e156b41793132775d3e591\"\n+\n [[package]]\n name = \"spki\"\n version = \"0.7.3\"\n@@ -3464,6 +3796,15 @@ dependencies = [\n \"syn 2.0.108\",\n ]\n \n+[[package]]\n+name = \"thread_local\"\n+version = \"1.1.9\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"f60246a4944f24f6e018aa17cdeffb7818b76356965d03b07d6a9886e8962185\"\n+dependencies = [\n+ \"cfg-if\",\n+]\n+\n [[package]]\n name = \"threadpool\"\n version = \"1.8.1\"\n@@ -3581,6 +3922,41 @@ dependencies = [\n \"tokio\",\n ]\n \n+[[package]]\n+name = \"tokio-tungstenite\"\n+version = \"0.24.0\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"edc5f74e248dc973e0dbb7b74c7e0d6fcc301c694ff50049504004ef4d0cdcd9\"\n+dependencies = [\n+ \"futures-util\",\n+ \"log\",\n+ \"rustls\",\n+ \"rustls-native-certs\",\n+ \"rustls-pki-types\",\n+ \"tokio\",\n+ \"tokio-rustls\",\n+ \"tungstenite\",\n+]\n+\n+[[package]]\n+name = \"tokio-tungstenite-wasm\"\n+version = \"0.4.0\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"e21a5c399399c3db9f08d8297ac12b500e86bca82e930253fdc62eaf9c0de6ae\"\n+dependencies = [\n+ \"futures-channel\",\n+ \"futures-util\",\n+ \"http\",\n+ \"httparse\",\n+ \"js-sys\",\n+ \"rustls\",\n+ \"thiserror 1.0.69\",\n+ \"tokio\",\n+ \"tokio-tungstenite\",\n+ \"wasm-bindgen\",\n+ \"web-sys\",\n+]\n+\n [[package]]\n name = \"tokio-util\"\n version = \"0.7.16\"\n@@ -3590,10 +3966,22 @@ dependencies = [\n \"bytes\",\n \"futures-core\",\n \"futures-sink\",\n+ \"futures-util\",\n \"pin-project-lite\",\n \"tokio\",\n ]\n \n+[[package]]\n+name = \"tower\"\n+version = \"0.4.13\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"b8fa9be0de6cf49e536ce1851f987bd21a43b771b09473c3549a6c853db37c1c\"\n+dependencies = [\n+ \"tower-layer\",\n+ \"tower-service\",\n+ \"tracing\",\n+]\n+\n [[package]]\n name = \"tower\"\n version = \"0.5.2\"\n@@ -3607,6 +3995,34 @@ dependencies = [\n \"tokio\",\n \"tower-layer\",\n \"tower-service\",\n+ \"tracing\",\n+]\n+\n+[[package]]\n+name = \"tower-http\"\n+version = \"0.5.2\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"1e9cd434a998747dd2c4276bc96ee2e0c7a2eadf3cae88e52be55a05fa9053f5\"\n+dependencies = [\n+ \"async-compression\",\n+ \"bitflags\",\n+ \"bytes\",\n+ \"futures-core\",\n+ \"futures-util\",\n+ \"http\",\n+ \"http-body\",\n+ \"http-body-util\",\n+ \"http-range-header\",\n+ \"httpdate\",\n+ \"mime\",\n+ \"mime_guess\",\n+ \"percent-encoding\",\n+ \"pin-project-lite\",\n+ \"tokio\",\n+ \"tokio-util\",\n+ \"tower-layer\",\n+ \"tower-service\",\n+ \"tracing\",\n ]\n \n [[package]]\n@@ -3622,7 +4038,7 @@ dependencies = [\n \"http-body\",\n \"iri-string\",\n \"pin-project-lite\",\n- \"tower\",\n+ \"tower 0.5.2\",\n \"tower-layer\",\n \"tower-service\",\n ]\n@@ -3645,6 +4061,7 @@ version = \"0.1.41\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n checksum = \"784e0ac535deb450455cbfa28a6f0df145ea1bb7ae51b821cf5e7927fdcfbdd0\"\n dependencies = [\n+ \"log\",\n \"pin-project-lite\",\n \"tracing-attributes\",\n \"tracing-core\",\n@@ -3668,6 +4085,36 @@ source = \"registry+https://github.com/rust-lang/crates.io-index\"\n checksum = \"b9d12581f227e93f094d3af2ae690a574abb8a2b9b7a96e7cfe9647b2b617678\"\n dependencies = [\n \"once_cell\",\n+ \"valuable\",\n+]\n+\n+[[package]]\n+name = \"tracing-log\"\n+version = \"0.2.0\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"ee855f1f400bd0e5c02d150ae5de3840039a3f54b025156404e34c23c03f47c3\"\n+dependencies = [\n+ \"log\",\n+ \"once_cell\",\n+ \"tracing-core\",\n+]\n+\n+[[package]]\n+name = \"tracing-subscriber\"\n+version = \"0.3.20\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"2054a14f5307d601f88daf0553e1cbf472acc4f2c51afab632431cdcd72124d5\"\n+dependencies = [\n+ \"matchers\",\n+ \"nu-ansi-term\",\n+ \"once_cell\",\n+ \"regex-automata\",\n+ \"sharded-slab\",\n+ \"smallvec\",\n+ \"thread_local\",\n+ \"tracing\",\n+ \"tracing-core\",\n+ \"tracing-log\",\n ]\n \n [[package]]\n@@ -3693,6 +4140,26 @@ version = \"0.2.5\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n checksum = \"e421abadd41a4225275504ea4d6566923418b7f05506fbc9c0fe86ba7396114b\"\n \n+[[package]]\n+name = \"tungstenite\"\n+version = \"0.24.0\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"18e5b8366ee7a95b16d32197d0b2604b43a0be89dc5fac9f8e96ccafbaedda8a\"\n+dependencies = [\n+ \"byteorder\",\n+ \"bytes\",\n+ \"data-encoding\",\n+ \"http\",\n+ \"httparse\",\n+ \"log\",\n+ \"rand 0.8.5\",\n+ \"rustls\",\n+ \"rustls-pki-types\",\n+ \"sha1\",\n+ \"thiserror 1.0.69\",\n+ \"utf-8\",\n+]\n+\n [[package]]\n name = \"twoway\"\n version = \"0.1.8\"\n@@ -3744,6 +4211,12 @@ version = \"0.2.2\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n checksum = \"b4ac048d71ede7ee76d585517add45da530660ef4390e49b098733c6e897f254\"\n \n+[[package]]\n+name = \"unicode-xid\"\n+version = \"0.2.6\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"ebc1c04c71510c7f702b52b7c350734c9ff1295c464a03335b00bb84fc54f853\"\n+\n [[package]]\n name = \"unsigned-varint\"\n version = \"0.8.0\"\n@@ -3792,6 +4265,12 @@ version = \"0.2.2\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n checksum = \"06abde3611657adf66d383f00b093d7faecc7fa57071cce2578660c9f1010821\"\n \n+[[package]]\n+name = \"valuable\"\n+version = \"0.1.1\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"ba73ea9cf16a25df0c8caa16c51acb937d5712a8429db78a3ee29d5dcacd3a65\"\n+\n [[package]]\n name = \"version_check\"\n version = \"0.9.5\"\n@@ -3975,6 +4454,41 @@ dependencies = [\n \"windows-sys 0.61.2\",\n ]\n \n+[[package]]\n+name = \"windows\"\n+version = \"0.61.3\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"9babd3a767a4c1aef6900409f85f5d53ce2544ccdfaa86dad48c91782c6d6893\"\n+dependencies = [\n+ \"windows-collections\",\n+ \"windows-core 0.61.2\",\n+ \"windows-future\",\n+ \"windows-link 0.1.3\",\n+ \"windows-numerics\",\n+]\n+\n+[[package]]\n+name = \"windows-collections\"\n+version = \"0.2.0\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"3beeceb5e5cfd9eb1d76b381630e82c4241ccd0d27f1a39ed41b2760b255c5e8\"\n+dependencies = [\n+ \"windows-core 0.61.2\",\n+]\n+\n+[[package]]\n+name = \"windows-core\"\n+version = \"0.61.2\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"c0fdd3ddb90610c7638aa2b3a3ab2904fb9e5cdbecc643ddb3647212781c4ae3\"\n+dependencies = [\n+ \"windows-implement\",\n+ \"windows-interface\",\n+ \"windows-link 0.1.3\",\n+ \"windows-result 0.3.4\",\n+ \"windows-strings 0.4.2\",\n+]\n+\n [[package]]\n name = \"windows-core\"\n version = \"0.62.2\"\n@@ -3988,6 +4502,17 @@ dependencies = [\n \"windows-strings 0.5.1\",\n ]\n \n+[[package]]\n+name = \"windows-future\"\n+version = \"0.2.1\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"fc6a41e98427b19fe4b73c550f060b59fa592d7d686537eebf9385621bfbad8e\"\n+dependencies = [\n+ \"windows-core 0.61.2\",\n+ \"windows-link 0.1.3\",\n+ \"windows-threading\",\n+]\n+\n [[package]]\n name = \"windows-implement\"\n version = \"0.60.2\"\n@@ -4022,6 +4547,16 @@ version = \"0.2.1\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n checksum = \"f0805222e57f7521d6a62e36fa9163bc891acd422f971defe97d64e70d0a4fe5\"\n \n+[[package]]\n+name = \"windows-numerics\"\n+version = \"0.2.0\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"9150af68066c4c5c07ddc0ce30421554771e528bde427614c61038bc2c92c2b1\"\n+dependencies = [\n+ \"windows-core 0.61.2\",\n+ \"windows-link 0.1.3\",\n+]\n+\n [[package]]\n name = \"windows-registry\"\n version = \"0.5.3\"\n@@ -4177,6 +4712,15 @@ dependencies = [\n \"windows_x86_64_msvc 0.53.1\",\n ]\n \n+[[package]]\n+name = \"windows-threading\"\n+version = \"0.1.0\"\n+source = \"registry+https://github.com/rust-lang/crates.io-index\"\n+checksum = \"b66463ad2e0ea3bbf808b7f1d371311c80e115c0b71d60efc142cafbcfb057a6\"\n+dependencies = [\n+ \"windows-link 0.1.3\",\n+]\n+\n [[package]]\n name = \"windows_aarch64_gnullvm\"\n version = \"0.42.2\"\n@@ -4371,8 +4915,10 @@ dependencies = [\n name = \"wisp-cli\"\n version = \"0.1.0\"\n dependencies = [\n+ \"axum\",\n \"base64 0.22.1\",\n \"bytes\",\n+ \"chrono\",\n \"clap\",\n \"flate2\",\n \"futures\",\n@@ -4387,6 +4933,7 @@ dependencies = [\n \"mime_guess\",\n \"multibase\",\n \"multihash\",\n+ \"n0-future\",\n \"reqwest\",\n \"rustversion\",\n \"serde\",\n@@ -4394,6 +4941,9 @@ dependencies = [\n \"sha2\",\n \"shellexpand\",\n \"tokio\",\n+ \"tower 0.4.13\",\n+ \"tower-http 0.5.2\",\n+ \"url\",\n \"walkdir\",\n ]\n \ndiff --git a/cli/Cargo.toml b/cli/Cargo.toml\nindex 6e0d1e2..c3eb22c 100644\n--- a/cli/Cargo.toml\n+++ b/cli/Cargo.toml\n@@ -11,7 +11,7 @@ place_wisp = []\n jacquard = { git = \"https://tangled.org/@nonbinary.computer/jacquard\", features = [\"loopback\"] }\n jacquard-oauth = { git = \"https://tangled.org/@nonbinary.computer/jacquard\" }\n jacquard-api = { git = \"https://tangled.org/@nonbinary.computer/jacquard\" }\n-jacquard-common = { git = \"https://tangled.org/@nonbinary.computer/jacquard\" }\n+jacquard-common = { git = \"https://tangled.org/@nonbinary.computer/jacquard\", features = [\"websocket\"] }\n jacquard-identity = { git = \"https://tangled.org/@nonbinary.computer/jacquard\", features = [\"dns\"] }\n jacquard-derive = { git = \"https://tangled.org/@nonbinary.computer/jacquard\" }\n jacquard-lexicon = { git = \"https://tangled.org/@nonbinary.computer/jacquard\" }\n@@ -33,3 +33,9 @@ futures = \"0.3.31\"\n multihash = \"0.19.3\"\n multibase = \"0.9\"\n sha2 = \"0.10\"\n+axum = \"0.7\"\n+tower-http = { version = \"0.5\", features = [\"fs\", \"compression-gzip\"] }\n+tower = \"0.4\"\n+n0-future = \"0.1\"\n+chrono = \"0.4\"\n+url = \"2.5\"\ndiff --git a/cli/src/download.rs b/cli/src/download.rs\nnew file mode 100644\nindex 0000000..a88a065\n--- /dev/null\n+++ b/cli/src/download.rs\n@@ -0,0 +1,71 @@\n+use base64::Engine;\n+use bytes::Bytes;\n+use flate2::read::GzDecoder;\n+use jacquard_common::types::blob::BlobRef;\n+use miette::IntoDiagnostic;\n+use std::io::Read;\n+use url::Url;\n+\n+/// Download a blob from the PDS\n+pub async fn download_blob(pds_url: \u0026Url, blob_ref: \u0026BlobRef\u003c'_\u003e, did: \u0026str) -\u003e miette::Result\u003cBytes\u003e {\n+ // Extract CID from blob ref\n+ let cid = blob_ref.blob().r#ref.to_string();\n+ \n+ // Construct blob download URL\n+ // The correct endpoint is: /xrpc/com.atproto.sync.getBlob?did={did}\u0026cid={cid}\n+ let blob_url = pds_url\n+ .join(\u0026format!(\"/xrpc/com.atproto.sync.getBlob?did={}\u0026cid={}\", did, cid))\n+ .into_diagnostic()?;\n+ \n+ let client = reqwest::Client::new();\n+ let response = client\n+ .get(blob_url)\n+ .send()\n+ .await\n+ .into_diagnostic()?;\n+ \n+ if !response.status().is_success() {\n+ return Err(miette::miette!(\n+ \"Failed to download blob: {}\",\n+ response.status()\n+ ));\n+ }\n+ \n+ let bytes = response.bytes().await.into_diagnostic()?;\n+ Ok(bytes)\n+}\n+\n+/// Decompress and decode a blob (base64 + gzip)\n+pub fn decompress_blob(data: \u0026[u8], is_base64: bool, is_gzipped: bool) -\u003e miette::Result\u003cVec\u003cu8\u003e\u003e {\n+ let mut current_data = data.to_vec();\n+ \n+ // First, decode base64 if needed\n+ if is_base64 {\n+ current_data = base64::prelude::BASE64_STANDARD\n+ .decode(\u0026current_data)\n+ .into_diagnostic()?;\n+ }\n+ \n+ // Then, decompress gzip if needed\n+ if is_gzipped {\n+ let mut decoder = GzDecoder::new(\u0026current_data[..]);\n+ let mut decompressed = Vec::new();\n+ decoder.read_to_end(\u0026mut decompressed).into_diagnostic()?;\n+ current_data = decompressed;\n+ }\n+ \n+ Ok(current_data)\n+}\n+\n+/// Download and decompress a blob\n+pub async fn download_and_decompress_blob(\n+ pds_url: \u0026Url,\n+ blob_ref: \u0026BlobRef\u003c'_\u003e,\n+ did: \u0026str,\n+ is_base64: bool,\n+ is_gzipped: bool,\n+) -\u003e miette::Result\u003cVec\u003cu8\u003e\u003e {\n+ let data = download_blob(pds_url, blob_ref, did).await?;\n+ decompress_blob(\u0026data, is_base64, is_gzipped)\n+}\n+\ndiff --git a/cli/src/main.rs b/cli/src/main.rs\nindex 8db65f6..46ce4bc 100644\n--- a/cli/src/main.rs\n+++ b/cli/src/main.rs\n@@ -2,8 +2,12 @@ mod builder_types;\n mod place_wisp;\n mod cid;\n mod blob_map;\n+mod metadata;\n+mod download;\n+mod pull;\n+mod serve;\n \n-use clap::Parser;\n+use clap::{Parser, Subcommand};\n use jacquard::CowStr;\n use jacquard::client::{Agent, FileAuthStore, AgentSessionExt, MemoryCredentialSession, AgentSession};\n use jacquard::oauth::client::OAuthClient;\n@@ -23,37 +27,126 @@ use futures::stream::{self, StreamExt};\n use place_wisp::fs::*;\n \n #[derive(Parser, Debug)]\n-#[command(author, version, about = \"Deploy a static site to wisp.place\")]\n+#[command(author, version, about = \"wisp.place CLI tool\")]\n struct Args {\n+ #[command(subcommand)]\n+ command: Option\u003cCommands\u003e,\n+ \n+ // Deploy arguments (when no subcommand is specified)\n /// Handle (e.g., alice.bsky.social), DID, or PDS URL\n- input: CowStr\u003c'static\u003e,\n+ #[arg(global = true, conflicts_with = \"command\")]\n+ input: Option\u003cCowStr\u003c'static\u003e\u003e,\n \n /// Path to the directory containing your static site\n- #[arg(short, long, default_value = \".\")]\n- path: PathBuf,\n+ #[arg(short, long, global = true, conflicts_with = \"command\")]\n+ path: Option\u003cPathBuf\u003e,\n \n /// Site name (defaults to directory name)\n- #[arg(short, long)]\n+ #[arg(short, long, global = true, conflicts_with = \"command\")]\n site: Option\u003cString\u003e,\n \n- /// Path to auth store file (will be created if missing, only used with OAuth)\n- #[arg(long, default_value = \"/tmp/wisp-oauth-session.json\")]\n- store: String,\n+ /// Path to auth store file\n+ #[arg(long, global = true, conflicts_with = \"command\")]\n+ store: Option\u003cString\u003e,\n \n- /// App Password for authentication (alternative to OAuth)\n- #[arg(long)]\n+ /// App Password for authentication\n+ #[arg(long, global = true, conflicts_with = \"command\")]\n password: Option\u003cCowStr\u003c'static\u003e\u003e,\n }\n \n+#[derive(Subcommand, Debug)]\n+enum Commands {\n+ /// Deploy a static site to wisp.place (default command)\n+ Deploy {\n+ /// Handle (e.g., alice.bsky.social), DID, or PDS URL\n+ input: CowStr\u003c'static\u003e,\n+\n+ /// Path to the directory containing your static site\n+ #[arg(short, long, default_value = \".\")]\n+ path: PathBuf,\n+\n+ /// Site name (defaults to directory name)\n+ #[arg(short, long)]\n+ site: Option\u003cString\u003e,\n+\n+ /// Path to auth store file (will be created if missing, only used with OAuth)\n+ #[arg(long, default_value = \"/tmp/wisp-oauth-session.json\")]\n+ store: String,\n+\n+ /// App Password for authentication (alternative to OAuth)\n+ #[arg(long)]\n+ password: Option\u003cCowStr\u003c'static\u003e\u003e,\n+ },\n+ /// Pull a site from the PDS to a local directory\n+ Pull {\n+ /// Handle (e.g., alice.bsky.social) or DID\n+ input: CowStr\u003c'static\u003e,\n+\n+ /// Site name (record key)\n+ #[arg(short, long)]\n+ site: String,\n+\n+ /// Output directory for the downloaded site\n+ #[arg(short, long, default_value = \".\")]\n+ output: PathBuf,\n+ },\n+ /// Serve a site locally with real-time firehose updates\n+ Serve {\n+ /// Handle (e.g., alice.bsky.social) or DID\n+ input: CowStr\u003c'static\u003e,\n+\n+ /// Site name (record key)\n+ #[arg(short, long)]\n+ site: String,\n+\n+ /// Output directory for the site files\n+ #[arg(short, long, default_value = \".\")]\n+ output: PathBuf,\n+\n+ /// Port to serve on\n+ #[arg(short, long, default_value = \"8080\")]\n+ port: u16,\n+ },\n+}\n+\n #[tokio::main]\n async fn main() -\u003e miette::Result\u003c()\u003e {\n let args = Args::parse();\n \n- // Dispatch to appropriate authentication method\n- if let Some(password) = args.password {\n- run_with_app_password(args.input, password, args.path, args.site).await\n- } else {\n- run_with_oauth(args.input, args.store, args.path, args.site).await\n+ match args.command {\n+ Some(Commands::Deploy { input, path, site, store, password }) =\u003e {\n+ // Dispatch to appropriate authentication method\n+ if let Some(password) = password {\n+ run_with_app_password(input, password, path, site).await\n+ } else {\n+ run_with_oauth(input, store, path, site).await\n+ }\n+ }\n+ Some(Commands::Pull { input, site, output }) =\u003e {\n+ pull::pull_site(input, CowStr::from(site), output).await\n+ }\n+ Some(Commands::Serve { input, site, output, port }) =\u003e {\n+ serve::serve_site(input, CowStr::from(site), output, port).await\n+ }\n+ None =\u003e {\n+ // Legacy mode: if input is provided, assume deploy command\n+ if let Some(input) = args.input {\n+ let path = args.path.unwrap_or_else(|| PathBuf::from(\".\"));\n+ let store = args.store.unwrap_or_else(|| \"/tmp/wisp-oauth-session.json\".to_string());\n+ \n+ // Dispatch to appropriate authentication method\n+ if let Some(password) = args.password {\n+ run_with_app_password(input, password, path, args.site).await\n+ } else {\n+ run_with_oauth(input, store, path, args.site).await\n+ }\n+ } else {\n+ // No command and no input, show help\n+ use clap::CommandFactory;\n+ Args::command().print_help().into_diagnostic()?;\n+ Ok(())\n+ }\n+ }\n }\n }\n \ndiff --git a/cli/src/metadata.rs b/cli/src/metadata.rs\nnew file mode 100644\nindex 0000000..843831b\n--- /dev/null\n+++ b/cli/src/metadata.rs\n@@ -0,0 +1,46 @@\n+use serde::{Deserialize, Serialize};\n+use std::collections::HashMap;\n+use std::path::Path;\n+use miette::IntoDiagnostic;\n+\n+/// Metadata tracking file CIDs for incremental updates\n+#[derive(Debug, Clone, Serialize, Deserialize)]\n+pub struct SiteMetadata {\n+ /// Record CID from the PDS\n+ pub record_cid: String,\n+ /// Map of file paths to their blob CIDs\n+ pub file_cids: HashMap\u003cString, String\u003e,\n+ /// Timestamp when the site was last synced\n+ pub last_sync: i64,\n+}\n+\n+impl SiteMetadata {\n+ pub fn new(record_cid: String, file_cids: HashMap\u003cString, String\u003e) -\u003e Self {\n+ Self {\n+ record_cid,\n+ file_cids,\n+ last_sync: chrono::Utc::now().timestamp(),\n+ }\n+ }\n+\n+ /// Load metadata from a directory\n+ pub fn load(dir: \u0026Path) -\u003e miette::Result\u003cOption\u003cSelf\u003e\u003e {\n+ let metadata_path = dir.join(\".wisp-metadata.json\");\n+ if !metadata_path.exists() {\n+ return Ok(None);\n+ }\n+\n+ let contents = std::fs::read_to_string(\u0026metadata_path).into_diagnostic()?;\n+ let metadata: SiteMetadata = serde_json::from_str(\u0026contents).into_diagnostic()?;\n+ Ok(Some(metadata))\n+ }\n+\n+ /// Save metadata to a directory\n+ pub fn save(\u0026self, dir: \u0026Path) -\u003e miette::Result\u003c()\u003e {\n+ let metadata_path = dir.join(\".wisp-metadata.json\");\n+ let contents = serde_json::to_string_pretty(self).into_diagnostic()?;\n+ std::fs::write(\u0026metadata_path, contents).into_diagnostic()?;\n+ Ok(())\n+ }\n+}\n+\ndiff --git a/cli/src/pull.rs b/cli/src/pull.rs\nnew file mode 100644\nindex 0000000..01cfaf5\n--- /dev/null\n+++ b/cli/src/pull.rs\n@@ -0,0 +1,305 @@\n+use crate::blob_map;\n+use crate::download;\n+use crate::metadata::SiteMetadata;\n+use crate::place_wisp::fs::*;\n+use jacquard::CowStr;\n+use jacquard::prelude::IdentityResolver;\n+use jacquard_common::types::string::Did;\n+use jacquard_common::xrpc::XrpcExt;\n+use jacquard_identity::PublicResolver;\n+use miette::IntoDiagnostic;\n+use std::collections::HashMap;\n+use std::path::{Path, PathBuf};\n+use url::Url;\n+\n+/// Pull a site from the PDS to a local directory\n+pub async fn pull_site(\n+ input: CowStr\u003c'static\u003e,\n+ rkey: CowStr\u003c'static\u003e,\n+ output_dir: PathBuf,\n+) -\u003e miette::Result\u003c()\u003e {\n+ println!(\"Pulling site {} from {}...\", rkey, input);\n+\n+ // Resolve handle to DID if needed\n+ let resolver = PublicResolver::default();\n+ let did = if input.starts_with(\"did:\") {\n+ Did::new(\u0026input).into_diagnostic()?\n+ } else {\n+ // It's a handle, resolve it\n+ let handle = jacquard_common::types::string::Handle::new(\u0026input).into_diagnostic()?;\n+ resolver.resolve_handle(\u0026handle).await.into_diagnostic()?\n+ };\n+\n+ // Resolve PDS endpoint for the DID\n+ let pds_url = resolver.pds_for_did(\u0026did).await.into_diagnostic()?;\n+ println!(\"Resolved PDS: {}\", pds_url);\n+\n+ // Fetch the place.wisp.fs record\n+\n+ println!(\"Fetching record from PDS...\");\n+ let client = reqwest::Client::new();\n+ \n+ // Use com.atproto.repo.getRecord\n+ use jacquard::api::com_atproto::repo::get_record::GetRecord;\n+ use jacquard_common::types::string::Rkey as RkeyType;\n+ let rkey_parsed = RkeyType::new(\u0026rkey).into_diagnostic()?;\n+ \n+ use jacquard_common::types::ident::AtIdentifier;\n+ use jacquard_common::types::string::RecordKey;\n+ let request = GetRecord::new()\n+ .repo(AtIdentifier::Did(did.clone()))\n+ .collection(CowStr::from(\"place.wisp.fs\"))\n+ .rkey(RecordKey::from(rkey_parsed))\n+ .build();\n+\n+ let response = client\n+ .xrpc(pds_url.clone())\n+ .send(\u0026request)\n+ .await\n+ .into_diagnostic()?;\n+\n+ let record_output = response.into_output().into_diagnostic()?;\n+ let record_cid = record_output.cid.as_ref().map(|c| c.to_string()).unwrap_or_default();\n+\n+ // Parse the record value as Fs\n+ use jacquard_common::types::value::from_data;\n+ let fs_record: Fs = from_data(\u0026record_output.value).into_diagnostic()?;\n+\n+ let file_count = fs_record.file_count.map(|c| c.to_string()).unwrap_or_else(|| \"?\".to_string());\n+ println!(\"Found site '{}' with {} files\", fs_record.site, file_count);\n+\n+ // Load existing metadata for incremental updates\n+ let existing_metadata = SiteMetadata::load(\u0026output_dir)?;\n+ let existing_file_cids = existing_metadata\n+ .as_ref()\n+ .map(|m| m.file_cids.clone())\n+ .unwrap_or_default();\n+\n+ // Extract blob map from the new manifest\n+ let new_blob_map = blob_map::extract_blob_map(\u0026fs_record.root);\n+ let new_file_cids: HashMap\u003cString, String\u003e = new_blob_map\n+ .iter()\n+ .map(|(path, (_blob_ref, cid))| (path.clone(), cid.clone()))\n+ .collect();\n+\n+ // Clean up any leftover temp directories from previous failed attempts\n+ let parent = output_dir.parent().unwrap_or_else(|| std::path::Path::new(\".\"));\n+ let output_name = output_dir.file_name().unwrap_or_else(|| std::ffi::OsStr::new(\"site\")).to_string_lossy();\n+ let temp_prefix = format!(\".tmp-{}-\", output_name);\n+ \n+ if let Ok(entries) = parent.read_dir() {\n+ for entry in entries.flatten() {\n+ let name = entry.file_name();\n+ if name.to_string_lossy().starts_with(\u0026temp_prefix) {\n+ let _ = std::fs::remove_dir_all(entry.path());\n+ }\n+ }\n+ }\n+\n+ // Check if we need to update (but only if output directory actually exists with files)\n+ if let Some(metadata) = \u0026existing_metadata {\n+ if metadata.record_cid == record_cid {\n+ // Verify that the output directory actually exists and has content\n+ let has_content = output_dir.exists() \u0026\u0026 \n+ output_dir.read_dir()\n+ .map(|mut entries| entries.any(|e| {\n+ if let Ok(entry) = e {\n+ !entry.file_name().to_string_lossy().starts_with(\".wisp-metadata\")\n+ } else {\n+ false\n+ }\n+ }))\n+ .unwrap_or(false);\n+ \n+ if has_content {\n+ println!(\"Site is already up to date!\");\n+ return Ok(());\n+ }\n+ }\n+ }\n+\n+ // Create temporary directory for atomic update\n+ // Place temp dir in parent directory to avoid issues with non-existent output_dir\n+ let parent = output_dir.parent().unwrap_or_else(|| std::path::Path::new(\".\"));\n+ let temp_dir_name = format!(\n+ \".tmp-{}-{}\",\n+ output_dir.file_name().unwrap_or_else(|| std::ffi::OsStr::new(\"site\")).to_string_lossy(),\n+ chrono::Utc::now().timestamp()\n+ );\n+ let temp_dir = parent.join(temp_dir_name);\n+ std::fs::create_dir_all(\u0026temp_dir).into_diagnostic()?;\n+\n+ println!(\"Downloading files...\");\n+ let mut downloaded = 0;\n+ let mut reused = 0;\n+\n+ // Download files recursively\n+ let download_result = download_directory(\n+ \u0026fs_record.root,\n+ \u0026temp_dir,\n+ \u0026pds_url,\n+ did.as_str(),\n+ \u0026new_blob_map,\n+ \u0026existing_file_cids,\n+ \u0026output_dir,\n+ String::new(),\n+ \u0026mut downloaded,\n+ \u0026mut reused,\n+ )\n+ .await;\n+\n+ // If download failed, clean up temp directory\n+ if let Err(e) = download_result {\n+ let _ = std::fs::remove_dir_all(\u0026temp_dir);\n+ return Err(e);\n+ }\n+\n+ println!(\n+ \"Downloaded {} files, reused {} files\",\n+ downloaded, reused\n+ );\n+\n+ // Save metadata\n+ let metadata = SiteMetadata::new(record_cid, new_file_cids);\n+ metadata.save(\u0026temp_dir)?;\n+\n+ // Move files from temp to output directory\n+ let output_abs = std::fs::canonicalize(\u0026output_dir).unwrap_or_else(|_| output_dir.clone());\n+ let current_dir = std::env::current_dir().into_diagnostic()?;\n+ \n+ // Special handling for pulling to current directory\n+ if output_abs == current_dir {\n+ // Move files from temp to current directory\n+ for entry in std::fs::read_dir(\u0026temp_dir).into_diagnostic()? {\n+ let entry = entry.into_diagnostic()?;\n+ let dest = current_dir.join(entry.file_name());\n+ \n+ // Remove existing file/dir if it exists\n+ if dest.exists() {\n+ if dest.is_dir() {\n+ std::fs::remove_dir_all(\u0026dest).into_diagnostic()?;\n+ } else {\n+ std::fs::remove_file(\u0026dest).into_diagnostic()?;\n+ }\n+ }\n+ \n+ // Move from temp to current dir\n+ std::fs::rename(entry.path(), dest).into_diagnostic()?;\n+ }\n+ \n+ // Clean up temp directory\n+ std::fs::remove_dir_all(\u0026temp_dir).into_diagnostic()?;\n+ } else {\n+ // If output directory exists and has content, remove it first\n+ if output_dir.exists() {\n+ std::fs::remove_dir_all(\u0026output_dir).into_diagnostic()?;\n+ }\n+ \n+ // Ensure parent directory exists\n+ if let Some(parent) = output_dir.parent() {\n+ if !parent.as_os_str().is_empty() \u0026\u0026 !parent.exists() {\n+ std::fs::create_dir_all(parent).into_diagnostic()?;\n+ }\n+ }\n+ \n+ // Rename temp to final location\n+ match std::fs::rename(\u0026temp_dir, \u0026output_dir) {\n+ Ok(_) =\u003e {},\n+ Err(e) =\u003e {\n+ // Clean up temp directory on failure\n+ let _ = std::fs::remove_dir_all(\u0026temp_dir);\n+ return Err(miette::miette!(\"Failed to move temp directory: {}\", e));\n+ }\n+ }\n+ }\n+\n+ println!(\"✓ Site pulled successfully to {}\", output_dir.display());\n+\n+ Ok(())\n+}\n+\n+/// Recursively download a directory\n+fn download_directory\u003c'a\u003e(\n+ dir: \u0026'a Directory\u003c'_\u003e,\n+ output_dir: \u0026'a Path,\n+ pds_url: \u0026'a Url,\n+ did: \u0026'a str,\n+ new_blob_map: \u0026'a HashMap\u003cString, (jacquard_common::types::blob::BlobRef\u003c'static\u003e, String)\u003e,\n+ existing_file_cids: \u0026'a HashMap\u003cString, String\u003e,\n+ existing_output_dir: \u0026'a Path,\n+ path_prefix: String,\n+ downloaded: \u0026'a mut usize,\n+ reused: \u0026'a mut usize,\n+) -\u003e std::pin::Pin\u003cBox\u003cdyn std::future::Future\u003cOutput = miette::Result\u003c()\u003e\u003e + Send + 'a\u003e\u003e {\n+ Box::pin(async move {\n+ for entry in \u0026dir.entries {\n+ let entry_name = entry.name.as_str();\n+ let current_path = if path_prefix.is_empty() {\n+ entry_name.to_string()\n+ } else {\n+ format!(\"{}/{}\", path_prefix, entry_name)\n+ };\n+\n+ match \u0026entry.node {\n+ EntryNode::File(file) =\u003e {\n+ let output_path = output_dir.join(entry_name);\n+\n+ // Check if file CID matches existing\n+ if let Some((_blob_ref, new_cid)) = new_blob_map.get(\u0026current_path) {\n+ if let Some(existing_cid) = existing_file_cids.get(\u0026current_path) {\n+ if existing_cid == new_cid {\n+ // File unchanged, copy from existing directory\n+ let existing_path = existing_output_dir.join(\u0026current_path);\n+ if existing_path.exists() {\n+ std::fs::copy(\u0026existing_path, \u0026output_path).into_diagnostic()?;\n+ *reused += 1;\n+ println!(\" ✓ Reused {}\", current_path);\n+ continue;\n+ }\n+ }\n+ }\n+ }\n+\n+ // File is new or changed, download it\n+ println!(\" ↓ Downloading {}\", current_path);\n+ let data = download::download_and_decompress_blob(\n+ pds_url,\n+ \u0026file.blob,\n+ did,\n+ file.base64.unwrap_or(false),\n+ file.encoding.as_ref().map(|e| e.as_str() == \"gzip\").unwrap_or(false),\n+ )\n+ .await?;\n+\n+ std::fs::write(\u0026output_path, data).into_diagnostic()?;\n+ *downloaded += 1;\n+ }\n+ EntryNode::Directory(subdir) =\u003e {\n+ let subdir_path = output_dir.join(entry_name);\n+ std::fs::create_dir_all(\u0026subdir_path).into_diagnostic()?;\n+\n+ download_directory(\n+ subdir,\n+ \u0026subdir_path,\n+ pds_url,\n+ did,\n+ new_blob_map,\n+ existing_file_cids,\n+ existing_output_dir,\n+ current_path,\n+ downloaded,\n+ reused,\n+ )\n+ .await?;\n+ }\n+ EntryNode::Unknown(_) =\u003e {\n+ // Skip unknown node types\n+ println!(\" ⚠ Skipping unknown node type for {}\", current_path);\n+ }\n+ }\n+ }\n+\n+ Ok(())\n+ })\n+}\n+\ndiff --git a/cli/src/serve.rs b/cli/src/serve.rs\nnew file mode 100644\nindex 0000000..240bf93\n--- /dev/null\n+++ b/cli/src/serve.rs\n@@ -0,0 +1,202 @@\n+use crate::pull::pull_site;\n+use axum::Router;\n+use jacquard::CowStr;\n+use jacquard_common::jetstream::{CommitOperation, JetstreamMessage, JetstreamParams};\n+use jacquard_common::types::string::Did;\n+use jacquard_common::xrpc::{SubscriptionClient, TungsteniteSubscriptionClient};\n+use miette::IntoDiagnostic;\n+use n0_future::StreamExt;\n+use std::path::PathBuf;\n+use std::sync::Arc;\n+use tokio::sync::RwLock;\n+use tower_http::compression::CompressionLayer;\n+use tower_http::services::ServeDir;\n+use url::Url;\n+\n+/// Shared state for the server\n+#[derive(Clone)]\n+struct ServerState {\n+ did: CowStr\u003c'static\u003e,\n+ rkey: CowStr\u003c'static\u003e,\n+ output_dir: PathBuf,\n+ last_cid: Arc\u003cRwLock\u003cOption\u003cString\u003e\u003e\u003e,\n+}\n+\n+/// Serve a site locally with real-time firehose updates\n+pub async fn serve_site(\n+ input: CowStr\u003c'static\u003e,\n+ rkey: CowStr\u003c'static\u003e,\n+ output_dir: PathBuf,\n+ port: u16,\n+) -\u003e miette::Result\u003c()\u003e {\n+ println!(\"Serving site {} from {} on port {}...\", rkey, input, port);\n+\n+ // Resolve handle to DID if needed\n+ use jacquard_identity::PublicResolver;\n+ use jacquard::prelude::IdentityResolver;\n+ \n+ let resolver = PublicResolver::default();\n+ let did = if input.starts_with(\"did:\") {\n+ Did::new(\u0026input).into_diagnostic()?\n+ } else {\n+ // It's a handle, resolve it\n+ let handle = jacquard_common::types::string::Handle::new(\u0026input).into_diagnostic()?;\n+ resolver.resolve_handle(\u0026handle).await.into_diagnostic()?\n+ };\n+ \n+ println!(\"Resolved to DID: {}\", did.as_str());\n+\n+ // Create output directory if it doesn't exist\n+ std::fs::create_dir_all(\u0026output_dir).into_diagnostic()?;\n+\n+ // Initial pull of the site\n+ println!(\"Performing initial pull...\");\n+ let did_str = CowStr::from(did.as_str().to_string());\n+ pull_site(did_str.clone(), rkey.clone(), output_dir.clone()).await?;\n+\n+ // Create shared state\n+ let state = ServerState {\n+ did: did_str.clone(),\n+ rkey: rkey.clone(),\n+ output_dir: output_dir.clone(),\n+ last_cid: Arc::new(RwLock::new(None)),\n+ };\n+\n+ // Start firehose listener in background\n+ let firehose_state = state.clone();\n+ tokio::spawn(async move {\n+ if let Err(e) = watch_firehose(firehose_state).await {\n+ eprintln!(\"Firehose error: {}\", e);\n+ }\n+ });\n+\n+ // Create HTTP server with gzip compression\n+ let app = Router::new()\n+ .fallback_service(\n+ ServeDir::new(\u0026output_dir)\n+ .precompressed_gzip()\n+ )\n+ .layer(CompressionLayer::new())\n+ .with_state(state);\n+\n+ let addr = format!(\"0.0.0.0:{}\", port);\n+ let listener = tokio::net::TcpListener::bind(\u0026addr)\n+ .await\n+ .into_diagnostic()?;\n+\n+ println!(\"\\n✓ Server running at http://localhost:{}\", port);\n+ println!(\" Watching for updates on the firehose...\\n\");\n+\n+ axum::serve(listener, app).await.into_diagnostic()?;\n+\n+ Ok(())\n+}\n+\n+/// Watch the firehose for updates to the specific site\n+fn watch_firehose(state: ServerState) -\u003e std::pin::Pin\u003cBox\u003cdyn std::future::Future\u003cOutput = miette::Result\u003c()\u003e\u003e + Send\u003e\u003e {\n+ Box::pin(async move {\n+ let jetstream_url = Url::parse(\"wss://jetstream1.us-east.fire.hose.cam\")\n+ .into_diagnostic()?;\n+\n+ println!(\"[Firehose] Connecting to Jetstream...\");\n+\n+ // Create subscription client\n+ let client = TungsteniteSubscriptionClient::from_base_uri(jetstream_url);\n+\n+ // Subscribe with no filters (we'll filter manually)\n+ // Jetstream doesn't support filtering by collection in the params builder\n+ let params = JetstreamParams::new().build();\n+\n+ let stream = client.subscribe(\u0026params).await.into_diagnostic()?;\n+ println!(\"[Firehose] Connected! Watching for updates...\");\n+\n+ // Convert to typed message stream\n+ let (_sink, mut messages) = stream.into_stream();\n+\n+ loop {\n+ match messages.next().await {\n+ Some(Ok(msg)) =\u003e {\n+ if let Err(e) = handle_firehose_message(\u0026state, msg).await {\n+ eprintln!(\"[Firehose] Error handling message: {}\", e);\n+ }\n+ }\n+ Some(Err(e)) =\u003e {\n+ eprintln!(\"[Firehose] Stream error: {}\", e);\n+ // Try to reconnect after a delay\n+ tokio::time::sleep(tokio::time::Duration::from_secs(5)).await;\n+ return Box::pin(watch_firehose(state)).await;\n+ }\n+ None =\u003e {\n+ println!(\"[Firehose] Stream ended, reconnecting...\");\n+ tokio::time::sleep(tokio::time::Duration::from_secs(5)).await;\n+ return Box::pin(watch_firehose(state)).await;\n+ }\n+ }\n+ }\n+ })\n+}\n+\n+/// Handle a firehose message\n+async fn handle_firehose_message(\n+ state: \u0026ServerState,\n+ msg: JetstreamMessage\u003c'_\u003e,\n+) -\u003e miette::Result\u003c()\u003e {\n+ match msg {\n+ JetstreamMessage::Commit {\n+ did,\n+ commit,\n+ ..\n+ } =\u003e {\n+ // Check if this is our site\n+ if did.as_str() == state.did.as_str()\n+ \u0026\u0026 commit.collection.as_str() == \"place.wisp.fs\"\n+ \u0026\u0026 commit.rkey.as_str() == state.rkey.as_str()\n+ {\n+ match commit.operation {\n+ CommitOperation::Create | CommitOperation::Update =\u003e {\n+ let new_cid = commit.cid.as_ref().map(|c| c.to_string());\n+ \n+ // Check if CID changed\n+ let should_update = {\n+ let last_cid = state.last_cid.read().await;\n+ new_cid != *last_cid\n+ };\n+\n+ if should_update {\n+ println!(\"\\n[Update] Detected change to site {} (CID: {:?})\", state.rkey, new_cid);\n+ println!(\"[Update] Pulling latest version...\");\n+\n+ // Pull the updated site\n+ match pull_site(\n+ state.did.clone(),\n+ state.rkey.clone(),\n+ state.output_dir.clone(),\n+ )\n+ .await\n+ {\n+ Ok(_) =\u003e {\n+ // Update last CID\n+ let mut last_cid = state.last_cid.write().await;\n+ *last_cid = new_cid;\n+ println!(\"[Update] ✓ Site updated successfully!\\n\");\n+ }\n+ Err(e) =\u003e {\n+ eprintln!(\"[Update] Failed to pull site: {}\", e);\n+ }\n+ }\n+ }\n+ }\n+ CommitOperation::Delete =\u003e {\n+ println!(\"\\n[Update] Site {} was deleted\", state.rkey);\n+ }\n+ }\n+ }\n+ }\n+ _ =\u003e {\n+ // Ignore identity and account messages\n+ }\n+ }\n+\n+ Ok(())\n+}\n+\n-- \n2.50.1 (Apple Git-155)\n\n\nFrom 436d7a062732626f17d71b91adccc8492e4d1977 Mon Sep 17 00:00:00 2001\nFrom: \"@nekomimi.pet\" \u003cmeowskulls@nekomimi.pet\u003e\nDate: Thu, 13 Nov 2025 00:32:52 -0500\nSubject: [PATCH 5/6] remove jacquard submodule\n\n---\n .gitmodules | 3 --\n cli/Cargo.lock | 136 ++++++++++++++++++++++++-------------------------\n cli/jacquard | 1 -\n 3 files changed, 68 insertions(+), 72 deletions(-)\n delete mode 100644 .gitmodules\n delete mode 160000 cli/jacquard\n\ndiff --git a/.gitmodules b/.gitmodules\ndeleted file mode 100644\nindex 784460f..0000000\n--- a/.gitmodules\n+++ /dev/null\n@@ -1,3 +0,0 @@\n-[submodule \"cli/jacquard\"]\n-\tpath = cli/jacquard\n-\turl = https://tangled.org/@nonbinary.computer/jacquard\ndiff --git a/cli/Cargo.lock b/cli/Cargo.lock\nindex 5fa5a99..8c1748e 100644\n--- a/cli/Cargo.lock\n+++ b/cli/Cargo.lock\n@@ -139,9 +139,9 @@ checksum = \"d92bec98840b8f03a5ff5413de5293bfcd8bf96467cf5452609f939ec6f5de16\"\n \n [[package]]\n name = \"async-compression\"\n-version = \"0.4.32\"\n+version = \"0.4.33\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n-checksum = \"5a89bce6054c720275ac2432fbba080a66a2106a44a1b804553930ca6909f4e0\"\n+checksum = \"93c1f86859c1af3d514fa19e8323147ff10ea98684e6c7b307912509f50e67b2\"\n dependencies = [\n \"compression-codecs\",\n \"compression-core\",\n@@ -158,7 +158,7 @@ checksum = \"9035ad2d096bed7955a320ee7e2230574d28fd3c3a0f186cbea1ff3c7eed5dbb\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -329,7 +329,7 @@ dependencies = [\n \"proc-macro2\",\n \"quote\",\n \"rustversion\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -428,9 +428,9 @@ dependencies = [\n \n [[package]]\n name = \"cc\"\n-version = \"1.2.44\"\n+version = \"1.2.45\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n-checksum = \"37521ac7aabe3d13122dc382493e20c9416f299d2ccd5b3a5340a2570cdeb0f3\"\n+checksum = \"35900b6c8d709fb1d854671ae27aeaa9eec2f8b01b364e1619a40da3e6fe2afe\"\n dependencies = [\n \"find-msvc-tools\",\n \"shlex\",\n@@ -555,7 +555,7 @@ dependencies = [\n \"heck 0.5.0\",\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -582,9 +582,9 @@ dependencies = [\n \n [[package]]\n name = \"compression-codecs\"\n-version = \"0.4.31\"\n+version = \"0.4.32\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n-checksum = \"ef8a506ec4b81c460798f572caead636d57d3d7e940f998160f52bd254bf2d23\"\n+checksum = \"680dc087785c5230f8e8843e2e57ac7c1c90488b6a91b88caa265410568f441b\"\n dependencies = [\n \"compression-core\",\n \"flate2\",\n@@ -593,9 +593,9 @@ dependencies = [\n \n [[package]]\n name = \"compression-core\"\n-version = \"0.4.29\"\n+version = \"0.4.30\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n-checksum = \"e47641d3deaf41fb1538ac1f54735925e275eaf3bf4d55c81b137fba797e5cbb\"\n+checksum = \"3a9b614a5787ef0c8802a55766480563cb3a93b435898c422ed2a359cf811582\"\n \n [[package]]\n name = \"const-oid\"\n@@ -736,7 +736,7 @@ dependencies = [\n \"proc-macro2\",\n \"quote\",\n \"strsim\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -747,7 +747,7 @@ checksum = \"d38308df82d1080de0afee5d069fa14b0326a88c14f15c5ccda35b4a6c414c81\"\n dependencies = [\n \"darling_core\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -787,7 +787,7 @@ source = \"registry+https://github.com/rust-lang/crates.io-index\"\n checksum = \"8d162beedaa69905488a8da94f5ac3edb4dd4788b732fadb7bd120b2625c1976\"\n dependencies = [\n \"data-encoding\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -838,7 +838,7 @@ checksum = \"cb7330aeadfbe296029522e6c40f315320aba36fc43a5b3632f3795348f3bd22\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n \"unicode-xid\",\n ]\n \n@@ -889,7 +889,7 @@ checksum = \"97369cbbc041bc366949bc74d34658d6cda5621039731c6310521892a3a20ae0\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -950,7 +950,7 @@ dependencies = [\n \"heck 0.5.0\",\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -1120,7 +1120,7 @@ checksum = \"162ee34ebcb7c64a8abebc059ce0fee27c2262618d7b60ed8faf72fef13c3650\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -1374,7 +1374,7 @@ dependencies = [\n \"markup5ever\",\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -1431,9 +1431,9 @@ checksum = \"df3b46402a9d5adb4c86a0cf463f42e19994e3ee891101b1841f30a545cb49a9\"\n \n [[package]]\n name = \"hyper\"\n-version = \"1.7.0\"\n+version = \"1.8.0\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n-checksum = \"eb3aa54a13a0dfe7fbe3a59e0c76093041720fdc77b110cc0fc260fafb4dc51e\"\n+checksum = \"1744436df46f0bde35af3eda22aeaba453aada65d8f1c171cd8a5f59030bd69f\"\n dependencies = [\n \"atomic-waker\",\n \"bytes\",\n@@ -1699,9 +1699,9 @@ checksum = \"469fb0b9cefa57e3ef31275ee7cacb78f2fdca44e4765491884a2b119d4eb130\"\n \n [[package]]\n name = \"iri-string\"\n-version = \"0.7.8\"\n+version = \"0.7.9\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n-checksum = \"dbc5ebe9c3a1a7a5127f920a418f7585e9e758e911d0466ed004f393b0e380b2\"\n+checksum = \"4f867b9d1d896b67beb18518eda36fdb77a32ea590de864f1325b294a6d14397\"\n dependencies = [\n \"memchr\",\n \"serde\",\n@@ -1728,7 +1728,7 @@ checksum = \"4a5f13b858c8d314ee3e8f639011f7ccefe71f97f96e50151fb991f267928e2c\"\n [[package]]\n name = \"jacquard\"\n version = \"0.9.0\"\n-source = \"git+https://tangled.org/@nonbinary.computer/jacquard#b5cc9b35e38e24e1890ae55e700dcfad0d6d433a\"\n+source = \"git+https://tangled.org/@nonbinary.computer/jacquard#5c79bb76de544cbd4fa8d5d8b01ba6e828f8ba65\"\n dependencies = [\n \"bytes\",\n \"getrandom 0.2.16\",\n@@ -1756,7 +1756,7 @@ dependencies = [\n [[package]]\n name = \"jacquard-api\"\n version = \"0.9.0\"\n-source = \"git+https://tangled.org/@nonbinary.computer/jacquard#b5cc9b35e38e24e1890ae55e700dcfad0d6d433a\"\n+source = \"git+https://tangled.org/@nonbinary.computer/jacquard#5c79bb76de544cbd4fa8d5d8b01ba6e828f8ba65\"\n dependencies = [\n \"bon\",\n \"bytes\",\n@@ -1774,7 +1774,7 @@ dependencies = [\n [[package]]\n name = \"jacquard-common\"\n version = \"0.9.0\"\n-source = \"git+https://tangled.org/@nonbinary.computer/jacquard#b5cc9b35e38e24e1890ae55e700dcfad0d6d433a\"\n+source = \"git+https://tangled.org/@nonbinary.computer/jacquard#5c79bb76de544cbd4fa8d5d8b01ba6e828f8ba65\"\n dependencies = [\n \"base64 0.22.1\",\n \"bon\",\n@@ -1815,19 +1815,19 @@ dependencies = [\n [[package]]\n name = \"jacquard-derive\"\n version = \"0.9.0\"\n-source = \"git+https://tangled.org/@nonbinary.computer/jacquard#b5cc9b35e38e24e1890ae55e700dcfad0d6d433a\"\n+source = \"git+https://tangled.org/@nonbinary.computer/jacquard#5c79bb76de544cbd4fa8d5d8b01ba6e828f8ba65\"\n dependencies = [\n \"heck 0.5.0\",\n \"jacquard-lexicon\",\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n name = \"jacquard-identity\"\n version = \"0.9.1\"\n-source = \"git+https://tangled.org/@nonbinary.computer/jacquard#b5cc9b35e38e24e1890ae55e700dcfad0d6d433a\"\n+source = \"git+https://tangled.org/@nonbinary.computer/jacquard#5c79bb76de544cbd4fa8d5d8b01ba6e828f8ba65\"\n dependencies = [\n \"bon\",\n \"bytes\",\n@@ -1853,7 +1853,7 @@ dependencies = [\n [[package]]\n name = \"jacquard-lexicon\"\n version = \"0.9.1\"\n-source = \"git+https://tangled.org/@nonbinary.computer/jacquard#b5cc9b35e38e24e1890ae55e700dcfad0d6d433a\"\n+source = \"git+https://tangled.org/@nonbinary.computer/jacquard#5c79bb76de544cbd4fa8d5d8b01ba6e828f8ba65\"\n dependencies = [\n \"cid\",\n \"dashmap\",\n@@ -1871,7 +1871,7 @@ dependencies = [\n \"serde_repr\",\n \"serde_with\",\n \"sha2\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n \"thiserror 2.0.17\",\n \"unicode-segmentation\",\n ]\n@@ -1879,7 +1879,7 @@ dependencies = [\n [[package]]\n name = \"jacquard-oauth\"\n version = \"0.9.0\"\n-source = \"git+https://tangled.org/@nonbinary.computer/jacquard#b5cc9b35e38e24e1890ae55e700dcfad0d6d433a\"\n+source = \"git+https://tangled.org/@nonbinary.computer/jacquard#5c79bb76de544cbd4fa8d5d8b01ba6e828f8ba65\"\n dependencies = [\n \"base64 0.22.1\",\n \"bytes\",\n@@ -2183,7 +2183,7 @@ checksum = \"db5b29714e950dbb20d5e6f74f9dcec4edbcc1067bb7f8ed198c097b8c1a818b\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -2338,9 +2338,9 @@ dependencies = [\n \n [[package]]\n name = \"num-bigint-dig\"\n-version = \"0.8.5\"\n+version = \"0.8.6\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n-checksum = \"82c79c15c05d4bf82b6f5ef163104cc81a760d8e874d38ac50ab67c8877b647b\"\n+checksum = \"e661dda6640fad38e827a6d4a310ff4763082116fe217f279885c97f511bb0b7\"\n dependencies = [\n \"lazy_static\",\n \"libm\",\n@@ -2486,7 +2486,7 @@ dependencies = [\n \"proc-macro2\",\n \"proc-macro2-diagnostics\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -2616,7 +2616,7 @@ checksum = \"6e918e4ff8c4549eb882f14b3a4bc8c8bc93de829416eacf579f1207a8fbf861\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -2689,7 +2689,7 @@ source = \"registry+https://github.com/rust-lang/crates.io-index\"\n checksum = \"479ca8adacdd7ce8f1fb39ce9ecccbfe93a3f1344b3d0d97f20bc0196208f62b\"\n dependencies = [\n \"proc-macro2\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -2742,7 +2742,7 @@ checksum = \"af066a9c399a26e020ada66a034357a868728e72cd426f3adcd35f80d88d88c8\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n \"version_check\",\n \"yansi\",\n ]\n@@ -2810,9 +2810,9 @@ dependencies = [\n \n [[package]]\n name = \"quote\"\n-version = \"1.0.41\"\n+version = \"1.0.42\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n-checksum = \"ce25767e7b499d1b604768e7cde645d14cc8584231ea6b295e9c9eb22c02e1d1\"\n+checksum = \"a338cc41d27e6cc6dce6cefc13a0729dfbb81c262b1f519331575dd80ef3067f\"\n dependencies = [\n \"proc-macro2\",\n ]\n@@ -2925,7 +2925,7 @@ checksum = \"b7186006dcb21920990093f30e3dea63b7d6e977bf1256be20c3563a5db070da\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -3103,9 +3103,9 @@ dependencies = [\n \n [[package]]\n name = \"rustls\"\n-version = \"0.23.34\"\n+version = \"0.23.35\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n-checksum = \"6a9586e9ee2b4f8fab52a0048ca7334d7024eef48e2cb9407e3497bb7cab7fa7\"\n+checksum = \"533f54bc6a7d4f647e46ad909549eda97bf5afc1585190ef692b4286b198bd8f\"\n dependencies = [\n \"once_cell\",\n \"ring\",\n@@ -3198,9 +3198,9 @@ dependencies = [\n \n [[package]]\n name = \"schemars\"\n-version = \"1.0.4\"\n+version = \"1.1.0\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n-checksum = \"82d20c4491bc164fa2f6c5d44565947a52ad80b9505d8e36f8d54c27c739fcd0\"\n+checksum = \"9558e172d4e8533736ba97870c4b2cd63f84b382a3d6eb063da41b91cce17289\"\n dependencies = [\n \"dyn-clone\",\n \"ref-cast\",\n@@ -3300,7 +3300,7 @@ checksum = \"d540f220d3187173da220f885ab66608367b6574e925011a9353e4badda91d79\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -3360,7 +3360,7 @@ checksum = \"175ee3e80ae9982737ca543e96133087cbd9a485eecc3bc4de9c1a37b47ea59c\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -3387,7 +3387,7 @@ dependencies = [\n \"indexmap 1.9.3\",\n \"indexmap 2.12.0\",\n \"schemars 0.9.0\",\n- \"schemars 1.0.4\",\n+ \"schemars 1.1.0\",\n \"serde_core\",\n \"serde_json\",\n \"serde_with_macros\",\n@@ -3403,7 +3403,7 @@ dependencies = [\n \"darling\",\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -3575,7 +3575,7 @@ dependencies = [\n \"quote\",\n \"serde\",\n \"sha2\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n \"thiserror 1.0.69\",\n ]\n \n@@ -3656,9 +3656,9 @@ dependencies = [\n \n [[package]]\n name = \"syn\"\n-version = \"2.0.108\"\n+version = \"2.0.110\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n-checksum = \"da58917d35242480a05c2897064da0a80589a2a0476c9a3f2fdc83b53502e917\"\n+checksum = \"a99801b5bd34ede4cf3fc688c5919368fea4e4814a4664359503e6015b280aea\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n@@ -3682,7 +3682,7 @@ checksum = \"728a70f3dbaf5bab7f0c4b1ac8d7ae5ea60a4b5549c8a5914361c99147a709d2\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -3782,7 +3782,7 @@ checksum = \"4fee6c4efc90059e10f81e6d42c60a18f76588c3d74cb83a0b242a2b6c7504c1\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -3793,7 +3793,7 @@ checksum = \"3ff15c8ecd7de3849db632e14d18d2571fa09dfc5ed93479bc4485c7a517c913\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -3909,7 +3909,7 @@ checksum = \"af407857209536a95c8e56f8231ef2c2e2aff839b22e07a1ffcbc617e9db9fa5\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -3959,9 +3959,9 @@ dependencies = [\n \n [[package]]\n name = \"tokio-util\"\n-version = \"0.7.16\"\n+version = \"0.7.17\"\n source = \"registry+https://github.com/rust-lang/crates.io-index\"\n-checksum = \"14307c986784f72ef81c89db7d9e28d6ac26d16213b109ea501696195e6e3ce5\"\n+checksum = \"2efa149fe76073d6e8fd97ef4f4eca7b67f599660115591483572e406e165594\"\n dependencies = [\n \"bytes\",\n \"futures-core\",\n@@ -4075,7 +4075,7 @@ checksum = \"81383ab64e72a7a8b8e13130c49e3dab29def6d0c7d76a03087b3cf71c5c6903\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -4125,7 +4125,7 @@ checksum = \"70977707304198400eb4835a78f6a9f928bf41bba420deb8fdb175cd965d77a7\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -4356,7 +4356,7 @@ dependencies = [\n \"bumpalo\",\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n \"wasm-bindgen-shared\",\n ]\n \n@@ -4521,7 +4521,7 @@ checksum = \"053e2e040ab57b9dc951b72c264860db7eb3b0200ba345b4e4c3b14f67855ddf\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -4532,7 +4532,7 @@ checksum = \"3f316c4a2570ba26bbec722032c4099d8c8bc095efccdc15688708623367e358\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -4995,7 +4995,7 @@ checksum = \"b659052874eb698efe5b9e8cf382204678a0086ebf46982b79d6ca3182927e5d\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n \"synstructure\",\n ]\n \n@@ -5016,7 +5016,7 @@ checksum = \"88d2b8d9c68ad2b9e4340d7832716a4d21a22a1154777ad56ea55c51a9cf3831\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\n \n [[package]]\n@@ -5036,7 +5036,7 @@ checksum = \"d71e5d6e06ab090c67b5e44993ec16b72dcbaabc526db883a360057678b48502\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n \"synstructure\",\n ]\n \n@@ -5079,5 +5079,5 @@ checksum = \"eadce39539ca5cb3985590102671f2567e659fca9666581ad3411d59207951f3\"\n dependencies = [\n \"proc-macro2\",\n \"quote\",\n- \"syn 2.0.108\",\n+ \"syn 2.0.110\",\n ]\ndiff --git a/cli/jacquard b/cli/jacquard\ndeleted file mode 160000\nindex d533482..0000000\n--- a/cli/jacquard\n+++ /dev/null\n@@ -1 +0,0 @@\n-Subproject commit d533482a61f540586b1eea620b8e9a01a59d5650\n-- \n2.50.1 (Apple Git-155)\n\n\nFrom 122e18dd70661462c0e9e325636a820367f0d893 Mon Sep 17 00:00:00 2001\nFrom: \"@nekomimi.pet\" \u003cmeowskulls@nekomimi.pet\u003e\nDate: Thu, 13 Nov 2025 02:31:33 -0500\nSubject: [PATCH 6/6] update flake\n\n---\n cli/Cargo.lock | 2 +-\n cli/Cargo.toml | 2 +-\n crates.nix | 29 ++++++++++++++++++++++++++++-\n flake.nix | 19 +++++++++++++++++--\n 4 files changed, 47 insertions(+), 5 deletions(-)\n\ndiff --git a/cli/Cargo.lock b/cli/Cargo.lock\nindex 8c1748e..c553def 100644\n--- a/cli/Cargo.lock\n+++ b/cli/Cargo.lock\n@@ -4913,7 +4913,7 @@ dependencies = [\n \n [[package]]\n name = \"wisp-cli\"\n-version = \"0.1.0\"\n+version = \"0.2.0\"\n dependencies = [\n \"axum\",\n \"base64 0.22.1\",\ndiff --git a/cli/Cargo.toml b/cli/Cargo.toml\nindex c3eb22c..af3bc4b 100644\n--- a/cli/Cargo.toml\n+++ b/cli/Cargo.toml\n@@ -1,6 +1,6 @@\n [package]\n name = \"wisp-cli\"\n-version = \"0.1.0\"\n+version = \"0.2.0\"\n edition = \"2024\"\n \n [features]\ndiff --git a/crates.nix b/crates.nix\nindex 9dbb8e0..21fea62 100644\n--- a/crates.nix\n+++ b/crates.nix\n@@ -19,6 +19,7 @@\n targets.x86_64-pc-windows-gnu.latest.rust-std\n targets.x86_64-unknown-linux-gnu.latest.rust-std\n targets.aarch64-apple-darwin.latest.rust-std\n+ targets.aarch64-unknown-linux-gnu.latest.rust-std\n ];\n # configure crates\n nci.crates.\"wisp-cli\" = {\n@@ -26,8 +27,20 @@\n dev.runTests = false;\n release.runTests = false;\n };\n- targets.\"x86_64-unknown-linux-gnu\" = {\n+ targets.\"x86_64-unknown-linux-gnu\" = let\n+ targetPkgs = pkgs.pkgsCross.gnu64;\n+ targetCC = targetPkgs.stdenv.cc;\n+ targetCargoEnvVarTarget = targetPkgs.stdenv.hostPlatform.rust.cargoEnvVarTarget;\n+ in rec {\n default = true;\n+ depsDrvConfig.mkDerivation = {\n+ nativeBuildInputs = [targetCC];\n+ };\n+ depsDrvConfig.env = rec {\n+ TARGET_CC = \"${targetCC.targetPrefix}cc\";\n+ \"CARGO_TARGET_${targetCargoEnvVarTarget}_LINKER\" = TARGET_CC;\n+ };\n+ drvConfig = depsDrvConfig;\n };\n targets.\"x86_64-pc-windows-gnu\" = let\n targetPkgs = pkgs.pkgsCross.mingwW64;\n@@ -58,6 +71,20 @@\n };\n drvConfig = depsDrvConfig;\n };\n+ targets.\"aarch64-unknown-linux-gnu\" = let\n+ targetPkgs = pkgs.pkgsCross.aarch64-multiplatform;\n+ targetCC = targetPkgs.stdenv.cc;\n+ targetCargoEnvVarTarget = targetPkgs.stdenv.hostPlatform.rust.cargoEnvVarTarget;\n+ in rec {\n+ depsDrvConfig.mkDerivation = {\n+ nativeBuildInputs = [targetCC];\n+ };\n+ depsDrvConfig.env = rec {\n+ TARGET_CC = \"${targetCC.targetPrefix}cc\";\n+ \"CARGO_TARGET_${targetCargoEnvVarTarget}_LINKER\" = TARGET_CC;\n+ };\n+ drvConfig = depsDrvConfig;\n+ };\n };\n };\n }\ndiff --git a/flake.nix b/flake.nix\nindex 1870e01..a8f33e2 100644\n--- a/flake.nix\n+++ b/flake.nix\n@@ -26,11 +26,26 @@\n ...\n }: let\n crateOutputs = config.nci.outputs.\"wisp-cli\";\n+ mkRenamedPackage = name: pkg: pkgs.runCommand name {} ''\n+ mkdir -p $out/bin\n+ cp ${pkg}/bin/wisp-cli $out/bin/${name}\n+ '';\n in {\n devShells.default = crateOutputs.devShell;\n packages.default = crateOutputs.packages.release;\n- packages.wisp-cli-windows = crateOutputs.allTargets.\"x86_64-pc-windows-gnu\".packages.release;\n- packages.wisp-cli-darwin = crateOutputs.allTargets.\"aarch64-apple-darwin\".packages.release;\n+ packages.wisp-cli-x86_64-linux = mkRenamedPackage \"wisp-cli-x86_64-linux\" crateOutputs.packages.release;\n+ packages.wisp-cli-aarch64-linux = mkRenamedPackage \"wisp-cli-aarch64-linux\" crateOutputs.allTargets.\"aarch64-unknown-linux-gnu\".packages.release;\n+ packages.wisp-cli-x86_64-windows = mkRenamedPackage \"wisp-cli-x86_64-windows.exe\" crateOutputs.allTargets.\"x86_64-pc-windows-gnu\".packages.release;\n+ packages.wisp-cli-aarch64-darwin = mkRenamedPackage \"wisp-cli-aarch64-darwin\" crateOutputs.allTargets.\"aarch64-apple-darwin\".packages.release;\n+ packages.all = pkgs.symlinkJoin {\n+ name = \"wisp-cli-all\";\n+ paths = [\n+ config.packages.wisp-cli-x86_64-linux\n+ config.packages.wisp-cli-aarch64-linux\n+ config.packages.wisp-cli-x86_64-windows\n+ config.packages.wisp-cli-aarch64-darwin\n+ ];\n+ };\n };\n };\n }\n-- \n2.50.1 (Apple Git-155)\n\n",
"target": {
"branch": "main",
"repo": "at://did:plc:ttdrpj45ibqunmfhdsb4zdwq/sh.tangled.repo/3m4wgtddvwv22"
},
"title": "rudimentry _redirects support, incremental uploading for cli"
}