Deploying SvelteKit to Cloudflare: Setup & CSP

Getting SvelteKit running on Cloudflare used to mean juggling multiple configuration files, installing adapters manually, and debugging environment variable issues across dev and production. In January 2025, the Svelte team shipped an update that changes this. The Svelte CLI can now fully set up a Cloudflare project from scratch, bake in Content Security Policy (CSP) support for hydration, and handle streaming file uploads in forms. If you've been hesitant about Cloudflare or burned out on setup overhead, this is worth a closer look.
Link to section: What Actually ChangedWhat Actually Changed
The big shifts happened across three areas: CLI automation, security headers, and file handling.
First, the sv CLI (Svelte's new command-line tool) now includes built-in support for Cloudflare Workers and Pages. When you run npx sv create my-app, you can choose Cloudflare as your target right during project setup. The CLI then installs the right adapter, generates a wrangler.json config, and connects your bindings (KV, D1, Durable Objects) to your SvelteKit hooks. No more copy-pasting config from five different docs.
Second, the hydratable option in Svelte 5.46.0 added a csp parameter. This matters because hydration injects an inline script block into your HTML head. If you enforce a strict Content Security Policy that blocks inline scripts, that injection fails silently and your app won't become interactive. The new csp option tells Svelte to use nonce-based CSP instead of relying on inline script tags. You supply the nonce on each request, and hydration works cleanly.
Third, file upload streaming in SvelteKit 2.49.0 lets form submissions send file data to the server as it arrives, not after the entire upload finishes. This cuts perceived latency and lets you start processing or validating files before the user's browser finishes uploading.
These three features are independent, but together they solve a real pain point: how do you deploy a performant, secure, real-world SvelteKit app to Cloudflare without wrestling configuration?
Link to section: The Automatic Setup FlowThe Automatic Setup Flow
Open a terminal and run:
npx sv create my-appWhen prompted, select the template (I'll use "demo" for this walkthrough). Say yes to TypeScript. Then, instead of leaving the CLI and manually running npm install @sveltejs/adapter-cloudflare, the CLI will ask you what you want to add. Select the "sveltekit-adapter" add-on.
? What would you like to add to your project?
[ ] eslint
[ ] prettier
> [x] sveltekit-adapter
[ ] tailwindcss
[ ] drizzle
[ ] ...The sveltekit-adapter add-on detects your environment and installs the Cloudflare adapter for you. It also generates a wrangler.json file in your project root:
{
"name": "my-app",
"type": "javascript",
"main": "build/index.js",
"compatibility_date": "2025-01-12",
"env": {
"production": {
"name": "my-app-production",
"kv_namespaces": [
{
"binding": "CACHE",
"id": "abc123...",
"preview_id": "xyz789..."
}
],
"d1_databases": [
{
"binding": "DB",
"database_name": "my-app-db",
"database_id": "def456..."
}
]
}
}
}This file binds Cloudflare resources to your SvelteKit app. In your hooks.server.ts, you access them via the platform property:
export const handle = async ({ event, resolve }) => {
const { platform } = event;
const cache = platform?.env?.CACHE;
const db = platform?.env?.DB;
// Use cache and db in your routes
return resolve(event);
};Deploy with:
npx wrangler deployThat's it. No adapter installation step, no searching for config keys, no "why is my env variable undefined in production?"

Link to section: Securing Hydration with CSPSecuring Hydration with CSP
Content Security Policy headers tell the browser what resources it's allowed to load. A strict CSP might look like:
Content-Security-Policy:
script-src 'nonce-abc123def456';
style-src 'nonce-xyz789';
connect-src https://api.example.com;
default-src 'self';
This says: only load scripts with nonce abc123def456, only load styles with nonce xyz789, and connect to your API. Everything else is blocked.
By default, Svelte 5 injects hydration as an inline script. If your CSP doesn't allow inline scripts (which is the most secure option), hydration never runs and your app stays a static snapshot. You'd see no errors, just unresponsive buttons and forms.
Svelte 5.46.0 added a csp option to the hydratable render function. In your SvelteKit +page.server.ts:
import { render } from 'svelte/server';
import App from './App.svelte';
export async function load({ url, platform }) {
// Generate a random nonce for this request
const nonce = crypto.getRandomValues(new Uint8Array(16))
.reduce((a, b) => a + b.toString(16), '');
const html = render(App, {
props: { url: url.pathname }
}).html;
// Inject nonce into CSP-aware hydration
const hydratable = render(App, {
hydrate: { nonce }
}).html;
return {
html,
headers: {
'Content-Security-Policy': `script-src 'nonce-${nonce}'; style-src 'nonce-${nonce}'; default-src 'self'`
}
};
}In your app.html:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width" />
%svelte.head%
</head>
<body>
<div id="app">%svelte.body%</div>
</body>
</html>Svelte renders the hydration script with your nonce inline. The browser sees a nonce that matches your CSP header and allows it. Hydration runs, interactivity is restored, and you've blocked everything else. This is the gold standard for security.
The catch: you must generate a new nonce per request. Reusing the same nonce for all users defeats the purpose. If you're using Cloudflare Workers or Pages Functions, generating a nonce on every request is cheap and fast (usually <1ms).
Link to section: Handling File Uploads with StreamingHandling File Uploads with Streaming
File uploads have always been awkward in web forms. You submit a form with a file input, the browser buffers the entire file in memory, and then sends it all at once. If the file is large or the network is slow, the user stares at a spinner for a long time.
SvelteKit 2.49.0 introduced streaming file uploads. Your form can send file data to the server as it arrives, and you can start processing or validating it before the upload completes. Here's a practical example.
Create a +page.server.ts with a form action:
import { fail } from '@sveltejs/kit';
export const actions = {
upload: async ({ request }) => {
const data = await request.formData();
const file = data.get('file');
if (!file || !(file instanceof File)) {
return fail(400, { error: 'No file provided' });
}
if (file.size > 50 * 1024 * 1024) {
return fail(413, { error: 'File too large (max 50MB)' });
}
// Process the file stream
const chunks = [];
const reader = file.stream().getReader();
try {
while (true) {
const { done, value } = await reader.read();
if (done) break;
chunks.push(value);
// You could validate, scan, or store chunks here
}
} catch (err) {
return fail(500, { error: 'Upload failed' });
}
return { success: true, fileName: file.name };
}
};And a form in +page.svelte:
<script>
import { enhance } from '$app/forms';
let uploading = false;
</script>
<form method="POST" action="?/upload" enctype="multipart/form-data" use:enhance={() => {
uploading = true;
return async ({ result }) => {
uploading = false;
if (result.type === 'success') {
alert(`Uploaded ${result.data.fileName}`);
}
};
}}>
<input type="file" name="file" required />
<button disabled={uploading}>
{uploading ? 'Uploading...' : 'Upload'}
</button>
</form>The key difference: the file is read as a stream on the server. Each chunk arrives and can be processed independently. If you're storing to an S3-like bucket, you can pipe chunks directly to the storage service instead of buffering in memory. If you're running a virus scanner, you can scan chunks as they arrive. This is much more efficient than the old approach of waiting for the entire file before doing anything.
On Cloudflare, you can pipe uploaded chunks to R2 (Cloudflare's S3 equivalent):
export const actions = {
upload: async ({ request, platform }) => {
const data = await request.formData();
const file = data.get('file');
if (!file || !(file instanceof File)) {
return fail(400, { error: 'No file provided' });
}
const r2 = platform.env.BUCKET; // Your R2 bucket binding
const key = `uploads/${Date.now()}-${file.name}`;
try {
await r2.put(key, file.stream(), {
httpMetadata: { contentType: file.type }
});
return { success: true, key };
} catch (err) {
return fail(500, { error: 'Storage failed' });
}
}
};This streams the file directly from the form to R2 without buffering on the server. Memory usage is constant, and latency is low.
Link to section: Putting It Together: A Real ExamplePutting It Together: A Real Example
Let me walk through a concrete scenario: a SvelteKit app that collects user documents and stores them securely on Cloudflare R2, with a strict CSP and automatic deployment.
First, create the project:
npx sv create secure-docs
cd secure-docs
npx sv add sveltekit-adapter
npm run buildSet up your svelte.config.js:
import adapter from '@sveltejs/adapter-cloudflare';
export default {
kit: {
adapter: adapter(),
csp: {
mode: 'hash'
}
}
};Create a +page.server.ts with a document upload handler:
import { fail } from '@sveltejs/kit';
export async function load({ platform }) {
return {
isProduction: platform?.env?.ENVIRONMENT === 'production'
};
}
export const actions = {
upload: async ({ request, platform }) => {
const data = await request.formData();
const file = data.get('document');
if (!file || !(file instanceof File)) {
return fail(400, { missing: true });
}
if (!['application/pdf', 'image/png', 'image/jpeg'].includes(file.type)) {
return fail(415, { unsupported: true });
}
if (file.size > 10 * 1024 * 1024) {
return fail(413, { toolarge: true });
}
const bucket = platform?.env?.DOCUMENTS;
if (!bucket) {
return fail(500, { message: 'Storage not configured' });
}
const timestamp = Date.now();
const key = `docs/${timestamp}/${file.name}`;
try {
await bucket.put(key, file.stream(), {
httpMetadata: { contentType: file.type }
});
return { success: true, url: `/documents/${key}` };
} catch (err) {
console.error('Upload error:', err);
return fail(500, { message: 'Upload failed' });
}
}
};And a form component in +page.svelte:
<script>
import { enhance } from '$app/forms';
let loading = false;
let message = '';
</script>
<form method="POST" action="?/upload" enctype="multipart/form-data" use:enhance={() => {
loading = true;
return async ({ result }) => {
loading = false;
if (result.type === 'success') {
message = 'Document uploaded successfully';
} else if (result.type === 'failure') {
message = result.data?.message || 'Upload failed';
}
};
}}>
<input type="file" name="document" accept=".pdf, .png, .jpg" required />
<button disabled={loading} type="submit">
{loading ? 'Uploading...' : 'Upload Document'}
</button>
{#if message}
<p>{message}</p>
{/if}
</form>Configure CSP in your hooks:
// hooks.server.ts
export async function handle({ event, resolve }) {
const nonce = crypto.getRandomValues(new Uint8Array(16))
.reduce((a, b) => a + b.toString(16), '');
event.locals.nonce = nonce;
const response = await resolve(event);
response.headers.append(
'Content-Security-Policy',
`script-src 'nonce-${nonce}'; style-src 'nonce-${nonce}'; default-src 'self'; connect-src 'self'`
);
return response;
}Set up your wrangler.json:
{
"name": "secure-docs",
"main": "build/index.js",
"type": "javascript",
"compatibility_date": "2025-01-12",
"r2_buckets": [
{
"binding": "DOCUMENTS",
"bucket_name": "secure-docs-bucket",
"preview_bucket_name": "secure-docs-preview"
}
],
"env": {
"production": {
"name": "secure-docs-prod",
"r2_buckets": [
{
"binding": "DOCUMENTS",
"bucket_name": "secure-docs-production",
"preview_bucket_name": "secure-docs-preview"
}
]
}
}
}Deploy:
npx wrangler deploy --env productionThat's a full, production-ready document upload system. The file never gets buffered in a server process. It streams from the form directly to R2. CSP is enforced per-request using a random nonce. And the entire setup happened with three CLI commands and a handful of configuration files.
Link to section: Trade-offs and When to Use ThisTrade-offs and When to Use This
Cloudflare Workers and Pages are excellent for stateless, compute-light workloads. They scale automatically, cost predictably (you pay per request, not per running process), and cold starts are negligible (<10ms typical).
But they have limits. Each request has a 30-second timeout and 128MB memory. If you need long-running background jobs or gigabytes of RAM for data processing, a traditional Node server on Vercel or a self-hosted Docker instance is better. Streaming file uploads help here, but even with streams, you can't process a 1GB file if your function memory is capped at 128MB.
Also, Cloudflare KV and D1 have eventual consistency. If you write a value and read it immediately, you might get the old version. This is fine for caches and session storage, but not for transactional data where consistency is critical. For those cases, external databases like Postgres or DynamoDB are safer.
CSP with nonces is more secure than inline scripts, but it adds latency if you generate a nonce on every request. On Cloudflare, it's negligible. On slower platforms, you might want to cache nonces or use a different hydration strategy.
File upload streaming is a win across the board: it reduces server memory, cuts latency, and lets you process data as it arrives. The only gotcha is that older browsers don't support the Streams API. But for modern apps, it's worth adopting.
Link to section: Next StepsNext Steps
If you're running SvelteKit on a Node server or Vercel, migrating to Cloudflare is straightforward:
- Run
npx sv add sveltekit-adapter="adapter:cloudflare"to install and configure the adapter. - Create a
wrangler.jsonand bind your resources (KV, D1, R2). - Update your hooks to generate nonces for CSP.
- Test locally with
wrangler dev. - Deploy with
wrangler deploy.
If you're already on Cloudflare, audit your forms to see if streaming uploads could save you memory or latency. And if you're not using CSP yet, start with a report-only policy (Content-Security-Policy-Report-Only) to see what breaks, then tighten it.
The tooling has genuinely improved. The setup that used to take an hour and a mental model of six moving parts now takes 10 minutes and a linear CLI flow. That's worth paying attention to.

