Skip to content

Auth

LiveStore doesn’t include built-in authentication or authorization support, but you can implement it in your app’s logic.

Use the syncPayload store option to send a custom payload to your sync backend.

The following example sends the authenticated user’s JWT to the server.

export const
const AuthenticatedProvider: () => JSX.Element
AuthenticatedProvider
= () => (
<
const LiveStoreProvider: <TSyncPayloadSchema extends Schema<any> = Schema<JsonValue, JsonValue, never>>({ renderLoading, renderError, renderShutdown, otelOptions, children, schema, storeId, boot, adapter, batchUpdates, disableDevtools, signal, confirmUnsavedChanges, syncPayload, syncPayloadSchema, debug, }: LiveStoreProviderProps<TSyncPayloadSchema> & React.PropsWithChildren) => React.ReactNode
LiveStoreProvider
LiveStoreProviderProps<TSyncPayloadSchema extends Schema<any> = Schema<JsonValue, JsonValue, never>>.schema: LiveStoreSchema<DbSchema, EventDefRecord>
schema
={
const schema: LiveStoreSchema<DbSchema, EventDefRecord>
schema
}
LiveStoreProviderProps<TSyncPayloadSchema extends Schema<any> = Schema<JsonValue, JsonValue, never>>.storeId?: string

The storeId can be used to isolate multiple stores from each other. So it can be useful for multi-tenancy scenarios.

The storeId is also used for persistence.

Make sure to also configure storeId in LiveStore Devtools (e.g. in Vite plugin).

@default'default'

storeId
={
const storeId: "demo-store"
storeId
}
LiveStoreProviderProps<TSyncPayloadSchema extends Schema<any> = Schema<JsonValue, JsonValue, never>>.adapter: Adapter
adapter
={
const adapter: Adapter
adapter
}
LiveStoreProviderProps<Schema<JsonValue, JsonValue, never>>.batchUpdates: (run: () => void) => void

In order for LiveStore to apply multiple events in a single render, you need to pass the batchUpdates function from either react-dom or react-native.

// With React DOM
import { unstable_batchedUpdates as batchUpdates } from 'react-dom'
// With React Native
import { unstable_batchedUpdates as batchUpdates } from 'react-native'

batchUpdates
={
function batchUpdates<A, R>(callback: (a: A) => R, a: A): R (+1 overload)
batchUpdates
}
LiveStoreProviderProps<Schema<JsonValue, JsonValue, never>>.syncPayload?: JsonValue
syncPayload
={{
authToken: string
authToken
:
const user: {
jwt: string;
}
user
.
jwt: string
jwt
, // Using a JWT
}}
>
{/* ... */}
{
const children: null
children
}
)

On the sync server, validate the token and allow or reject the sync based on the result. See the following example:

import {
const makeDurableObject: MakeDurableObjectClass

Creates a Durable Object class for handling WebSocket-based sync. A sync durable object is uniquely scoped to a specific storeId.

The sync DO supports 3 transport modes:

  • HTTP JSON-RPC
  • WebSocket
  • Durable Object RPC calls (only works in combination with @livestore/adapter-cf)

Example:

// In your Cloudflare Worker file
import { makeDurableObject } from '@livestore/sync-cf/cf-worker'
export class SyncBackendDO extends makeDurableObject({
onPush: async (message) => {
console.log('onPush', message.batch)
},
onPull: async (message) => {
console.log('onPull', message)
},
}) {}

wrangler.toml

[[durable_objects.bindings]]
name = "SYNC_BACKEND_DO"
class_name = "SyncBackendDO"
[[migrations]]
tag = "v1"
new_sqlite_classes = ["SyncBackendDO"]

makeDurableObject
,
const makeWorker: <TEnv extends Env = Env, TDurableObjectRpc extends Rpc.DurableObjectBranded | undefined = undefined, TSyncPayload = JsonValue>(options: MakeWorkerOptions<TEnv, TSyncPayload>) => CFWorker<TEnv, TDurableObjectRpc>

Produces a Cloudflare Worker fetch handler that delegates sync traffic to the Durable Object identified by syncBackendBinding.

For more complex setups prefer implementing a custom fetch and call

handleSyncRequest

from the branch that handles LiveStore sync requests.

makeWorker
} from '@livestore/sync-cf/cf-worker'
import * as
import jose
jose
from 'jose'
const
const JWT_SECRET: "a-string-secret-at-least-256-bits-long"
JWT_SECRET
= 'a-string-secret-at-least-256-bits-long'
export class
class SyncBackendDO
SyncBackendDO
extends
function makeDurableObject(options?: MakeDurableObjectClassOptions): {
new (ctx: DoState, env: Env): DoObject<SyncBackendRpcInterface>;
}

Creates a Durable Object class for handling WebSocket-based sync. A sync durable object is uniquely scoped to a specific storeId.

The sync DO supports 3 transport modes:

  • HTTP JSON-RPC
  • WebSocket
  • Durable Object RPC calls (only works in combination with @livestore/adapter-cf)

Example:

// In your Cloudflare Worker file
import { makeDurableObject } from '@livestore/sync-cf/cf-worker'
export class SyncBackendDO extends makeDurableObject({
onPush: async (message) => {
console.log('onPush', message.batch)
},
onPull: async (message) => {
console.log('onPull', message)
},
}) {}

wrangler.toml

[[durable_objects.bindings]]
name = "SYNC_BACKEND_DO"
class_name = "SyncBackendDO"
[[migrations]]
tag = "v1"
new_sqlite_classes = ["SyncBackendDO"]

makeDurableObject
({
onPush?: (message: PushRequest, context: {
storeId: StoreId;
payload?: JsonValue;
}) => SyncOrPromiseOrEffect<void>
onPush
: async (
message: {
readonly batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[];
readonly backendId: Option<string>;
}
message
) => {
var console: Console

The console module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers.

The module exports two specific components:

  • A Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
  • A global console instance configured to write to process.stdout and process.stderr. The global console can be used without importing the node:console module.

Warning: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the note on process I/O for more information.

Example using the global console:

console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr

Example using the Console class:

const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err

@seesource

console
.
Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)

Prints to stdout with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to printf(3) (the arguments are all passed to util.format()).

const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout

See util.format() for more information.

@sincev0.1.100

log
('onPush',
message: {
readonly batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[];
readonly backendId: Option<string>;
}
message
.
batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[]
batch
)
},
onPull?: (message: PullRequest, context: {
storeId: StoreId;
payload?: JsonValue;
}) => SyncOrPromiseOrEffect<void>
onPull
: async (
message: {
readonly cursor: Option<{
readonly backendId: string;
readonly eventSequenceNumber: number & Brand<"GlobalEventSequenceNumber">;
}>;
}
message
) => {
var console: Console

The console module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers.

The module exports two specific components:

  • A Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
  • A global console instance configured to write to process.stdout and process.stderr. The global console can be used without importing the node:console module.

Warning: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the note on process I/O for more information.

Example using the global console:

console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr

Example using the Console class:

const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err

@seesource

console
.
Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)

Prints to stdout with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to printf(3) (the arguments are all passed to util.format()).

const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout

See util.format() for more information.

@sincev0.1.100

log
('onPull',
message: {
readonly cursor: Option<{
readonly backendId: string;
readonly eventSequenceNumber: number & Brand<"GlobalEventSequenceNumber">;
}>;
}
message
)
},
}) {}
export default
makeWorker<Env, undefined, any>(options: MakeWorkerOptions<Env, any>): CFWorker<Env, undefined>

Produces a Cloudflare Worker fetch handler that delegates sync traffic to the Durable Object identified by syncBackendBinding.

For more complex setups prefer implementing a custom fetch and call

handleSyncRequest

from the branch that handles LiveStore sync requests.

makeWorker
({
syncBackendBinding: string

Binding name of the sync Durable Object declared in wrangler config.

syncBackendBinding
: 'SYNC_BACKEND_DO',
validatePayload?: (payload: any, context: {
storeId: string;
}) => void | Promise<void>

Validates the (optionally decoded) payload during WebSocket connection establishment. If

syncPayloadSchema

is provided, payload will be of the schema’s inferred type.

validatePayload
: async (
payload: any
payload
: any,
context: {
storeId: string;
}
context
) => {
const {
const storeId: string
storeId
} =
context: {
storeId: string;
}
context
const {
const authToken: any
authToken
} =
payload: any
payload
if (!
const authToken: any
authToken
) {
throw new
var Error: ErrorConstructor
new (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error
('No auth token provided')
}
const
const user: jose.JWTPayload | undefined
user
= await
function getUserFromToken(token: string): Promise<jose.JWTPayload | undefined>
getUserFromToken
(
const authToken: any
authToken
)
if (!
const user: jose.JWTPayload | undefined
user
) {
throw new
var Error: ErrorConstructor
new (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error
('Invalid auth token')
} else {
// User is authenticated!
var console: Console

The console module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers.

The module exports two specific components:

  • A Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
  • A global console instance configured to write to process.stdout and process.stderr. The global console can be used without importing the node:console module.

Warning: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the note on process I/O for more information.

Example using the global console:

console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr

Example using the Console class:

const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err

@seesource

console
.
Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)

Prints to stdout with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to printf(3) (the arguments are all passed to util.format()).

const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout

See util.format() for more information.

@sincev0.1.100

log
('Sync backend payload',
var JSON: JSON

An intrinsic object that provides functions to convert JavaScript values to and from the JavaScript Object Notation (JSON) format.

JSON
.
JSON.stringify(value: any, replacer?: (number | string)[] | null, space?: string | number): string (+1 overload)

Converts a JavaScript value to a JavaScript Object Notation (JSON) string.

@paramvalue A JavaScript value, usually an object or array, to be converted.

@paramreplacer An array of strings and numbers that acts as an approved list for selecting the object properties that will be stringified.

@paramspace Adds indentation, white space, and line break characters to the return-value JSON text to make it easier to read.

@throws{TypeError} If a circular reference or a BigInt value is found.

stringify
(
const user: jose.JWTPayload
user
, null, 2))
}
// Check if token is expired
if (
payload: any
payload
.
any
exp
&&
payload: any
payload
.
any
exp
<
var Date: DateConstructor

Enables basic storage and retrieval of dates and times.

Date
.
DateConstructor.now(): number

Returns the number of milliseconds elapsed since midnight, January 1, 1970 Universal Coordinated Time (UTC).

now
() / 1000) {
throw new
var Error: ErrorConstructor
new (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error
('Token expired')
}
await
function checkUserAccess(payload: jose.JWTPayload, storeId: string): Promise<void>
checkUserAccess
(
const user: jose.JWTPayload
user
,
const storeId: string
storeId
)
},
enableCORS?: boolean

@defaultfalse

enableCORS
: true,
})
async function
function getUserFromToken(token: string): Promise<jose.JWTPayload | undefined>
getUserFromToken
(
token: string
token
: string):
interface Promise<T>

Represents the completion of an asynchronous operation

Promise
<
import jose
jose
.
export JWTPayload

Recognized JWT Claims Set members, any other members may also be present.

JWTPayload
| undefined> {
try {
const {
const payload: jose.JWTPayload

JWT Claims Set.

payload
} = await
import jose
jose
.
jwtVerify<jose.JWTPayload>(jwt: string | Uint8Array, key: jose.CryptoKey | jose.KeyObject | jose.JWK | Uint8Array, options?: jose.JWTVerifyOptions): Promise<jose.JWTVerifyResult<jose.JWTPayload>> (+1 overload)
export jwtVerify

Verifies the JWT format (to be a JWS Compact format), verifies the JWS signature, validates the JWT Claims Set.

This function is exported (as a named export) from the main 'jose' module entry point as well as from its subpath export 'jose/jwt/verify'.

@paramjwt JSON Web Token value (encoded as JWS).

@paramkey Key to verify the JWT with. See https://github.com/panva/jose/issues/210#jws-alg Algorithm Key Requirements.

@paramoptions JWT Decryption and JWT Claims Set validation options.

jwtVerify
(
token: string
token
, new
var TextEncoder: new () => TextEncoder

The TextEncoder interface takes a stream of code points as input and emits a stream of UTF-8 bytes.

MDN Reference

An implementation of the WHATWG Encoding Standard TextEncoder API. All instances of TextEncoder only support UTF-8 encoding.

const encoder = new TextEncoder();
const uint8array = encoder.encode('this is some data');

TextEncoder class is a global reference for import { TextEncoder } from 'node:util' https://nodejs.org/api/globals.html#textencoder

@sincev11.0.0

TextEncoder
().
TextEncoder.encode(input?: string): Uint8Array<ArrayBuffer>

The TextEncoder.encode() method takes a string as input, and returns a Global_Objects/Uint8Array containing the text given in parameters encoded with the specific method for that TextEncoder object.

MDN Reference

encode
(
const JWT_SECRET: "a-string-secret-at-least-256-bits-long"
JWT_SECRET
))
return
const payload: jose.JWTPayload

JWT Claims Set.

payload
} catch (
function (local var) error: unknown
error
) {
var console: Console

The console module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers.

The module exports two specific components:

  • A Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
  • A global console instance configured to write to process.stdout and process.stderr. The global console can be used without importing the node:console module.

Warning: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the note on process I/O for more information.

Example using the global console:

console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr

Example using the Console class:

const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err

@seesource

console
.
Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)

Prints to stdout with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to printf(3) (the arguments are all passed to util.format()).

const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout

See util.format() for more information.

@sincev0.1.100

log
('⚠️ Error verifying token',
function (local var) error: unknown
error
)
}
}
async function
function checkUserAccess(payload: jose.JWTPayload, storeId: string): Promise<void>
checkUserAccess
(
payload: jose.JWTPayload
payload
:
import jose
jose
.
export JWTPayload

Recognized JWT Claims Set members, any other members may also be present.

JWTPayload
,
storeId: string
storeId
: string):
interface Promise<T>

Represents the completion of an asynchronous operation

Promise
<void> {
// Check if user is authorized to access the store
var console: Console

The console module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers.

The module exports two specific components:

  • A Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
  • A global console instance configured to write to process.stdout and process.stderr. The global console can be used without importing the node:console module.

Warning: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the note on process I/O for more information.

Example using the global console:

console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr

Example using the Console class:

const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err

@seesource

console
.
Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)

Prints to stdout with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to printf(3) (the arguments are all passed to util.format()).

const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout

See util.format() for more information.

@sincev0.1.100

log
('Checking access for store',
storeId: string
storeId
, 'with payload',
payload: jose.JWTPayload
payload
)
}

The above example uses jose, a popular JavaScript module that supports JWTs. It works across various runtimes, including Node.js, Cloudflare Workers, Deno, Bun, and others.

The validatePayload function receives the authToken, checks if the payload exists, and verifies that it’s valid and hasn’t expired. If all checks pass, sync continues as normal. If any check fails, the server rejects the sync.

The client app still works as expected, but saves data locally. If the user re-authenticates or refreshes the token later, LiveStore syncs any local changes made while the user was unauthenticated.

Re-validate payload inside the Durable Object

Section titled “Re-validate payload inside the Durable Object”

When you rely on syncPayload, treat it as untrusted input. Decode the token inside validatePayload to gate the connection, and then repeat the same verification inside the Durable Object before trusting per-push metadata.

type
type SyncPayload = {
authToken?: string;
userId?: string;
}
SyncPayload
= {
authToken?: string
authToken
?: string;
userId?: string
userId
?: string }
type
type AuthorizedSession = {
authToken: string;
userId: string;
}
AuthorizedSession
= {
authToken: string
authToken
: string
userId: string
userId
: string
}
const
const ensureAuthorized: (payload: unknown) => AuthorizedSession
ensureAuthorized
= (
payload: unknown
payload
: unknown):
type AuthorizedSession = {
authToken: string;
userId: string;
}
AuthorizedSession
=> {
if (
payload: unknown
payload
===
var undefined
undefined
||
payload: {} | null
payload
=== null || typeof
payload: {}
payload
!== 'object') {
throw new
var Error: ErrorConstructor
new (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error
('Missing auth payload')
}
const {
const authToken: string | undefined
authToken
,
const userId: string | undefined
userId
} =
payload: object
payload
as
type SyncPayload = {
authToken?: string;
userId?: string;
}
SyncPayload
if (!
const authToken: string | undefined
authToken
) {
throw new
var Error: ErrorConstructor
new (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error
('Missing auth token')
}
const
const claims: Claims
claims
=
function verifyJwt(token: string): Claims
verifyJwt
(
const authToken: string
authToken
)
if (!
const claims: Claims
claims
.
sub?: string | undefined
sub
) {
throw new
var Error: ErrorConstructor
new (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error
('Token missing subject claim')
}
if (
const userId: string | undefined
userId
!==
var undefined
undefined
&&
const userId: string
userId
!==
const claims: Claims
claims
.
sub?: string
sub
) {
throw new
var Error: ErrorConstructor
new (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error
('Payload userId mismatch')
}
return {
authToken: string
authToken
,
userId: string
userId
:
const claims: Claims
claims
.
sub?: string
sub
}
}
export default
makeWorker<Env, undefined, JsonValue>(options: MakeWorkerOptions<Env, JsonValue>): CFWorker<Env, undefined>

Produces a Cloudflare Worker fetch handler that delegates sync traffic to the Durable Object identified by syncBackendBinding.

For more complex setups prefer implementing a custom fetch and call

handleSyncRequest

from the branch that handles LiveStore sync requests.

makeWorker
({
syncBackendBinding: string

Binding name of the sync Durable Object declared in wrangler config.

syncBackendBinding
: 'SYNC_BACKEND_DO',
validatePayload?: (payload: JsonValue, context: {
storeId: string;
}) => void | Promise<void>

Validates the (optionally decoded) payload during WebSocket connection establishment. If

syncPayloadSchema

is provided, payload will be of the schema’s inferred type.

validatePayload
: (
payload: JsonValue
payload
) => {
const ensureAuthorized: (payload: unknown) => AuthorizedSession
ensureAuthorized
(
payload: JsonValue
payload
)
},
})
export class
class SyncBackendDO
SyncBackendDO
extends
function makeDurableObject(options?: MakeDurableObjectClassOptions): {
new (ctx: DoState, env: Env): DoObject<SyncBackendRpcInterface>;
}

Creates a Durable Object class for handling WebSocket-based sync. A sync durable object is uniquely scoped to a specific storeId.

The sync DO supports 3 transport modes:

  • HTTP JSON-RPC
  • WebSocket
  • Durable Object RPC calls (only works in combination with @livestore/adapter-cf)

Example:

// In your Cloudflare Worker file
import { makeDurableObject } from '@livestore/sync-cf/cf-worker'
export class SyncBackendDO extends makeDurableObject({
onPush: async (message) => {
console.log('onPush', message.batch)
},
onPull: async (message) => {
console.log('onPull', message)
},
}) {}

wrangler.toml

[[durable_objects.bindings]]
name = "SYNC_BACKEND_DO"
class_name = "SyncBackendDO"
[[migrations]]
tag = "v1"
new_sqlite_classes = ["SyncBackendDO"]

makeDurableObject
({
onPush?: (message: SyncMessage.PushRequest, context: {
storeId: StoreId;
payload?: JsonValue;
}) => SyncOrPromiseOrEffect<void>
onPush
: async (
message: {
readonly batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[];
readonly backendId: Option<string>;
}
message
:
import SyncMessage
SyncMessage
.
type PushRequest = {
readonly batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[];
readonly backendId: Option<string>;
}
PushRequest
, {
payload: JsonValue | undefined
payload
}) => {
const {
const userId: string
userId
} =
const ensureAuthorized: (payload: unknown) => AuthorizedSession
ensureAuthorized
(
payload: JsonValue | undefined
payload
)
await
const ensureTenantAccess: (_userId: string, _batch: SyncMessage.PushRequest["batch"]) => Promise<void>
ensureTenantAccess
(
const userId: string
userId
,
message: {
readonly batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[];
readonly backendId: Option<string>;
}
message
.
batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[]
batch
)
},
}) {}
const
const ensureTenantAccess: (_userId: string, _batch: SyncMessage.PushRequest["batch"]) => Promise<void>
ensureTenantAccess
= async (
_userId: string
_userId
: string,
_batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[]
_batch
:
import SyncMessage
SyncMessage
.
type PushRequest = {
readonly batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[];
readonly backendId: Option<string>;
}
PushRequest
['batch']) => {
// Replace with your application-specific access checks.
}
export type
type Claims = {
sub?: string;
}
Claims
= {
sub?: string
sub
?: string
}
  • validatePayload runs once per connection and rejects mismatched tokens before LiveStore upgrades to WebSocket.
  • onPush (and onPull, if you need it) must repeat the verification because the payload forwarded to the Durable Object is the original client input.
  • The HTTP transport does not forward payloads today; embed the necessary authorization context directly in the events or move those clients to WebSocket/DO-RPC if you must rely on shared payload metadata.

You can extend ensureAuthorized to project additional claims, memoise verification per authToken, or enforce application-specific policies without changing LiveStore internals.

LiveStore’s clientId identifies a client instance, while user identity is an application-level concern that must be modeled through your application’s events and logic.

  • clientId: Automatically managed by LiveStore, identifies a client instance
  • User identity: Managed by your application through events and syncPayload

The syncPayload is primarily intended for authentication purposes:

export const
const AuthenticatedProvider: () => JSX.Element
AuthenticatedProvider
= () => (
<
const LiveStoreProvider: <TSyncPayloadSchema extends Schema<any> = Schema<JsonValue, JsonValue, never>>({ renderLoading, renderError, renderShutdown, otelOptions, children, schema, storeId, boot, adapter, batchUpdates, disableDevtools, signal, confirmUnsavedChanges, syncPayload, syncPayloadSchema, debug, }: LiveStoreProviderProps<TSyncPayloadSchema> & React.PropsWithChildren) => React.ReactNode
LiveStoreProvider
LiveStoreProviderProps<TSyncPayloadSchema extends Schema<any> = Schema<JsonValue, JsonValue, never>>.schema: LiveStoreSchema<DbSchema, EventDefRecord>
schema
={
const schema: LiveStoreSchema<DbSchema, EventDefRecord>
schema
}
LiveStoreProviderProps<TSyncPayloadSchema extends Schema<any> = Schema<JsonValue, JsonValue, never>>.storeId?: string

The storeId can be used to isolate multiple stores from each other. So it can be useful for multi-tenancy scenarios.

The storeId is also used for persistence.

Make sure to also configure storeId in LiveStore Devtools (e.g. in Vite plugin).

@default'default'

storeId
={
const storeId: "demo-store"
storeId
}
LiveStoreProviderProps<TSyncPayloadSchema extends Schema<any> = Schema<JsonValue, JsonValue, never>>.adapter: Adapter
adapter
={
const adapter: Adapter
adapter
}
LiveStoreProviderProps<Schema<JsonValue, JsonValue, never>>.batchUpdates: (run: () => void) => void

In order for LiveStore to apply multiple events in a single render, you need to pass the batchUpdates function from either react-dom or react-native.

// With React DOM
import { unstable_batchedUpdates as batchUpdates } from 'react-dom'
// With React Native
import { unstable_batchedUpdates as batchUpdates } from 'react-native'

batchUpdates
={
function batchUpdates<A, R>(callback: (a: A) => R, a: A): R (+1 overload)
batchUpdates
}
LiveStoreProviderProps<Schema<JsonValue, JsonValue, never>>.syncPayload?: JsonValue
syncPayload
={{
authToken: string
authToken
:
const user: {
jwt: string;
}
user
.
jwt: string
jwt
, // Using a JWT
}}
>
{/* ... */}
{
const children: null
children
}
)

User identification and semantic data (like user IDs) should typically be handled through your event payloads and application state rather than relying solely on the sync payload.