Skip to content

Cloudflare Workers

The @livestore/sync-cf package provides a comprehensive LiveStore sync provider for Cloudflare Workers. It uses Durable Objects for connectivity and, by default, persists events in the Durable Object’s own SQLite. You can optionally use Cloudflare D1 instead. Multiple transports are supported to fit different deployment scenarios.

Terminal window
pnpm add @livestore/sync-cf

The sync provider supports three transport protocols, each optimized for different use cases:

Real-time bidirectional communication with automatic reconnection and live pull support.

import {
const makeWsSync: (options: WsSyncOptions) => SyncBackendConstructor<SyncMetadata>

Creates a sync backend that uses WebSocket to communicate with the sync backend.

@example

import { makeWsSync } from '@livestore/sync-cf/client'
const syncBackend = makeWsSync({ url: 'wss://sync.example.com' })

makeWsSync
} from '@livestore/sync-cf/client'
export const
const syncBackend: SyncBackendConstructor<{
readonly createdAt: string;
readonly _tag: "SyncMessage.SyncMetadata";
}, JsonValue>
syncBackend
=
function makeWsSync(options: WsSyncOptions): SyncBackendConstructor<SyncMetadata>

Creates a sync backend that uses WebSocket to communicate with the sync backend.

@example

import { makeWsSync } from '@livestore/sync-cf/client'
const syncBackend = makeWsSync({ url: 'wss://sync.example.com' })

makeWsSync
({
WsSyncOptions.url: string

URL of the sync backend

The protocol can either http/https or ws/wss

url
: 'wss://sync.example.com',
})

HTTP-based sync with polling for live updates. Requires the enable_request_signal compatibility flag.

import {
const makeHttpSync: (options: HttpSyncOptions) => SyncBackendConstructor<SyncMetadata>

Note: This implementation requires the enable_request_signal compatibility flag to properly support pull streaming responses

makeHttpSync
} from '@livestore/sync-cf/client'
export const
const syncBackend: SyncBackendConstructor<{
readonly createdAt: string;
readonly _tag: "SyncMessage.SyncMetadata";
}, JsonValue>
syncBackend
=
function makeHttpSync(options: HttpSyncOptions): SyncBackendConstructor<SyncMetadata>

Note: This implementation requires the enable_request_signal compatibility flag to properly support pull streaming responses

makeHttpSync
({
HttpSyncOptions.url: string

URL of the sync backend

@example

const syncBackend = makeHttpSync({ url: 'https://sync.example.com' })

url
: 'https://sync.example.com',
HttpSyncOptions.livePull?: {
pollInterval?: DurationInput;
}
livePull
: {
pollInterval?: DurationInput

How often to poll for new events

@default5 seconds

pollInterval
: 3000, // Poll every 3 seconds
},
})

Direct RPC communication between Durable Objects (internal use by @livestore/adapter-cloudflare).

import type {
import CfTypes
CfTypes
,
(alias) interface SyncBackendRpcInterface
import SyncBackendRpcInterface

Durable Object interface supporting the DO RPC protocol for DO <> DO syncing.

SyncBackendRpcInterface
} from '@livestore/sync-cf/cf-worker'
import {
const makeDoRpcSync: ({ syncBackendStub, durableObjectContext }: DoRpcSyncOptions) => SyncBackendConstructor<SyncMetadata>

Creates a sync backend that uses Durable Object RPC to communicate with the sync backend.

Used internally by @livestore/adapter-cf to connect to the sync backend.

makeDoRpcSync
} from '@livestore/sync-cf/client'
declare const
const state: CfTypes.DurableObjectState<unknown>
state
:
import CfTypes
CfTypes
.
interface DurableObjectState<Props = unknown>
DurableObjectState
declare const
const syncBackendDurableObject: CfTypes.DurableObjectStub<SyncBackendRpcInterface>
syncBackendDurableObject
:
import CfTypes
CfTypes
.
type DurableObjectStub<T extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined> = (T extends CfTypes.Rpc.EntrypointBranded ? CfTypes.Rpc.Provider<T, "alarm" | "webSocketMessage" | "webSocketClose" | "webSocketError" | "fetch" | "connect"> : unknown) & {
fetch(input: CfTypes.RequestInfo | CfTypes.URL, init?: CfTypes.RequestInit): Promise<CfTypes.Response>;
connect(address: CfTypes.SocketAddress | string, options?: CfTypes.SocketOptions): CfTypes.Socket;
} & {
...;
}
DurableObjectStub
<
(alias) interface SyncBackendRpcInterface
import SyncBackendRpcInterface

Durable Object interface supporting the DO RPC protocol for DO <> DO syncing.

SyncBackendRpcInterface
>
export const
const syncBackend: SyncBackendConstructor<{
readonly createdAt: string;
readonly _tag: "SyncMessage.SyncMetadata";
}, JsonValue>
syncBackend
=
function makeDoRpcSync({ syncBackendStub, durableObjectContext }: DoRpcSyncOptions): SyncBackendConstructor<SyncMetadata>

Creates a sync backend that uses Durable Object RPC to communicate with the sync backend.

Used internally by @livestore/adapter-cf to connect to the sync backend.

makeDoRpcSync
({
DoRpcSyncOptions.syncBackendStub: SyncBackendRpcStub

Durable Object stub that implements the SyncDoRpc interface

syncBackendStub
:
const syncBackendDurableObject: CfTypes.DurableObjectStub<SyncBackendRpcInterface>
syncBackendDurableObject
,
DoRpcSyncOptions.durableObjectContext: {
bindingName: string;
durableObjectId: string;
}

Information about this DurableObject instance so the Sync DO instance can call back to this instance

durableObjectContext
: {
bindingName: string

See wrangler.toml for the binding name

bindingName
: 'CLIENT_DO',
durableObjectId: string

state.id.toString() in the DO

durableObjectId
:
const state: CfTypes.DurableObjectState<unknown>
state
.
DurableObjectState<unknown>.id: CfTypes.DurableObjectId
id
.
DurableObjectId.toString(): string
toString
(),
},
})

Creates a WebSocket-based sync backend client.

Options:

  • url - WebSocket URL (supports ws/wss or http/https protocols)
  • webSocketFactory? - Custom WebSocket implementation
  • ping? - Ping configuration:
    • enabled?: boolean - Enable/disable ping (default: true)
    • requestTimeout?: Duration - Ping timeout (default: 10 seconds)
    • requestInterval?: Duration - Ping interval (default: 10 seconds)

Features:

  • Real-time live pull
  • Automatic reconnection
  • Connection status tracking
  • Ping/pong keep-alive
import {
const makeWsSync: (options: WsSyncOptions) => SyncBackendConstructor<SyncMetadata>

Creates a sync backend that uses WebSocket to communicate with the sync backend.

@example

import { makeWsSync } from '@livestore/sync-cf/client'
const syncBackend = makeWsSync({ url: 'wss://sync.example.com' })

makeWsSync
} from '@livestore/sync-cf/client'
export const
const syncBackend: SyncBackendConstructor<{
readonly createdAt: string;
readonly _tag: "SyncMessage.SyncMetadata";
}, JsonValue>
syncBackend
=
function makeWsSync(options: WsSyncOptions): SyncBackendConstructor<SyncMetadata>

Creates a sync backend that uses WebSocket to communicate with the sync backend.

@example

import { makeWsSync } from '@livestore/sync-cf/client'
const syncBackend = makeWsSync({ url: 'wss://sync.example.com' })

makeWsSync
({
WsSyncOptions.url: string

URL of the sync backend

The protocol can either http/https or ws/wss

url
: 'wss://sync.example.com',
WsSyncOptions.ping?: {
enabled?: boolean;
requestTimeout?: DurationInput;
requestInterval?: DurationInput;
}
ping
: {
enabled?: boolean

@defaulttrue

enabled
: true,
requestTimeout?: DurationInput

How long to wait for a ping response before timing out

@default10 seconds

requestTimeout
: 5000,
requestInterval?: DurationInput

How often to send ping requests

@default10 seconds

requestInterval
: 15000,
},
})

Creates an HTTP-based sync backend client with polling for live updates.

Options:

  • url - HTTP endpoint URL
  • headers? - Additional HTTP headers
  • livePull? - Live pull configuration:
    • pollInterval?: Duration - Polling interval (default: 5 seconds)
  • ping? - Ping configuration (same as WebSocket)

Features:

  • HTTP request/response based
  • Polling-based live pull
  • Custom headers support
  • Connection status via ping
import {
const makeHttpSync: (options: HttpSyncOptions) => SyncBackendConstructor<SyncMetadata>

Note: This implementation requires the enable_request_signal compatibility flag to properly support pull streaming responses

makeHttpSync
} from '@livestore/sync-cf/client'
export const
const syncBackend: SyncBackendConstructor<{
readonly createdAt: string;
readonly _tag: "SyncMessage.SyncMetadata";
}, JsonValue>
syncBackend
=
function makeHttpSync(options: HttpSyncOptions): SyncBackendConstructor<SyncMetadata>

Note: This implementation requires the enable_request_signal compatibility flag to properly support pull streaming responses

makeHttpSync
({
HttpSyncOptions.url: string

URL of the sync backend

@example

const syncBackend = makeHttpSync({ url: 'https://sync.example.com' })

url
: 'https://sync.example.com',
HttpSyncOptions.headers?: Record<string, string>
headers
: {
type Authorization: string
Authorization
: 'Bearer token',
'X-Custom-Header': 'value',
},
HttpSyncOptions.livePull?: {
pollInterval?: DurationInput;
}
livePull
: {
pollInterval?: DurationInput

How often to poll for new events

@default5 seconds

pollInterval
: 2000, // Poll every 2 seconds
},
})

Creates a Durable Object RPC-based sync backend (for internal use).

Options:

  • syncBackendStub - Durable Object stub implementing SyncBackendRpcInterface
  • durableObjectContext - Context for RPC callbacks:
    • bindingName - Wrangler binding name for the client DO
    • durableObjectId - Client Durable Object ID

Features:

  • Direct RPC communication
  • Real-time live pull via callbacks
  • Hibernation support

Handles RPC callback for live pull updates in Durable Objects.

import {
class DurableObject<Env = Cloudflare.Env, Props = {}>
DurableObject
} from 'cloudflare:workers'
import { type
(alias) interface ClientDoWithRpcCallback
import ClientDoWithRpcCallback
ClientDoWithRpcCallback
,
const createStoreDoPromise: <TSchema extends LiveStoreSchema, TEnv, TState extends DurableObjectState = DurableObjectState<unknown>>(options: CreateStoreDoOptions<TSchema, TEnv, TState>) => Promise<Store<TSchema, {}>>
createStoreDoPromise
} from '@livestore/adapter-cloudflare'
import {
function nanoid<Type extends string>(size?: number): Type

Generate secure URL-friendly unique ID.

By default, the ID will have 21 symbols to have a collision probability similar to UUID v4.

import { nanoid } from 'nanoid'
model.id = nanoid() //=> "Uakgb_J5m9g-0JDMbcJqL"

@paramsize Size of the ID. The default size is 21.

@returnsA random string.

nanoid
, type
class Store<TSchema extends LiveStoreSchema = LiveStoreSchema.Any, TContext = {}>
Store
, type
type Unsubscribe = () => void
Unsubscribe
} from '@livestore/livestore'
import {
const handleSyncUpdateRpc: (payload: unknown) => Promise<void>

import { DurableObject } from 'cloudflare:workers'
import { ClientDoWithRpcCallback } from '@livestore/common-cf'
export class MyDurableObject extends DurableObject implements ClientDoWithRpcCallback {
// ...
async syncUpdateRpc(payload: RpcMessage.ResponseChunkEncoded) {
return handleSyncUpdateRpc(payload)
}
}

handleSyncUpdateRpc
} from '@livestore/sync-cf/client'
import type {
type Env = {
CLIENT_DO: DurableObjectNamespace<ClientDoWithRpcCallback>;
SYNC_BACKEND_DO: DurableObjectNamespace<SyncBackendRpcInterface>;
DB: D1Database;
ADMIN_SECRET: string;
}
Env
} from './env.ts'
import {
const schema: FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>
schema
,
const tables: {
todos: TableDef<SqliteTableDefForInput<"todos", {
readonly id: {
columnType: "text";
schema: Schema<string, string, never>;
default: None<never>;
nullable: false;
primaryKey: true;
autoIncrement: false;
};
readonly text: {
columnType: "text";
schema: Schema<string, string, never>;
default: Some<"">;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly completed: {
columnType: "integer";
schema: Schema<boolean, number, never>;
default: Some<false>;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly deletedAt: {
columnType: "integer";
... 4 more ...;
autoIncrement: false;
};
}>, WithDefaults<...>, Schema<...>>;
}
tables
} from './schema.ts'
import {
const storeIdFromRequest: (request: Request) => string
storeIdFromRequest
} from './shared.ts'
type
type AlarmInfo = {
isRetry: boolean;
retryCount: number;
}
AlarmInfo
= {
isRetry: boolean
isRetry
: boolean
retryCount: number
retryCount
: number
}
export class
class LiveStoreClientDO
LiveStoreClientDO
extends
class DurableObject<Env = Cloudflare.Env, Props = {}>
DurableObject
<
type Env = {
CLIENT_DO: DurableObjectNamespace<ClientDoWithRpcCallback>;
SYNC_BACKEND_DO: DurableObjectNamespace<SyncBackendRpcInterface>;
DB: D1Database;
ADMIN_SECRET: string;
}
Env
> implements
(alias) interface ClientDoWithRpcCallback
import ClientDoWithRpcCallback
ClientDoWithRpcCallback
{
LiveStoreClientDO.__DURABLE_OBJECT_BRAND: never
__DURABLE_OBJECT_BRAND
: never =
var undefined
undefined
as never
private
LiveStoreClientDO.storeId: string | undefined
storeId
: string | undefined
private
LiveStoreClientDO.cachedStore: Store<FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>, {}> | undefined
cachedStore
:
class Store<TSchema extends LiveStoreSchema = LiveStoreSchema.Any, TContext = {}>
Store
<typeof
const schema: FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>
schema
> | undefined
private
LiveStoreClientDO.storeSubscription: Unsubscribe | undefined
storeSubscription
:
type Unsubscribe = () => void
Unsubscribe
| undefined
private readonly
LiveStoreClientDO.todosQuery: QueryBuilder<readonly {
readonly id: string;
readonly text: string;
readonly deletedAt: Date | null;
readonly completed: boolean;
}[], TableDefBase<SqliteTableDefForInput<"todos", {
readonly id: {
columnType: "text";
schema: Schema<string, string, never>;
default: None<never>;
nullable: false;
primaryKey: true;
autoIncrement: false;
};
readonly text: {
columnType: "text";
schema: Schema<string, string, never>;
default: Some<"">;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly completed: {
columnType: "integer";
schema: Schema<boolean, number, never>;
default: Some<false>;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly deletedAt: {
...;
};
}>, WithDefaults<...>>, "select" | ... 3 more ... | "row">
todosQuery
=
const tables: {
todos: TableDef<SqliteTableDefForInput<"todos", {
readonly id: {
columnType: "text";
schema: Schema<string, string, never>;
default: None<never>;
nullable: false;
primaryKey: true;
autoIncrement: false;
};
readonly text: {
columnType: "text";
schema: Schema<string, string, never>;
default: Some<"">;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly completed: {
columnType: "integer";
schema: Schema<boolean, number, never>;
default: Some<false>;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly deletedAt: {
...;
};
}>, WithDefaults<...>, Schema<...>>;
}
tables
.
todos: TableDef<SqliteTableDefForInput<"todos", {
readonly id: {
columnType: "text";
schema: Schema<string, string, never>;
default: None<never>;
nullable: false;
primaryKey: true;
autoIncrement: false;
};
readonly text: {
columnType: "text";
schema: Schema<string, string, never>;
default: Some<"">;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly completed: {
columnType: "integer";
schema: Schema<boolean, number, never>;
default: Some<false>;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly deletedAt: {
...;
};
}>, WithDefaults<...>, Schema<...>>
todos
.
select: <"id" | "text" | "deletedAt" | "completed">(...columns: ("id" | "text" | "deletedAt" | "completed")[]) => QueryBuilder<readonly {
readonly id: string;
readonly text: string;
readonly deletedAt: Date | null;
readonly completed: boolean;
}[], TableDefBase<SqliteTableDefForInput<"todos", {
readonly id: {
columnType: "text";
schema: Schema<string, string, never>;
default: None<never>;
nullable: false;
primaryKey: true;
autoIncrement: false;
};
readonly text: {
columnType: "text";
schema: Schema<string, string, never>;
default: Some<"">;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly completed: {
...;
};
readonly deletedAt: {
...;
};
}>, WithDefaults<...>>, "select" | ... 3 more ... | "row"> (+1 overload)

Select multiple columns

select
()
async
LiveStoreClientDO.fetch(request: Request): Promise<Response>
fetch
(
request: Request<unknown, CfProperties<unknown>>
request
:
interface Request<CfHostMetadata = unknown, Cf = CfProperties<CfHostMetadata>>

The Request interface of the Fetch API represents a resource request.

MDN Reference

This Fetch API interface represents a resource request.

MDN Reference

Request
):
interface Promise<T>

Represents the completion of an asynchronous operation

Promise
<
interface Response

The Response interface of the Fetch API represents the response to a request.

MDN Reference

This Fetch API interface represents the response to a request.

MDN Reference

Response
> {
// @ts-expect-error TODO remove casts once CF types are fixed in https://github.com/cloudflare/workerd/issues/4811
this.
LiveStoreClientDO.storeId: string | undefined
storeId
=
function storeIdFromRequest(request: Request): string
storeIdFromRequest
(
request: Request<unknown, CfProperties<unknown>>
request
)
const
const store: Store<FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>, {}>
store
= await this.
LiveStoreClientDO.getStore(): Promise<Store<FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>, {}>>
getStore
()
await this.
LiveStoreClientDO.subscribeToStore(): Promise<void>
subscribeToStore
()
const
const todos: readonly {
readonly id: string;
readonly text: string;
readonly deletedAt: Date | null;
readonly completed: boolean;
}[]
todos
=
const store: Store<FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>, {}>
store
.
Store<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<...>; todoUncompleted: EventDef<...>; todoDeleted: EventDef<...>; todoClearedCompleted: EventDef<...>; }; state: InternalState; }>, {}>.query: <readonly {
readonly id: string;
readonly text: string;
readonly deletedAt: Date | null;
readonly completed: boolean;
}[]>(query: Queryable<readonly {
readonly id: string;
readonly text: string;
readonly deletedAt: Date | null;
readonly completed: boolean;
}[]> | {
query: string;
bindValues: Bindable;
schema?: Schema<readonly {
readonly id: string;
readonly text: string;
readonly deletedAt: Date | null;
readonly completed: boolean;
}[], readonly {
readonly id: string;
readonly text: string;
readonly deletedAt: Date | null;
readonly completed: boolean;
}[], never>;
}, options?: {
otelContext?: Context;
debugRefreshReason?: RefreshReason;
}) => readonly {
readonly id: string;
readonly text: string;
readonly deletedAt: Date | null;
readonly completed: boolean;
}[]

Synchronously queries the database without creating a LiveQuery. This is useful for queries that don't need to be reactive.

Example: Query builder

const completedTodos = store.query(tables.todo.where({ complete: true }))

Example: Raw SQL query

const completedTodos = store.query({ query: 'SELECT * FROM todo WHERE complete = 1', bindValues: {} })

query
(this.
LiveStoreClientDO.todosQuery: QueryBuilder<readonly {
readonly id: string;
readonly text: string;
readonly deletedAt: Date | null;
readonly completed: boolean;
}[], TableDefBase<SqliteTableDefForInput<"todos", {
readonly id: {
columnType: "text";
schema: Schema<string, string, never>;
default: None<never>;
nullable: false;
primaryKey: true;
autoIncrement: false;
};
readonly text: {
columnType: "text";
schema: Schema<string, string, never>;
default: Some<"">;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly completed: {
columnType: "integer";
schema: Schema<boolean, number, never>;
default: Some<false>;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly deletedAt: {
...;
};
}>, WithDefaults<...>>, "select" | ... 3 more ... | "row">
todosQuery
)
return new
var Response: new (body?: BodyInit | null, init?: ResponseInit) => Response

The Response interface of the Fetch API represents the response to a request.

MDN Reference

This Fetch API interface represents the response to a request.

MDN Reference

Response
(
var JSON: JSON

An intrinsic object that provides functions to convert JavaScript values to and from the JavaScript Object Notation (JSON) format.

JSON
.
JSON.stringify(value: any, replacer?: (number | string)[] | null, space?: string | number): string (+1 overload)

Converts a JavaScript value to a JavaScript Object Notation (JSON) string.

@paramvalue A JavaScript value, usually an object or array, to be converted.

@paramreplacer An array of strings and numbers that acts as an approved list for selecting the object properties that will be stringified.

@paramspace Adds indentation, white space, and line break characters to the return-value JSON text to make it easier to read.

@throws{TypeError} If a circular reference or a BigInt value is found.

stringify
(
const todos: readonly {
readonly id: string;
readonly text: string;
readonly deletedAt: Date | null;
readonly completed: boolean;
}[]
todos
, null, 2), {
ResponseInit.headers?: HeadersInit
headers
: { 'Content-Type': 'application/json' },
})
}
private async
LiveStoreClientDO.getStore(): Promise<Store<FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>, {}>>
getStore
() {
if (this.
LiveStoreClientDO.cachedStore: Store<FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>, {}> | undefined
cachedStore
!==
var undefined
undefined
) {
return this.
LiveStoreClientDO.cachedStore: Store<FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>, {}>
cachedStore
}
const
const storeId: string
storeId
= this.
LiveStoreClientDO.storeId: string | undefined
storeId
??
nanoid<string>(size?: number): string

Generate secure URL-friendly unique ID.

By default, the ID will have 21 symbols to have a collision probability similar to UUID v4.

import { nanoid } from 'nanoid'
model.id = nanoid() //=> "Uakgb_J5m9g-0JDMbcJqL"

@paramsize Size of the ID. The default size is 21.

@returnsA random string.

nanoid
()
const
const store: Store<FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>, {}>
store
= await
createStoreDoPromise<FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>, Env, DurableObjectState<...>>(options: CreateStoreDoOptions<...>): Promise<...>
createStoreDoPromise
({
schema: FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>

LiveStore schema that defines state, migrations, and validators.

schema
,
storeId: string

Logical identifier for the store instance persisted inside the Durable Object.

storeId
,
clientId: string

Unique identifier for the client that owns the Durable Object instance.

clientId
: 'client-do',
sessionId: string

Identifier for the LiveStore session running inside the Durable Object.

sessionId
:
nanoid<string>(size?: number): string

Generate secure URL-friendly unique ID.

By default, the ID will have 21 symbols to have a collision probability similar to UUID v4.

import { nanoid } from 'nanoid'
model.id = nanoid() //=> "Uakgb_J5m9g-0JDMbcJqL"

@paramsize Size of the ID. The default size is 21.

@returnsA random string.

nanoid
(),
durableObject: {
ctx: DurableObjectState<unknown>;
env: Env;
bindingName: "CLIENT_DO" | "SYNC_BACKEND_DO";
}

Runtime details about the Durable Object this store runs inside. Needed for sync backend to call back to this instance.

durableObject
: {
// @ts-expect-error TODO remove once CF types are fixed in https://github.com/cloudflare/workerd/issues/4811
ctx: DurableObjectState<unknown>

Durable Object state handle (e.g. this.ctx).

ctx
: this.
CloudflareWorkersModule.DurableObject<Env, {}>.ctx: DurableObjectState<{}>
ctx
,
env: Env

Environment bindings associated with the Durable Object.

env
: this.
CloudflareWorkersModule.DurableObject<Env, {}>.env: Env
env
,
bindingName: "CLIENT_DO" | "SYNC_BACKEND_DO"

Binding name Cloudflare uses to reach this Durable Object from other workers.

bindingName
: 'CLIENT_DO',
},
syncBackendStub: DurableObjectStub<SyncBackendRpcInterface>

RPC stub pointing at the sync backend Durable Object used for replication.

syncBackendStub
: this.
CloudflareWorkersModule.DurableObject<Env, {}>.env: Env
env
.
type SYNC_BACKEND_DO: DurableObjectNamespace<SyncBackendRpcInterface>
SYNC_BACKEND_DO
.
DurableObjectNamespace<SyncBackendRpcInterface>.get(id: DurableObjectId, options?: DurableObjectNamespaceGetDurableObjectOptions): DurableObjectStub<SyncBackendRpcInterface>
get
(this.
CloudflareWorkersModule.DurableObject<Env, {}>.env: Env
env
.
type SYNC_BACKEND_DO: DurableObjectNamespace<SyncBackendRpcInterface>
SYNC_BACKEND_DO
.
DurableObjectNamespace<SyncBackendRpcInterface>.idFromName(name: string): DurableObjectId
idFromName
(
const storeId: string
storeId
)),
livePull?: boolean

Enables live pull mode to receive sync updates via Durable Object RPC callbacks.

@defaultfalse

livePull
: true,
})
this.
LiveStoreClientDO.cachedStore: Store<FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>, {}> | undefined
cachedStore
=
const store: Store<FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>, {}>
store
return
const store: Store<FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>, {}>
store
}
private async
LiveStoreClientDO.subscribeToStore(): Promise<void>
subscribeToStore
() {
const
const store: Store<FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>, {}>
store
= await this.
LiveStoreClientDO.getStore(): Promise<Store<FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>, {}>>
getStore
()
if (this.
LiveStoreClientDO.storeSubscription: Unsubscribe | undefined
storeSubscription
===
var undefined
undefined
) {
this.
LiveStoreClientDO.storeSubscription: Unsubscribe | undefined
storeSubscription
=
const store: Store<FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>, {}>
store
.
Store<TSchema extends LiveStoreSchema = LiveStoreSchema.Any, TContext = {}>.subscribe: <readonly {
readonly id: string;
readonly text: string;
readonly deletedAt: Date | null;
readonly completed: boolean;
}[]>(query: Queryable<readonly {
readonly id: string;
readonly text: string;
readonly deletedAt: Date | null;
readonly completed: boolean;
}[]>, onUpdate: (value: readonly {
readonly id: string;
readonly text: string;
readonly deletedAt: Date | null;
readonly completed: boolean;
}[]) => void, options?: SubscribeOptions<readonly {
readonly id: string;
readonly text: string;
readonly deletedAt: Date | null;
readonly completed: boolean;
}[]> | undefined) => Unsubscribe (+1 overload)
subscribe
(this.
LiveStoreClientDO.todosQuery: QueryBuilder<readonly {
readonly id: string;
readonly text: string;
readonly deletedAt: Date | null;
readonly completed: boolean;
}[], TableDefBase<SqliteTableDefForInput<"todos", {
readonly id: {
columnType: "text";
schema: Schema<string, string, never>;
default: None<never>;
nullable: false;
primaryKey: true;
autoIncrement: false;
};
readonly text: {
columnType: "text";
schema: Schema<string, string, never>;
default: Some<"">;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly completed: {
columnType: "integer";
schema: Schema<boolean, number, never>;
default: Some<false>;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly deletedAt: {
...;
};
}>, WithDefaults<...>>, "select" | ... 3 more ... | "row">
todosQuery
, (
todos: readonly {
readonly id: string;
readonly text: string;
readonly completed: boolean;
readonly deletedAt: Date | null;
}[]
todos
:
interface ReadonlyArray<T>
ReadonlyArray
<typeof
const tables: {
todos: TableDef<SqliteTableDefForInput<"todos", {
readonly id: {
columnType: "text";
schema: Schema<string, string, never>;
default: None<never>;
nullable: false;
primaryKey: true;
autoIncrement: false;
};
readonly text: {
columnType: "text";
schema: Schema<string, string, never>;
default: Some<"">;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly completed: {
columnType: "integer";
schema: Schema<boolean, number, never>;
default: Some<false>;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly deletedAt: {
...;
};
}>, WithDefaults<...>, Schema<...>>;
}
tables
.
todos: TableDef<SqliteTableDefForInput<"todos", {
readonly id: {
columnType: "text";
schema: Schema<string, string, never>;
default: None<never>;
nullable: false;
primaryKey: true;
autoIncrement: false;
};
readonly text: {
columnType: "text";
schema: Schema<string, string, never>;
default: Some<"">;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly completed: {
columnType: "integer";
schema: Schema<boolean, number, never>;
default: Some<false>;
nullable: false;
primaryKey: false;
autoIncrement: false;
};
readonly deletedAt: {
...;
};
}>, WithDefaults<...>, Schema<...>>
todos
.
type Type: {
readonly id: string;
readonly text: string;
readonly completed: boolean;
readonly deletedAt: Date | null;
}
Type
>) => {
var console: Console

The console module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers.

The module exports two specific components:

  • A Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
  • A global console instance configured to write to process.stdout and process.stderr. The global console can be used without importing the node:console module.

Warning: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the note on process I/O for more information.

Example using the global console:

console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr

Example using the Console class:

const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err

@seesource

console
.
Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)

Prints to stdout with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to printf(3) (the arguments are all passed to util.format()).

const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout

See util.format() for more information.

@sincev0.1.100

log
(`todos for store (${this.
LiveStoreClientDO.storeId: string | undefined
storeId
})`,
todos: readonly {
readonly id: string;
readonly text: string;
readonly completed: boolean;
readonly deletedAt: Date | null;
}[]
todos
)
})
}
await this.
CloudflareWorkersModule.DurableObject<Env, {}>.ctx: DurableObjectState<{}>
ctx
.
DurableObjectState<{}>.storage: DurableObjectStorage
storage
.
DurableObjectStorage.setAlarm(scheduledTime: number | Date, options?: DurableObjectSetAlarmOptions): Promise<void>
setAlarm
(
var Date: DateConstructor

Enables basic storage and retrieval of dates and times.

Date
.
DateConstructor.now(): number

Returns the number of milliseconds elapsed since midnight, January 1, 1970 Universal Coordinated Time (UTC).

now
() + 1000)
}
LiveStoreClientDO.alarm(_alarmInfo?: AlarmInfo): void | Promise<void>
alarm
(
_alarmInfo: AlarmInfo | undefined
_alarmInfo
?:
type AlarmInfo = {
isRetry: boolean;
retryCount: number;
}
AlarmInfo
): void |
interface Promise<T>

Represents the completion of an asynchronous operation

Promise
<void> {
return this.
LiveStoreClientDO.subscribeToStore(): Promise<void>
subscribeToStore
()
}
async
LiveStoreClientDO.syncUpdateRpc(payload: unknown): Promise<void>
syncUpdateRpc
(
payload: unknown
payload
: unknown) {
await
function handleSyncUpdateRpc(payload: unknown): Promise<void>

import { DurableObject } from 'cloudflare:workers'
import { ClientDoWithRpcCallback } from '@livestore/common-cf'
export class MyDurableObject extends DurableObject implements ClientDoWithRpcCallback {
// ...
async syncUpdateRpc(payload: RpcMessage.ResponseChunkEncoded) {
return handleSyncUpdateRpc(payload)
}
}

handleSyncUpdateRpc
(
payload: unknown
payload
)
}
}

Creates a sync backend Durable Object class.

Options:

  • onPush? - Callback for push events: (message, context) => void | Promise<void>
  • onPushRes? - Callback for push responses: (message) => void | Promise<void>
  • onPull? - Callback for pull requests: (message, context) => void | Promise<void>
  • onPullRes? - Callback for pull responses: (message) => void | Promise<void>
  • storage? - Storage engine: { _tag: 'do-sqlite' } | { _tag: 'd1', binding: string } (default: do-sqlite)
  • enabledTransports? - Set of enabled transports: Set<'http' | 'ws' | 'do-rpc'>
  • otel? - OpenTelemetry configuration:
    • baseUrl? - OTEL endpoint URL
    • serviceName? - Service name for traces
import {
const makeDurableObject: MakeDurableObjectClass

Creates a Durable Object class for handling WebSocket-based sync. A sync durable object is uniquely scoped to a specific storeId.

The sync DO supports 3 transport modes:

  • HTTP JSON-RPC
  • WebSocket
  • Durable Object RPC calls (only works in combination with @livestore/adapter-cf)

Example:

// In your Cloudflare Worker file
import { makeDurableObject } from '@livestore/sync-cf/cf-worker'
export class SyncBackendDO extends makeDurableObject({
onPush: async (message) => {
console.log('onPush', message.batch)
},
onPull: async (message) => {
console.log('onPull', message)
},
}) {}

wrangler.toml

[[durable_objects.bindings]]
name = "SYNC_BACKEND_DO"
class_name = "SyncBackendDO"
[[migrations]]
tag = "v1"
new_sqlite_classes = ["SyncBackendDO"]

makeDurableObject
} from '@livestore/sync-cf/cf-worker'
const
const hasUserId: (p: unknown) => p is {
userId: string;
}
hasUserId
= (
p: unknown
p
: unknown):
p: unknown
p
is {
userId: string
userId
: string } =>
typeof
p: unknown
p
=== 'object' &&
p: object | null
p
!==
var undefined
undefined
&&
p: object | null
p
!== null && 'userId' in
p: object
p
export class
class SyncBackendDO
SyncBackendDO
extends
function makeDurableObject(options?: MakeDurableObjectClassOptions): {
new (ctx: DoState, env: Env): DoObject<SyncBackendRpcInterface>;
}

Creates a Durable Object class for handling WebSocket-based sync. A sync durable object is uniquely scoped to a specific storeId.

The sync DO supports 3 transport modes:

  • HTTP JSON-RPC
  • WebSocket
  • Durable Object RPC calls (only works in combination with @livestore/adapter-cf)

Example:

// In your Cloudflare Worker file
import { makeDurableObject } from '@livestore/sync-cf/cf-worker'
export class SyncBackendDO extends makeDurableObject({
onPush: async (message) => {
console.log('onPush', message.batch)
},
onPull: async (message) => {
console.log('onPull', message)
},
}) {}

wrangler.toml

[[durable_objects.bindings]]
name = "SYNC_BACKEND_DO"
class_name = "SyncBackendDO"
[[migrations]]
tag = "v1"
new_sqlite_classes = ["SyncBackendDO"]

makeDurableObject
({
onPush?: (message: PushRequest, context: {
storeId: StoreId;
payload?: JsonValue;
}) => SyncOrPromiseOrEffect<void>
onPush
: async (
message: {
readonly batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[];
readonly backendId: Option<string>;
}
message
, {
storeId: string
storeId
,
payload: JsonValue | undefined
payload
}) => {
var console: Console

The console module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers.

The module exports two specific components:

  • A Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
  • A global console instance configured to write to process.stdout and process.stderr. The global console can be used without importing the node:console module.

Warning: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the note on process I/O for more information.

Example using the global console:

console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr

Example using the Console class:

const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err

@seesource

console
.
Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)

Prints to stdout with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to printf(3) (the arguments are all passed to util.format()).

const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout

See util.format() for more information.

@sincev0.1.100

log
(`Push to store ${
storeId: string
storeId
}:`,
message: {
readonly batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[];
readonly backendId: Option<string>;
}
message
.
batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[]
batch
)
// Custom business logic
if (
const hasUserId: (p: unknown) => p is {
userId: string;
}
hasUserId
(
payload: JsonValue | undefined
payload
)) {
await
var Promise: PromiseConstructor

Represents the completion of an asynchronous operation

Promise
.
PromiseConstructor.resolve(): Promise<void> (+2 overloads)

Creates a new resolved promise.

@returnsA resolved promise.

resolve
()
}
},
onPull?: (message: PullRequest, context: {
storeId: StoreId;
payload?: JsonValue;
}) => SyncOrPromiseOrEffect<void>
onPull
: async (
_message: {
readonly cursor: Option<{
readonly backendId: string;
readonly eventSequenceNumber: number & Brand<"GlobalEventSequenceNumber">;
}>;
}
_message
, {
storeId: string
storeId
}) => {
var console: Console

The console module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers.

The module exports two specific components:

  • A Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
  • A global console instance configured to write to process.stdout and process.stderr. The global console can be used without importing the node:console module.

Warning: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the note on process I/O for more information.

Example using the global console:

console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr

Example using the Console class:

const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err

@seesource

console
.
Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)

Prints to stdout with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to printf(3) (the arguments are all passed to util.format()).

const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout

See util.format() for more information.

@sincev0.1.100

log
(`Pull from store ${
storeId: string
storeId
}`)
},
enabledTransports?: Set<"http" | "ws" | "do-rpc">

Enabled transports for sync backend

  • http: HTTP JSON-RPC
  • ws: WebSocket
  • do-rpc: Durable Object RPC calls (only works in combination with @livestore/adapter-cf)

@defaultSet(['http', 'ws', 'do-rpc'])

enabledTransports
: new
var Set: SetConstructor
new <"http" | "ws">(iterable?: Iterable<"http" | "ws"> | null | undefined) => Set<"http" | "ws"> (+1 overload)
Set
(['ws', 'http']), // Disable DO RPC
otel?: {
baseUrl?: string;
serviceName?: string;
}
otel
: {
baseUrl?: string
baseUrl
: 'https://otel.example.com',
serviceName?: string
serviceName
: 'livestore-sync',
},
}) {}

Creates a complete Cloudflare Worker for the sync backend.

Options:

  • syncBackendBinding - Durable Object binding name defined in wrangler.toml
  • validatePayload? - Payload validation function: (payload, context) => void | Promise<void>
  • enableCORS? - Enable CORS headers (default: false)

makeWorker is a quick way to get started in simple demos. In most production workers you typically want to share routing logic with other endpoints, so prefer wiring your own fetch handler and call handleSyncRequest when you detect a sync request. A minimal example:

import type {
type CFWorker<TEnv extends Env = Env, _T extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined> = {
fetch: <CFHostMetada = unknown>(request: CfTypes.Request<CFHostMetada>, env: TEnv, ctx: CfTypes.ExecutionContext) => Promise<CfTypes.Response>;
}
CFWorker
,
import CfTypes
CfTypes
} from '@livestore/sync-cf/cf-worker'
import {
const handleSyncRequest: <TEnv extends Env = Env, TDurableObjectRpc extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined, CFHostMetada = unknown, TSyncPayload = JsonValue>({ request, searchParams: { storeId, payload, transport }, env: explicitlyProvidedEnv, syncBackendBinding, headers, validatePayload, syncPayloadSchema, }: {
request: CfTypes.Request<CFHostMetada>;
searchParams: SearchParams;
env?: TEnv | undefined;
ctx: CfTypes.ExecutionContext;
syncBackendBinding: MakeWorkerOptions<TEnv, TSyncPayload>["syncBackendBinding"];
headers?: CfTypes.HeadersInit | undefined;
validatePayload?: MakeWorkerOptions<TEnv, TSyncPayload>["validatePayload"];
syncPayloadSchema?: MakeWorkerOptions<TEnv, TSyncPayload>["syncPayloadSchema"];
}) => Promise<CfTypes.Response>

Handles LiveStore sync requests (e.g. with search params ?storeId=...&transport=...).

@example

const validatePayload = (payload: Schema.JsonValue | undefined, context: { storeId: string }) => {
console.log(`Validating connection for store: ${context.storeId}`)
if (payload?.authToken !== 'insecure-token-change-me') {
throw new Error('Invalid auth token')
}
}
export default {
fetch: async (request, env, ctx) => {
const searchParams = matchSyncRequest(request)
// Is LiveStore sync request
if (searchParams !== undefined) {
return handleSyncRequest({
request,
searchParams,
env,
ctx,
syncBackendBinding: 'SYNC_BACKEND_DO',
headers: {},
validatePayload,
})
}
return new Response('Invalid path', { status: 400 })
}
}

@throws{UnexpectedError} If the payload is invalid

handleSyncRequest
,
const matchSyncRequest: (request: CfTypes.Request) => SearchParams | undefined

Extracts the LiveStore sync search parameters from a request. Returns undefined when the request does not carry valid sync metadata so callers can fall back to custom routing.

matchSyncRequest
} from '@livestore/sync-cf/cf-worker'
import type {
(alias) interface Env
import Env
Env
} from './env.ts'
export default {
fetch: <CFHostMetada = unknown>(request: CfTypes.Request<CFHostMetada, CfTypes.CfProperties<CFHostMetada>>, env: Env, ctx: CfTypes.ExecutionContext) => Promise<CfTypes.Response>
fetch
: async (
request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request
:
import CfTypes
CfTypes
.
interface Request<CfHostMetadata = unknown, Cf = CfTypes.CfProperties<CfHostMetadata>>

This Fetch API interface represents a resource request.

MDN Reference

Request
,
env: Env
env
:
(alias) interface Env
import Env
Env
,
ctx: CfTypes.ExecutionContext<unknown>
ctx
:
import CfTypes
CfTypes
.
interface ExecutionContext<Props = unknown>
ExecutionContext
) => {
const
const searchParams: {
readonly storeId: string;
readonly payload: JsonValue | undefined;
readonly transport: "http" | "ws";
} | undefined
searchParams
=
function matchSyncRequest(request: CfTypes.Request): SearchParams | undefined

Extracts the LiveStore sync search parameters from a request. Returns undefined when the request does not carry valid sync metadata so callers can fall back to custom routing.

matchSyncRequest
(
request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request
)
if (
const searchParams: {
readonly storeId: string;
readonly payload: JsonValue | undefined;
readonly transport: "http" | "ws";
} | undefined
searchParams
!==
var undefined
undefined
) {
return
handleSyncRequest<Env, undefined, unknown, JsonValue>({ request, searchParams: { storeId, payload, transport }, env: explicitlyProvidedEnv, syncBackendBinding, headers, validatePayload, syncPayloadSchema, }: {
request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>;
searchParams: SearchParams;
env?: Env | undefined;
ctx: CfTypes.ExecutionContext;
syncBackendBinding: "SYNC_BACKEND_DO";
headers?: CfTypes.HeadersInit | undefined;
validatePayload?: ((payload: JsonValue, context: {
storeId: string;
}) => void | Promise<void>) | undefined;
syncPayloadSchema?: Schema<JsonValue, JsonValue, never> | undefined;
}): Promise<CfTypes.Response>

Handles LiveStore sync requests (e.g. with search params ?storeId=...&transport=...).

@example

const validatePayload = (payload: Schema.JsonValue | undefined, context: { storeId: string }) => {
console.log(`Validating connection for store: ${context.storeId}`)
if (payload?.authToken !== 'insecure-token-change-me') {
throw new Error('Invalid auth token')
}
}
export default {
fetch: async (request, env, ctx) => {
const searchParams = matchSyncRequest(request)
// Is LiveStore sync request
if (searchParams !== undefined) {
return handleSyncRequest({
request,
searchParams,
env,
ctx,
syncBackendBinding: 'SYNC_BACKEND_DO',
headers: {},
validatePayload,
})
}
return new Response('Invalid path', { status: 400 })
}
}

@throws{UnexpectedError} If the payload is invalid

handleSyncRequest
({
request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request
,
searchParams: {
readonly storeId: string;
readonly payload: JsonValue | undefined;
readonly transport: "http" | "ws";
}
searchParams
,
env?: Env | undefined
env
,
ctx: CfTypes.ExecutionContext<unknown>

Only there for type-level reasons

ctx
,
syncBackendBinding: "SYNC_BACKEND_DO"

Binding name of the sync backend Durable Object

syncBackendBinding
: 'SYNC_BACKEND_DO',
})
}
// Custom routes, assets, etc.
return new
var Response: new (body?: BodyInit | null, init?: ResponseInit) => Response

The Response interface of the Fetch API represents the response to a request.

MDN Reference

This Fetch API interface represents the response to a request.

MDN Reference

Response
('Not found', {
ResponseInit.status?: number
status
: 404 }) as unknown as
import CfTypes
CfTypes
.
interface Response

This Fetch API interface represents the response to a request.

MDN Reference

Response
},
} satisfies
type CFWorker<TEnv extends Env = Env, _T extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined> = {
fetch: <CFHostMetada = unknown>(request: CfTypes.Request<CFHostMetada>, env: TEnv, ctx: CfTypes.ExecutionContext) => Promise<CfTypes.Response>;
}
CFWorker
<
(alias) interface Env
import Env
Env
>
import {
const makeWorker: <TEnv extends Env = Env, TDurableObjectRpc extends Rpc.DurableObjectBranded | undefined = undefined, TSyncPayload = JsonValue>(options: MakeWorkerOptions<TEnv, TSyncPayload>) => CFWorker<TEnv, TDurableObjectRpc>

Produces a Cloudflare Worker fetch handler that delegates sync traffic to the Durable Object identified by syncBackendBinding.

For more complex setups prefer implementing a custom fetch and call

handleSyncRequest

from the branch that handles LiveStore sync requests.

makeWorker
} from '@livestore/sync-cf/cf-worker'
export default
makeWorker<Env, undefined, JsonValue>(options: MakeWorkerOptions<Env, JsonValue>): CFWorker<Env, undefined>

Produces a Cloudflare Worker fetch handler that delegates sync traffic to the Durable Object identified by syncBackendBinding.

For more complex setups prefer implementing a custom fetch and call

handleSyncRequest

from the branch that handles LiveStore sync requests.

makeWorker
({
syncBackendBinding: string

Binding name of the sync Durable Object declared in wrangler config.

syncBackendBinding
: 'SYNC_BACKEND_DO',
validatePayload?: (payload: JsonValue, context: {
storeId: string;
}) => void | Promise<void>

Validates the (optionally decoded) payload during WebSocket connection establishment. If

syncPayloadSchema

is provided, payload will be of the schema’s inferred type.

validatePayload
: (
payload: JsonValue
payload
, {
storeId: string
storeId
}) => {
// Simple token-based guard at connection time
const
const hasAuthToken: boolean
hasAuthToken
= typeof
payload: JsonValue
payload
=== 'object' &&
payload: JsonObject | JsonArray | null
payload
!== null && 'authToken' in
payload: JsonObject | JsonArray
payload
if (!
const hasAuthToken: boolean
hasAuthToken
) {
throw new
var Error: ErrorConstructor
new (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error
('Missing auth token')
}
if ((
payload: JsonObject
payload
as any).
any
authToken
!== 'insecure-token-change-me') {
throw new
var Error: ErrorConstructor
new (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error
('Invalid auth token')
}
var console: Console

The console module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers.

The module exports two specific components:

  • A Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
  • A global console instance configured to write to process.stdout and process.stderr. The global console can be used without importing the node:console module.

Warning: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the note on process I/O for more information.

Example using the global console:

console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr

Example using the Console class:

const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err

@seesource

console
.
Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)

Prints to stdout with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to printf(3) (the arguments are all passed to util.format()).

const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout

See util.format() for more information.

@sincev0.1.100

log
(`Validated connection for store: ${
storeId: string
storeId
}`)
},
enableCORS?: boolean

@defaultfalse

enableCORS
: true,
})

Handles sync backend HTTP requests in custom workers.

Options:

  • request - The incoming request
  • searchParams - Parsed sync request parameters
  • env - Worker environment
  • ctx - Worker execution context
  • syncBackendBinding - Durable Object binding name defined in wrangler.toml
  • headers? - Response headers
  • validatePayload? - Payload validation function
import type {
type CFWorker<TEnv extends Env = Env, _T extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined> = {
fetch: <CFHostMetada = unknown>(request: CfTypes.Request<CFHostMetada>, env: TEnv, ctx: CfTypes.ExecutionContext) => Promise<CfTypes.Response>;
}
CFWorker
,
import CfTypes
CfTypes
} from '@livestore/sync-cf/cf-worker'
import {
const handleSyncRequest: <TEnv extends Env = Env, TDurableObjectRpc extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined, CFHostMetada = unknown, TSyncPayload = JsonValue>({ request, searchParams: { storeId, payload, transport }, env: explicitlyProvidedEnv, syncBackendBinding, headers, validatePayload, syncPayloadSchema, }: {
request: CfTypes.Request<CFHostMetada>;
searchParams: SearchParams;
env?: TEnv | undefined;
ctx: CfTypes.ExecutionContext;
syncBackendBinding: MakeWorkerOptions<TEnv, TSyncPayload>["syncBackendBinding"];
headers?: CfTypes.HeadersInit | undefined;
validatePayload?: MakeWorkerOptions<TEnv, TSyncPayload>["validatePayload"];
syncPayloadSchema?: MakeWorkerOptions<TEnv, TSyncPayload>["syncPayloadSchema"];
}) => Promise<CfTypes.Response>

Handles LiveStore sync requests (e.g. with search params ?storeId=...&transport=...).

@example

const validatePayload = (payload: Schema.JsonValue | undefined, context: { storeId: string }) => {
console.log(`Validating connection for store: ${context.storeId}`)
if (payload?.authToken !== 'insecure-token-change-me') {
throw new Error('Invalid auth token')
}
}
export default {
fetch: async (request, env, ctx) => {
const searchParams = matchSyncRequest(request)
// Is LiveStore sync request
if (searchParams !== undefined) {
return handleSyncRequest({
request,
searchParams,
env,
ctx,
syncBackendBinding: 'SYNC_BACKEND_DO',
headers: {},
validatePayload,
})
}
return new Response('Invalid path', { status: 400 })
}
}

@throws{UnexpectedError} If the payload is invalid

handleSyncRequest
,
const matchSyncRequest: (request: CfTypes.Request) => SearchParams | undefined

Extracts the LiveStore sync search parameters from a request. Returns undefined when the request does not carry valid sync metadata so callers can fall back to custom routing.

matchSyncRequest
} from '@livestore/sync-cf/cf-worker'
import type {
(alias) interface Env
import Env
Env
} from './env.ts'
export default {
fetch: <CFHostMetada = unknown>(request: CfTypes.Request<CFHostMetada, CfTypes.CfProperties<CFHostMetada>>, env: Env, ctx: CfTypes.ExecutionContext) => Promise<CfTypes.Response>
fetch
: async (
request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request
:
import CfTypes
CfTypes
.
interface Request<CfHostMetadata = unknown, Cf = CfTypes.CfProperties<CfHostMetadata>>

This Fetch API interface represents a resource request.

MDN Reference

Request
,
env: Env
env
:
(alias) interface Env
import Env
Env
,
ctx: CfTypes.ExecutionContext<unknown>
ctx
:
import CfTypes
CfTypes
.
interface ExecutionContext<Props = unknown>
ExecutionContext
) => {
const
const searchParams: {
readonly storeId: string;
readonly payload: JsonValue | undefined;
readonly transport: "http" | "ws";
} | undefined
searchParams
=
function matchSyncRequest(request: CfTypes.Request): SearchParams | undefined

Extracts the LiveStore sync search parameters from a request. Returns undefined when the request does not carry valid sync metadata so callers can fall back to custom routing.

matchSyncRequest
(
request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request
)
if (
const searchParams: {
readonly storeId: string;
readonly payload: JsonValue | undefined;
readonly transport: "http" | "ws";
} | undefined
searchParams
!==
var undefined
undefined
) {
return
handleSyncRequest<Env, undefined, unknown, JsonValue>({ request, searchParams: { storeId, payload, transport }, env: explicitlyProvidedEnv, syncBackendBinding, headers, validatePayload, syncPayloadSchema, }: {
request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>;
searchParams: SearchParams;
env?: Env | undefined;
ctx: CfTypes.ExecutionContext;
syncBackendBinding: "SYNC_BACKEND_DO";
headers?: CfTypes.HeadersInit | undefined;
validatePayload?: ((payload: JsonValue, context: {
storeId: string;
}) => void | Promise<void>) | undefined;
syncPayloadSchema?: Schema<JsonValue, JsonValue, never> | undefined;
}): Promise<CfTypes.Response>

Handles LiveStore sync requests (e.g. with search params ?storeId=...&transport=...).

@example

const validatePayload = (payload: Schema.JsonValue | undefined, context: { storeId: string }) => {
console.log(`Validating connection for store: ${context.storeId}`)
if (payload?.authToken !== 'insecure-token-change-me') {
throw new Error('Invalid auth token')
}
}
export default {
fetch: async (request, env, ctx) => {
const searchParams = matchSyncRequest(request)
// Is LiveStore sync request
if (searchParams !== undefined) {
return handleSyncRequest({
request,
searchParams,
env,
ctx,
syncBackendBinding: 'SYNC_BACKEND_DO',
headers: {},
validatePayload,
})
}
return new Response('Invalid path', { status: 400 })
}
}

@throws{UnexpectedError} If the payload is invalid

handleSyncRequest
({
request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request
,
searchParams: {
readonly storeId: string;
readonly payload: JsonValue | undefined;
readonly transport: "http" | "ws";
}
searchParams
,
env?: Env | undefined
env
,
ctx: CfTypes.ExecutionContext<unknown>

Only there for type-level reasons

ctx
,
syncBackendBinding: "SYNC_BACKEND_DO"

Binding name of the sync backend Durable Object

syncBackendBinding
: 'SYNC_BACKEND_DO',
headers?: CfTypes.HeadersInit | undefined
headers
: { 'X-Custom': 'header' },
validatePayload?: ((payload: JsonValue, context: {
storeId: string;
}) => void | Promise<void>) | undefined
validatePayload
: (
payload: JsonValue
payload
, {
storeId: string
storeId
}) => {
// Custom validation logic
if (!(typeof
payload: JsonValue
payload
=== 'object' &&
payload: JsonObject | JsonArray | null
payload
!== null && 'authToken' in
payload: JsonObject | JsonArray
payload
)) {
throw new
var Error: ErrorConstructor
new (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error
('Missing auth token')
}
var console: Console

The console module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers.

The module exports two specific components:

  • A Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
  • A global console instance configured to write to process.stdout and process.stderr. The global console can be used without importing the node:console module.

Warning: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the note on process I/O for more information.

Example using the global console:

console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr

Example using the Console class:

const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err

@seesource

console
.
Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)

Prints to stdout with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to printf(3) (the arguments are all passed to util.format()).

const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout

See util.format() for more information.

@sincev0.1.100

log
('Validating store',
storeId: string
storeId
)
},
})
}
return new
var Response: new (body?: BodyInit | null, init?: ResponseInit) => Response

The Response interface of the Fetch API represents the response to a request.

MDN Reference

This Fetch API interface represents the response to a request.

MDN Reference

Response
('Not found', {
ResponseInit.status?: number
status
: 404 }) as unknown as
import CfTypes
CfTypes
.
interface Response

This Fetch API interface represents the response to a request.

MDN Reference

Response
},
} satisfies
type CFWorker<TEnv extends Env = Env, _T extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined> = {
fetch: <CFHostMetada = unknown>(request: CfTypes.Request<CFHostMetada>, env: TEnv, ctx: CfTypes.ExecutionContext) => Promise<CfTypes.Response>;
}
CFWorker
<
(alias) interface Env
import Env
Env
>

Parses and validates sync request search parameters.

Returns the decoded search params or undefined if the request is not a LiveStore sync request.

import type {
import CfTypes
CfTypes
} from '@livestore/sync-cf/cf-worker'
import {
const matchSyncRequest: (request: CfTypes.Request) => SearchParams | undefined

Extracts the LiveStore sync search parameters from a request. Returns undefined when the request does not carry valid sync metadata so callers can fall back to custom routing.

matchSyncRequest
} from '@livestore/sync-cf/cf-worker'
declare const
const request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request
:
import CfTypes
CfTypes
.
interface Request<CfHostMetadata = unknown, Cf = CfTypes.CfProperties<CfHostMetadata>>

This Fetch API interface represents a resource request.

MDN Reference

Request
const
const searchParams: {
readonly storeId: string;
readonly payload: JsonValue | undefined;
readonly transport: "http" | "ws";
} | undefined
searchParams
=
function matchSyncRequest(request: CfTypes.Request): SearchParams | undefined

Extracts the LiveStore sync search parameters from a request. Returns undefined when the request does not carry valid sync metadata so callers can fall back to custom routing.

matchSyncRequest
(
const request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request
)
if (
const searchParams: {
readonly storeId: string;
readonly payload: JsonValue | undefined;
readonly transport: "http" | "ws";
} | undefined
searchParams
!==
var undefined
undefined
) {
const {
const storeId: string
storeId
,
const payload: JsonValue | undefined
payload
,
const transport: "http" | "ws"
transport
} =
const searchParams: {
readonly storeId: string;
readonly payload: JsonValue | undefined;
readonly transport: "http" | "ws";
}
searchParams
var console: Console

The console module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers.

The module exports two specific components:

  • A Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
  • A global console instance configured to write to process.stdout and process.stderr. The global console can be used without importing the node:console module.

Warning: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the note on process I/O for more information.

Example using the global console:

console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr

Example using the Console class:

const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err

@seesource

console
.
Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)

Prints to stdout with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to printf(3) (the arguments are all passed to util.format()).

const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout

See util.format() for more information.

@sincev0.1.100

log
(`Sync request for store ${
const storeId: string
storeId
} via ${
const transport: "http" | "ws"
transport
}`)
var console: Console

The console module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers.

The module exports two specific components:

  • A Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
  • A global console instance configured to write to process.stdout and process.stderr. The global console can be used without importing the node:console module.

Warning: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the note on process I/O for more information.

Example using the global console:

console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr

Example using the Console class:

const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err

@seesource

console
.
Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)

Prints to stdout with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to printf(3) (the arguments are all passed to util.format()).

const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout

See util.format() for more information.

@sincev0.1.100

log
(
const payload: JsonValue | undefined
payload
)
}

Configure your wrangler.toml for sync backend deployment (default: DO SQLite storage):

name = "livestore-sync"
main = "./src/worker.ts"
compatibility_date = "2025-05-07"
compatibility_flags = [
"enable_request_signal", # Required for HTTP streaming
]
[[durable_objects.bindings]]
name = "SYNC_BACKEND_DO"
class_name = "SyncBackendDO"
[[migrations]]
tag = "v1"
new_sqlite_classes = ["SyncBackendDO"]
[vars]
ADMIN_SECRET = "your-admin-secret"

To use D1 instead of DO SQLite, add a D1 binding and reference it from makeDurableObject({ storage: { _tag: 'd1', binding: '...' } }):

[[d1_databases]]
binding = "DB"
database_name = "livestore-sync"
database_id = "your-database-id"
[vars]
ADMIN_SECRET = "your-admin-secret"

Required environment bindings:

import type {
import CfTypes
CfTypes
,
(alias) interface SyncBackendRpcInterface
import SyncBackendRpcInterface

Durable Object interface supporting the DO RPC protocol for DO <> DO syncing.

SyncBackendRpcInterface
} from '@livestore/sync-cf/cf-worker'
export interface
interface Env
Env
{
Env.ADMIN_SECRET: string
ADMIN_SECRET
: string // Admin authentication
Env.SYNC_BACKEND_DO: CfTypes.DurableObjectNamespace<SyncBackendRpcInterface>
SYNC_BACKEND_DO
:
import CfTypes
CfTypes
.
class DurableObjectNamespace<T extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined>
DurableObjectNamespace
<
(alias) interface SyncBackendRpcInterface
import SyncBackendRpcInterface

Durable Object interface supporting the DO RPC protocol for DO <> DO syncing.

SyncBackendRpcInterface
>
}

LiveStore identifies sync requests purely by search parameters; the request path does not matter. Use matchSyncRequest(request) to detect sync traffic.

Required search parameters:

ParamTypeRequiredDescription
storeIdstringYesTarget LiveStore identifier.
transport'ws' | 'http'YesTransport protocol selector.
payloadJSON (URI-encoded)NoArbitrary JSON used for auth/tenant routing; validated in validatePayload.

Examples (any path):

  • WebSocket: https://sync.example.com?storeId=abc&transport=ws (must include Upgrade: websocket)
  • HTTP: https://sync.example.com?storeId=abc&transport=http

Notes:

  • For transport=ws, if the request is not a WebSocket upgrade, the backend returns 426 Upgrade Required.
  • transport='do-rpc' is internal for Durable Object RPC and not exposed via URL parameters.

By default, events are stored in the Durable Object’s SQLite with tables following the pattern:

eventlog_{PERSISTENCE_FORMAT_VERSION}_{storeId}

You can opt into D1 with the same table shape. The persistence format version is automatically managed and incremented when the storage schema changes.

  • DO SQLite (default)
    • Pros: easiest deploy (no D1), data co-located with the DO, lowest latency
    • Cons: not directly inspectable outside the DO; operational tooling must go through the DO
  • D1 (optional)
    • Pros: inspectable using D1 tools/clients; enables cross-store analytics outside DOs
    • Cons: extra hop, JSON response size considerations; requires D1 provisioning

Deploy to Cloudflare Workers:

Terminal window
# Deploy the worker
npx wrangler deploy
# Create D1 database
npx wrangler d1 create livestore-sync
# Run migrations if needed
npx wrangler d1 migrations apply livestore-sync

Run locally with Wrangler:

Terminal window
# Start local development server
npx wrangler dev
# Access local D1 database
# Located at: .wrangler/state/d1/miniflare-D1DatabaseObject/XXX.sqlite
import {
const makeWorker: (options: WorkerOptions) => void
makeWorker
} from '@livestore/adapter-web/worker'
import {
const makeWsSync: (options: WsSyncOptions) => SyncBackendConstructor<SyncMetadata>

Creates a sync backend that uses WebSocket to communicate with the sync backend.

@example

import { makeWsSync } from '@livestore/sync-cf/client'
const syncBackend = makeWsSync({ url: 'wss://sync.example.com' })

makeWsSync
} from '@livestore/sync-cf/client'
import {
const schema: FromInputSchema.DeriveSchema<{
events: {
todoCreated: EventDef<"v1.TodoCreated", {
readonly id: string;
readonly text: string;
}, {
readonly id: string;
readonly text: string;
}, false>;
todoCompleted: EventDef<"v1.TodoCompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoUncompleted: EventDef<"v1.TodoUncompleted", {
readonly id: string;
}, {
readonly id: string;
}, false>;
todoDeleted: EventDef<"v1.TodoDeleted", {
readonly id: string;
readonly deletedAt: Date;
}, {
readonly id: string;
readonly deletedAt: string;
}, false>;
todoClearedCompleted: EventDef<...>;
};
state: InternalState;
}>
schema
} from './schema.ts'
function makeWorker(options: WorkerOptions): void
makeWorker
({
schema: LiveStoreSchema<DbSchema, EventDefRecord>
schema
,
sync?: SyncOptions
sync
: {
backend?: SyncBackendConstructor<any, JsonValue>
backend
:
function makeWsSync(options: WsSyncOptions): SyncBackendConstructor<SyncMetadata>

Creates a sync backend that uses WebSocket to communicate with the sync backend.

@example

import { makeWsSync } from '@livestore/sync-cf/client'
const syncBackend = makeWsSync({ url: 'wss://sync.example.com' })

makeWsSync
({
WsSyncOptions.url: string

URL of the sync backend

The protocol can either http/https or ws/wss

url
: 'wss://sync.example.com',
}),
},
})
import {
const makeDurableObject: MakeDurableObjectClass

Creates a Durable Object class for handling WebSocket-based sync. A sync durable object is uniquely scoped to a specific storeId.

The sync DO supports 3 transport modes:

  • HTTP JSON-RPC
  • WebSocket
  • Durable Object RPC calls (only works in combination with @livestore/adapter-cf)

Example:

// In your Cloudflare Worker file
import { makeDurableObject } from '@livestore/sync-cf/cf-worker'
export class SyncBackendDO extends makeDurableObject({
onPush: async (message) => {
console.log('onPush', message.batch)
},
onPull: async (message) => {
console.log('onPull', message)
},
}) {}

wrangler.toml

[[durable_objects.bindings]]
name = "SYNC_BACKEND_DO"
class_name = "SyncBackendDO"
[[migrations]]
tag = "v1"
new_sqlite_classes = ["SyncBackendDO"]

makeDurableObject
,
const makeWorker: <TEnv extends Env = Env, TDurableObjectRpc extends Rpc.DurableObjectBranded | undefined = undefined, TSyncPayload = JsonValue>(options: MakeWorkerOptions<TEnv, TSyncPayload>) => CFWorker<TEnv, TDurableObjectRpc>

Produces a Cloudflare Worker fetch handler that delegates sync traffic to the Durable Object identified by syncBackendBinding.

For more complex setups prefer implementing a custom fetch and call

handleSyncRequest

from the branch that handles LiveStore sync requests.

makeWorker
} from '@livestore/sync-cf/cf-worker'
export class
class SyncBackendDO
SyncBackendDO
extends
function makeDurableObject(options?: MakeDurableObjectClassOptions): {
new (ctx: DoState, env: Env): DoObject<SyncBackendRpcInterface>;
}

Creates a Durable Object class for handling WebSocket-based sync. A sync durable object is uniquely scoped to a specific storeId.

The sync DO supports 3 transport modes:

  • HTTP JSON-RPC
  • WebSocket
  • Durable Object RPC calls (only works in combination with @livestore/adapter-cf)

Example:

// In your Cloudflare Worker file
import { makeDurableObject } from '@livestore/sync-cf/cf-worker'
export class SyncBackendDO extends makeDurableObject({
onPush: async (message) => {
console.log('onPush', message.batch)
},
onPull: async (message) => {
console.log('onPull', message)
},
}) {}

wrangler.toml

[[durable_objects.bindings]]
name = "SYNC_BACKEND_DO"
class_name = "SyncBackendDO"
[[migrations]]
tag = "v1"
new_sqlite_classes = ["SyncBackendDO"]

makeDurableObject
({
onPush?: (message: PushRequest, context: {
storeId: StoreId;
payload?: JsonValue;
}) => SyncOrPromiseOrEffect<void>
onPush
: async (
message: {
readonly batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[];
readonly backendId: Option<string>;
}
message
, {
storeId: string
storeId
}) => {
// Log all sync events
var console: Console

The console module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers.

The module exports two specific components:

  • A Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
  • A global console instance configured to write to process.stdout and process.stderr. The global console can be used without importing the node:console module.

Warning: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the note on process I/O for more information.

Example using the global console:

console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr

Example using the Console class:

const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err

@seesource

console
.
Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)

Prints to stdout with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to printf(3) (the arguments are all passed to util.format()).

const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout

See util.format() for more information.

@sincev0.1.100

log
(`Store ${
storeId: string
storeId
} received ${
message: {
readonly batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[];
readonly backendId: Option<string>;
}
message
.
batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[]
batch
.
ReadonlyArray<{ readonly name: string; readonly args: any; readonly seqNum: number & Brand<"GlobalEventSequenceNumber">; readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">; readonly clientId: string; readonly sessionId: string; }>.length: number

Gets the length of the array. This is a number one higher than the highest element defined in an array.

length
} events`)
},
}) {}
const
const hasStoreAccess: (_userId: string, _storeId: string) => boolean
hasStoreAccess
= (
_userId: string
_userId
: string,
_storeId: string
_storeId
: string): boolean => true
export default
makeWorker<Env, undefined, JsonValue>(options: MakeWorkerOptions<Env, JsonValue>): CFWorker<Env, undefined>

Produces a Cloudflare Worker fetch handler that delegates sync traffic to the Durable Object identified by syncBackendBinding.

For more complex setups prefer implementing a custom fetch and call

handleSyncRequest

from the branch that handles LiveStore sync requests.

makeWorker
({
syncBackendBinding: string

Binding name of the sync Durable Object declared in wrangler config.

syncBackendBinding
: 'SYNC_BACKEND_DO',
validatePayload?: (payload: JsonValue, context: {
storeId: string;
}) => void | Promise<void>

Validates the (optionally decoded) payload during WebSocket connection establishment. If

syncPayloadSchema

is provided, payload will be of the schema’s inferred type.

validatePayload
: (
payload: JsonValue
payload
, {
storeId: string
storeId
}) => {
if (!(typeof
payload: JsonValue
payload
=== 'object' &&
payload: JsonObject | JsonArray | null
payload
!== null && 'userId' in
payload: JsonObject | JsonArray
payload
)) {
throw new
var Error: ErrorConstructor
new (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error
('User ID required')
}
// Validate user has access to store
if (!
const hasStoreAccess: (_userId: string, _storeId: string) => boolean
hasStoreAccess
((
payload: JsonObject
payload
as any).
any
userId
as string,
storeId: string
storeId
)) {
throw new
var Error: ErrorConstructor
new (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error
('Unauthorized access to store')
}
},
enableCORS?: boolean

@defaultfalse

enableCORS
: true,
})
import {
const makeDurableObject: MakeDurableObjectClass

Creates a Durable Object class for handling WebSocket-based sync. A sync durable object is uniquely scoped to a specific storeId.

The sync DO supports 3 transport modes:

  • HTTP JSON-RPC
  • WebSocket
  • Durable Object RPC calls (only works in combination with @livestore/adapter-cf)

Example:

// In your Cloudflare Worker file
import { makeDurableObject } from '@livestore/sync-cf/cf-worker'
export class SyncBackendDO extends makeDurableObject({
onPush: async (message) => {
console.log('onPush', message.batch)
},
onPull: async (message) => {
console.log('onPull', message)
},
}) {}

wrangler.toml

[[durable_objects.bindings]]
name = "SYNC_BACKEND_DO"
class_name = "SyncBackendDO"
[[migrations]]
tag = "v1"
new_sqlite_classes = ["SyncBackendDO"]

makeDurableObject
} from '@livestore/sync-cf/cf-worker'
type
type Transport = "http" | "ws" | "do-rpc"
Transport
= 'http' | 'ws' | 'do-rpc'
const
const getTransportFromContext: (ctx: unknown) => Transport
getTransportFromContext
= (
ctx: unknown
ctx
: unknown):
type Transport = "http" | "ws" | "do-rpc"
Transport
=> {
if (typeof
ctx: unknown
ctx
=== 'object' &&
ctx: object | null
ctx
!== null && 'transport' in (
ctx: object
ctx
as any)) {
const
const t: any
t
= (
ctx: object
ctx
as any).
any
transport
if (
const t: any
t
=== 'http' ||
const t: any
t
=== 'ws' ||
const t: any
t
=== 'do-rpc') return
const t: any
t
}
return 'http'
}
export class
class SyncBackendDO
SyncBackendDO
extends
function makeDurableObject(options?: MakeDurableObjectClassOptions): {
new (ctx: DoState, env: Env): DoObject<SyncBackendRpcInterface>;
}

Creates a Durable Object class for handling WebSocket-based sync. A sync durable object is uniquely scoped to a specific storeId.

The sync DO supports 3 transport modes:

  • HTTP JSON-RPC
  • WebSocket
  • Durable Object RPC calls (only works in combination with @livestore/adapter-cf)

Example:

// In your Cloudflare Worker file
import { makeDurableObject } from '@livestore/sync-cf/cf-worker'
export class SyncBackendDO extends makeDurableObject({
onPush: async (message) => {
console.log('onPush', message.batch)
},
onPull: async (message) => {
console.log('onPull', message)
},
}) {}

wrangler.toml

[[durable_objects.bindings]]
name = "SYNC_BACKEND_DO"
class_name = "SyncBackendDO"
[[migrations]]
tag = "v1"
new_sqlite_classes = ["SyncBackendDO"]

makeDurableObject
({
// Enable all transport modes
enabledTransports?: Set<"http" | "ws" | "do-rpc">

Enabled transports for sync backend

  • http: HTTP JSON-RPC
  • ws: WebSocket
  • do-rpc: Durable Object RPC calls (only works in combination with @livestore/adapter-cf)

@defaultSet(['http', 'ws', 'do-rpc'])

enabledTransports
: new
var Set: SetConstructor
new <Transport>(iterable?: Iterable<Transport> | null | undefined) => Set<Transport> (+1 overload)
Set
<
type Transport = "http" | "ws" | "do-rpc"
Transport
>(['http', 'ws', 'do-rpc']),
onPush?: (message: PushRequest, context: {
storeId: StoreId;
payload?: JsonValue;
}) => SyncOrPromiseOrEffect<void>
onPush
: async (
message: {
readonly batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[];
readonly backendId: Option<string>;
}
message
,
context: {
storeId: StoreId;
payload?: JsonValue;
}
context
) => {
const
const transport: Transport
transport
=
const getTransportFromContext: (ctx: unknown) => Transport
getTransportFromContext
(
context: {
storeId: StoreId;
payload?: JsonValue;
}
context
)
var console: Console

The console module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers.

The module exports two specific components:

  • A Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
  • A global console instance configured to write to process.stdout and process.stderr. The global console can be used without importing the node:console module.

Warning: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the note on process I/O for more information.

Example using the global console:

console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr

Example using the Console class:

const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err

@seesource

console
.
Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)

Prints to stdout with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to printf(3) (the arguments are all passed to util.format()).

const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout

See util.format() for more information.

@sincev0.1.100

log
(`Push via ${
const transport: Transport
transport
}:`,
message: {
readonly batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[];
readonly backendId: Option<string>;
}
message
.
batch: readonly {
readonly name: string;
readonly args: any;
readonly seqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">;
readonly clientId: string;
readonly sessionId: string;
}[]
batch
.
ReadonlyArray<{ readonly name: string; readonly args: any; readonly seqNum: number & Brand<"GlobalEventSequenceNumber">; readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">; readonly clientId: string; readonly sessionId: string; }>.length: number

Gets the length of the array. This is a number one higher than the highest element defined in an array.

length
)
},
}) {}