Cloudflare Workers
The @livestore/sync-cf package provides a comprehensive LiveStore sync provider for Cloudflare Workers. It uses Durable Objects for connectivity and, by default, persists events in the Durable Object’s own SQLite. You can optionally use Cloudflare D1 instead. Multiple transports are supported to fit different deployment scenarios.
Installation
Section titled “Installation”pnpm add @livestore/sync-cfTransport Modes
Section titled “Transport Modes”The sync provider supports three transport protocols, each optimized for different use cases:
WebSocket Transport (Recommended)
Section titled “WebSocket Transport (Recommended)”Real-time bidirectional communication with automatic reconnection and live pull support.
import { const makeWsSync: (options: WsSyncOptions) => SyncBackendConstructor<SyncMetadata>
Creates a sync backend that uses WebSocket to communicate with the sync backend.
makeWsSync } from '@livestore/sync-cf/client'
export const const syncBackend: SyncBackendConstructor<{ readonly createdAt: string; readonly _tag: "SyncMessage.SyncMetadata";}, JsonValue>
syncBackend = function makeWsSync(options: WsSyncOptions): SyncBackendConstructor<SyncMetadata>
Creates a sync backend that uses WebSocket to communicate with the sync backend.
makeWsSync({ WsSyncOptions.url: string
URL of the sync backend
The protocol can either http/https or ws/wss
url: 'wss://sync.example.com',})HTTP Transport
Section titled “HTTP Transport”HTTP-based sync with polling for live updates. Requires the enable_request_signal compatibility flag.
import { const makeHttpSync: (options: HttpSyncOptions) => SyncBackendConstructor<SyncMetadata>
Note: This implementation requires the enable_request_signal compatibility flag to properly support pull streaming responses
makeHttpSync } from '@livestore/sync-cf/client'
export const const syncBackend: SyncBackendConstructor<{ readonly createdAt: string; readonly _tag: "SyncMessage.SyncMetadata";}, JsonValue>
syncBackend = function makeHttpSync(options: HttpSyncOptions): SyncBackendConstructor<SyncMetadata>
Note: This implementation requires the enable_request_signal compatibility flag to properly support pull streaming responses
makeHttpSync({ HttpSyncOptions.url: string
URL of the sync backend
url: 'https://sync.example.com', HttpSyncOptions.livePull?: { pollInterval?: DurationInput;}
livePull: { pollInterval?: DurationInput
How often to poll for new events
pollInterval: 3000, // Poll every 3 seconds },})Durable Object RPC Transport
Section titled “Durable Object RPC Transport”Direct RPC communication between Durable Objects (internal use by @livestore/adapter-cloudflare).
import type { import CfTypes
CfTypes, (alias) interface SyncBackendRpcInterfaceimport SyncBackendRpcInterface
Durable Object interface supporting the DO RPC protocol for DO <> DO syncing.
SyncBackendRpcInterface } from '@livestore/sync-cf/cf-worker'import { const makeDoRpcSync: ({ syncBackendStub, durableObjectContext }: DoRpcSyncOptions) => SyncBackendConstructor<SyncMetadata>
Creates a sync backend that uses Durable Object RPC to communicate with the sync backend.
Used internally by @livestore/adapter-cf to connect to the sync backend.
makeDoRpcSync } from '@livestore/sync-cf/client'
declare const const state: CfTypes.DurableObjectState<unknown>
state: import CfTypes
CfTypes.interface DurableObjectState<Props = unknown>
DurableObjectStatedeclare const const syncBackendDurableObject: CfTypes.DurableObjectStub<SyncBackendRpcInterface>
syncBackendDurableObject: import CfTypes
CfTypes.type DurableObjectStub<T extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined> = (T extends CfTypes.Rpc.EntrypointBranded ? CfTypes.Rpc.Provider<T, "alarm" | "webSocketMessage" | "webSocketClose" | "webSocketError" | "fetch" | "connect"> : unknown) & { fetch(input: CfTypes.RequestInfo | CfTypes.URL, init?: CfTypes.RequestInit): Promise<CfTypes.Response>; connect(address: CfTypes.SocketAddress | string, options?: CfTypes.SocketOptions): CfTypes.Socket;} & { ...;}
DurableObjectStub<(alias) interface SyncBackendRpcInterfaceimport SyncBackendRpcInterface
Durable Object interface supporting the DO RPC protocol for DO <> DO syncing.
SyncBackendRpcInterface>
export const const syncBackend: SyncBackendConstructor<{ readonly createdAt: string; readonly _tag: "SyncMessage.SyncMetadata";}, JsonValue>
syncBackend = function makeDoRpcSync({ syncBackendStub, durableObjectContext }: DoRpcSyncOptions): SyncBackendConstructor<SyncMetadata>
Creates a sync backend that uses Durable Object RPC to communicate with the sync backend.
Used internally by @livestore/adapter-cf to connect to the sync backend.
makeDoRpcSync({ DoRpcSyncOptions.syncBackendStub: SyncBackendRpcStub
Durable Object stub that implements the SyncDoRpc interface
syncBackendStub: const syncBackendDurableObject: CfTypes.DurableObjectStub<SyncBackendRpcInterface>
syncBackendDurableObject, DoRpcSyncOptions.durableObjectContext: { bindingName: string; durableObjectId: string;}
Information about this DurableObject instance so the Sync DO instance can call back to this instance
durableObjectContext: { bindingName: string
See wrangler.toml for the binding name
bindingName: 'CLIENT_DO', durableObjectId: string
state.id.toString() in the DO
durableObjectId: const state: CfTypes.DurableObjectState<unknown>
state.DurableObjectState<unknown>.id: CfTypes.DurableObjectId
id.DurableObjectId.toString(): string
toString(), },})Client API Reference
Section titled “Client API Reference”makeWsSync(options)
Section titled “makeWsSync(options)”Creates a WebSocket-based sync backend client.
Options:
url- WebSocket URL (supportsws/wssorhttp/httpsprotocols)webSocketFactory?- Custom WebSocket implementationping?- Ping configuration:enabled?: boolean- Enable/disable ping (default:true)requestTimeout?: Duration- Ping timeout (default: 10 seconds)requestInterval?: Duration- Ping interval (default: 10 seconds)
Features:
- Real-time live pull
- Automatic reconnection
- Connection status tracking
- Ping/pong keep-alive
import { const makeWsSync: (options: WsSyncOptions) => SyncBackendConstructor<SyncMetadata>
Creates a sync backend that uses WebSocket to communicate with the sync backend.
makeWsSync } from '@livestore/sync-cf/client'
export const const syncBackend: SyncBackendConstructor<{ readonly createdAt: string; readonly _tag: "SyncMessage.SyncMetadata";}, JsonValue>
syncBackend = function makeWsSync(options: WsSyncOptions): SyncBackendConstructor<SyncMetadata>
Creates a sync backend that uses WebSocket to communicate with the sync backend.
makeWsSync({ WsSyncOptions.url: string
URL of the sync backend
The protocol can either http/https or ws/wss
url: 'wss://sync.example.com', WsSyncOptions.ping?: { enabled?: boolean; requestTimeout?: DurationInput; requestInterval?: DurationInput;}
ping: { enabled?: boolean
enabled: true, requestTimeout?: DurationInput
How long to wait for a ping response before timing out
requestTimeout: 5000, requestInterval?: DurationInput
How often to send ping requests
requestInterval: 15000, },})makeHttpSync(options)
Section titled “makeHttpSync(options)”Creates an HTTP-based sync backend client with polling for live updates.
Options:
url- HTTP endpoint URLheaders?- Additional HTTP headerslivePull?- Live pull configuration:pollInterval?: Duration- Polling interval (default: 5 seconds)
ping?- Ping configuration (same as WebSocket)
Features:
- HTTP request/response based
- Polling-based live pull
- Custom headers support
- Connection status via ping
import { const makeHttpSync: (options: HttpSyncOptions) => SyncBackendConstructor<SyncMetadata>
Note: This implementation requires the enable_request_signal compatibility flag to properly support pull streaming responses
makeHttpSync } from '@livestore/sync-cf/client'
export const const syncBackend: SyncBackendConstructor<{ readonly createdAt: string; readonly _tag: "SyncMessage.SyncMetadata";}, JsonValue>
syncBackend = function makeHttpSync(options: HttpSyncOptions): SyncBackendConstructor<SyncMetadata>
Note: This implementation requires the enable_request_signal compatibility flag to properly support pull streaming responses
makeHttpSync({ HttpSyncOptions.url: string
URL of the sync backend
url: 'https://sync.example.com', HttpSyncOptions.headers?: Record<string, string>
headers: { type Authorization: string
Authorization: 'Bearer token', 'X-Custom-Header': 'value', }, HttpSyncOptions.livePull?: { pollInterval?: DurationInput;}
livePull: { pollInterval?: DurationInput
How often to poll for new events
pollInterval: 2000, // Poll every 2 seconds },})makeDoRpcSync(options)
Section titled “makeDoRpcSync(options)”Creates a Durable Object RPC-based sync backend (for internal use).
Options:
syncBackendStub- Durable Object stub implementingSyncBackendRpcInterfacedurableObjectContext- Context for RPC callbacks:bindingName- Wrangler binding name for the client DOdurableObjectId- Client Durable Object ID
Features:
- Direct RPC communication
- Real-time live pull via callbacks
- Hibernation support
handleSyncUpdateRpc(payload)
Section titled “handleSyncUpdateRpc(payload)”Handles RPC callback for live pull updates in Durable Objects.
import { class DurableObject<Env = Cloudflare.Env, Props = {}>
DurableObject } from 'cloudflare:workers'import { type (alias) interface ClientDoWithRpcCallbackimport ClientDoWithRpcCallback
ClientDoWithRpcCallback, const createStoreDoPromise: <TSchema extends LiveStoreSchema, TEnv, TState extends DurableObjectState = DurableObjectState<unknown>>(options: CreateStoreDoOptions<TSchema, TEnv, TState>) => Promise<Store<TSchema, {}>>
createStoreDoPromise } from '@livestore/adapter-cloudflare'import { function nanoid<Type extends string>(size?: number): Type
Generate secure URL-friendly unique ID.
By default, the ID will have 21 symbols to have a collision probability
similar to UUID v4.
import { nanoid } from 'nanoid'model.id = nanoid() //=> "Uakgb_J5m9g-0JDMbcJqL"
nanoid, type class Store<TSchema extends LiveStoreSchema = LiveStoreSchema.Any, TContext = {}>
Store, type type Unsubscribe = () => void
Unsubscribe } from '@livestore/livestore'import { const handleSyncUpdateRpc: (payload: unknown) => Promise<void>
import { DurableObject } from 'cloudflare:workers'import { ClientDoWithRpcCallback } from '@livestore/common-cf'
export class MyDurableObject extends DurableObject implements ClientDoWithRpcCallback { // ...
async syncUpdateRpc(payload: RpcMessage.ResponseChunkEncoded) { return handleSyncUpdateRpc(payload) }}
handleSyncUpdateRpc } from '@livestore/sync-cf/client'import type { type Env = { CLIENT_DO: DurableObjectNamespace<ClientDoWithRpcCallback>; SYNC_BACKEND_DO: DurableObjectNamespace<SyncBackendRpcInterface>; DB: D1Database; ADMIN_SECRET: string;}
Env } from './env.ts'import { const schema: FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>
schema, const tables: { todos: TableDef<SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { columnType: "integer"; ... 4 more ...; autoIncrement: false; }; }>, WithDefaults<...>, Schema<...>>;}
tables } from './schema.ts'import { const storeIdFromRequest: (request: Request) => string
storeIdFromRequest } from './shared.ts'
type type AlarmInfo = { isRetry: boolean; retryCount: number;}
AlarmInfo = { isRetry: boolean
isRetry: boolean retryCount: number
retryCount: number}
export class class LiveStoreClientDO
LiveStoreClientDO extends class DurableObject<Env = Cloudflare.Env, Props = {}>
DurableObject<type Env = { CLIENT_DO: DurableObjectNamespace<ClientDoWithRpcCallback>; SYNC_BACKEND_DO: DurableObjectNamespace<SyncBackendRpcInterface>; DB: D1Database; ADMIN_SECRET: string;}
Env> implements (alias) interface ClientDoWithRpcCallbackimport ClientDoWithRpcCallback
ClientDoWithRpcCallback { LiveStoreClientDO.__DURABLE_OBJECT_BRAND: never
__DURABLE_OBJECT_BRAND: never = var undefined
undefined as never
private LiveStoreClientDO.storeId: string | undefined
storeId: string | undefined private LiveStoreClientDO.cachedStore: Store<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>, {}> | undefined
cachedStore: class Store<TSchema extends LiveStoreSchema = LiveStoreSchema.Any, TContext = {}>
Store<typeof const schema: FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>
schema> | undefined private LiveStoreClientDO.storeSubscription: Unsubscribe | undefined
storeSubscription: type Unsubscribe = () => void
Unsubscribe | undefined private readonly LiveStoreClientDO.todosQuery: QueryBuilder<readonly { readonly id: string; readonly text: string; readonly deletedAt: Date | null; readonly completed: boolean;}[], TableDefBase<SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, WithDefaults<...>>, "select" | ... 3 more ... | "row">
todosQuery = const tables: { todos: TableDef<SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, WithDefaults<...>, Schema<...>>;}
tables.todos: TableDef<SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, WithDefaults<...>, Schema<...>>
todos.select: <"id" | "text" | "deletedAt" | "completed">(...columns: ("id" | "text" | "deletedAt" | "completed")[]) => QueryBuilder<readonly { readonly id: string; readonly text: string; readonly deletedAt: Date | null; readonly completed: boolean;}[], TableDefBase<SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { ...; }; readonly deletedAt: { ...; };}>, WithDefaults<...>>, "select" | ... 3 more ... | "row"> (+1 overload)
Select multiple columns
select()
async LiveStoreClientDO.fetch(request: Request): Promise<Response>
fetch(request: Request<unknown, CfProperties<unknown>>
request: interface Request<CfHostMetadata = unknown, Cf = CfProperties<CfHostMetadata>>
The Request interface of the Fetch API represents a resource request.
This Fetch API interface represents a resource request.
Request): interface Promise<T>
Represents the completion of an asynchronous operation
Promise<interface Response
The Response interface of the Fetch API represents the response to a request.
This Fetch API interface represents the response to a request.
Response> { // @ts-expect-error TODO remove casts once CF types are fixed in https://github.com/cloudflare/workerd/issues/4811 this.LiveStoreClientDO.storeId: string | undefined
storeId = function storeIdFromRequest(request: Request): string
storeIdFromRequest(request: Request<unknown, CfProperties<unknown>>
request)
const const store: Store<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>, {}>
store = await this.LiveStoreClientDO.getStore(): Promise<Store<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>, {}>>
getStore() await this.LiveStoreClientDO.subscribeToStore(): Promise<void>
subscribeToStore()
const const todos: readonly { readonly id: string; readonly text: string; readonly deletedAt: Date | null; readonly completed: boolean;}[]
todos = const store: Store<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>, {}>
store.Store<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<...>; todoUncompleted: EventDef<...>; todoDeleted: EventDef<...>; todoClearedCompleted: EventDef<...>; }; state: InternalState; }>, {}>.query: <readonly { readonly id: string; readonly text: string; readonly deletedAt: Date | null; readonly completed: boolean;}[]>(query: Queryable<readonly { readonly id: string; readonly text: string; readonly deletedAt: Date | null; readonly completed: boolean;}[]> | { query: string; bindValues: Bindable; schema?: Schema<readonly { readonly id: string; readonly text: string; readonly deletedAt: Date | null; readonly completed: boolean; }[], readonly { readonly id: string; readonly text: string; readonly deletedAt: Date | null; readonly completed: boolean; }[], never>;}, options?: { otelContext?: Context; debugRefreshReason?: RefreshReason;}) => readonly { readonly id: string; readonly text: string; readonly deletedAt: Date | null; readonly completed: boolean;}[]
Synchronously queries the database without creating a LiveQuery.
This is useful for queries that don't need to be reactive.
Example: Query builder
const completedTodos = store.query(tables.todo.where({ complete: true }))
Example: Raw SQL query
const completedTodos = store.query({ query: 'SELECT * FROM todo WHERE complete = 1', bindValues: {} })
query(this.LiveStoreClientDO.todosQuery: QueryBuilder<readonly { readonly id: string; readonly text: string; readonly deletedAt: Date | null; readonly completed: boolean;}[], TableDefBase<SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, WithDefaults<...>>, "select" | ... 3 more ... | "row">
todosQuery) return new var Response: new (body?: BodyInit | null, init?: ResponseInit) => Response
The Response interface of the Fetch API represents the response to a request.
This Fetch API interface represents the response to a request.
Response(var JSON: JSON
An intrinsic object that provides functions to convert JavaScript values to and from the JavaScript Object Notation (JSON) format.
JSON.JSON.stringify(value: any, replacer?: (number | string)[] | null, space?: string | number): string (+1 overload)
Converts a JavaScript value to a JavaScript Object Notation (JSON) string.
stringify(const todos: readonly { readonly id: string; readonly text: string; readonly deletedAt: Date | null; readonly completed: boolean;}[]
todos, null, 2), { ResponseInit.headers?: HeadersInit
headers: { 'Content-Type': 'application/json' }, }) }
private async LiveStoreClientDO.getStore(): Promise<Store<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>, {}>>
getStore() { if (this.LiveStoreClientDO.cachedStore: Store<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>, {}> | undefined
cachedStore !== var undefined
undefined) { return this.LiveStoreClientDO.cachedStore: Store<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>, {}>
cachedStore }
const const storeId: string
storeId = this.LiveStoreClientDO.storeId: string | undefined
storeId ?? nanoid<string>(size?: number): string
Generate secure URL-friendly unique ID.
By default, the ID will have 21 symbols to have a collision probability
similar to UUID v4.
import { nanoid } from 'nanoid'model.id = nanoid() //=> "Uakgb_J5m9g-0JDMbcJqL"
nanoid()
const const store: Store<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>, {}>
store = await createStoreDoPromise<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>, Env, DurableObjectState<...>>(options: CreateStoreDoOptions<...>): Promise<...>
createStoreDoPromise({ schema: FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>
LiveStore schema that defines state, migrations, and validators.
schema, storeId: string
Logical identifier for the store instance persisted inside the Durable Object.
storeId, clientId: string
Unique identifier for the client that owns the Durable Object instance.
clientId: 'client-do', sessionId: string
Identifier for the LiveStore session running inside the Durable Object.
sessionId: nanoid<string>(size?: number): string
Generate secure URL-friendly unique ID.
By default, the ID will have 21 symbols to have a collision probability
similar to UUID v4.
import { nanoid } from 'nanoid'model.id = nanoid() //=> "Uakgb_J5m9g-0JDMbcJqL"
nanoid(), durableObject: { ctx: DurableObjectState<unknown>; env: Env; bindingName: "CLIENT_DO" | "SYNC_BACKEND_DO";}
Runtime details about the Durable Object this store runs inside. Needed for sync backend to call back to this instance.
durableObject: { // @ts-expect-error TODO remove once CF types are fixed in https://github.com/cloudflare/workerd/issues/4811 ctx: DurableObjectState<unknown>
Durable Object state handle (e.g. this.ctx).
ctx: this.CloudflareWorkersModule.DurableObject<Env, {}>.ctx: DurableObjectState<{}>
ctx, env: Env
Environment bindings associated with the Durable Object.
env: this.CloudflareWorkersModule.DurableObject<Env, {}>.env: Env
env, bindingName: "CLIENT_DO" | "SYNC_BACKEND_DO"
Binding name Cloudflare uses to reach this Durable Object from other workers.
bindingName: 'CLIENT_DO', }, syncBackendStub: DurableObjectStub<SyncBackendRpcInterface>
RPC stub pointing at the sync backend Durable Object used for replication.
syncBackendStub: this.CloudflareWorkersModule.DurableObject<Env, {}>.env: Env
env.type SYNC_BACKEND_DO: DurableObjectNamespace<SyncBackendRpcInterface>
SYNC_BACKEND_DO.DurableObjectNamespace<SyncBackendRpcInterface>.get(id: DurableObjectId, options?: DurableObjectNamespaceGetDurableObjectOptions): DurableObjectStub<SyncBackendRpcInterface>
get(this.CloudflareWorkersModule.DurableObject<Env, {}>.env: Env
env.type SYNC_BACKEND_DO: DurableObjectNamespace<SyncBackendRpcInterface>
SYNC_BACKEND_DO.DurableObjectNamespace<SyncBackendRpcInterface>.idFromName(name: string): DurableObjectId
idFromName(const storeId: string
storeId)), livePull?: boolean
Enables live pull mode to receive sync updates via Durable Object RPC callbacks.
livePull: true, })
this.LiveStoreClientDO.cachedStore: Store<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>, {}> | undefined
cachedStore = const store: Store<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>, {}>
store return const store: Store<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>, {}>
store }
private async LiveStoreClientDO.subscribeToStore(): Promise<void>
subscribeToStore() { const const store: Store<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>, {}>
store = await this.LiveStoreClientDO.getStore(): Promise<Store<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>, {}>>
getStore()
if (this.LiveStoreClientDO.storeSubscription: Unsubscribe | undefined
storeSubscription === var undefined
undefined) { this.LiveStoreClientDO.storeSubscription: Unsubscribe | undefined
storeSubscription = const store: Store<FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>, {}>
store.Store<TSchema extends LiveStoreSchema = LiveStoreSchema.Any, TContext = {}>.subscribe: <readonly { readonly id: string; readonly text: string; readonly deletedAt: Date | null; readonly completed: boolean;}[]>(query: Queryable<readonly { readonly id: string; readonly text: string; readonly deletedAt: Date | null; readonly completed: boolean;}[]>, onUpdate: (value: readonly { readonly id: string; readonly text: string; readonly deletedAt: Date | null; readonly completed: boolean;}[]) => void, options?: SubscribeOptions<readonly { readonly id: string; readonly text: string; readonly deletedAt: Date | null; readonly completed: boolean;}[]> | undefined) => Unsubscribe (+1 overload)
subscribe(this.LiveStoreClientDO.todosQuery: QueryBuilder<readonly { readonly id: string; readonly text: string; readonly deletedAt: Date | null; readonly completed: boolean;}[], TableDefBase<SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, WithDefaults<...>>, "select" | ... 3 more ... | "row">
todosQuery, (todos: readonly { readonly id: string; readonly text: string; readonly completed: boolean; readonly deletedAt: Date | null;}[]
todos: interface ReadonlyArray<T>
ReadonlyArray<typeof const tables: { todos: TableDef<SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, WithDefaults<...>, Schema<...>>;}
tables.todos: TableDef<SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, WithDefaults<...>, Schema<...>>
todos.type Type: { readonly id: string; readonly text: string; readonly completed: boolean; readonly deletedAt: Date | null;}
Type>) => { var console: Console
The console module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
- A
Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
- A global
console instance configured to write to process.stdout and
process.stderr. The global console can be used without importing the node:console module.
Warning: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the note on process I/O for
more information.
Example using the global console:
console.log('hello world');// Prints: hello world, to stdoutconsole.log('hello %s', 'world');// Prints: hello world, to stdoutconsole.error(new Error('Whoops, something bad happened'));// Prints error message and stack trace to stderr:// Error: Whoops, something bad happened// at [eval]:5:15// at Script.runInThisContext (node:vm:132:18)// at Object.runInThisContext (node:vm:309:38)// at node:internal/process/execution:77:19// at [eval]-wrapper:6:22// at evalScript (node:internal/process/execution:76:60)// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';console.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to stderr
Example using the Console class:
const out = getStreamSomehow();const err = getStreamSomehow();const myConsole = new console.Console(out, err);
myConsole.log('hello world');// Prints: hello world, to outmyConsole.log('hello %s', 'world');// Prints: hello world, to outmyConsole.error(new Error('Whoops, something bad happened'));// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';myConsole.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to err
console.Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)
Prints to stdout with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to printf(3)
(the arguments are all passed to util.format()).
const count = 5;console.log('count: %d', count);// Prints: count: 5, to stdoutconsole.log('count:', count);// Prints: count: 5, to stdout
See util.format() for more information.
log(`todos for store (${this.LiveStoreClientDO.storeId: string | undefined
storeId})`, todos: readonly { readonly id: string; readonly text: string; readonly completed: boolean; readonly deletedAt: Date | null;}[]
todos) }) }
await this.CloudflareWorkersModule.DurableObject<Env, {}>.ctx: DurableObjectState<{}>
ctx.DurableObjectState<{}>.storage: DurableObjectStorage
storage.DurableObjectStorage.setAlarm(scheduledTime: number | Date, options?: DurableObjectSetAlarmOptions): Promise<void>
setAlarm(var Date: DateConstructor
Enables basic storage and retrieval of dates and times.
Date.DateConstructor.now(): number
Returns the number of milliseconds elapsed since midnight, January 1, 1970 Universal Coordinated Time (UTC).
now() + 1000) }
LiveStoreClientDO.alarm(_alarmInfo?: AlarmInfo): void | Promise<void>
alarm(_alarmInfo: AlarmInfo | undefined
_alarmInfo?: type AlarmInfo = { isRetry: boolean; retryCount: number;}
AlarmInfo): void | interface Promise<T>
Represents the completion of an asynchronous operation
Promise<void> { return this.LiveStoreClientDO.subscribeToStore(): Promise<void>
subscribeToStore() }
async LiveStoreClientDO.syncUpdateRpc(payload: unknown): Promise<void>
syncUpdateRpc(payload: unknown
payload: unknown) { await function handleSyncUpdateRpc(payload: unknown): Promise<void>
import { DurableObject } from 'cloudflare:workers'import { ClientDoWithRpcCallback } from '@livestore/common-cf'
export class MyDurableObject extends DurableObject implements ClientDoWithRpcCallback { // ...
async syncUpdateRpc(payload: RpcMessage.ResponseChunkEncoded) { return handleSyncUpdateRpc(payload) }}
handleSyncUpdateRpc(payload: unknown
payload) }}import type { (alias) interface ClientDoWithRpcCallbackimport ClientDoWithRpcCallback
ClientDoWithRpcCallback } from '@livestore/adapter-cloudflare'import type { import CfTypes
CfTypes, (alias) interface SyncBackendRpcInterfaceimport SyncBackendRpcInterface
Durable Object interface supporting the DO RPC protocol for DO <> DO syncing.
SyncBackendRpcInterface } from '@livestore/sync-cf/cf-worker'
export type type Env = { CLIENT_DO: CfTypes.DurableObjectNamespace<ClientDoWithRpcCallback>; SYNC_BACKEND_DO: CfTypes.DurableObjectNamespace<SyncBackendRpcInterface>; DB: CfTypes.D1Database; ADMIN_SECRET: string;}
Env = { type CLIENT_DO: CfTypes.DurableObjectNamespace<ClientDoWithRpcCallback>
CLIENT_DO: import CfTypes
CfTypes.class DurableObjectNamespace<T extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined>
DurableObjectNamespace<(alias) interface ClientDoWithRpcCallbackimport ClientDoWithRpcCallback
ClientDoWithRpcCallback> type SYNC_BACKEND_DO: CfTypes.DurableObjectNamespace<SyncBackendRpcInterface>
SYNC_BACKEND_DO: import CfTypes
CfTypes.class DurableObjectNamespace<T extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined>
DurableObjectNamespace<(alias) interface SyncBackendRpcInterfaceimport SyncBackendRpcInterface
Durable Object interface supporting the DO RPC protocol for DO <> DO syncing.
SyncBackendRpcInterface> type DB: CfTypes.D1Database
DB: import CfTypes
CfTypes.class D1Database
D1Database type ADMIN_SECRET: string
ADMIN_SECRET: string}import { import Events
Events, const makeSchema: <TInputSchema extends InputSchema>(inputSchema: TInputSchema) => FromInputSchema.DeriveSchema<TInputSchema>
makeSchema, import Schema
Schema, import State
State } from '@livestore/livestore'
export const const tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>;}
tables = { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>
todos: import State
State.import SQLite
SQLite.function table<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { columnType: "integer"; ... 4 more ...; autoIncrement: false; };}, Partial<...>>(args: { ...;} & Partial<...>): State.SQLite.TableDef<...> (+2 overloads)
Creates a SQLite table definition from columns or an Effect Schema.
This function supports two main ways to define a table:
- Using explicit column definitions
- Using an Effect Schema (either the
name property needs to be provided or the schema needs to have a title/identifier)
// Using explicit columnsconst usersTable = State.SQLite.table({ name: 'users', columns: { id: State.SQLite.text({ primaryKey: true }), name: State.SQLite.text({ nullable: false }), email: State.SQLite.text({ nullable: false }), age: State.SQLite.integer({ nullable: true }), },})
// Using Effect Schema with annotationsimport { Schema } from '@livestore/utils/effect'
const UserSchema = Schema.Struct({ id: Schema.Int.pipe(State.SQLite.withPrimaryKey).pipe(State.SQLite.withAutoIncrement), email: Schema.String.pipe(State.SQLite.withUnique), name: Schema.String, active: Schema.Boolean.pipe(State.SQLite.withDefault(true)), createdAt: Schema.optional(Schema.Date),})
// Option 1: With explicit nameconst usersTable = State.SQLite.table({ name: 'users', schema: UserSchema,})
// Option 2: With name from schema annotation (title or identifier)const AnnotatedUserSchema = UserSchema.annotations({ title: 'users' })const usersTable2 = State.SQLite.table({ schema: AnnotatedUserSchema,})
// Adding indexesconst PostSchema = Schema.Struct({ id: Schema.String.pipe(State.SQLite.withPrimaryKey), title: Schema.String, authorId: Schema.String, createdAt: Schema.Date,}).annotations({ identifier: 'posts' })
const postsTable = State.SQLite.table({ schema: PostSchema, indexes: [ { name: 'idx_posts_author', columns: ['authorId'] }, { name: 'idx_posts_created', columns: ['createdAt'], isUnique: false }, ],})
table({ name: "todos"
name: 'todos', columns: { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}
columns: { id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false;}
id: import State
State.import SQLite
SQLite.const text: <string, string, false, typeof NoDefault, true, false>(args: { schema?: Schema.Schema<string, string, never>; default?: typeof NoDefault; nullable?: false; primaryKey?: true; autoIncrement?: false;}) => { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false;} (+1 overload)
text({ primaryKey?: true
primaryKey: true }), text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false;}
text: import State
State.import SQLite
SQLite.const text: <string, string, false, "", false, false>(args: { schema?: Schema.Schema<string, string, never>; default?: ""; nullable?: false; primaryKey?: false; autoIncrement?: false;}) => { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false;} (+1 overload)
text({ default?: ""
default: '' }), completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false;}
completed: import State
State.import SQLite
SQLite.const boolean: <boolean, false, false, false, false>(args: { default?: false; nullable?: false; primaryKey?: false; autoIncrement?: false;}) => { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false;} (+1 overload)
boolean({ default?: false
default: false }), deletedAt: { columnType: "integer"; schema: Schema.Schema<Date | null, number | null, never>; default: None<never>; nullable: true; primaryKey: false; autoIncrement: false;}
deletedAt: import State
State.import SQLite
SQLite.const integer: <number, Date, true, typeof NoDefault, false, false>(args: { schema?: Schema.Schema<Date, number, never>; default?: typeof NoDefault; nullable?: true; primaryKey?: false; autoIncrement?: false;}) => { columnType: "integer"; schema: Schema.Schema<Date | null, number | null, never>; default: None<never>; nullable: true; primaryKey: false; autoIncrement: false;} (+1 overload)
integer({ nullable?: true
nullable: true, schema?: Schema.Schema<Date, number, never>
schema: import Schema
Schema.class DateFromNumber
Defines a schema that converts a number into a Date object using the new Date constructor. This schema does not validate the numerical input,
allowing potentially invalid values such as NaN, Infinity, and
-Infinity to be converted into Date objects. During the encoding process,
any invalid Date object will be encoded to NaN.
DateFromNumber }), }, }),}
export const const events: { todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>; todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: State.SQLite.EventDef<...>;}
events = { todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string;}, { readonly text: string; readonly id: string;}, false>
todoCreated: import Events
Events.synced<"v1.TodoCreated", { readonly text: string; readonly id: string;}, { readonly text: string; readonly id: string;}>(args: { name: "v1.TodoCreated"; schema: Schema.Schema<{ readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, never>;} & Omit<State.SQLite.DefineEventOptions<{ readonly text: string; readonly id: string;}, false>, "derived" | "clientOnly">): State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string;}, { readonly text: string; readonly id: string;}, false>export synced
synced({ name: "v1.TodoCreated"
name: 'v1.TodoCreated', schema: Schema.Schema<{ readonly text: string; readonly id: string;}, { readonly text: string; readonly id: string;}, never>
schema: import Schema
Schema.function Struct<{ id: typeof Schema.String; text: typeof Schema.String;}>(fields: { id: typeof Schema.String; text: typeof Schema.String;}): Schema.Struct<{ id: typeof Schema.String; text: typeof Schema.String;}> (+1 overload)
Struct({ id: typeof Schema.String
id: import Schema
Schema.class Stringexport String
String, text: typeof Schema.String
text: import Schema
Schema.class Stringexport String
String }), }), todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string;}, { readonly id: string;}, false>
todoCompleted: import Events
Events.synced<"v1.TodoCompleted", { readonly id: string;}, { readonly id: string;}>(args: { name: "v1.TodoCompleted"; schema: Schema.Schema<{ readonly id: string; }, { readonly id: string; }, never>;} & Omit<State.SQLite.DefineEventOptions<{ readonly id: string;}, false>, "derived" | "clientOnly">): State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string;}, { readonly id: string;}, false>export synced
synced({ name: "v1.TodoCompleted"
name: 'v1.TodoCompleted', schema: Schema.Schema<{ readonly id: string;}, { readonly id: string;}, never>
schema: import Schema
Schema.function Struct<{ id: typeof Schema.String;}>(fields: { id: typeof Schema.String;}): Schema.Struct<{ id: typeof Schema.String;}> (+1 overload)
Struct({ id: typeof Schema.String
id: import Schema
Schema.class Stringexport String
String }), }), todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string;}, { readonly id: string;}, false>
todoUncompleted: import Events
Events.synced<"v1.TodoUncompleted", { readonly id: string;}, { readonly id: string;}>(args: { name: "v1.TodoUncompleted"; schema: Schema.Schema<{ readonly id: string; }, { readonly id: string; }, never>;} & Omit<State.SQLite.DefineEventOptions<{ readonly id: string;}, false>, "derived" | "clientOnly">): State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string;}, { readonly id: string;}, false>export synced
synced({ name: "v1.TodoUncompleted"
name: 'v1.TodoUncompleted', schema: Schema.Schema<{ readonly id: string;}, { readonly id: string;}, never>
schema: import Schema
Schema.function Struct<{ id: typeof Schema.String;}>(fields: { id: typeof Schema.String;}): Schema.Struct<{ id: typeof Schema.String;}> (+1 overload)
Struct({ id: typeof Schema.String
id: import Schema
Schema.class Stringexport String
String }), }), todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date;}, { readonly id: string; readonly deletedAt: string;}, false>
todoDeleted: import Events
Events.synced<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date;}, { readonly id: string; readonly deletedAt: string;}>(args: { name: "v1.TodoDeleted"; schema: Schema.Schema<{ readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, never>;} & Omit<State.SQLite.DefineEventOptions<{ readonly id: string; readonly deletedAt: Date;}, false>, "derived" | "clientOnly">): State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date;}, { readonly id: string; readonly deletedAt: string;}, false>export synced
synced({ name: "v1.TodoDeleted"
name: 'v1.TodoDeleted', schema: Schema.Schema<{ readonly id: string; readonly deletedAt: Date;}, { readonly id: string; readonly deletedAt: string;}, never>
schema: import Schema
Schema.function Struct<{ id: typeof Schema.String; deletedAt: typeof Schema.Date;}>(fields: { id: typeof Schema.String; deletedAt: typeof Schema.Date;}): Schema.Struct<{ id: typeof Schema.String; deletedAt: typeof Schema.Date;}> (+1 overload)
Struct({ id: typeof Schema.String
id: import Schema
Schema.class Stringexport String
String, deletedAt: typeof Schema.Date
deletedAt: import Schema
Schema.class Dateexport Date
This schema converts a string into a Date object using the new Date
constructor. It ensures that only valid date strings are accepted,
rejecting any strings that would result in an invalid date, such as new Date("Invalid Date").
Date }), }), todoClearedCompleted: State.SQLite.EventDef<"v1.TodoClearedCompleted", { readonly deletedAt: Date;}, { readonly deletedAt: string;}, false>
todoClearedCompleted: import Events
Events.synced<"v1.TodoClearedCompleted", { readonly deletedAt: Date;}, { readonly deletedAt: string;}>(args: { name: "v1.TodoClearedCompleted"; schema: Schema.Schema<{ readonly deletedAt: Date; }, { readonly deletedAt: string; }, never>;} & Omit<State.SQLite.DefineEventOptions<{ readonly deletedAt: Date;}, false>, "derived" | "clientOnly">): State.SQLite.EventDef<"v1.TodoClearedCompleted", { readonly deletedAt: Date;}, { readonly deletedAt: string;}, false>export synced
synced({ name: "v1.TodoClearedCompleted"
name: 'v1.TodoClearedCompleted', schema: Schema.Schema<{ readonly deletedAt: Date;}, { readonly deletedAt: string;}, never>
schema: import Schema
Schema.function Struct<{ deletedAt: typeof Schema.Date;}>(fields: { deletedAt: typeof Schema.Date;}): Schema.Struct<{ deletedAt: typeof Schema.Date;}> (+1 overload)
Struct({ deletedAt: typeof Schema.Date
deletedAt: import Schema
Schema.class Dateexport Date
This schema converts a string into a Date object using the new Date
constructor. It ensures that only valid date strings are accepted,
rejecting any strings that would result in an invalid date, such as new Date("Invalid Date").
Date }), }),}
const const materializers: { "v1.TodoCreated": State.SQLite.Materializer<State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>>; "v1.TodoCompleted": State.SQLite.Materializer<State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>>; "v1.TodoUncompleted": State.SQLite.Materializer<State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>>; "v1.TodoDeleted": State.SQLite.Materializer<...>; "v1.TodoClearedCompleted": State.SQLite.Materializer<...>;}
materializers = import State
State.import SQLite
SQLite.const materializers: <{ todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>; todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: State.SQLite.EventDef<...>;}>(_eventDefRecord: { todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>; todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: State.SQLite.EventDef<...>;}, handlers: { ...;}) => { ...;}
materializers(const events: { todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>; todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: State.SQLite.EventDef<...>;}
events, { 'v1.TodoCreated': ({ id: string
id, text: string
text }) => const tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>;}
tables.todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>
todos.insert: (values: { readonly id: string; readonly text?: string; readonly completed?: boolean; readonly deletedAt?: Date | null;}) => QueryBuilder<readonly { readonly id: string; readonly text: string; readonly completed: boolean; readonly deletedAt: Date | null;}[], State.SQLite.TableDefBase<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { ...; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>>, "select" | ... 6 more ... | "row">
Insert a new row into the table
Example:
db.todos.insert({ id: '123', text: 'Buy milk', status: 'active' })
insert({ id: string
id, text?: string
text, completed?: boolean
completed: false }), 'v1.TodoCompleted': ({ id: string
id }) => const tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>;}
tables.todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>
todos.update: (values: Partial<{ readonly text: string; readonly id: string; readonly completed: boolean; readonly deletedAt: Date | null;}>) => QueryBuilder<readonly { readonly id: string; readonly text: string; readonly completed: boolean; readonly deletedAt: Date | null;}[], State.SQLite.TableDefBase<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { ...; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>>, "select" | ... 6 more ... | "row">
Update rows in the table that match the where clause
Example:
db.todos.update({ status: 'completed' }).where({ id: '123' })
update({ completed?: boolean
completed: true }).where: (params: Partial<{ readonly id: string | { op: QueryBuilder<TResult, TTableDef extends State.SQLite.TableDefBase, TWithout extends QueryBuilder.ApiFeature = never>.WhereOps.SingleValue; value: string; } | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[]; } | undefined; readonly text: string | { op: QueryBuilder.WhereOps.SingleValue; value: string; } | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[]; } | undefined; readonly completed: boolean | ... 2 more ... | undefined; readonly deletedAt: Date | ... 3 more ... | undefined;}>) => QueryBuilder<...> (+2 overloads)
where({ id?: string | { op: QueryBuilder.WhereOps.SingleValue; value: string;} | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[];} | undefined
id }), 'v1.TodoUncompleted': ({ id: string
id }) => const tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>;}
tables.todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>
todos.update: (values: Partial<{ readonly text: string; readonly id: string; readonly completed: boolean; readonly deletedAt: Date | null;}>) => QueryBuilder<readonly { readonly id: string; readonly text: string; readonly completed: boolean; readonly deletedAt: Date | null;}[], State.SQLite.TableDefBase<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { ...; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>>, "select" | ... 6 more ... | "row">
Update rows in the table that match the where clause
Example:
db.todos.update({ status: 'completed' }).where({ id: '123' })
update({ completed?: boolean
completed: false }).where: (params: Partial<{ readonly id: string | { op: QueryBuilder<TResult, TTableDef extends State.SQLite.TableDefBase, TWithout extends QueryBuilder.ApiFeature = never>.WhereOps.SingleValue; value: string; } | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[]; } | undefined; readonly text: string | { op: QueryBuilder.WhereOps.SingleValue; value: string; } | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[]; } | undefined; readonly completed: boolean | ... 2 more ... | undefined; readonly deletedAt: Date | ... 3 more ... | undefined;}>) => QueryBuilder<...> (+2 overloads)
where({ id?: string | { op: QueryBuilder.WhereOps.SingleValue; value: string;} | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[];} | undefined
id }), 'v1.TodoDeleted': ({ id: string
id, deletedAt: Date
deletedAt }) => const tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>;}
tables.todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>
todos.update: (values: Partial<{ readonly text: string; readonly id: string; readonly completed: boolean; readonly deletedAt: Date | null;}>) => QueryBuilder<readonly { readonly id: string; readonly text: string; readonly completed: boolean; readonly deletedAt: Date | null;}[], State.SQLite.TableDefBase<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { ...; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>>, "select" | ... 6 more ... | "row">
Update rows in the table that match the where clause
Example:
db.todos.update({ status: 'completed' }).where({ id: '123' })
update({ deletedAt?: Date | null
deletedAt }).where: (params: Partial<{ readonly id: string | { op: QueryBuilder<TResult, TTableDef extends State.SQLite.TableDefBase, TWithout extends QueryBuilder.ApiFeature = never>.WhereOps.SingleValue; value: string; } | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[]; } | undefined; readonly text: string | { op: QueryBuilder.WhereOps.SingleValue; value: string; } | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[]; } | undefined; readonly completed: boolean | ... 2 more ... | undefined; readonly deletedAt: Date | ... 3 more ... | undefined;}>) => QueryBuilder<...> (+2 overloads)
where({ id?: string | { op: QueryBuilder.WhereOps.SingleValue; value: string;} | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[];} | undefined
id }), 'v1.TodoClearedCompleted': ({ deletedAt: Date
deletedAt }) => const tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>;}
tables.todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>
todos.update: (values: Partial<{ readonly text: string; readonly id: string; readonly completed: boolean; readonly deletedAt: Date | null;}>) => QueryBuilder<readonly { readonly id: string; readonly text: string; readonly completed: boolean; readonly deletedAt: Date | null;}[], State.SQLite.TableDefBase<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { ...; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>>, "select" | ... 6 more ... | "row">
Update rows in the table that match the where clause
Example:
db.todos.update({ status: 'completed' }).where({ id: '123' })
update({ deletedAt?: Date | null
deletedAt }).where: (params: Partial<{ readonly id: string | { op: QueryBuilder<TResult, TTableDef extends State.SQLite.TableDefBase, TWithout extends QueryBuilder.ApiFeature = never>.WhereOps.SingleValue; value: string; } | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[]; } | undefined; readonly text: string | { op: QueryBuilder.WhereOps.SingleValue; value: string; } | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[]; } | undefined; readonly completed: boolean | ... 2 more ... | undefined; readonly deletedAt: Date | ... 3 more ... | undefined;}>) => QueryBuilder<...> (+2 overloads)
where({ completed?: boolean | { op: QueryBuilder.WhereOps.SingleValue; value: boolean;} | { op: QueryBuilder.WhereOps.MultiValue; value: readonly boolean[];} | undefined
completed: true }),})
const const state: InternalState
state = import State
State.import SQLite
SQLite.const makeState: <{ tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>; }; materializers: { ...; };}>(inputSchema: { tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>; }; materializers: { ...; };}) => InternalState
makeState({ tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>;}
tables, materializers: { "v1.TodoCreated": State.SQLite.Materializer<State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>>; "v1.TodoCompleted": State.SQLite.Materializer<State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>>; "v1.TodoUncompleted": State.SQLite.Materializer<State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>>; "v1.TodoDeleted": State.SQLite.Materializer<...>; "v1.TodoClearedCompleted": State.SQLite.Materializer<...>;}
materializers })
export const const schema: FromInputSchema.DeriveSchema<{ events: { todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>; todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: State.SQLite.EventDef<...>; }; state: InternalState;}>
schema = makeSchema<{ events: { todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>; todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: State.SQLite.EventDef<...>; }; state: InternalState;}>(inputSchema: { events: { todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>; todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: State.SQLite.EventDef<...>; }; state: InternalState;}): FromInputSchema.DeriveSchema<...>
makeSchema({ events: { todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>; todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: State.SQLite.EventDef<...>;}
events, state: InternalState
state })import type { import CfTypes
CfTypes } from '@livestore/sync-cf/cf-worker'
export const const storeIdFromRequest: (request: CfTypes.Request) => string
storeIdFromRequest = (request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request: import CfTypes
CfTypes.interface Request<CfHostMetadata = unknown, Cf = CfTypes.CfProperties<CfHostMetadata>>
This Fetch API interface represents a resource request.
Request) => { const const url: URL
url = new var URL: new (url: string | URL, base?: string | URL) => URL
The URL interface is used to parse, construct, normalize, and encode URL.
URL(request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request.Request<unknown, CfProperties<unknown>>.url: string
Returns the URL of request as a string.
url) const const storeId: string | null
storeId = const url: URL
url.URL.searchParams: URLSearchParams
The searchParams read-only property of the access to the [MISSING: httpmethod('GET')] decoded query arguments contained in the URL.
searchParams.URLSearchParams.get(name: string): string | null
The get() method of the URLSearchParams interface returns the first value associated to the given search parameter.
get('storeId')
if (const storeId: string | null
storeId === null) { throw new var Error: ErrorConstructornew (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error('storeId is required in URL search params') }
return const storeId: string
storeId}Server API Reference
Section titled “Server API Reference”makeDurableObject(options)
Section titled “makeDurableObject(options)”Creates a sync backend Durable Object class.
Options:
onPush?- Callback for push events:(message, context) => void | Promise<void>onPushRes?- Callback for push responses:(message) => void | Promise<void>onPull?- Callback for pull requests:(message, context) => void | Promise<void>onPullRes?- Callback for pull responses:(message) => void | Promise<void>storage?- Storage engine:{ _tag: 'do-sqlite' } | { _tag: 'd1', binding: string }(default:do-sqlite)enabledTransports?- Set of enabled transports:Set<'http' | 'ws' | 'do-rpc'>otel?- OpenTelemetry configuration:baseUrl?- OTEL endpoint URLserviceName?- Service name for traces
import { const makeDurableObject: MakeDurableObjectClass
Creates a Durable Object class for handling WebSocket-based sync.
A sync durable object is uniquely scoped to a specific storeId.
The sync DO supports 3 transport modes:
- HTTP JSON-RPC
- WebSocket
- Durable Object RPC calls (only works in combination with
@livestore/adapter-cf)
Example:
// In your Cloudflare Worker fileimport { makeDurableObject } from '@livestore/sync-cf/cf-worker'
export class SyncBackendDO extends makeDurableObject({ onPush: async (message) => { console.log('onPush', message.batch) }, onPull: async (message) => { console.log('onPull', message) },}) {}
wrangler.toml
[[durable_objects.bindings]]name = "SYNC_BACKEND_DO"class_name = "SyncBackendDO"
[[migrations]]tag = "v1"new_sqlite_classes = ["SyncBackendDO"]
makeDurableObject } from '@livestore/sync-cf/cf-worker'
const const hasUserId: (p: unknown) => p is { userId: string;}
hasUserId = (p: unknown
p: unknown): p: unknown
p is { userId: string
userId: string } => typeof p: unknown
p === 'object' && p: object | null
p !== var undefined
undefined && p: object | null
p !== null && 'userId' in p: object
p
export class class SyncBackendDO
SyncBackendDO extends function makeDurableObject(options?: MakeDurableObjectClassOptions): { new (ctx: DoState, env: Env): DoObject<SyncBackendRpcInterface>;}
Creates a Durable Object class for handling WebSocket-based sync.
A sync durable object is uniquely scoped to a specific storeId.
The sync DO supports 3 transport modes:
- HTTP JSON-RPC
- WebSocket
- Durable Object RPC calls (only works in combination with
@livestore/adapter-cf)
Example:
// In your Cloudflare Worker fileimport { makeDurableObject } from '@livestore/sync-cf/cf-worker'
export class SyncBackendDO extends makeDurableObject({ onPush: async (message) => { console.log('onPush', message.batch) }, onPull: async (message) => { console.log('onPull', message) },}) {}
wrangler.toml
[[durable_objects.bindings]]name = "SYNC_BACKEND_DO"class_name = "SyncBackendDO"
[[migrations]]tag = "v1"new_sqlite_classes = ["SyncBackendDO"]
makeDurableObject({ onPush?: (message: PushRequest, context: { storeId: StoreId; payload?: JsonValue;}) => SyncOrPromiseOrEffect<void>
onPush: async (message: { readonly batch: readonly { readonly name: string; readonly args: any; readonly seqNum: number & Brand<"GlobalEventSequenceNumber">; readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">; readonly clientId: string; readonly sessionId: string; }[]; readonly backendId: Option<string>;}
message, { storeId: string
storeId, payload: JsonValue | undefined
payload }) => { var console: Console
The console module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
- A
Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
- A global
console instance configured to write to process.stdout and
process.stderr. The global console can be used without importing the node:console module.
Warning: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the note on process I/O for
more information.
Example using the global console:
console.log('hello world');// Prints: hello world, to stdoutconsole.log('hello %s', 'world');// Prints: hello world, to stdoutconsole.error(new Error('Whoops, something bad happened'));// Prints error message and stack trace to stderr:// Error: Whoops, something bad happened// at [eval]:5:15// at Script.runInThisContext (node:vm:132:18)// at Object.runInThisContext (node:vm:309:38)// at node:internal/process/execution:77:19// at [eval]-wrapper:6:22// at evalScript (node:internal/process/execution:76:60)// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';console.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to stderr
Example using the Console class:
const out = getStreamSomehow();const err = getStreamSomehow();const myConsole = new console.Console(out, err);
myConsole.log('hello world');// Prints: hello world, to outmyConsole.log('hello %s', 'world');// Prints: hello world, to outmyConsole.error(new Error('Whoops, something bad happened'));// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';myConsole.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to err
console.Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)
Prints to stdout with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to printf(3)
(the arguments are all passed to util.format()).
const count = 5;console.log('count: %d', count);// Prints: count: 5, to stdoutconsole.log('count:', count);// Prints: count: 5, to stdout
See util.format() for more information.
log(`Push to store ${storeId: string
storeId}:`, message: { readonly batch: readonly { readonly name: string; readonly args: any; readonly seqNum: number & Brand<"GlobalEventSequenceNumber">; readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">; readonly clientId: string; readonly sessionId: string; }[]; readonly backendId: Option<string>;}
message.batch: readonly { readonly name: string; readonly args: any; readonly seqNum: number & Brand<"GlobalEventSequenceNumber">; readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">; readonly clientId: string; readonly sessionId: string;}[]
batch)
// Custom business logic if (const hasUserId: (p: unknown) => p is { userId: string;}
hasUserId(payload: JsonValue | undefined
payload)) { await var Promise: PromiseConstructor
Represents the completion of an asynchronous operation
Promise.PromiseConstructor.resolve(): Promise<void> (+2 overloads)
Creates a new resolved promise.
resolve() } }, onPull?: (message: PullRequest, context: { storeId: StoreId; payload?: JsonValue;}) => SyncOrPromiseOrEffect<void>
onPull: async (_message: { readonly cursor: Option<{ readonly backendId: string; readonly eventSequenceNumber: number & Brand<"GlobalEventSequenceNumber">; }>;}
_message, { storeId: string
storeId }) => { var console: Console
The console module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
- A
Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
- A global
console instance configured to write to process.stdout and
process.stderr. The global console can be used without importing the node:console module.
Warning: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the note on process I/O for
more information.
Example using the global console:
console.log('hello world');// Prints: hello world, to stdoutconsole.log('hello %s', 'world');// Prints: hello world, to stdoutconsole.error(new Error('Whoops, something bad happened'));// Prints error message and stack trace to stderr:// Error: Whoops, something bad happened// at [eval]:5:15// at Script.runInThisContext (node:vm:132:18)// at Object.runInThisContext (node:vm:309:38)// at node:internal/process/execution:77:19// at [eval]-wrapper:6:22// at evalScript (node:internal/process/execution:76:60)// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';console.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to stderr
Example using the Console class:
const out = getStreamSomehow();const err = getStreamSomehow();const myConsole = new console.Console(out, err);
myConsole.log('hello world');// Prints: hello world, to outmyConsole.log('hello %s', 'world');// Prints: hello world, to outmyConsole.error(new Error('Whoops, something bad happened'));// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';myConsole.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to err
console.Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)
Prints to stdout with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to printf(3)
(the arguments are all passed to util.format()).
const count = 5;console.log('count: %d', count);// Prints: count: 5, to stdoutconsole.log('count:', count);// Prints: count: 5, to stdout
See util.format() for more information.
log(`Pull from store ${storeId: string
storeId}`) }, enabledTransports?: Set<"http" | "ws" | "do-rpc">
Enabled transports for sync backend
http: HTTP JSON-RPC
ws: WebSocket
do-rpc: Durable Object RPC calls (only works in combination with @livestore/adapter-cf)
enabledTransports: new var Set: SetConstructornew <"http" | "ws">(iterable?: Iterable<"http" | "ws"> | null | undefined) => Set<"http" | "ws"> (+1 overload)
Set(['ws', 'http']), // Disable DO RPC otel?: { baseUrl?: string; serviceName?: string;}
otel: { baseUrl?: string
baseUrl: 'https://otel.example.com', serviceName?: string
serviceName: 'livestore-sync', },}) {}makeWorker(options)
Section titled “makeWorker(options)”Creates a complete Cloudflare Worker for the sync backend.
Options:
syncBackendBinding- Durable Object binding name defined inwrangler.tomlvalidatePayload?- Payload validation function:(payload, context) => void | Promise<void>enableCORS?- Enable CORS headers (default:false)
makeWorker is a quick way to get started in simple demos. In most production workers you typically want to share routing logic with other endpoints, so prefer wiring your own fetch handler and call handleSyncRequest when you detect a sync request. A minimal example:
import type { type CFWorker<TEnv extends Env = Env, _T extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined> = { fetch: <CFHostMetada = unknown>(request: CfTypes.Request<CFHostMetada>, env: TEnv, ctx: CfTypes.ExecutionContext) => Promise<CfTypes.Response>;}
CFWorker, import CfTypes
CfTypes } from '@livestore/sync-cf/cf-worker'import { const handleSyncRequest: <TEnv extends Env = Env, TDurableObjectRpc extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined, CFHostMetada = unknown, TSyncPayload = JsonValue>({ request, searchParams: { storeId, payload, transport }, env: explicitlyProvidedEnv, syncBackendBinding, headers, validatePayload, syncPayloadSchema, }: { request: CfTypes.Request<CFHostMetada>; searchParams: SearchParams; env?: TEnv | undefined; ctx: CfTypes.ExecutionContext; syncBackendBinding: MakeWorkerOptions<TEnv, TSyncPayload>["syncBackendBinding"]; headers?: CfTypes.HeadersInit | undefined; validatePayload?: MakeWorkerOptions<TEnv, TSyncPayload>["validatePayload"]; syncPayloadSchema?: MakeWorkerOptions<TEnv, TSyncPayload>["syncPayloadSchema"];}) => Promise<CfTypes.Response>
Handles LiveStore sync requests (e.g. with search params ?storeId=...&transport=...).
handleSyncRequest, const matchSyncRequest: (request: CfTypes.Request) => SearchParams | undefined
Extracts the LiveStore sync search parameters from a request. Returns
undefined when the request does not carry valid sync metadata so callers
can fall back to custom routing.
matchSyncRequest } from '@livestore/sync-cf/cf-worker'import type { (alias) interface Envimport Env
Env } from './env.ts'
export default { fetch: <CFHostMetada = unknown>(request: CfTypes.Request<CFHostMetada, CfTypes.CfProperties<CFHostMetada>>, env: Env, ctx: CfTypes.ExecutionContext) => Promise<CfTypes.Response>
fetch: async (request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request: import CfTypes
CfTypes.interface Request<CfHostMetadata = unknown, Cf = CfTypes.CfProperties<CfHostMetadata>>
This Fetch API interface represents a resource request.
Request, env: Env
env: (alias) interface Envimport Env
Env, ctx: CfTypes.ExecutionContext<unknown>
ctx: import CfTypes
CfTypes.interface ExecutionContext<Props = unknown>
ExecutionContext) => { const const searchParams: { readonly storeId: string; readonly payload: JsonValue | undefined; readonly transport: "http" | "ws";} | undefined
searchParams = function matchSyncRequest(request: CfTypes.Request): SearchParams | undefined
Extracts the LiveStore sync search parameters from a request. Returns
undefined when the request does not carry valid sync metadata so callers
can fall back to custom routing.
matchSyncRequest(request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request)
if (const searchParams: { readonly storeId: string; readonly payload: JsonValue | undefined; readonly transport: "http" | "ws";} | undefined
searchParams !== var undefined
undefined) { return handleSyncRequest<Env, undefined, unknown, JsonValue>({ request, searchParams: { storeId, payload, transport }, env: explicitlyProvidedEnv, syncBackendBinding, headers, validatePayload, syncPayloadSchema, }: { request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>; searchParams: SearchParams; env?: Env | undefined; ctx: CfTypes.ExecutionContext; syncBackendBinding: "SYNC_BACKEND_DO"; headers?: CfTypes.HeadersInit | undefined; validatePayload?: ((payload: JsonValue, context: { storeId: string; }) => void | Promise<void>) | undefined; syncPayloadSchema?: Schema<JsonValue, JsonValue, never> | undefined;}): Promise<CfTypes.Response>
Handles LiveStore sync requests (e.g. with search params ?storeId=...&transport=...).
handleSyncRequest({ request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request, searchParams: { readonly storeId: string; readonly payload: JsonValue | undefined; readonly transport: "http" | "ws";}
searchParams, env?: Env | undefined
env, ctx: CfTypes.ExecutionContext<unknown>
Only there for type-level reasons
ctx, syncBackendBinding: "SYNC_BACKEND_DO"
Binding name of the sync backend Durable Object
syncBackendBinding: 'SYNC_BACKEND_DO', }) }
// Custom routes, assets, etc. return new var Response: new (body?: BodyInit | null, init?: ResponseInit) => Response
The Response interface of the Fetch API represents the response to a request.
This Fetch API interface represents the response to a request.
Response('Not found', { ResponseInit.status?: number
status: 404 }) as unknown as import CfTypes
CfTypes.interface Response
This Fetch API interface represents the response to a request.
Response },} satisfies type CFWorker<TEnv extends Env = Env, _T extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined> = { fetch: <CFHostMetada = unknown>(request: CfTypes.Request<CFHostMetada>, env: TEnv, ctx: CfTypes.ExecutionContext) => Promise<CfTypes.Response>;}
CFWorker<(alias) interface Envimport Env
Env>import type { import CfTypes
CfTypes, (alias) interface SyncBackendRpcInterfaceimport SyncBackendRpcInterface
Durable Object interface supporting the DO RPC protocol for DO <> DO syncing.
SyncBackendRpcInterface } from '@livestore/sync-cf/cf-worker'
export interface interface Env
Env { Env.ADMIN_SECRET: string
ADMIN_SECRET: string // Admin authentication Env.SYNC_BACKEND_DO: CfTypes.DurableObjectNamespace<SyncBackendRpcInterface>
SYNC_BACKEND_DO: import CfTypes
CfTypes.class DurableObjectNamespace<T extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined>
DurableObjectNamespace<(alias) interface SyncBackendRpcInterfaceimport SyncBackendRpcInterface
Durable Object interface supporting the DO RPC protocol for DO <> DO syncing.
SyncBackendRpcInterface>}import { const makeWorker: <TEnv extends Env = Env, TDurableObjectRpc extends Rpc.DurableObjectBranded | undefined = undefined, TSyncPayload = JsonValue>(options: MakeWorkerOptions<TEnv, TSyncPayload>) => CFWorker<TEnv, TDurableObjectRpc>
Produces a Cloudflare Worker fetch handler that delegates sync traffic to the
Durable Object identified by syncBackendBinding.
For more complex setups prefer implementing a custom fetch and call
handleSyncRequest
from the branch that handles LiveStore sync requests.
makeWorker } from '@livestore/sync-cf/cf-worker'
export default makeWorker<Env, undefined, JsonValue>(options: MakeWorkerOptions<Env, JsonValue>): CFWorker<Env, undefined>
Produces a Cloudflare Worker fetch handler that delegates sync traffic to the
Durable Object identified by syncBackendBinding.
For more complex setups prefer implementing a custom fetch and call
handleSyncRequest
from the branch that handles LiveStore sync requests.
makeWorker({ syncBackendBinding: string
Binding name of the sync Durable Object declared in wrangler config.
syncBackendBinding: 'SYNC_BACKEND_DO', validatePayload?: (payload: JsonValue, context: { storeId: string;}) => void | Promise<void>
Validates the (optionally decoded) payload during WebSocket connection establishment.
If
syncPayloadSchema
is provided, payload will be of the schema’s inferred type.
validatePayload: (payload: JsonValue
payload, { storeId: string
storeId }) => { // Simple token-based guard at connection time const const hasAuthToken: boolean
hasAuthToken = typeof payload: JsonValue
payload === 'object' && payload: JsonObject | JsonArray | null
payload !== null && 'authToken' in payload: JsonObject | JsonArray
payload if (!const hasAuthToken: boolean
hasAuthToken) { throw new var Error: ErrorConstructornew (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error('Missing auth token') } if ((payload: JsonObject
payload as any).any
authToken !== 'insecure-token-change-me') { throw new var Error: ErrorConstructornew (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error('Invalid auth token') } var console: Console
The console module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
- A
Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
- A global
console instance configured to write to process.stdout and
process.stderr. The global console can be used without importing the node:console module.
Warning: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the note on process I/O for
more information.
Example using the global console:
console.log('hello world');// Prints: hello world, to stdoutconsole.log('hello %s', 'world');// Prints: hello world, to stdoutconsole.error(new Error('Whoops, something bad happened'));// Prints error message and stack trace to stderr:// Error: Whoops, something bad happened// at [eval]:5:15// at Script.runInThisContext (node:vm:132:18)// at Object.runInThisContext (node:vm:309:38)// at node:internal/process/execution:77:19// at [eval]-wrapper:6:22// at evalScript (node:internal/process/execution:76:60)// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';console.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to stderr
Example using the Console class:
const out = getStreamSomehow();const err = getStreamSomehow();const myConsole = new console.Console(out, err);
myConsole.log('hello world');// Prints: hello world, to outmyConsole.log('hello %s', 'world');// Prints: hello world, to outmyConsole.error(new Error('Whoops, something bad happened'));// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';myConsole.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to err
console.Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)
Prints to stdout with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to printf(3)
(the arguments are all passed to util.format()).
const count = 5;console.log('count: %d', count);// Prints: count: 5, to stdoutconsole.log('count:', count);// Prints: count: 5, to stdout
See util.format() for more information.
log(`Validated connection for store: ${storeId: string
storeId}`) }, enableCORS?: boolean
enableCORS: true,})handleSyncRequest(args)
Section titled “handleSyncRequest(args)”Handles sync backend HTTP requests in custom workers.
Options:
request- The incoming requestsearchParams- Parsed sync request parametersenv- Worker environmentctx- Worker execution contextsyncBackendBinding- Durable Object binding name defined inwrangler.tomlheaders?- Response headersvalidatePayload?- Payload validation function
import type { type CFWorker<TEnv extends Env = Env, _T extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined> = { fetch: <CFHostMetada = unknown>(request: CfTypes.Request<CFHostMetada>, env: TEnv, ctx: CfTypes.ExecutionContext) => Promise<CfTypes.Response>;}
CFWorker, import CfTypes
CfTypes } from '@livestore/sync-cf/cf-worker'import { const handleSyncRequest: <TEnv extends Env = Env, TDurableObjectRpc extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined, CFHostMetada = unknown, TSyncPayload = JsonValue>({ request, searchParams: { storeId, payload, transport }, env: explicitlyProvidedEnv, syncBackendBinding, headers, validatePayload, syncPayloadSchema, }: { request: CfTypes.Request<CFHostMetada>; searchParams: SearchParams; env?: TEnv | undefined; ctx: CfTypes.ExecutionContext; syncBackendBinding: MakeWorkerOptions<TEnv, TSyncPayload>["syncBackendBinding"]; headers?: CfTypes.HeadersInit | undefined; validatePayload?: MakeWorkerOptions<TEnv, TSyncPayload>["validatePayload"]; syncPayloadSchema?: MakeWorkerOptions<TEnv, TSyncPayload>["syncPayloadSchema"];}) => Promise<CfTypes.Response>
Handles LiveStore sync requests (e.g. with search params ?storeId=...&transport=...).
handleSyncRequest, const matchSyncRequest: (request: CfTypes.Request) => SearchParams | undefined
Extracts the LiveStore sync search parameters from a request. Returns
undefined when the request does not carry valid sync metadata so callers
can fall back to custom routing.
matchSyncRequest } from '@livestore/sync-cf/cf-worker'import type { (alias) interface Envimport Env
Env } from './env.ts'
export default { fetch: <CFHostMetada = unknown>(request: CfTypes.Request<CFHostMetada, CfTypes.CfProperties<CFHostMetada>>, env: Env, ctx: CfTypes.ExecutionContext) => Promise<CfTypes.Response>
fetch: async (request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request: import CfTypes
CfTypes.interface Request<CfHostMetadata = unknown, Cf = CfTypes.CfProperties<CfHostMetadata>>
This Fetch API interface represents a resource request.
Request, env: Env
env: (alias) interface Envimport Env
Env, ctx: CfTypes.ExecutionContext<unknown>
ctx: import CfTypes
CfTypes.interface ExecutionContext<Props = unknown>
ExecutionContext) => { const const searchParams: { readonly storeId: string; readonly payload: JsonValue | undefined; readonly transport: "http" | "ws";} | undefined
searchParams = function matchSyncRequest(request: CfTypes.Request): SearchParams | undefined
Extracts the LiveStore sync search parameters from a request. Returns
undefined when the request does not carry valid sync metadata so callers
can fall back to custom routing.
matchSyncRequest(request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request)
if (const searchParams: { readonly storeId: string; readonly payload: JsonValue | undefined; readonly transport: "http" | "ws";} | undefined
searchParams !== var undefined
undefined) { return handleSyncRequest<Env, undefined, unknown, JsonValue>({ request, searchParams: { storeId, payload, transport }, env: explicitlyProvidedEnv, syncBackendBinding, headers, validatePayload, syncPayloadSchema, }: { request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>; searchParams: SearchParams; env?: Env | undefined; ctx: CfTypes.ExecutionContext; syncBackendBinding: "SYNC_BACKEND_DO"; headers?: CfTypes.HeadersInit | undefined; validatePayload?: ((payload: JsonValue, context: { storeId: string; }) => void | Promise<void>) | undefined; syncPayloadSchema?: Schema<JsonValue, JsonValue, never> | undefined;}): Promise<CfTypes.Response>
Handles LiveStore sync requests (e.g. with search params ?storeId=...&transport=...).
handleSyncRequest({ request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request, searchParams: { readonly storeId: string; readonly payload: JsonValue | undefined; readonly transport: "http" | "ws";}
searchParams, env?: Env | undefined
env, ctx: CfTypes.ExecutionContext<unknown>
Only there for type-level reasons
ctx, syncBackendBinding: "SYNC_BACKEND_DO"
Binding name of the sync backend Durable Object
syncBackendBinding: 'SYNC_BACKEND_DO', headers?: CfTypes.HeadersInit | undefined
headers: { 'X-Custom': 'header' }, validatePayload?: ((payload: JsonValue, context: { storeId: string;}) => void | Promise<void>) | undefined
validatePayload: (payload: JsonValue
payload, { storeId: string
storeId }) => { // Custom validation logic if (!(typeof payload: JsonValue
payload === 'object' && payload: JsonObject | JsonArray | null
payload !== null && 'authToken' in payload: JsonObject | JsonArray
payload)) { throw new var Error: ErrorConstructornew (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error('Missing auth token') } var console: Console
The console module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
- A
Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
- A global
console instance configured to write to process.stdout and
process.stderr. The global console can be used without importing the node:console module.
Warning: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the note on process I/O for
more information.
Example using the global console:
console.log('hello world');// Prints: hello world, to stdoutconsole.log('hello %s', 'world');// Prints: hello world, to stdoutconsole.error(new Error('Whoops, something bad happened'));// Prints error message and stack trace to stderr:// Error: Whoops, something bad happened// at [eval]:5:15// at Script.runInThisContext (node:vm:132:18)// at Object.runInThisContext (node:vm:309:38)// at node:internal/process/execution:77:19// at [eval]-wrapper:6:22// at evalScript (node:internal/process/execution:76:60)// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';console.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to stderr
Example using the Console class:
const out = getStreamSomehow();const err = getStreamSomehow();const myConsole = new console.Console(out, err);
myConsole.log('hello world');// Prints: hello world, to outmyConsole.log('hello %s', 'world');// Prints: hello world, to outmyConsole.error(new Error('Whoops, something bad happened'));// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';myConsole.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to err
console.Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)
Prints to stdout with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to printf(3)
(the arguments are all passed to util.format()).
const count = 5;console.log('count: %d', count);// Prints: count: 5, to stdoutconsole.log('count:', count);// Prints: count: 5, to stdout
See util.format() for more information.
log('Validating store', storeId: string
storeId) }, }) }
return new var Response: new (body?: BodyInit | null, init?: ResponseInit) => Response
The Response interface of the Fetch API represents the response to a request.
This Fetch API interface represents the response to a request.
Response('Not found', { ResponseInit.status?: number
status: 404 }) as unknown as import CfTypes
CfTypes.interface Response
This Fetch API interface represents the response to a request.
Response },} satisfies type CFWorker<TEnv extends Env = Env, _T extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined> = { fetch: <CFHostMetada = unknown>(request: CfTypes.Request<CFHostMetada>, env: TEnv, ctx: CfTypes.ExecutionContext) => Promise<CfTypes.Response>;}
CFWorker<(alias) interface Envimport Env
Env>import type { import CfTypes
CfTypes, (alias) interface SyncBackendRpcInterfaceimport SyncBackendRpcInterface
Durable Object interface supporting the DO RPC protocol for DO <> DO syncing.
SyncBackendRpcInterface } from '@livestore/sync-cf/cf-worker'
export interface interface Env
Env { Env.ADMIN_SECRET: string
ADMIN_SECRET: string // Admin authentication Env.SYNC_BACKEND_DO: CfTypes.DurableObjectNamespace<SyncBackendRpcInterface>
SYNC_BACKEND_DO: import CfTypes
CfTypes.class DurableObjectNamespace<T extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined>
DurableObjectNamespace<(alias) interface SyncBackendRpcInterfaceimport SyncBackendRpcInterface
Durable Object interface supporting the DO RPC protocol for DO <> DO syncing.
SyncBackendRpcInterface>}matchSyncRequest(request)
Section titled “matchSyncRequest(request)”Parses and validates sync request search parameters.
Returns the decoded search params or undefined if the request is not a LiveStore sync request.
import type { import CfTypes
CfTypes } from '@livestore/sync-cf/cf-worker'import { const matchSyncRequest: (request: CfTypes.Request) => SearchParams | undefined
Extracts the LiveStore sync search parameters from a request. Returns
undefined when the request does not carry valid sync metadata so callers
can fall back to custom routing.
matchSyncRequest } from '@livestore/sync-cf/cf-worker'
declare const const request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request: import CfTypes
CfTypes.interface Request<CfHostMetadata = unknown, Cf = CfTypes.CfProperties<CfHostMetadata>>
This Fetch API interface represents a resource request.
Request
const const searchParams: { readonly storeId: string; readonly payload: JsonValue | undefined; readonly transport: "http" | "ws";} | undefined
searchParams = function matchSyncRequest(request: CfTypes.Request): SearchParams | undefined
Extracts the LiveStore sync search parameters from a request. Returns
undefined when the request does not carry valid sync metadata so callers
can fall back to custom routing.
matchSyncRequest(const request: CfTypes.Request<unknown, CfTypes.CfProperties<unknown>>
request)if (const searchParams: { readonly storeId: string; readonly payload: JsonValue | undefined; readonly transport: "http" | "ws";} | undefined
searchParams !== var undefined
undefined) { const { const storeId: string
storeId, const payload: JsonValue | undefined
payload, const transport: "http" | "ws"
transport } = const searchParams: { readonly storeId: string; readonly payload: JsonValue | undefined; readonly transport: "http" | "ws";}
searchParams var console: Console
The console module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
- A
Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
- A global
console instance configured to write to process.stdout and
process.stderr. The global console can be used without importing the node:console module.
Warning: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the note on process I/O for
more information.
Example using the global console:
console.log('hello world');// Prints: hello world, to stdoutconsole.log('hello %s', 'world');// Prints: hello world, to stdoutconsole.error(new Error('Whoops, something bad happened'));// Prints error message and stack trace to stderr:// Error: Whoops, something bad happened// at [eval]:5:15// at Script.runInThisContext (node:vm:132:18)// at Object.runInThisContext (node:vm:309:38)// at node:internal/process/execution:77:19// at [eval]-wrapper:6:22// at evalScript (node:internal/process/execution:76:60)// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';console.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to stderr
Example using the Console class:
const out = getStreamSomehow();const err = getStreamSomehow();const myConsole = new console.Console(out, err);
myConsole.log('hello world');// Prints: hello world, to outmyConsole.log('hello %s', 'world');// Prints: hello world, to outmyConsole.error(new Error('Whoops, something bad happened'));// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';myConsole.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to err
console.Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)
Prints to stdout with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to printf(3)
(the arguments are all passed to util.format()).
const count = 5;console.log('count: %d', count);// Prints: count: 5, to stdoutconsole.log('count:', count);// Prints: count: 5, to stdout
See util.format() for more information.
log(`Sync request for store ${const storeId: string
storeId} via ${const transport: "http" | "ws"
transport}`) var console: Console
The console module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
- A
Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
- A global
console instance configured to write to process.stdout and
process.stderr. The global console can be used without importing the node:console module.
Warning: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the note on process I/O for
more information.
Example using the global console:
console.log('hello world');// Prints: hello world, to stdoutconsole.log('hello %s', 'world');// Prints: hello world, to stdoutconsole.error(new Error('Whoops, something bad happened'));// Prints error message and stack trace to stderr:// Error: Whoops, something bad happened// at [eval]:5:15// at Script.runInThisContext (node:vm:132:18)// at Object.runInThisContext (node:vm:309:38)// at node:internal/process/execution:77:19// at [eval]-wrapper:6:22// at evalScript (node:internal/process/execution:76:60)// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';console.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to stderr
Example using the Console class:
const out = getStreamSomehow();const err = getStreamSomehow();const myConsole = new console.Console(out, err);
myConsole.log('hello world');// Prints: hello world, to outmyConsole.log('hello %s', 'world');// Prints: hello world, to outmyConsole.error(new Error('Whoops, something bad happened'));// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';myConsole.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to err
console.Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)
Prints to stdout with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to printf(3)
(the arguments are all passed to util.format()).
const count = 5;console.log('count: %d', count);// Prints: count: 5, to stdoutconsole.log('count:', count);// Prints: count: 5, to stdout
See util.format() for more information.
log(const payload: JsonValue | undefined
payload)}Configuration
Section titled “Configuration”Wrangler Configuration
Section titled “Wrangler Configuration”Configure your wrangler.toml for sync backend deployment (default: DO SQLite storage):
name = "livestore-sync"main = "./src/worker.ts"compatibility_date = "2025-05-07"compatibility_flags = [ "enable_request_signal", # Required for HTTP streaming]
[[durable_objects.bindings]]name = "SYNC_BACKEND_DO"class_name = "SyncBackendDO"
[[migrations]]tag = "v1"new_sqlite_classes = ["SyncBackendDO"]
[vars]ADMIN_SECRET = "your-admin-secret"To use D1 instead of DO SQLite, add a D1 binding and reference it from makeDurableObject({ storage: { _tag: 'd1', binding: '...' } }):
[[d1_databases]]binding = "DB"database_name = "livestore-sync"database_id = "your-database-id"
[vars]ADMIN_SECRET = "your-admin-secret"Environment Variables
Section titled “Environment Variables”Required environment bindings:
import type { import CfTypes
CfTypes, (alias) interface SyncBackendRpcInterfaceimport SyncBackendRpcInterface
Durable Object interface supporting the DO RPC protocol for DO <> DO syncing.
SyncBackendRpcInterface } from '@livestore/sync-cf/cf-worker'
export interface interface Env
Env { Env.ADMIN_SECRET: string
ADMIN_SECRET: string // Admin authentication Env.SYNC_BACKEND_DO: CfTypes.DurableObjectNamespace<SyncBackendRpcInterface>
SYNC_BACKEND_DO: import CfTypes
CfTypes.class DurableObjectNamespace<T extends CfTypes.Rpc.DurableObjectBranded | undefined = undefined>
DurableObjectNamespace<(alias) interface SyncBackendRpcInterfaceimport SyncBackendRpcInterface
Durable Object interface supporting the DO RPC protocol for DO <> DO syncing.
SyncBackendRpcInterface>}Transport Protocol Details
Section titled “Transport Protocol Details”LiveStore identifies sync requests purely by search parameters; the request path does not matter. Use matchSyncRequest(request) to detect sync traffic.
Required search parameters:
| Param | Type | Required | Description |
|---|---|---|---|
storeId | string | Yes | Target LiveStore identifier. |
transport | 'ws' | 'http' | Yes | Transport protocol selector. |
payload | JSON (URI-encoded) | No | Arbitrary JSON used for auth/tenant routing; validated in validatePayload. |
Examples (any path):
- WebSocket:
https://sync.example.com?storeId=abc&transport=ws(must includeUpgrade: websocket) - HTTP:
https://sync.example.com?storeId=abc&transport=http
Notes:
- For
transport=ws, if the request is not a WebSocket upgrade, the backend returns426 Upgrade Required. transport='do-rpc'is internal for Durable Object RPC and not exposed via URL parameters.
Data Storage
Section titled “Data Storage”By default, events are stored in the Durable Object’s SQLite with tables following the pattern:
eventlog_{PERSISTENCE_FORMAT_VERSION}_{storeId}You can opt into D1 with the same table shape. The persistence format version is automatically managed and incremented when the storage schema changes.
Storage Engines
Section titled “Storage Engines”- DO SQLite (default)
- Pros: easiest deploy (no D1), data co-located with the DO, lowest latency
- Cons: not directly inspectable outside the DO; operational tooling must go through the DO
- D1 (optional)
- Pros: inspectable using D1 tools/clients; enables cross-store analytics outside DOs
- Cons: extra hop, JSON response size considerations; requires D1 provisioning
Deployment
Section titled “Deployment”Deploy to Cloudflare Workers:
# Deploy the workernpx wrangler deploy
# Create D1 databasenpx wrangler d1 create livestore-sync
# Run migrations if needednpx wrangler d1 migrations apply livestore-syncLocal Development
Section titled “Local Development”Run locally with Wrangler:
# Start local development servernpx wrangler dev
# Access local D1 database# Located at: .wrangler/state/d1/miniflare-D1DatabaseObject/XXX.sqliteExamples
Section titled “Examples”Basic WebSocket Client
Section titled “Basic WebSocket Client”import { const makeWorker: (options: WorkerOptions) => void
makeWorker } from '@livestore/adapter-web/worker'import { const makeWsSync: (options: WsSyncOptions) => SyncBackendConstructor<SyncMetadata>
Creates a sync backend that uses WebSocket to communicate with the sync backend.
makeWsSync } from '@livestore/sync-cf/client'import { const schema: FromInputSchema.DeriveSchema<{ events: { todoCreated: EventDef<"v1.TodoCreated", { readonly id: string; readonly text: string; }, { readonly id: string; readonly text: string; }, false>; todoCompleted: EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: EventDef<...>; }; state: InternalState;}>
schema } from './schema.ts'
function makeWorker(options: WorkerOptions): void
makeWorker({ schema: LiveStoreSchema<DbSchema, EventDefRecord>
schema, sync?: SyncOptions
sync: { backend?: SyncBackendConstructor<any, JsonValue>
backend: function makeWsSync(options: WsSyncOptions): SyncBackendConstructor<SyncMetadata>
Creates a sync backend that uses WebSocket to communicate with the sync backend.
makeWsSync({ WsSyncOptions.url: string
URL of the sync backend
The protocol can either http/https or ws/wss
url: 'wss://sync.example.com', }), },})import { import Events
Events, const makeSchema: <TInputSchema extends InputSchema>(inputSchema: TInputSchema) => FromInputSchema.DeriveSchema<TInputSchema>
makeSchema, import Schema
Schema, import State
State } from '@livestore/livestore'
export const const tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>;}
tables = { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>
todos: import State
State.import SQLite
SQLite.function table<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { columnType: "integer"; ... 4 more ...; autoIncrement: false; };}, Partial<...>>(args: { ...;} & Partial<...>): State.SQLite.TableDef<...> (+2 overloads)
Creates a SQLite table definition from columns or an Effect Schema.
This function supports two main ways to define a table:
- Using explicit column definitions
- Using an Effect Schema (either the
name property needs to be provided or the schema needs to have a title/identifier)
// Using explicit columnsconst usersTable = State.SQLite.table({ name: 'users', columns: { id: State.SQLite.text({ primaryKey: true }), name: State.SQLite.text({ nullable: false }), email: State.SQLite.text({ nullable: false }), age: State.SQLite.integer({ nullable: true }), },})
// Using Effect Schema with annotationsimport { Schema } from '@livestore/utils/effect'
const UserSchema = Schema.Struct({ id: Schema.Int.pipe(State.SQLite.withPrimaryKey).pipe(State.SQLite.withAutoIncrement), email: Schema.String.pipe(State.SQLite.withUnique), name: Schema.String, active: Schema.Boolean.pipe(State.SQLite.withDefault(true)), createdAt: Schema.optional(Schema.Date),})
// Option 1: With explicit nameconst usersTable = State.SQLite.table({ name: 'users', schema: UserSchema,})
// Option 2: With name from schema annotation (title or identifier)const AnnotatedUserSchema = UserSchema.annotations({ title: 'users' })const usersTable2 = State.SQLite.table({ schema: AnnotatedUserSchema,})
// Adding indexesconst PostSchema = Schema.Struct({ id: Schema.String.pipe(State.SQLite.withPrimaryKey), title: Schema.String, authorId: Schema.String, createdAt: Schema.Date,}).annotations({ identifier: 'posts' })
const postsTable = State.SQLite.table({ schema: PostSchema, indexes: [ { name: 'idx_posts_author', columns: ['authorId'] }, { name: 'idx_posts_created', columns: ['createdAt'], isUnique: false }, ],})
table({ name: "todos"
name: 'todos', columns: { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}
columns: { id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false;}
id: import State
State.import SQLite
SQLite.const text: <string, string, false, typeof NoDefault, true, false>(args: { schema?: Schema.Schema<string, string, never>; default?: typeof NoDefault; nullable?: false; primaryKey?: true; autoIncrement?: false;}) => { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false;} (+1 overload)
text({ primaryKey?: true
primaryKey: true }), text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false;}
text: import State
State.import SQLite
SQLite.const text: <string, string, false, "", false, false>(args: { schema?: Schema.Schema<string, string, never>; default?: ""; nullable?: false; primaryKey?: false; autoIncrement?: false;}) => { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false;} (+1 overload)
text({ default?: ""
default: '' }), completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false;}
completed: import State
State.import SQLite
SQLite.const boolean: <boolean, false, false, false, false>(args: { default?: false; nullable?: false; primaryKey?: false; autoIncrement?: false;}) => { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false;} (+1 overload)
boolean({ default?: false
default: false }), deletedAt: { columnType: "integer"; schema: Schema.Schema<Date | null, number | null, never>; default: None<never>; nullable: true; primaryKey: false; autoIncrement: false;}
deletedAt: import State
State.import SQLite
SQLite.const integer: <number, Date, true, typeof NoDefault, false, false>(args: { schema?: Schema.Schema<Date, number, never>; default?: typeof NoDefault; nullable?: true; primaryKey?: false; autoIncrement?: false;}) => { columnType: "integer"; schema: Schema.Schema<Date | null, number | null, never>; default: None<never>; nullable: true; primaryKey: false; autoIncrement: false;} (+1 overload)
integer({ nullable?: true
nullable: true, schema?: Schema.Schema<Date, number, never>
schema: import Schema
Schema.class DateFromNumber
Defines a schema that converts a number into a Date object using the new Date constructor. This schema does not validate the numerical input,
allowing potentially invalid values such as NaN, Infinity, and
-Infinity to be converted into Date objects. During the encoding process,
any invalid Date object will be encoded to NaN.
DateFromNumber }), }, }),}
export const const events: { todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>; todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: State.SQLite.EventDef<...>;}
events = { todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string;}, { readonly text: string; readonly id: string;}, false>
todoCreated: import Events
Events.synced<"v1.TodoCreated", { readonly text: string; readonly id: string;}, { readonly text: string; readonly id: string;}>(args: { name: "v1.TodoCreated"; schema: Schema.Schema<{ readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, never>;} & Omit<State.SQLite.DefineEventOptions<{ readonly text: string; readonly id: string;}, false>, "derived" | "clientOnly">): State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string;}, { readonly text: string; readonly id: string;}, false>export synced
synced({ name: "v1.TodoCreated"
name: 'v1.TodoCreated', schema: Schema.Schema<{ readonly text: string; readonly id: string;}, { readonly text: string; readonly id: string;}, never>
schema: import Schema
Schema.function Struct<{ id: typeof Schema.String; text: typeof Schema.String;}>(fields: { id: typeof Schema.String; text: typeof Schema.String;}): Schema.Struct<{ id: typeof Schema.String; text: typeof Schema.String;}> (+1 overload)
Struct({ id: typeof Schema.String
id: import Schema
Schema.class Stringexport String
String, text: typeof Schema.String
text: import Schema
Schema.class Stringexport String
String }), }), todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string;}, { readonly id: string;}, false>
todoCompleted: import Events
Events.synced<"v1.TodoCompleted", { readonly id: string;}, { readonly id: string;}>(args: { name: "v1.TodoCompleted"; schema: Schema.Schema<{ readonly id: string; }, { readonly id: string; }, never>;} & Omit<State.SQLite.DefineEventOptions<{ readonly id: string;}, false>, "derived" | "clientOnly">): State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string;}, { readonly id: string;}, false>export synced
synced({ name: "v1.TodoCompleted"
name: 'v1.TodoCompleted', schema: Schema.Schema<{ readonly id: string;}, { readonly id: string;}, never>
schema: import Schema
Schema.function Struct<{ id: typeof Schema.String;}>(fields: { id: typeof Schema.String;}): Schema.Struct<{ id: typeof Schema.String;}> (+1 overload)
Struct({ id: typeof Schema.String
id: import Schema
Schema.class Stringexport String
String }), }), todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string;}, { readonly id: string;}, false>
todoUncompleted: import Events
Events.synced<"v1.TodoUncompleted", { readonly id: string;}, { readonly id: string;}>(args: { name: "v1.TodoUncompleted"; schema: Schema.Schema<{ readonly id: string; }, { readonly id: string; }, never>;} & Omit<State.SQLite.DefineEventOptions<{ readonly id: string;}, false>, "derived" | "clientOnly">): State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string;}, { readonly id: string;}, false>export synced
synced({ name: "v1.TodoUncompleted"
name: 'v1.TodoUncompleted', schema: Schema.Schema<{ readonly id: string;}, { readonly id: string;}, never>
schema: import Schema
Schema.function Struct<{ id: typeof Schema.String;}>(fields: { id: typeof Schema.String;}): Schema.Struct<{ id: typeof Schema.String;}> (+1 overload)
Struct({ id: typeof Schema.String
id: import Schema
Schema.class Stringexport String
String }), }), todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date;}, { readonly id: string; readonly deletedAt: string;}, false>
todoDeleted: import Events
Events.synced<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date;}, { readonly id: string; readonly deletedAt: string;}>(args: { name: "v1.TodoDeleted"; schema: Schema.Schema<{ readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, never>;} & Omit<State.SQLite.DefineEventOptions<{ readonly id: string; readonly deletedAt: Date;}, false>, "derived" | "clientOnly">): State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date;}, { readonly id: string; readonly deletedAt: string;}, false>export synced
synced({ name: "v1.TodoDeleted"
name: 'v1.TodoDeleted', schema: Schema.Schema<{ readonly id: string; readonly deletedAt: Date;}, { readonly id: string; readonly deletedAt: string;}, never>
schema: import Schema
Schema.function Struct<{ id: typeof Schema.String; deletedAt: typeof Schema.Date;}>(fields: { id: typeof Schema.String; deletedAt: typeof Schema.Date;}): Schema.Struct<{ id: typeof Schema.String; deletedAt: typeof Schema.Date;}> (+1 overload)
Struct({ id: typeof Schema.String
id: import Schema
Schema.class Stringexport String
String, deletedAt: typeof Schema.Date
deletedAt: import Schema
Schema.class Dateexport Date
This schema converts a string into a Date object using the new Date
constructor. It ensures that only valid date strings are accepted,
rejecting any strings that would result in an invalid date, such as new Date("Invalid Date").
Date }), }), todoClearedCompleted: State.SQLite.EventDef<"v1.TodoClearedCompleted", { readonly deletedAt: Date;}, { readonly deletedAt: string;}, false>
todoClearedCompleted: import Events
Events.synced<"v1.TodoClearedCompleted", { readonly deletedAt: Date;}, { readonly deletedAt: string;}>(args: { name: "v1.TodoClearedCompleted"; schema: Schema.Schema<{ readonly deletedAt: Date; }, { readonly deletedAt: string; }, never>;} & Omit<State.SQLite.DefineEventOptions<{ readonly deletedAt: Date;}, false>, "derived" | "clientOnly">): State.SQLite.EventDef<"v1.TodoClearedCompleted", { readonly deletedAt: Date;}, { readonly deletedAt: string;}, false>export synced
synced({ name: "v1.TodoClearedCompleted"
name: 'v1.TodoClearedCompleted', schema: Schema.Schema<{ readonly deletedAt: Date;}, { readonly deletedAt: string;}, never>
schema: import Schema
Schema.function Struct<{ deletedAt: typeof Schema.Date;}>(fields: { deletedAt: typeof Schema.Date;}): Schema.Struct<{ deletedAt: typeof Schema.Date;}> (+1 overload)
Struct({ deletedAt: typeof Schema.Date
deletedAt: import Schema
Schema.class Dateexport Date
This schema converts a string into a Date object using the new Date
constructor. It ensures that only valid date strings are accepted,
rejecting any strings that would result in an invalid date, such as new Date("Invalid Date").
Date }), }),}
const const materializers: { "v1.TodoCreated": State.SQLite.Materializer<State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>>; "v1.TodoCompleted": State.SQLite.Materializer<State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>>; "v1.TodoUncompleted": State.SQLite.Materializer<State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>>; "v1.TodoDeleted": State.SQLite.Materializer<...>; "v1.TodoClearedCompleted": State.SQLite.Materializer<...>;}
materializers = import State
State.import SQLite
SQLite.const materializers: <{ todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>; todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: State.SQLite.EventDef<...>;}>(_eventDefRecord: { todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>; todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: State.SQLite.EventDef<...>;}, handlers: { ...;}) => { ...;}
materializers(const events: { todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>; todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: State.SQLite.EventDef<...>;}
events, { 'v1.TodoCreated': ({ id: string
id, text: string
text }) => const tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>;}
tables.todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>
todos.insert: (values: { readonly id: string; readonly text?: string; readonly completed?: boolean; readonly deletedAt?: Date | null;}) => QueryBuilder<readonly { readonly id: string; readonly text: string; readonly completed: boolean; readonly deletedAt: Date | null;}[], State.SQLite.TableDefBase<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { ...; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>>, "select" | ... 6 more ... | "row">
Insert a new row into the table
Example:
db.todos.insert({ id: '123', text: 'Buy milk', status: 'active' })
insert({ id: string
id, text?: string
text, completed?: boolean
completed: false }), 'v1.TodoCompleted': ({ id: string
id }) => const tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>;}
tables.todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>
todos.update: (values: Partial<{ readonly text: string; readonly id: string; readonly completed: boolean; readonly deletedAt: Date | null;}>) => QueryBuilder<readonly { readonly id: string; readonly text: string; readonly completed: boolean; readonly deletedAt: Date | null;}[], State.SQLite.TableDefBase<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { ...; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>>, "select" | ... 6 more ... | "row">
Update rows in the table that match the where clause
Example:
db.todos.update({ status: 'completed' }).where({ id: '123' })
update({ completed?: boolean
completed: true }).where: (params: Partial<{ readonly id: string | { op: QueryBuilder<TResult, TTableDef extends State.SQLite.TableDefBase, TWithout extends QueryBuilder.ApiFeature = never>.WhereOps.SingleValue; value: string; } | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[]; } | undefined; readonly text: string | { op: QueryBuilder.WhereOps.SingleValue; value: string; } | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[]; } | undefined; readonly completed: boolean | ... 2 more ... | undefined; readonly deletedAt: Date | ... 3 more ... | undefined;}>) => QueryBuilder<...> (+2 overloads)
where({ id?: string | { op: QueryBuilder.WhereOps.SingleValue; value: string;} | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[];} | undefined
id }), 'v1.TodoUncompleted': ({ id: string
id }) => const tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>;}
tables.todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>
todos.update: (values: Partial<{ readonly text: string; readonly id: string; readonly completed: boolean; readonly deletedAt: Date | null;}>) => QueryBuilder<readonly { readonly id: string; readonly text: string; readonly completed: boolean; readonly deletedAt: Date | null;}[], State.SQLite.TableDefBase<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { ...; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>>, "select" | ... 6 more ... | "row">
Update rows in the table that match the where clause
Example:
db.todos.update({ status: 'completed' }).where({ id: '123' })
update({ completed?: boolean
completed: false }).where: (params: Partial<{ readonly id: string | { op: QueryBuilder<TResult, TTableDef extends State.SQLite.TableDefBase, TWithout extends QueryBuilder.ApiFeature = never>.WhereOps.SingleValue; value: string; } | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[]; } | undefined; readonly text: string | { op: QueryBuilder.WhereOps.SingleValue; value: string; } | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[]; } | undefined; readonly completed: boolean | ... 2 more ... | undefined; readonly deletedAt: Date | ... 3 more ... | undefined;}>) => QueryBuilder<...> (+2 overloads)
where({ id?: string | { op: QueryBuilder.WhereOps.SingleValue; value: string;} | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[];} | undefined
id }), 'v1.TodoDeleted': ({ id: string
id, deletedAt: Date
deletedAt }) => const tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>;}
tables.todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>
todos.update: (values: Partial<{ readonly text: string; readonly id: string; readonly completed: boolean; readonly deletedAt: Date | null;}>) => QueryBuilder<readonly { readonly id: string; readonly text: string; readonly completed: boolean; readonly deletedAt: Date | null;}[], State.SQLite.TableDefBase<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { ...; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>>, "select" | ... 6 more ... | "row">
Update rows in the table that match the where clause
Example:
db.todos.update({ status: 'completed' }).where({ id: '123' })
update({ deletedAt?: Date | null
deletedAt }).where: (params: Partial<{ readonly id: string | { op: QueryBuilder<TResult, TTableDef extends State.SQLite.TableDefBase, TWithout extends QueryBuilder.ApiFeature = never>.WhereOps.SingleValue; value: string; } | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[]; } | undefined; readonly text: string | { op: QueryBuilder.WhereOps.SingleValue; value: string; } | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[]; } | undefined; readonly completed: boolean | ... 2 more ... | undefined; readonly deletedAt: Date | ... 3 more ... | undefined;}>) => QueryBuilder<...> (+2 overloads)
where({ id?: string | { op: QueryBuilder.WhereOps.SingleValue; value: string;} | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[];} | undefined
id }), 'v1.TodoClearedCompleted': ({ deletedAt: Date
deletedAt }) => const tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>;}
tables.todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>
todos.update: (values: Partial<{ readonly text: string; readonly id: string; readonly completed: boolean; readonly deletedAt: Date | null;}>) => QueryBuilder<readonly { readonly id: string; readonly text: string; readonly completed: boolean; readonly deletedAt: Date | null;}[], State.SQLite.TableDefBase<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { ...; }; readonly deletedAt: { ...; };}>, State.SQLite.WithDefaults<...>>, "select" | ... 6 more ... | "row">
Update rows in the table that match the where clause
Example:
db.todos.update({ status: 'completed' }).where({ id: '123' })
update({ deletedAt?: Date | null
deletedAt }).where: (params: Partial<{ readonly id: string | { op: QueryBuilder<TResult, TTableDef extends State.SQLite.TableDefBase, TWithout extends QueryBuilder.ApiFeature = never>.WhereOps.SingleValue; value: string; } | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[]; } | undefined; readonly text: string | { op: QueryBuilder.WhereOps.SingleValue; value: string; } | { op: QueryBuilder.WhereOps.MultiValue; value: readonly string[]; } | undefined; readonly completed: boolean | ... 2 more ... | undefined; readonly deletedAt: Date | ... 3 more ... | undefined;}>) => QueryBuilder<...> (+2 overloads)
where({ completed?: boolean | { op: QueryBuilder.WhereOps.SingleValue; value: boolean;} | { op: QueryBuilder.WhereOps.MultiValue; value: readonly boolean[];} | undefined
completed: true }),})
const const state: InternalState
state = import State
State.import SQLite
SQLite.const makeState: <{ tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>; }; materializers: { ...; };}>(inputSchema: { tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>; }; materializers: { ...; };}) => InternalState
makeState({ tables: { todos: State.SQLite.TableDef<State.SQLite.SqliteTableDefForInput<"todos", { readonly id: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: None<never>; nullable: false; primaryKey: true; autoIncrement: false; }; readonly text: { columnType: "text"; schema: Schema.Schema<string, string, never>; default: Some<"">; nullable: false; primaryKey: false; autoIncrement: false; }; readonly completed: { columnType: "integer"; schema: Schema.Schema<boolean, number, never>; default: Some<false>; nullable: false; primaryKey: false; autoIncrement: false; }; readonly deletedAt: { ...; }; }>, State.SQLite.WithDefaults<...>, Schema.Schema<...>>;}
tables, materializers: { "v1.TodoCreated": State.SQLite.Materializer<State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>>; "v1.TodoCompleted": State.SQLite.Materializer<State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>>; "v1.TodoUncompleted": State.SQLite.Materializer<State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>>; "v1.TodoDeleted": State.SQLite.Materializer<...>; "v1.TodoClearedCompleted": State.SQLite.Materializer<...>;}
materializers })
export const const schema: FromInputSchema.DeriveSchema<{ events: { todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>; todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: State.SQLite.EventDef<...>; }; state: InternalState;}>
schema = makeSchema<{ events: { todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>; todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: State.SQLite.EventDef<...>; }; state: InternalState;}>(inputSchema: { events: { todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>; todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: State.SQLite.EventDef<...>; }; state: InternalState;}): FromInputSchema.DeriveSchema<...>
makeSchema({ events: { todoCreated: State.SQLite.EventDef<"v1.TodoCreated", { readonly text: string; readonly id: string; }, { readonly text: string; readonly id: string; }, false>; todoCompleted: State.SQLite.EventDef<"v1.TodoCompleted", { readonly id: string; }, { readonly id: string; }, false>; todoUncompleted: State.SQLite.EventDef<"v1.TodoUncompleted", { readonly id: string; }, { readonly id: string; }, false>; todoDeleted: State.SQLite.EventDef<"v1.TodoDeleted", { readonly id: string; readonly deletedAt: Date; }, { readonly id: string; readonly deletedAt: string; }, false>; todoClearedCompleted: State.SQLite.EventDef<...>;}
events, state: InternalState
state })Custom Worker with Authentication
Section titled “Custom Worker with Authentication”import { const makeDurableObject: MakeDurableObjectClass
Creates a Durable Object class for handling WebSocket-based sync.
A sync durable object is uniquely scoped to a specific storeId.
The sync DO supports 3 transport modes:
- HTTP JSON-RPC
- WebSocket
- Durable Object RPC calls (only works in combination with
@livestore/adapter-cf)
Example:
// In your Cloudflare Worker fileimport { makeDurableObject } from '@livestore/sync-cf/cf-worker'
export class SyncBackendDO extends makeDurableObject({ onPush: async (message) => { console.log('onPush', message.batch) }, onPull: async (message) => { console.log('onPull', message) },}) {}
wrangler.toml
[[durable_objects.bindings]]name = "SYNC_BACKEND_DO"class_name = "SyncBackendDO"
[[migrations]]tag = "v1"new_sqlite_classes = ["SyncBackendDO"]
makeDurableObject, const makeWorker: <TEnv extends Env = Env, TDurableObjectRpc extends Rpc.DurableObjectBranded | undefined = undefined, TSyncPayload = JsonValue>(options: MakeWorkerOptions<TEnv, TSyncPayload>) => CFWorker<TEnv, TDurableObjectRpc>
Produces a Cloudflare Worker fetch handler that delegates sync traffic to the
Durable Object identified by syncBackendBinding.
For more complex setups prefer implementing a custom fetch and call
handleSyncRequest
from the branch that handles LiveStore sync requests.
makeWorker } from '@livestore/sync-cf/cf-worker'
export class class SyncBackendDO
SyncBackendDO extends function makeDurableObject(options?: MakeDurableObjectClassOptions): { new (ctx: DoState, env: Env): DoObject<SyncBackendRpcInterface>;}
Creates a Durable Object class for handling WebSocket-based sync.
A sync durable object is uniquely scoped to a specific storeId.
The sync DO supports 3 transport modes:
- HTTP JSON-RPC
- WebSocket
- Durable Object RPC calls (only works in combination with
@livestore/adapter-cf)
Example:
// In your Cloudflare Worker fileimport { makeDurableObject } from '@livestore/sync-cf/cf-worker'
export class SyncBackendDO extends makeDurableObject({ onPush: async (message) => { console.log('onPush', message.batch) }, onPull: async (message) => { console.log('onPull', message) },}) {}
wrangler.toml
[[durable_objects.bindings]]name = "SYNC_BACKEND_DO"class_name = "SyncBackendDO"
[[migrations]]tag = "v1"new_sqlite_classes = ["SyncBackendDO"]
makeDurableObject({ onPush?: (message: PushRequest, context: { storeId: StoreId; payload?: JsonValue;}) => SyncOrPromiseOrEffect<void>
onPush: async (message: { readonly batch: readonly { readonly name: string; readonly args: any; readonly seqNum: number & Brand<"GlobalEventSequenceNumber">; readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">; readonly clientId: string; readonly sessionId: string; }[]; readonly backendId: Option<string>;}
message, { storeId: string
storeId }) => { // Log all sync events var console: Console
The console module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
- A
Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
- A global
console instance configured to write to process.stdout and
process.stderr. The global console can be used without importing the node:console module.
Warning: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the note on process I/O for
more information.
Example using the global console:
console.log('hello world');// Prints: hello world, to stdoutconsole.log('hello %s', 'world');// Prints: hello world, to stdoutconsole.error(new Error('Whoops, something bad happened'));// Prints error message and stack trace to stderr:// Error: Whoops, something bad happened// at [eval]:5:15// at Script.runInThisContext (node:vm:132:18)// at Object.runInThisContext (node:vm:309:38)// at node:internal/process/execution:77:19// at [eval]-wrapper:6:22// at evalScript (node:internal/process/execution:76:60)// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';console.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to stderr
Example using the Console class:
const out = getStreamSomehow();const err = getStreamSomehow();const myConsole = new console.Console(out, err);
myConsole.log('hello world');// Prints: hello world, to outmyConsole.log('hello %s', 'world');// Prints: hello world, to outmyConsole.error(new Error('Whoops, something bad happened'));// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';myConsole.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to err
console.Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)
Prints to stdout with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to printf(3)
(the arguments are all passed to util.format()).
const count = 5;console.log('count: %d', count);// Prints: count: 5, to stdoutconsole.log('count:', count);// Prints: count: 5, to stdout
See util.format() for more information.
log(`Store ${storeId: string
storeId} received ${message: { readonly batch: readonly { readonly name: string; readonly args: any; readonly seqNum: number & Brand<"GlobalEventSequenceNumber">; readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">; readonly clientId: string; readonly sessionId: string; }[]; readonly backendId: Option<string>;}
message.batch: readonly { readonly name: string; readonly args: any; readonly seqNum: number & Brand<"GlobalEventSequenceNumber">; readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">; readonly clientId: string; readonly sessionId: string;}[]
batch.ReadonlyArray<{ readonly name: string; readonly args: any; readonly seqNum: number & Brand<"GlobalEventSequenceNumber">; readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">; readonly clientId: string; readonly sessionId: string; }>.length: number
Gets the length of the array. This is a number one higher than the highest element defined in an array.
length} events`) },}) {}
const const hasStoreAccess: (_userId: string, _storeId: string) => boolean
hasStoreAccess = (_userId: string
_userId: string, _storeId: string
_storeId: string): boolean => true
export default makeWorker<Env, undefined, JsonValue>(options: MakeWorkerOptions<Env, JsonValue>): CFWorker<Env, undefined>
Produces a Cloudflare Worker fetch handler that delegates sync traffic to the
Durable Object identified by syncBackendBinding.
For more complex setups prefer implementing a custom fetch and call
handleSyncRequest
from the branch that handles LiveStore sync requests.
makeWorker({ syncBackendBinding: string
Binding name of the sync Durable Object declared in wrangler config.
syncBackendBinding: 'SYNC_BACKEND_DO', validatePayload?: (payload: JsonValue, context: { storeId: string;}) => void | Promise<void>
Validates the (optionally decoded) payload during WebSocket connection establishment.
If
syncPayloadSchema
is provided, payload will be of the schema’s inferred type.
validatePayload: (payload: JsonValue
payload, { storeId: string
storeId }) => { if (!(typeof payload: JsonValue
payload === 'object' && payload: JsonObject | JsonArray | null
payload !== null && 'userId' in payload: JsonObject | JsonArray
payload)) { throw new var Error: ErrorConstructornew (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error('User ID required') }
// Validate user has access to store if (!const hasStoreAccess: (_userId: string, _storeId: string) => boolean
hasStoreAccess((payload: JsonObject
payload as any).any
userId as string, storeId: string
storeId)) { throw new var Error: ErrorConstructornew (message?: string, options?: ErrorOptions) => Error (+2 overloads)
Error('Unauthorized access to store') } }, enableCORS?: boolean
enableCORS: true,})Multi-Transport Setup
Section titled “Multi-Transport Setup”import { const makeDurableObject: MakeDurableObjectClass
Creates a Durable Object class for handling WebSocket-based sync.
A sync durable object is uniquely scoped to a specific storeId.
The sync DO supports 3 transport modes:
- HTTP JSON-RPC
- WebSocket
- Durable Object RPC calls (only works in combination with
@livestore/adapter-cf)
Example:
// In your Cloudflare Worker fileimport { makeDurableObject } from '@livestore/sync-cf/cf-worker'
export class SyncBackendDO extends makeDurableObject({ onPush: async (message) => { console.log('onPush', message.batch) }, onPull: async (message) => { console.log('onPull', message) },}) {}
wrangler.toml
[[durable_objects.bindings]]name = "SYNC_BACKEND_DO"class_name = "SyncBackendDO"
[[migrations]]tag = "v1"new_sqlite_classes = ["SyncBackendDO"]
makeDurableObject } from '@livestore/sync-cf/cf-worker'
type type Transport = "http" | "ws" | "do-rpc"
Transport = 'http' | 'ws' | 'do-rpc'
const const getTransportFromContext: (ctx: unknown) => Transport
getTransportFromContext = (ctx: unknown
ctx: unknown): type Transport = "http" | "ws" | "do-rpc"
Transport => { if (typeof ctx: unknown
ctx === 'object' && ctx: object | null
ctx !== null && 'transport' in (ctx: object
ctx as any)) { const const t: any
t = (ctx: object
ctx as any).any
transport if (const t: any
t === 'http' || const t: any
t === 'ws' || const t: any
t === 'do-rpc') return const t: any
t } return 'http'}
export class class SyncBackendDO
SyncBackendDO extends function makeDurableObject(options?: MakeDurableObjectClassOptions): { new (ctx: DoState, env: Env): DoObject<SyncBackendRpcInterface>;}
Creates a Durable Object class for handling WebSocket-based sync.
A sync durable object is uniquely scoped to a specific storeId.
The sync DO supports 3 transport modes:
- HTTP JSON-RPC
- WebSocket
- Durable Object RPC calls (only works in combination with
@livestore/adapter-cf)
Example:
// In your Cloudflare Worker fileimport { makeDurableObject } from '@livestore/sync-cf/cf-worker'
export class SyncBackendDO extends makeDurableObject({ onPush: async (message) => { console.log('onPush', message.batch) }, onPull: async (message) => { console.log('onPull', message) },}) {}
wrangler.toml
[[durable_objects.bindings]]name = "SYNC_BACKEND_DO"class_name = "SyncBackendDO"
[[migrations]]tag = "v1"new_sqlite_classes = ["SyncBackendDO"]
makeDurableObject({ // Enable all transport modes enabledTransports?: Set<"http" | "ws" | "do-rpc">
Enabled transports for sync backend
http: HTTP JSON-RPC
ws: WebSocket
do-rpc: Durable Object RPC calls (only works in combination with @livestore/adapter-cf)
enabledTransports: new var Set: SetConstructornew <Transport>(iterable?: Iterable<Transport> | null | undefined) => Set<Transport> (+1 overload)
Set<type Transport = "http" | "ws" | "do-rpc"
Transport>(['http', 'ws', 'do-rpc']),
onPush?: (message: PushRequest, context: { storeId: StoreId; payload?: JsonValue;}) => SyncOrPromiseOrEffect<void>
onPush: async (message: { readonly batch: readonly { readonly name: string; readonly args: any; readonly seqNum: number & Brand<"GlobalEventSequenceNumber">; readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">; readonly clientId: string; readonly sessionId: string; }[]; readonly backendId: Option<string>;}
message, context: { storeId: StoreId; payload?: JsonValue;}
context) => { const const transport: Transport
transport = const getTransportFromContext: (ctx: unknown) => Transport
getTransportFromContext(context: { storeId: StoreId; payload?: JsonValue;}
context) var console: Console
The console module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
- A
Console class with methods such as console.log(), console.error() and console.warn() that can be used to write to any Node.js stream.
- A global
console instance configured to write to process.stdout and
process.stderr. The global console can be used without importing the node:console module.
Warning: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the note on process I/O for
more information.
Example using the global console:
console.log('hello world');// Prints: hello world, to stdoutconsole.log('hello %s', 'world');// Prints: hello world, to stdoutconsole.error(new Error('Whoops, something bad happened'));// Prints error message and stack trace to stderr:// Error: Whoops, something bad happened// at [eval]:5:15// at Script.runInThisContext (node:vm:132:18)// at Object.runInThisContext (node:vm:309:38)// at node:internal/process/execution:77:19// at [eval]-wrapper:6:22// at evalScript (node:internal/process/execution:76:60)// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';console.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to stderr
Example using the Console class:
const out = getStreamSomehow();const err = getStreamSomehow();const myConsole = new console.Console(out, err);
myConsole.log('hello world');// Prints: hello world, to outmyConsole.log('hello %s', 'world');// Prints: hello world, to outmyConsole.error(new Error('Whoops, something bad happened'));// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';myConsole.warn(`Danger ${name}! Danger!`);// Prints: Danger Will Robinson! Danger!, to err
console.Console.log(message?: any, ...optionalParams: any[]): void (+3 overloads)
Prints to stdout with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to printf(3)
(the arguments are all passed to util.format()).
const count = 5;console.log('count: %d', count);// Prints: count: 5, to stdoutconsole.log('count:', count);// Prints: count: 5, to stdout
See util.format() for more information.
log(`Push via ${const transport: Transport
transport}:`, message: { readonly batch: readonly { readonly name: string; readonly args: any; readonly seqNum: number & Brand<"GlobalEventSequenceNumber">; readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">; readonly clientId: string; readonly sessionId: string; }[]; readonly backendId: Option<string>;}
message.batch: readonly { readonly name: string; readonly args: any; readonly seqNum: number & Brand<"GlobalEventSequenceNumber">; readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">; readonly clientId: string; readonly sessionId: string;}[]
batch.ReadonlyArray<{ readonly name: string; readonly args: any; readonly seqNum: number & Brand<"GlobalEventSequenceNumber">; readonly parentSeqNum: number & Brand<"GlobalEventSequenceNumber">; readonly clientId: string; readonly sessionId: string; }>.length: number
Gets the length of the array. This is a number one higher than the highest element defined in an array.
length) },}) {}