Using TanStack DB (ver 0.4.1) with ElectricSQL #217
                  
                    
                      sudoskys
                    
                  
                
                  started this conversation in
                Show and tell
              
            Replies: 2 comments
-
| The article was completed with the assistance of LLM. There may be some errors. If it needs to be corrected, please let me know. | 
Beta Was this translation helpful? Give feedback.
                  
                    0 replies
                  
                
            -
| In version 0.2.0, the type system has been significantly improved. When using Drizzle ORM, you must ensure all types are explicitly defined. For jsonb fields, use  jsonb("item_data").$type<DataTypeMap[keyof DataTypeMap]>()Otherwise, you will encounter type compatibility issues where  | 
Beta Was this translation helpful? Give feedback.
                  
                    0 replies
                  
                
            
  
    Sign up for free
    to join this conversation on GitHub.
    Already have an account?
    Sign in to comment
  
        
    
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
An Integration Guide to TanStack DB & ElectricSQL: Building Real-Time Apps with Optimistic Updates
Abstract
This document is an in-depth technical guide providing a clear, reusable example of how to combine TanStack DB and ElectricSQL to build modern web applications with real-time data synchronization, secure optimistic updates, and transactional consistency. We will explore the core architecture and best practices through a Q&A application centered around a
messagestable.Architecture Overview
Our Q&A application architecture is divided into a client and a server, communicating via API calls and a real-time data stream. The interaction model of the core components is as follows:
graph TB subgraph "Client: App" Component[React Component] --> Action[TanStack DB Action] Action --> |Optimistic Update| Collection[DB Collection: Messages] Collection --> |Shape Request| Auth[Shape Auth Proxy] end subgraph "Server: API & Gatekeeper" Auth --> |Validated Request| ShapeProxy[Shape Proxy Endpoint] ShapeProxy --> Electric[ElectricSQL Service] Action --> |API Call: /questions/send| APIEndpoint[API Endpoint] APIEndpoint --> |DB Transaction| Postgres[PostgreSQL Database] end Postgres --> |Replication| Electric Electric --> |Live Data Stream| Collection style Auth fill:#f9f,stroke:#333,stroke-width:2px style APIEndpoint fill:#ccf,stroke:#333,stroke-width:2px style Action fill:#cfc,stroke:#333,stroke-width:2px1. Foundation: The Shape Gatekeeper Security Model
To securely sync the database to the client in real-time, we cannot expose it directly. ElectricSQL uses the concept of "Shapes" to define subsets of data that a client can subscribe to. Our architecture adds a "Shape Gatekeeper" layer to validate and authorize these data subscription requests.
1.1. Dual JWT Authentication
Our authentication system employs a dual JWT model for flexible and secure data access control:
WHERE conversation_id = 'conv-abc') it is allowed to access, thus enabling row-level security.1.2. Step 1: Shape Authorization Endpoint (
/shapes/auth-token)The client exchanges tokens by calling the backend's
/shapes/auth-tokenendpoint.1.3. Step 2: Secure Shape Proxy Endpoint (
/shape)After obtaining a short-lived
Shape Token, the client does not request the ElectricSQL service directly. Instead, it requests our custom/shapeproxy endpoint. This endpoint's responsibility is to validate the Shape Token and securely forward the request to the actual ElectricSQL service.1.4. Client-side Collection Definition
On the client, we create a
createMessagesCollectionfactory function that encapsulates all the logic for obtaining a Shape Token and configuring ElectricSQL's data synchronization.1.5. Deep Dive: The
createShapeConfigFactory for Resilient ConnectionsThe core value of
createShapeConfigis itsonErrorcallback, which provides an elegant, automated connection recovery mechanism for handling expired Shape Tokens. The implementation details are provided in the appendix.1.6. Appendix: Client-side Helper Function Implementations
To make this document self-contained, simplified versions of key helper functions are provided below.
2. Core: Atomic Actions & The
txidBridgeWhen a user submits a new question, we need to update the client UI, call the backend API, and ensure eventual data consistency. This is where TanStack DB's
createTransactionand ElectricSQL'stxidmechanism come into play.2.1. Transactional Action (
sendQuestionAction)We encapsulate the operation of sending a new question into an atomic, asynchronous Action function.
2.1.1. Appendix: Placeholder Object Creation
The
createQuestionPlaceholderandcreateAnswerPlaceholderfunctions used in the optimistic update are simple factories for creating objects that match the local databaseMessageschema.2.2. Backend API &
txidGenerationThe backend API's core responsibilities are:
txid) from the database operation.txidto the client.2.2.1. Deep Dive: How the Backend Gets the
txidThe
txidis the bridge between the client's action and the backend's data synchronization. In our backend'smessage.service.ts, this ID is obtained via a transaction helper function that wraps all database write operations. Its core relies on a built-in PostgreSQL function,pg_current_xact_id().The beauty of this design is that the
withTransactionmethod encapsulates the entire flow: "start transaction, get ID, execute operations, commit transaction." Any method needing to write to the database atomically and requiring atxidfor ElectricSQL sync can simply use this helper.2.3. How
awaitTxIdWorksThe
txidis the key that bridges the gap between the frontend's optimistic update and the backend's true data. Theawait collection.utils.awaitTxId(txid)function works as follows:txid.txidit is waiting for, it knows that this specific transaction has been successfully synced from the server to the client's local database. At this point, theawaitTxIdpromise is resolved.This mechanism elegantly guarantees that when the
commit()function returns, our local database not only contains the optimistic placeholders but that these placeholders have already been overwritten by the persisted, true data from the server.3. The Payoff: Real-time UI with
useLiveQueryWith a secure data channel and reliable data operations in place, we can now reap the benefits in the UI layer. TanStack DB's
useLiveQueryhook allows us to effortlessly subscribe to changes in a Collection. The latest version of TanStack DB now supports loading states, making it easier to handle initial data synchronization and empty states.When a user submits a new question,
sendQuestionAction'stransaction.mutate()immediately inserts the placeholders into the local database.useLiveQuerydetects this change, and theQuestionAnswerListcomponent re-renders instantly, showing the new question and "Generating answer...", achieving a perfect optimistic update. When the backend's answer generation task completes and updates the database, ElectricSQL syncs the change back to the client.useLiveQuerytriggers another re-render, updating the answer placeholder with the real response.4. End-to-End Flow
Let's summarize the entire process with a sequence diagram:
sequenceDiagram participant User participant Component as React Component participant Action as sendQuestionAction participant LocalDB as TanStack/Electric DB participant API as Backend API participant Postgres participant Electric as ElectricSQL Replication User->>Component: Clicks "Send" button Component->>Action: sendQuestionAction({ content: 'Hello!' }) Action->>LocalDB: Optimistically insert Question & Answer placeholders Note right of LocalDB: UI instantly shows question and "Generating answer..." Action->>API: POST /questions/send API->>Postgres: BEGIN TRANSACTION API->>Postgres: INSERT question message API->>Postgres: INSERT answer placeholder API->>Postgres: COMMIT Postgres-->>API: Returns txid API-->>Action: Returns { success: true, txid: '...' } Action->>LocalDB: await collection.utils.awaitTxId(txid) Note right of LocalDB: Action pauses, listening for replication Postgres-->>Electric: Replicates committed transaction Electric-->>LocalDB: Pushes changes to client LocalDB-->>Action: awaitTxId promise resolves par Background Answer Generation API->>API: Generate Answer API->>Postgres: UPDATE answer message SET content = '...' end Postgres-->>Electric: Replicates answer Electric-->>LocalDB: Pushes final reply to client Note right of LocalDB: UI automatically updates to show the final replyWith this architecture, we have successfully built a powerful real-time application in a declarative, fault-tolerant, and efficient manner. This pattern elegantly encapsulates the complexities of UI updates, data persistence, and real-time synchronization, allowing developers to focus on implementing business logic.
5. Troubleshooting
5.1. JSONB Fields and TypeScript Types
When working with PostgreSQL JSONB fields in ElectricSQL, you'll encounter some important limitations and performance considerations:
Performance Impact
Best Practice: Use JSONB only for supplementary data that is:
TypeScript Type Issues
ElectricSQL's client SDK converts all JSONB fields to the generic
Jsontype, which loses all type information. There is currently no automatic way to preserve types, requiring manual type assertions.Problem Example:
Solution - Transform Functions with Type Assertions:
Recommended Approach:
Key Points:
_createCollectioninternallyReturnType<typeof _createCollection>for cache typesuseMemo: Collections already handle their own caching5.2. Simplified Authentication with Subqueries
ElectricSQL now supports subqueries (electric-sql/electric#2931), which can significantly simplify your authentication and authorization logic.
Before (Complex Permission Checks):
After (Using Subqueries):
This approach:
Beta Was this translation helpful? Give feedback.
All reactions