Skip to Content
LearnCore ConceptsModel Loader

Model Loader

Concept: Transparent data source abstraction that tries local storage first, then falls back to network.

The Pattern

When you access a model or relationship, Gluonic’s Model Loader tries multiple sources in order:

Access model by ID 1. Check ObjectPool (memory) → Instant! 2. Try local storage (Drizzle adapter) → Fast (~10ms) 3. Fallback to network (API) → Slower (~100-500ms) 4. Cache in storage for next time Return to component

You never know where data comes from - and you don’t need to!

How It Works

Step 1: Check ObjectPool

// Best case: Data already in memory const row = store.pool.get(type, id) if (row) { return bridge.getModel(type, id) // Instant! < 1ms }

If data is in the ObjectPool, return immediately. No I/O needed.

Step 2: Try Local Storage

// Not in pool: Try local storage const row = await storage.getRow(type, id) if (row) { store.pool.upsert(row) // Add to pool return bridge.getModel(type, id) // Fast! ~10ms }

If data is in local storage (from previous sync), load it. Much faster than network.

Step 3: Fallback to Network

// Not in storage: Load from network const rows = await fetch(`/sync/v1/fetch?type=${type}&ids=${id}`) // Cache everywhere store.pool.upsert(rows[0]) // Add to pool await storage.putRow(rows[0]) // Add to storage return bridge.getModel(type, id) // Slower ~100-500ms

If data isn’t local, fetch from network and cache it.

Transparent to Calling Code

The beauty of Model Loader is that it’s completely transparent:

const IssueDetail = observer(({ issueId }) => { const issue = useModel<Issue>('issue', issueId) // Same code regardless of where data comes from: return <div>{issue.title}</div> }) // Scenario 1: Issue in pool → Renders instantly // Scenario 2: Issue in storage → Renders in ~10ms // Scenario 3: Issue needs network → Renders in ~500ms // Code doesn't change!

Why This Enables Synchronous API

Model Loader is what allows Gluonic’s synchronous API to work:

// You write synchronous code const author = issue.author.value // Behind the scenes, Model Loader: 1. Checks pool (instant) 2. Tries storage (fast) 3. Falls back to network (slower) // You don't care where it comes from // You just get the data (or undefined initially)

Without Model Loader, you’d need different APIs for each source:

// ❌ Without Model Loader - complex const author = getFromPool(authorId) || await getFromStorage(authorId) || await getFromNetwork(authorId) // ✅ With Model Loader - simple const author = issue.author.value

Caching Behavior

First Access

Request: issue.author.value ObjectPool: Miss Storage: Miss Network: Fetch Cache: Pool ✓, Storage ✓ Return: User object

Second Access

Request: issue.author.value ObjectPool: Hit! ✓ Return: User object (instant!)

After App Restart

Request: issue.author.value ObjectPool: Miss (fresh start) Storage: Hit! ✓ Cache: Pool ✓ Return: User object (~10ms)

Network not needed - data persisted from last session!

Deduplication

Model Loader ensures same request only happens once:

// Two components access same model simultaneously <ComponentA issue={issue} /> // Kicks load <ComponentB issue={issue} /> // Reuses in-flight promise // Behind the scenes: const pendingLoads = new Map<string, Promise<Model>>() async ensureLoaded(type: string, id: string) { const key = `${type}:${id}` // Check if already loading if (pendingLoads.has(key)) { return pendingLoads.get(key) // Reuse! ✓ } // Start new load const promise = this.load(type, id) pendingLoads.set(key, promise) return promise }

Only 1 network request, even if 10 components need the same data.

Bulk Loading

Model Loader can load multiple models efficiently:

// Load multiple models at once await store.ensureRowsByIds('issue', ['1', '2', '3', '4', '5']) // Behind the scenes: 1. Check pool for each ID 2. Check storage for missing IDs (1 query) 3. Fetch from network for still-missing IDs (1 request) 4. Cache all results // More efficient than individual loads

Integration with Lazy Loading

Model Loader powers lazy loading:

class LazyReference<T> { get value(): T | undefined { if (!this._hydrated) { // Use Model Loader to load the related object void bridge.ensureLoaded(type, this.foreignKeyValue) } return this._value } }

When you access .value, LazyReference uses Model Loader to fetch the related object - trying pool, storage, then network.

Example: Multi-Level Loading

const IssueDetail = observer(({ issueId }) => { const issue = useModel<Issue>('issue', issueId) return ( <div> <h1>{issue.title}</h1> {/* Level 1: Issue from pool/storage */} <p>Created by: {issue.creator.value?.name}</p> {/* Level 2: Creator from pool/storage/network */} <div> Comments: {issue.comments.map(comment => ( <div key={comment.id}> {comment.text} {/* Level 3: Each comment from pool/storage/network */} <span>by {comment.author.value?.name}</span> {/* Level 4: Each author from pool/storage/network */} </div> ))} </div> </div> ) })

Each level uses Model Loader independently - all transparent to your code.

Benefits

1. Transparent Data Source

// Same API whether data is: // - In memory (instant) // - In storage (fast) // - On network (slower) const user = issue.assignee.value // You don't know, you don't care

2. Optimal Performance

// Always chooses fastest available source // Caches aggressively // Subsequent access is instant

3. Offline Support

// Offline: Falls back to storage gracefully // Data that's been synced before is available // New data waits for connection

4. Developer Experience

// No branching logic needed // No manual cache management // Just access data synchronously

Key Insight: Model Loader is why you can write issue.author.value and not worry about whether the author is in memory, storage, or needs to be fetched from the network. It abstracts away the data source completely.

Implementation Detail

Model Loader is implemented in IdentityMap.ensureLoaded():

class IdentityMap { async ensureLoaded<T>(type: string, id: string): Promise<T> { // Check pool if (store.pool.get(type, id)) { return this.getModel(type, id) } // Load via store (tries storage → network) await store.ensureRowsByIds(type, [id]) return this.getModel(type, id) } }

LazyReferences and LazyCollections use this under the hood.

Next Steps

Last updated on