Quickstart¶
Get CIUF running against your PostgreSQL database in under 5 minutes.
Install¶
# With PostgreSQL driver
pip install ciuf[postgres]
# SQLite only (no postgres driver)
pip install ciuf
Requires Python 3.11+.
Connect¶
On startup, CIUF inspects your database schema and creates TableNode objects for each table. No data is loaded yet.
Run your first cached query¶
result = engine.query("""
SELECT orders.id, orders.amount, customers.name, customers.plan
FROM orders
JOIN customers ON orders.customer_id = customers.id
WHERE customers.plan = 'pro'
ORDER BY orders.created_at DESC
LIMIT 100
""")
# result is a pandas DataFrame
print(result.head())
The first call reads from PostgreSQL and populates the cache. Every subsequent call with the same query pattern returns from memory — no database roundtrip.
Keep the cache in sync¶
Tell CIUF about every write to the database:
# After INSERT
engine.on_insert("orders", {
"id": 12345,
"amount": 99.0,
"customer_id": 42,
"status": "completed",
"created_at": "2026-05-09T12:00:00"
})
# After UPDATE
engine.on_update("customers",
old={"id": 42, "plan": "free", "name": "Acme Corp"},
new={"id": 42, "plan": "pro", "name": "Acme Corp"}
)
# After DELETE
engine.on_delete("orders", {"id": 12345})
CIUF propagates only the delta — no full table recompute.
SQLAlchemy integration¶
If you use SQLAlchemy ORM, hook into the session flush event:
from sqlalchemy import event
from sqlalchemy.orm import Session
from ciuf import Engine
ciuf_engine = Engine("postgresql://user:pass@localhost/mydb")
@event.listens_for(Session, "after_flush")
def sync_ciuf(session, flush_context):
for obj in session.new:
ciuf_engine.on_insert(obj.__tablename__, {
c.key: getattr(obj, c.key)
for c in obj.__mapper__.column_attrs
})
for obj in session.deleted:
ciuf_engine.on_delete(obj.__tablename__, {
c.key: getattr(obj, c.key)
for c in obj.__mapper__.column_attrs
})
FastAPI example¶
from fastapi import FastAPI
from ciuf import Engine
app = FastAPI()
ciuf_engine = Engine("postgresql://user:pass@localhost/mydb")
@app.get("/dashboard/{user_id}")
async def dashboard(user_id: int):
df = ciuf_engine.query(f"""
SELECT orders.id, orders.amount, products.category
FROM orders
JOIN products ON orders.product_id = products.id
WHERE orders.customer_id = {user_id}
AND orders.status = 'completed'
ORDER BY orders.created_at DESC
LIMIT 100
""")
return df.to_dict(orient="records")
The first request per query pattern hits PostgreSQL. Every subsequent request returns from the in-memory cache.
Configuration¶
engine = Engine(
"postgresql://user:pass@localhost/mydb",
max_memory_mb=512, # LRU eviction threshold (default: 256MB)
ttl_seconds=3600, # Cache TTL per result (default: no expiry)
)
Next steps¶
- Architecture overview — how the DAG engine works
- GitHub Discussions — questions and feedback
- Open an issue — bug reports and feature requests