Save knowledge from any AI tool as pages at clean URLs.
Search and reuse from anywhere via API or MCP.
StreamingResponse wraps an async generator. The key insight: you yield bytes, not strings...
async def generate():
async for chunk in llm.stream(prompt):
yield f"data: {chunk}\n\n"
return StreamingResponse(generate())