doc for persistence? like what ai sdk have? #201
Replies: 3 comments
-
|
@AlemTuzlak Do you have any thoughts on this? The persistence features around LangGraph are incredibly well-designed out and really impressive. That said, I don’t particularly resonate with their approach. It makes me think how great it would be if TanStack AI supported persistence and checkpointing features like this! |
Beta Was this translation helpful? Give feedback.
-
|
Honestly, some sort of recommendation around persistence would be extremely helpful. I'm currently passing along a conversation ID as context and using tee to split the streamed response and use one of those to aggregate again and store. I don't mind the stateless approach this lib takes but almost anyone will need persistence and currently there's just a finish handler on the front end as an obvious hook? |
Beta Was this translation helpful? Give feedback.
-
|
This is what I've been using: /**
* Yields every item from source transparently while collecting them.
* After the source exhausts, calls `onComplete` with all collected items
* within the iteration lifecycle, before signaling done.
*
* This ensures the consumer (e.g., an HTTP response stream) stays open
* until `onComplete` finishes, binding post-stream work to the
* response lifetime rather than leaving it as a detached promise.
*/
export async function* withOnComplete<T>(
source: AsyncIterable<T>,
onComplete: (items: T[]) => Promise<void>,
): AsyncGenerator<T> {
const items: T[] = []
for await (const item of source) {
items.push(item)
yield item
}
await onComplete(items)
}and just wrap |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
https://github.com/vercel-labs/ai-sdk-persistence-db/tree/main
Beta Was this translation helpful? Give feedback.
All reactions