improvements(ai): Improve AI streaming UI/UX interactions + better separation of AI provider responsibilities #2039
Reference in New Issue
Block a user
Delete Branch "zachgoll/ai-streaming-updates"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
The initial implementation of the personal finance AI in #2022 scaffolded out a working domain for interacting with a generic, external "LLM" and providing chat responses to a user. This PR offers some improvements to the domain and stronger boundaries between the "LLM Provider" and the Maybe app.
Orchestrating a chat between an LLM and the Maybe app involves several core responsibilities:
Previously, we had a lot of the "orchestration" logic happening within the provider. This PR cleans up those domain boundaries and moves a lot of the "orchestration" of the chat to
Assistantand leaving the provider solely responsible for parsing, normalizing, and emitting stream events to theAssistantProvider::LlmConcept- defines a generic interface that all LLM providers must implement, including thechat_responsemethod (streamable)Provider::Openai- our starting concrete implementation for now that streams the LLM responsesAssistant- listens to lifecycle events emitted from the responder, persists chat state, and broadcasts chat updates.Assistant::Configurable- controls the configuration of the assistant including developer instructions and callable functionsAssistant::Responder- orchestrates a single chat response from the generic LLM, including follow up responses for tool callsAssistant::FunctionToolCaller- fulfills LLM function requests and normalizes them toToolCall::Functiondomain objectsAssistant::Function- the base class for a callable assistant functionAssistant::Broadcastable- handles chat UI updates via turbo streams