Currently, there are two types of messages: regular `message` and `lazyMessage`. A `lazyMessage` waits until its content has finished loading before displaying it to the user. However, this approach has a drawback: most large language model (LLM) APIs actually support stream mode, which returns text progressively as the model generates it.
It is therefore recommended to add a new message type that supports real-time updates. Typically, when using an LLM API in stream mode, text can be returned incrementally as the model “thinks,” providing a user experience similar to that of official LLM web interfaces, where content appears progressively.This is crucial for enhancing user experience, especially when generating longer texts, as users can see initial results without having to wait for an extended period.
Another issue is that most LLMs return content in Markdown format, but the current `message` component does not support rendering Markdown.
a loyal fan of Wisej.NET/Winform/WPF
I believe the extension is on github as an open source control. My guess is a PR with the changes would be appreciated. Or you can use any available chat control in javascript.
Please login first to submit.
