Suggestions for Chat Control Extension

0
0

 

Currently, there are two types of messages: regular `message` and `lazyMessage`. A `lazyMessage` waits until its content has finished loading before displaying it to the user. However, this approach has a drawback: most large language model (LLM) APIs actually support stream mode, which returns text progressively as the model generates it.

It is therefore recommended to add a new message type that supports real-time updates. Typically, when using an LLM API in stream mode, text can be returned incrementally as the model “thinks,” providing a user experience similar to that of official LLM web interfaces, where content appears progressively.This is crucial for enhancing user experience, especially when generating longer texts, as users can see initial results without having to wait for an extended period.

 

Another issue is that most LLMs return content in Markdown format, but the current `message` component does not support rendering Markdown.

 

 

 

a loyal fan of Wisej.NET/Winform/WPF

  • You must to post comments
0
0

I believe the extension is on github as an open source control. My guess is a PR with the changes would be appreciated. Or you can use any available chat control in javascript.

  • Panda Pan
    That’s a great idea! The Lunar New Year holiday is coming up soon in my region, so I’ll have some free time. I’ll look into expanding the functionality—specifically, I’d like to make the bubble width configurable and add the ability to customize the font color as well.
  • You must to post comments
Showing 1 result
Your Answer

Please first to submit.