You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
07/19/2025: Add a bright error message if a tool call fails and we fail to parse it in chat.html.erb.
2
2
3
-
07/19/2025: Add with_indifferent_access to our context hash that passes in an api_token so that we can access it with either :api_token or "api_token" in agent_state_builder.rb (caused a bug during the hackathon)
3
+
07/19/2025: Add with_indifferent_access to our context hash that passes in an api_token so that we can access it with either :api_token or "api_token" in agent_state_builder.rb (caused a bug during the hackathon)
4
+
5
+
07/20/2025: Add support for AIMessageChunk streaming in chat_channel.rb. (We should also support this for agent_message).
6
+
- Why?
7
+
- Better UX.
8
+
- Run local models offline with better UX. (Local models are slow)
9
+
10
+
07/28/2025:
11
+
- Working on async websocket disconnection errors when streaming long messages back from LlamaBot backend.
12
+
- Added more robust websocket handling in chat_channel.rb, to deal with tiny disconnects or other issues.
0 commit comments