-
Notifications
You must be signed in to change notification settings - Fork 1
Open
Labels
enhancementNew feature or requestNew feature or requestinferencejail-breakWays to make the chatbot render what it is not supposed to doWays to make the chatbot render what it is not supposed to do
Milestone
Description
Summary
Currently, the RAG4EIC chatbot does not retain previous messages between user and AI, so each turn is stateless. This issue proposes adding conversational memory, so that user and AI messages are stored and provided as context to the LLM. Additionally, if OpenAI or the LLM returns an error, the app should display a failure message and prompt the user to reload the conversation.
Requirements
- Extend the Streamlit chat interface to store the conversation history as alternating messages between user and assistant (AI).
- On each user input, pass the full message history (in the correct format) to the LLM backend for context-aware responses.
- If the LLM/OpenAI call fails (network error, quota exceeded, null result, etc), catch the exception and:
- Display a message to the user: "There was an error generating a response. Please reload the conversation and try again."
- Optionally, log the error for debugging.
- After a failure, do NOT append a new AI message to the conversation history.
Implementation Plan
-
Message Storage
- In Streamlit, use
st.session_state["messages"]to store a list of message dicts, e.g.{ "role": "user"|"assistant", "content": ... }. - On each new user message, append to this list as
{ "role": "user", "content": user_input }. - When the LLM returns a result, append its output as
{ "role": "assistant", "content": response }.
- In Streamlit, use
-
Passing History to LLM
- When calling the LLM, pass the entire
messageslist as input (as required by OpenAI Chat API or compatible LLMs). - The LLM should use the history to generate context-aware answers.
- When calling the LLM, pass the entire
-
Error Handling
- Wrap LLM calls in a try/except block.
- On exception (any error), show the error message to the user and do NOT append a new AI message.
- Optionally log the error (e.g. with
st.erroror to a file/logging system).
-
Reload Conversation
- Suggest the user reload or restart the app to clear the error state.
-
UI Improvements
- Optionally provide a "Clear Conversation" or "Restart" button to reset the chat history.
References/Examples
Future Improvements
- Add more advanced memory (summarization, windowed memory, etc)
- Allow exporting or saving conversation history
- Add retry logic for transient errors
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or requestinferencejail-breakWays to make the chatbot render what it is not supposed to doWays to make the chatbot render what it is not supposed to do
Type
Projects
Status
Todo