Hi Serge,
If I remember well, you asked me to look into this issue a few weeks ago. Could you add more details about it? Basically, do you want me to implement an LLM agent as a tutor during a session? Please correct me if I'm wrong.
Hello, that's basically the idea. The same way you can interact with a tutor, you should be able to interact with a LLM (I would personally go for Mistral first, as we already discussed)
Hi Dave, I see you're still assigned to this issue, but I don't see any change. Do you already have something? Are you still working on this? Or can I change the assignation?
I think thag ideally, it should work back-end only, in a way that when a message is received, it dectect that it's an AI session, and forward to a LLM, that answer as a "normal message". That way, on the frontend side, we wouldn't have to manage the two different possibilities.
Hi @bridubois,
Basically, the current logic is that the user selects the session they would like to attend. They can either select a normal session with a human tutor or a session where they choose a bot as a tutor. For now, the bot is the Mistral Small 3 model by default.
The idea is that I added a button that allows the user to select a specific session. The session logic remains the same, including all its components (messages, etc.).
On the backend, if the session has two users, the backend interprets it as a normal session. However, if it's a session with just one user, it is interpreted as a session managed by the LLM. In that case, an endpoint was created to communicate with the LLM API.
So, that’s the logic I'm implementing.