Add LLM-based discussion to async peer instruction (course-wide key)#1140
Add LLM-based discussion to async peer instruction (course-wide key)#1140bnmnetp merged 23 commits intoRunestoneInteractive:mainfrom
Conversation
…e student; cannot move to second question
merge pi llm integration
|
@sethbern All the code under I would recommend that you look at https://docs.runestone.academy and at least have a look at the welcome and overview of the monorepo that describes how things are structured. There is lots of other documentation on building, developing and debugging there as well. Hence it is going to live under All the code needed to retrieve and update from the database is under Please be careful not to confuse these files with the old files that are still present under bases/rsptx/web2py_server. That is the old version that is currently in production but we are phasing that out. The new code has been tested but definitely not to the extent that the old code has been. Feel free to reach out for help in our Discord, or also feel free to come to our tuesday zoom dropin ( https://mathtech.org/dropin) I’m usually there from 8-10 Pacific time. But if I’m not others will be there that can answer questions too. Its fine if we want to have these features in parallel, but the plan is that the web2py service will be deactivated next summer. |
content
Outdated
There was a problem hiding this comment.
content, docker-compose.override.yml and pi_attempt_id all appear to be empty files. Please remove them from the PR.
|
@bnmnetp Thank you for your comments. I have removed the empty files. I understand that this work should ultimately live in the FastAPI assignment server and that the web2py code is being phased out. However, I want to check, would you like me to move this over to FastAPI now, or is it okay to merge the web2py version and then follow up with the FastAPI version right away? |
|
Its OK to leave what you have in place for web2py, but I would encourage you to get it over to FastAPI and use that version for your research. That PI that I have ported will be available (though not the default) in production this weekend. Whether you do that as one PR or two is not a big deal to me. |
|
@bnmnetp Thanks, I appreciate the clarification. I’ll keep this PR focused on the existing web2py implementation so it can be merged and used as is for an upcoming conference workshop. I’ll then follow up with a separate PR that migrates the full functionality over to the FastAPI assignment server and uses that version for the research going forward. |
This adds an LLM-based discussion mode to asynchronous peer instruction, using a course-wide OpenAI API key when available and falling back to legacy behavior of showing previous text-chat messages when there is no key.
Core functionality
Implementation details
Backend:
Frontend:
How to use: