Inspiration
As NUS students, we all feel that NUS sometimes does not provide enough practices for us for exams. Often times, we only have 1-2 past year papers that barely help us assess whether we are ready for the exams. What's worse, some of us find ourselves unable to grasp the concepts that the modules are teaching.
Some of us do try to use ChatGPT to consolidate our understanding. However, it does only provide explanation to our question, but does not provide further questions to check our understanding if we do not know how to craft our prompts.
That is why we developed NeuralCats! Using the power of OpenAI, we hope to systematically generate quizzes that check our understanding of the content taught in our modules. With a simple click, we can easily generate a quiz at our desired difficulty level and assess our understanding by checking our answers against the given answers.
Features
Generate module-specific quiz
Simply select the module of your choice, and the difficulty level, and click to get quiz! Our website would generate a quiz to test your understanding across all topics of a module!
What's more, you can save the PDF files of the quiz questions and answers, so that you may attempt the quiz at a later time.
Save quizzes you created
Create an account on our website, and you can easily save quizzes you created and attempt them at a later time. In our dashboard, you can view all quizzes created by you, with their hyperlinks directly linked to the page with the quiz.
Request new modules
Of course, we ourselves cannot cover all modules available. We allow users to request for new modules, and admins can review them and add the content where appropriate.
Techstacks
Azure services
Below are the Azure services we have integrated into our app.
We use OpenAI API to send our module content, together with our crafted prompt, to generate a quiz for the module content.
This is used as our database, to store user-uploaded data. Data stored includes:
- Modules and their content.
- User information, including username, hashed password and hashed user session.
- Module requests.
- Quizzes generated.
The storage is used to store any user-uploaded files and app-generated files, to complement the database.
Our app is deployed using Docker. The Docker container image is stored using this service.
This service is used to serve our app. The server takes the Docker image from Azure Container Registry.
Other techstacks
We use FastAPI to develop our backend, for its ease of user-input checking and auto-generated Swagger UI documentation and playground.
We use React framework to develop our frontend.
We use Emotion CSS to style our components, and create reusable style sheets.
We use Redux to store states that are shared across many sub-pages of our website.
We use Docker to deploy our app, due to its ability to separate the app's operation from an operating system, and as a result, we could test out locally whether our app works once deployed.
This was used by pdfkit, a Python library converting HTML to PDF.
Challenges and what we learned
Azure services
For all of us, this is our first time using most of the Azure services. Due to lack of online examples on the usage, we had to continually look up for the documentation of the services.
AI services
This is our first time integrating OpenAI into our project. We learned to use the API properly, and learned to craft our prompt in the exact way that give us the response that we want, with the right content and in the right format.
We considered alternative services offered by Azure:
- Azure Text Analytics: This was considered to be used to extract key phrases to form questions. However, we realised that the key phrases were not useful in crafting questions, and the extracted key phrases were not necessarily accurate.
- Azure Question Answering: This was considered for getting answer from a knowledge base. However, the service was unable to craft multiple-choice questions, as it was a service only to get an answer to a question. Furthermore, the limitation on the amount of knowledge we can store was not sufficient for us to put up a prototype.
- Azure Search Service: We considered using this to generate incorrect answers to a question. However, it was barely suitable.
In the end, we decided to use OpenAI, because it was capable of responding to almost any prompt. Hence, it was used to generate questions, generate incorrect choices to questions, and give the correct answers to the questions.
Generate PDF
One of our app feature requires us to convert the generated questions and answers into PDF files. We decided to use Markdown language to format the response, and then convert it to PDF.
We struggled to find the appropriate library for conversion from Markdown language to PDF. In the end, we decided to use two separate libraries, to first convert Markdown to HTML, and then convert HTML to PDF.
Docker and deployment
All of us had very little experience with deployment with Docker. However, thanks to the comprehensive instruction on deployment from FastAPI official documentation and Azure official instruction for deploying FastAPI application, we were able to craft the proper Dockerfile to generate the Docker image.
Furthermore, our application requires wkhtmltopdf, which caused us huge issues when deploying with Docker. We managed to change our code so that it does not need to know where the binary file is located, and craft the installation instruction of the software in Dockerfile.
What's next
Better user dashboard
Our user dashboard currently consolidates the quizzes generated by a user. However, it does not store anything else about the quizzes other than the basic information of the quiz.
We plan to add user progress to each of the quizzes that a user has generated and attempted, including questions attempted and whether the user has checked answers. In this way, it provides a better dashboard for users to track their progress on their quizzes and thereby part of their learning progress.
Furthermore, we also plan to allow users to untrack quizzes, so that users can discard quizzes of the modules they have already finished.
Better question generation
As of now, some of the questions generated might not necessarily be useful in assessing one's understanding of the module. This is because OpenAI tends to stick to the examples given in the lecture notes if no further prompting is made.
We plan to enhance our prompt to OpenAI, so that it can craft better questions.
Automatic request processing
Currently, our app allows users to request for new modules to be included. Admins can review the modules and the associated files uploaded by the users, but there is no easy way to approve those request. Admins would have to download those files, manually extract texts from those files, and then consolidate and upload the texts to our database.
We plan to automate the approval process, by automatic conversion of files to text content, and automatic upload of the text content onto our database, once an admin has approved a request.
Furthermore, we plan to use better PDF readers, such as those that can perform OCR, so that a wider variety of materials can be accepted.
User credentials
Similar to any other apps that have user handling, we plan to allow users to change username or password.
Testing instructions
Testing instruction can be found here: https://github.com/nknguyenhc/NeuralCats#testing-instructions
Log in or sign up for Devpost to join the conversation.