Key Features:
Facial Expression Recognition: Utilizing state-of-the-art computer vision techniques, the system can detect students' facial expressions while they are watching course videos. This enables real-time assessment of their emotional states, including confusion or boredom.
Attention Detection: The product employs the facial expression recognition capabilities to identify when students become distracted or lose focus during the video. This allows for pinpointing specific moments where attention is most likely to wane.
Intelligent Pop-up Notifications: When a student's attention wavers, the system displays interactive pop-up notifications directly on the video. These notifications include contextually relevant questions generated by an advanced language model, like ChatGPT, based on the content of the video transcript and the student's profile.
Data Analytics Dashboard: The product provides a comprehensive data analytics dashboard for course instructors. This dashboard visualizes aggregated data on student engagement and attention patterns, highlighting the specific parts of the video where the majority of students lose concentration. This data empowers instructors to make data-driven decisions and optimize their courses for better student retention and comprehension.
User-Friendly Interface: The user interface is designed to be intuitive and seamless, allowing students to access course videos, interact with the pop-up notifications, and navigate through the content effortlessly. The system is responsive and compatible with various devices, providing a smooth user experience.
Data Privacy and Security: We prioritize the privacy and security of user data. The system adheres to industry best practices and compliance with relevant data protection regulations, ensuring that student information is handled securely and confidentially.
Getting Started:
To use the Ed Tech Video Engagement Analyzer, follow these steps:
Install the required dependencies and set up the development environment. Detailed instructions can be found in the accompanying documentation.
Upload pre-recorded course videos to the system and generate corresponding transcripts.
Integrate the facial expression recognition module and chatbot functionality into your course video player.
Deploy the system and configure the data analytics dashboard for instructors to access the engagement data.
Continuously gather user feedback and iterate on the product to enhance its effectiveness and user experience.
Contributions:
We welcome contributions from the open-source community to improve this educational technology product. If you are interested in contributing, please refer to the contribution guidelines provided in the repository.
License:
The Ed Tech Video Engagement Analyzer is released under the MIT License. You are free to use, modify, and distribute this product for educational purposes.
Disclaimer:
Please note that this product is a research prototype and should be used with caution. While it aims to enhance student engagement and learning, its accuracy and effectiveness may vary depending on various factors. Use of this product should be in accordance with applicable regulations and ethical considerations.
We hope this description provides a comprehensive overview of your ed tech product and encourages collaboration from the open-source community. Good luck with your development!
Built With
- assemblyai
- chatgpt
- hume
- openai
- python
- tensorflow
Log in or sign up for Devpost to join the conversation.