"The Smart Nation is an initiative by the Government of Singapore to harness Infocomm technologies, networks and big data to create tech-enabled solutions."
In this Hackathon, we aim to develop AI models that help to solve industrial or social problems in the new stage of social development.
In line with the seven National AI Projects that address Key Challenges in Singapore, we decided to focus on the Education section. Our objective is to provide personalised education for students by enhancing the learning experience for both teachers and students.
A study showed that 64%
of students struggle to maintain focus and discipline over online learning.
Due to the Covid Pandemic, the trend of class settings shift to online mode aligning with the advancement of technology. While students trying to adapt to the new learning style, teachers are also struggling to track the level of engagement of all students.
If teachers knew how engaged and focused their students are in class, they can improve their teaching methods by:
- Adjusting their teaching pace
- Putting more emphasis on harder topics
How might we harness AI to help teachers determine the level of engagement of students in online learning?
A computer-vision based system that tracks students level of engagement in real-time during an online class session.
Comparing to traditional image processing method which is complex and required certain combination of models training, our solution using Eye Aspect Ratio (EAR)
which is more more elegant, efficient and easy to implement as it just requires simple calculation based on the ratio of distances between Facial Landmarks of eyes.
- OpenCV with dlib as our main machine learning libraries.
- Detect face using "HOG-based" face detector from dlib
- Predict Facial Landmarks using dlib and obtain Facial Landmarks index for both eyes using module
face_utils
- Live video streamming through USB/webcam/Raspberry Pi camera using module
VideoStream
in library imutils
Note: As our project requires datasets of people's images (which are mostly inaccessible to protect privacy), we resort to using pre-trained model in dlib and imutils to detect face and predict Facial Landmarks.
- The 2D facial landmark locations of each eye is represented by 6
$(x,y)$ -coordinates starting from the left-corner of eyes (from the perspective of 3rd party), and ploting it clockwise evenly for the remaining region.
The Eye Aspect Ratio (EAR) equation:
$$EAR = \frac{\parallel{p_2-p_6}\parallel + \parallel{p_3-p_5}\parallel}{2||p_1-p_4||}$$
- Numerator: Distance between the vertical eye landmarks
- Denominator: Distance between the horizontal eye landmarks (twice the Denominator because there is only one set of horizontal points but two sets of vertical points)
- Eyes open:
$\text{EAR} \approx \text{constant}$
Eyes closed:$\text{EAR} = 0$ - For the first 5 seconds, we calibrate the average of EAR of the student.
- Sets a threshold of 90% of average EAR to signify closed eyes.
- Storing and querying of student's engagement level is done using API built under Django's REST framework
- Data is stored on an SQLite database.
- Student-side interface is built using
streamlit
. - Professor-side interface is built using Django's template engine.
Key in relevant info via Welcome to aSES
page, the system can be run on the students devices via USB/webcame/Raspberry Pi camera and track their engagements during online classes.
-
The videos are not being recorded, only data such as
Engagement of student
in binary form will be saved into database -
After class, the results will be uploaded to cloud database.
-
Based on Eye Aspect Ratio measurement and face detection
Receive information regarding student engagement via Automated Student Engagement System (aSES)
, where they get feedback on the amount of time students spent disengaged and the engagement level over time by querying their names, course, module, group.
- Category:
- Individual student (by querying the student's matriculation number)
- The entire class (by leaving matriculation number section blank)
- Cloud-based database is not free for this Hackathon.
- Public datasets specific to our project's aim is not accessible due to privacy concerns.
- Successfully developed a computer-vision based AI model under 72 hours using OpenCV and dlib.
- Successfully developed a simple fullstack application for our system using Django and JavaScript.
- Eye detection algorithm can be improved for greater accuracy
- Presentation of live statistic to teachers when class ongoing, and use AI for recommendation on how teachers should react
- Dependencies can be built into a client program for easier installation
- Solution can be expanded to include facial recognition with multiple users in the field of vision
- Applicable to in-person learning in lecture theatres/ classrooms
- Solution can be applied in other fields such as transportation (tracking driver's attention), media (advertising) and medical industry (research on attention-related disorders).
Our project is built entirely on open-source libraries, frameworks, and languages. The most common benefit of open source software is cost saving; it implies that we are not obligated to pay for the use of software, making our project's overall budget cost extremely low. The quality of open-source projects is high, too, because the community is constantly improving the software. To inherit the excellent trait of an open-source project, we have decided to release our project as open source to allow other people to build on top of our project and constantly improve it until one day we receive enough funding from angel investors. After receiving the grant, we will turn the project into private, hire sophisticated software programmers to improve on top of the base, then release the product on a subscription basis for the general public.
As a student, we solve problems in our daily lives, so we came up with the idea of tracking the engagement status of students in class. With aSES, we allow the professor to receive student engagement reports so that they will get feedback on the number of time students spent disengaged. However, on second thought, we think our project will benefit the student community and the general public. For example, detecting the engagement status of a bus driver on a long ride to ensures the passenger's safety.
- Lee Juin @Neo-Zenith
- Daniel Tan Teck Wee @DanielTanTWOfficial
- Ng Woon Yee @woonyee28
- Lee Ci Hui @perfectsquare123
- Weng Pei He @wph12
- Bernice Koh Jun Yan @bernicekjy
- This project uses Django which is licensed under BSD-3 license for all backend programming logic and frontend template rendering.
- This project uses Charts.js which is licensed under MIT license for the visual display of charts on frontend.
- This project uses OpenCV which is licensed under Apache 2 license for the machine learning logic.
Below are some links that we have used as references throughout the project:
- https://iblnews.org/a-survey-shows-that-many-college-students-struggle-to-maintain-focus-and-discipline-in-distance-learning/
- https://pyimagesearch.com/2017/04/24/eye-blink-detection-opencv-python-dlib/
- http://vision.fe.uni-lj.si/cvww2016/proceedings/papers/05.pdf
- Begin by cloning this repository to your working directory:
git clone https://github.com/Neo-Zenith/Project-ASES.git
-
Run
setup.bat
. This will install Python virtual environment and all the dependcies required.- Note that installation of
dlib
will take some time. It is normal that your CPU and/or RAM usage will spike to 100% during this period.
- Note that installation of
-
Run
server.bat
to initialize the server. -
Run
client.bat
to initialize the client application.