Replies: 1 comment 1 reply
-
human does have eye gaze detection, see examples and docs. however, i do not like the idea of using ai for such use-cases so this is as much as i'm wiling to say on the subject. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello everyone, I'm currently working on an AI-based proctoring exam project where I need to determine if a student is looking away from the screen during the exam. I'm using the Human Library by Vladmandic for eye tracking. I have a few questions regarding the best variables or methods to use for detecting if a student is looking away from the screen:
1. Head Pose Estimation: I noticed that the Human Library provides head pose estimation. Can anyone provide insights on how reliable this method is for detecting if someone is looking away? Are there specific threshold values for yaw, pitch, and roll that are recommended?
2. Eye Gaze Detection: Does the Human Library offer any functionality for eye gaze detection? If so, how can I implement it to check if the student's eyes are not focused on the screen?
3. Combining Head Pose and Eye Gaze: Would it be more accurate to combine both head pose estimation and eye gaze detection for better results? If yes, how should I approach this combination?
4. Performance and Accuracy: Are there any performance or accuracy concerns I should be aware of when using these methods in a real-time exam proctoring scenario? Any advice, code snippets, or references to similar implementations would be greatly appreciated. Thank you!
Beta Was this translation helpful? Give feedback.
All reactions