AI Eye Gaze Tracker

 AI Eye Gaze Tracker 

1. Introduction

The AI Eye Gaze Tracker is a project designed to determine where a user is looking in real-time. This system combines computer vision and machine learning techniques to track eye movements and map gaze directions. Applications include assistive technologies, user behavior analysis, and gaming.

2. Prerequisites

• Python: Install Python 3.x from the official Python website.
• Required Libraries:
  - opencv-python: Install using pip install opencv-python
  - dlib: Install using pip install dlib
  - numpy: Install using pip install numpy
  - imutils: Install using pip install imutils
• A webcam or camera-enabled device for real-time tracking.

3. Project Setup

1. Create a Project Directory:

- Name your project folder, e.g., `AI_Eye_Gaze_Tracker`.
- Inside this folder, create the Python script file (`eye_gaze_tracker.py`).

2. Install Required Libraries:

Ensure OpenCV, dlib, and other dependencies are installed using `pip`.

4. Writing the Code

Below is an example code snippet for the AI Eye Gaze Tracker:


import cv2
import dlib
import numpy as np

# Initialize dlib's face detector and shape predictor
detector = dlib.get_frontal_face_detector()
predictor = dlib.shape_predictor('shape_predictor_68_face_landmarks.dat')

# Function to get the gaze direction
def get_eye_gaze(eye_points, facial_landmarks):
    # Extract eye region from facial landmarks
    eye_region = np.array([(facial_landmarks.part(point).x, facial_landmarks.part(point).y) for point in eye_points])
    min_x = np.min(eye_region[:, 0])
    max_x = np.max(eye_region[:, 0])
    min_y = np.min(eye_region[:, 1])
    max_y = np.max(eye_region[:, 1])
    return min_x, max_x, min_y, max_y

# Main function for real-time eye gaze tracking
def main():
    cap = cv2.VideoCapture(0)

    while True:
        ret, frame = cap.read()
        if not ret:
            break

        gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
        faces = detector(gray)

        for face in faces:
            landmarks = predictor(gray, face)
            left_eye = get_eye_gaze([36, 37, 38, 39, 40, 41], landmarks)
            right_eye = get_eye_gaze([42, 43, 44, 45, 46, 47], landmarks)

            # Highlight eye regions
            cv2.rectangle(frame, (left_eye[0], left_eye[2]), (left_eye[1], left_eye[3]), (255, 0, 0), 2)
            cv2.rectangle(frame, (right_eye[0], right_eye[2]), (right_eye[1], right_eye[3]), (255, 0, 0), 2)

        cv2.imshow('AI Eye Gaze Tracker', frame)
        if cv2.waitKey(1) & 0xFF == ord('q'):
            break

    cap.release()
    cv2.destroyAllWindows()

if __name__ == "__main__":
    main()
   

5. Key Components

• Face Detection: Detects faces using dlib's face detector.
• Landmark Detection: Identifies key points for the eyes using a pre-trained shape predictor.
• Eye Gaze Mapping: Tracks eye movement and highlights gaze regions in real-time.

6. Testing

1. Ensure the shape predictor file (`shape_predictor_68_face_landmarks.dat`) is downloaded and available.

2. Run the script:

   python eye_gaze_tracker.py

3. Verify the detection and tracking of gaze direction in the video feed.

7. Enhancements

• Gaze Direction Classification: Map gaze directions to specific screen regions.
• Multi-User Tracking: Track eye gaze for multiple users simultaneously.
• Integration with Applications: Use gaze data in gaming or assistive technologies.

8. Troubleshooting

• Detection Issues: Ensure proper lighting and clear video input.
• Shape Predictor Errors: Verify the path to the `shape_predictor_68_face_landmarks.dat` file.
• Performance Lag: Optimize the code or reduce input resolution.

9. Conclusion

The AI Eye Gaze Tracker project showcases the ability to track eye movements and detect gaze direction. This technology has a wide range of applications and can be enhanced for specific use cases.