Çapşek M.F.Karaci A.2023-04-112023-04-122023-04-112023-04-122022-12-31https://hdl.handle.net/20.500.12597/3791In this study, an artificial intelligence-supported system was developed for these individuals to control the mouse with head and eye movements. In this system, facial movements and eyes are detected in real-time from the images obtained from the camera through the Haar Cascade, Dlib, and Open CV libraries. While Haar Cascade is used to detect the face region, the Dblib library is used to obtain right and left eye region images from this detected face image. These eye region images obtained are given as input to the CNN model trained with 2874 eye data and it is determined whether the eye is closed or open. The CNN model was trained on a public eye image dataset representing 1500 open and 1374 closed eye states. Closing and opening of the left eye cause the left click of the mouse and closing and opening of the right eye cause the right click of the mouse. In addition, the position of the face detected by Haar Cascade is used to model mouse movement. According to the test results, it has been observed that the system correctly detects the eyes and the open-closed state of these eyes, and correctly classifies the blinking event in both eyes with CNN. However, it has been determined that there are cases of slowness or incomplete adaptation to facial movement in the modeling of mouse movement.truecnn | dlib | Haar cascade | image processing | mouse controlComputer Control with Face and Eye Movements Using Deep Learning and Image Processing MethodsArticle10.31202/ecjse.11313772-s2.0-85146769694