Working on a senior project using vision builder, labview, and NI camera 1712. Unfortunately, none of us in the group has had a lot of exposure to working with any of it. So far, we've been able to have the camera to identify the eyes as well as developed a program that allows us to type in points in x and y coordinates and moves the mouse to those coordinates on the screen. Now what we want to do is sync up that eye movement so that the mouse cursor will move based on eyes rather than typing in the location. Could someone please help us with developing/pointing us in the right direction of how to create the algorithms that will allow us to use the real-time tracking of the eyes to moving the mouse cursor?
Thanks a ton in advance!!! =)