Yes this is a good remark you pointed but then it would be another effect, when you look down eyes looks like they are nearly closed when observed from a static camera.
I'll try with several positions of the camera to check what can be the most efficient and usable.
Grrr have to rebuild a kernel because of this unsupported :laser: camera :chase: losing time, delayed things are harassing :ranger:
ok fixed now :)
Next will be for tomorrow, on my side it's 00:29, time to sleep a bit. i shortly tried the camera, the IR is giving pretty good results in complete dark.
Today i modified the code to let it me say using voice synthesis the directions but this was slowing too much the overall result so i left the voice on the side for the moment to focus on the most important.
I simplified the angles readings to get simple directions like N, E, S, W, NE, SE, SW, NW with thisand using counters inside the code, they should play the role of determining how long i watch in one direction. When comparing the elapsed time to a certain amount of pixels i wish to move, i send values to a dummy device. This dummy device using Xlib has for task to get values and move the cursor on the screen.
Don't know if you followed the idea but with simple elements, it's possible to get something roughly designed to be used as starting point to make something usefull :)
The video showing the result of the simplified output, after that i believe things will be clearer.
ok on the previous video the interesting part was in the console juste under what camera was displaying. Now one more with IR, i definitely have to build some glasses with a camera at least for the dev time, after that it'll be always time to chose a good camera to be embedded inside the mirror or clipped somewhere on the mirror's structure.
Still perfecting the thing to make it be not polluted by other shadows or light events and to always make the camera be in relation with the pupil in every conditions. Tried the camera cursor redirected to the dummy mouse, it werks :cool:
Should be hopefully a new video in next days for the eye tracking thing and direct access to mouse cursor only by looking at it, this means that detecting angle and sending pulses to the dumy device is not needed anymore. Just look the screen and the mouse cursor is jumping to that position but before that i still have to fix accuracy which sometimes is losing iris :)
Last news, finally found a way to lock eyes position in despite of head moves, this will allow me to go to next step, guessing the gaze direction without the hassles to remain in one unique position which is not usable in car environment.
I also found a skin detection algo, the purpose of it will be, once integrated inside the eyes locking program, to detect eye blink and by the way check if it was intentional blink or not, so click or not on the point the eye is looking at.
You're starting to get famous, kraft :)
You're quoted in severals forums for your voice control : http://www.alionet.org/index.php?showtopic=16882
lol i'm not famous and i don't care.
I thought that this version of perlbox could be usefull for people with disabilities to use it at home but it seems more that no one is interested with french voice recognition :)
While i'm here, i have insulated the iris but i still have troubles with filtering, sometimes for unknown reason and while the iris is well insulated from the background and displayed, the library has difficulties to extract the shape. I'll try some other strategic way of filtering but i have the feeling that i'm very near of final result, at least the pupils only are displayed in live image. I also have to find a way to get relative movement of pupils to the targeted area (eyes) instead of the whole image.
This is necessary to avoid a camera hanging on glasses.