[미국특허]
System and method for determining a level of operator fatigue
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
H04N-009/47
H04N-007/18
A61B-003/113
G06K-009/00
A61B-003/14
H04N-005/232
출원번호
US-0397410
(2012-02-15)
등록번호
US-9198575
(2015-12-01)
발명자
/ 주소
Blacutt, Sergio
Flores, Daniel
Flores, Ruben
Soto, Miguel
출원인 / 주소
GUARDVANT, INC.
대리인 / 주소
Hayes Soloway P.C.
인용정보
피인용 횟수 :
13인용 특허 :
36
초록▼
A system for determining a vehicle operator's level of fatigue comprises a camera for detecting at least the vehicle operator's eyes; an onboard vehicle detector for detecting at least one of vehicle speed, location, activity, acceleration, deceleration, or steering wheel rotation; and a processor w
A system for determining a vehicle operator's level of fatigue comprises a camera for detecting at least the vehicle operator's eyes; an onboard vehicle detector for detecting at least one of vehicle speed, location, activity, acceleration, deceleration, or steering wheel rotation; and a processor which receives information detected by the camera and the onboard vehicle detector and calculates a real-time operator fatigue score based on the received data. The operator fatigue score may further be based on an operator's time of shift, type of vehicle, and current and previous tasks.
대표청구항▼
1. In a vehicle having an operator, a system for determining the operator's level of fatigue comprising: a first camera that collects camera data by detecting the operator's eyes and eyelids;a first detector onboard the vehicle for detecting vehicle data including at least one of vehicle speed, loca
1. In a vehicle having an operator, a system for determining the operator's level of fatigue comprising: a first camera that collects camera data by detecting the operator's eyes and eyelids;a first detector onboard the vehicle for detecting vehicle data including at least one of vehicle speed, location, activity, acceleration, deceleration, or steering wheel rotation; anda processor which receives the camera data and the vehicle data and calculates (a) a PERCLOS value for the operation based on the camera data and calculates (b) an operator fatigue score based on the PERCLOS value and the vehicle data,wherein the processor performs trend analysis on the past and current operator fatigue scores to predict a future level of operator fatigue calculated as follows: Operator score=average PERCLOS*PERCLOS factor+average fatigue driving percentage*average fatigue driving percentage factor. 2. The system of claim 1, further comprising a second camera for detecting at least the operator's eyes and eyelids, wherein the processor further receives information detected by the second camera. 3. The system of claim 1, further comprising a second detector for detecting transmissions to and from a cellular phone, wherein the processor further receives information detected by the second detector and determines whether the vehicle operator is using a cellular phone. 4. The system of claim 1, wherein the first camera is an infrared illuminated camera. 5. The system of claim 1, wherein the processor calculates the operator fatigue score on a substantially real-time basis. 6. The system of claim 1, wherein the processor is located in a central computer remote from the vehicle and the processor receives said data through a wireless communications network. 7. The system of claim 1, wherein the processor is located onboard the vehicle and the calculated operator fatigue score is transmitted to a central computer through a wireless communications network. 8. The system of claim 2, wherein the second camera is operable to detect an operator's head and shoulders and the processor is operable to determine the occurrence of a head nod, slouching or postural adjustments based on said received data from the first camera. 9. The system of claim 1, wherein the processor is operable to determine operator distraction based on the received data. 10. The system of claim 9, wherein the processor is operable to determine the use of a cellular phone or the composition of a text message by the operator based on the received data. 11. The system of claim 1, wherein the processor further receives data indicating at least one of an operator's time into shift, type of vehicle, current task, or previous tasks. 12. The system of claim 1, wherein the processor produces an operator fatigue scorecard that displays the operator fatigue score. 13. The system of claim 12, wherein the operator fatigue scorecard further displays data received by said processor. 14. The system of claim 12, wherein the operator fatigue scorecard further displays past and current operator fatigue scores over a period of time in a plot or trend-line. 15. The system of claim 14, wherein the processor performs trend analysis on the past and current operator fatigue scores to predict a future level of operator fatigue. 16. In a vehicle having an operator, a method for determining the operator's level of fatigue comprising: detecting with a camera the operator's eyes and eyelids;detecting with a detector located onboard the vehicle, vehicle data selected from at least one of vehicle speed, vehicle location, vehicle activity, vehicle acceleration, vehicle deceleration, and vehicle steering wheel rotation;transmitting data detected by the camera and the detector to a processor; andcalculating, by the processor, an operator fatigue score based on (a) a calculated PERCLOS value for the operator based on detection by the camera, and (b) the vehicle received data,wherein the processor performs trend analysis on the past and current operator fatigue scores to predict a future level of operator fatigue calculated as follows: Operator score=average PERCLOS*PERCLOS factor+average fatigue driving percentage*average fatigue driving percentage factor. 17. The method of claim 16, wherein the step of detecting the operator's eyes and eyelids is performed with two or more cameras. 18. The method of claim 16, further comprising the steps of: detecting radio frequency signals; anddetermining whether a cellular phone is being used based on the detected radio frequency signals. 19. The method of claim 16, wherein the camera is an infrared illuminated camera. 20. The method of claim 16, wherein the processor calculates the operator fatigue score on a substantially real-time basis. 21. The method of claim 16, wherein a second camera is provided to also detect the operator's head and shoulders and the processor is operable to determine the occurrence of a head nod, slouching or postural adjustments based on said received data from the camera. 22. The method of claim 16, wherein the processor is operable to determine operator distraction based on the received data. 23. The method of claim 22, wherein the processor is operable to also determine the use of a cellular phone or the composition of a text message by the operator based on the received data. 24. The method of claim 16, wherein the processor further receives data indicating at least one of an operator's time into shift, type of vehicle, current task, or previous tasks. 25. The method of claim 16, wherein the processor produces an operator fatigue scorecard that displays the operator fatigue score. 26. The method of claim 25, wherein the operator fatigue scorecard further displays data received by said processor. 27. The method of claim 25, wherein the operator fatigue scorecard further displays past and current operator fatigue scores over a period of time in a plot or trend-line. 28. The method of claim 27, wherein the processor performs trend analysis on the past and current operator fatigue scores to predict a future level of operator fatigue. 29. An article of manufacture for determining a vehicle operator's level of fatigue, wherein said article of manufacture is in communication with a camera directed towards the vehicle operator's eyes and eyelids, and an onboard vehicle detector for detecting vehicle data selected from at least one of vehicle speed, vehicle location, vehicle activity, vehicle acceleration, vehicle deceleration, and vehicle steering wheel rotation, said article comprising a processor and a computer readable medium having computer readable program steps to effect: retrieving data detected by the camera and the onboard vehicle detector, and calculating, by the processor, an operator fatigue score based on (a) a PERCLOS value calculated for the operation from the camera, and (b) the vehicle received data, wherein the processor performs trend analysis on the past and current operator fatigue scores to predict a future level of operator fatigue calculated as follows: Operator score=average PERCLOS*PERCLOS factor+average fatigue driving percentage*average fatigue driving percentage factor. 30. The article of manufacture of claim 29, wherein the processor calculates the operator fatigue score on a substantially real-time basis. 31. The article of manufacture of claim 29, wherein the camera is also operable to detect an operator's head and shoulders, and the processor determines the occurrence of a head nod, slouching or postural adjustments based on said received data from the camera. 32. The article of manufacture of claim 29, wherein the processor determines operator distraction based on the received data. 33. The article of manufacture of claim 29, wherein the processor determines the use of a cellular phone or the composition of a text message by the operator based on the received data. 34. The article of manufacture of claim 29, wherein the processor further receives data indicating at least one of an operator's time into shill, type of vehicle, current task, or previous tasks. 35. The article of manufacture of claim 29, wherein the processor produces an operator fatigue scorecard that displays the operator fatigue score. 36. The article of manufacture of claim 29, wherein the operator fatigue scorecard further displays data received by said processor. 37. The article of manufacture of claim 29, wherein the operator fatigue scorecard further displays past and current operator fatigue scores over a period of time in a plot or trend-line.
Tsujino Hiroshi (Iruma JPX) Hasegawa Hiroshi (Niiza JPX) Sakagami Yoshiaki (Kodaira JPX) Yamashita Hirokazu (Asaka JPX), Control device of an autonomously moving body and evaluation method for data thereof.
Burke Steven A. (Champlin MN) Liang Cao Z. (Tianjin OH CNX) Hall Ernest L. (Cincinnati OH), Guiding an unmanned vehicle by reference to overhead features.
Gudat Adam J. (Edelstein IL) Bradbury Walter J. (Peoria IL) Christensen Dana A. (Peoria IL) Clow Richard G. (Phoenix AZ) Devier Lonnie J. (Pittsburgh PA) Kemner Carl A. (Peoria Heights IL) Kleimenhag, Integrated vehicle positioning and navigation system, apparatus and method.
Evans ; Jr. John M. (Brookfield CT) King Steven J. (Woodbury CT) Weiman Carl F. R. (Westport CT), Mobile robot navigation employing retroreflective ceiling features.
Ollis Mark ; Fromme Christopher C. ; Hegadorn Timothy Ennis ; Kelly Alonzo James ; Bares John ; Herman Herman ; Stentz Anthony J. ; McCargo Moore ; Jr. Richard ; Herdle David K. ; Higgins Frank ; Cam, Vision-based motion sensor for mining machine control.
Vijaya Kumar, Vivek; Grimm, Donald K.; Kiefer, Raymond J., Apparatus for assessing, predicting, and responding to driver fatigue and drowsiness levels.
Vandommele, Tjark; Wulf, Felix, Method for furnishing a warning signal, and method for generating a pre-microsleep pattern for detection of an impending microsleep event for a vehicle.
Boss, Gregory J.; Fox, Jeremy R.; Jones, Andrew R.; McConnell, Kevin C.; Moore, Jr., John E., Preventing driver distraction from incoming notifications.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.