Steer maneuvers for materials handling vehicles
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
B62D-006/00
B62D-015/02
B66F-009/075
G05D-001/02
G08C-017/02
출원번호
US-0205449
(2014-03-12)
등록번호
US-9493184
(2016-11-15)
발명자
/ 주소
Castaneda, Anthony T.
McCroskey, William W.
Schloemer, James F.
Schumacher, Mark E.
Siefring, Vernon W.
Wellman, Timothy A.
출원인 / 주소
Crown Equipment Corporation
대리인 / 주소
Stevens & Showalter LLP
인용정보
피인용 횟수 :
0인용 특허 :
68
초록▼
A materials handling vehicle implements a steer maneuver to hug an object. A controller on the vehicle receives sensor data from at least one sensing device and detects that a selected object is in an environment proximate the vehicle using the received sensor data. The controller performs a steer m
A materials handling vehicle implements a steer maneuver to hug an object. A controller on the vehicle receives sensor data from at least one sensing device and detects that a selected object is in an environment proximate the vehicle using the received sensor data. The controller performs a steer maneuver by causing the vehicle to steer such that the vehicle is substantially maintained at a desired distance from the selected object as the vehicle travels on a desired heading.
대표청구항▼
1. A method for a materials handling vehicle to implement a steer maneuver comprising: receiving sensor data from at least one sensing device by a controller on a materials handling vehicle;detecting that a selected object is in an environment proximate the vehicle, the environment comprising: a hug
1. A method for a materials handling vehicle to implement a steer maneuver comprising: receiving sensor data from at least one sensing device by a controller on a materials handling vehicle;detecting that a selected object is in an environment proximate the vehicle, the environment comprising: a hug zone displaced laterally from the left or right side of the vehicle;a stop zone laterally inwardly from the hug zone, wherein if an object is detected in at least a portion of the stop zone the vehicle is caused to initiate a braking operation;a no steer zone laterally outwardly from the stop zone, wherein if an object is detected in at least a portion of the no steer zone the vehicle is not permitted to turn toward the no steer zone; anda steer zone located laterally between the no steer zone and the hug zone, wherein if an object is detected in at least a portion of the steer zone the vehicle is permitted to turn toward the steer zone as long as no portion of the object is located in the no steer zone; andperforming a steer maneuver by the controller causing the vehicle to steer such that the vehicle is substantially maintained at a desired distance from the selected object, wherein the desired distance is maintained between the vehicle and the selected object as the vehicle travels on a desired heading. 2. The method of claim 1, wherein performing a steer maneuver comprises the controller automatically steering the vehicle such that the selected object is at least partially maintained in the hug zone. 3. The method of claim 2, wherein performing a steer maneuver further comprises steering the vehicle such that at least a portion of the selected object is substantially maintained on a hug line associated with the hug zone. 4. The method of claim 3, wherein: if a laterally innermost portion of the selected object is located laterally between the hug line and the vehicle, the vehicle is automatically steered away from the selected object until the laterally innermost portion of the selected object is located on the hug line, at which point the vehicle is automatically steered to the desired heading; andif the laterally innermost portion of the selected object is located laterally on the other side of the hug line than the vehicle, the vehicle is automatically steered toward the selected object until the laterally innermost portion of the selected object is located on the hug line, at which point the vehicle is automatically steered to the desired heading. 5. The method of claim 2, wherein: the hug zone extends in an axial direction that is parallel to a central axis of the vehicle and the hug zone is laterally displaced from a side of the vehicle; andthe desired heading is substantially in the axial direction. 6. The method of claim 1, wherein the environment comprises first and second hug zones, the first hug zone displaced laterally from the left side of the vehicle and the second hug zone displaced from the right side of the vehicle. 7. The method of claim 6, wherein the environment further comprises: first and second stop zones laterally inwardly from the respective first and second hug zones, wherein if an object is detected in a stop zone, the vehicle is caused to initiate a braking operation;first and second no steer zones laterally outwardly from the respective stop zones, wherein if an object is detected in at least a portion of a no steer zone the vehicle is not permitted to turn toward the no steer zone in which the object was detected; andfirst and second steer zones laterally between the respective no steer zones and the respective hug zones, wherein if an object is detected in at least a portion of a steer zone the vehicle is permitted to turn toward the steer zone in which the object was detected as long as no portion of the object is located in the corresponding no steer zone. 8. The method of claim 7, wherein the selected object is the first object that is detected in at least one of the steer zones and the no steer zones. 9. The method of claim 6, wherein the controller is programmable to only perform a steer maneuver if an object is detected in a select one of the first and second hug zones. 10. The method of claim 1, wherein detecting that a selected object is in an environment proximate the vehicle comprises detecting that the selected object is in a scanned zone of the environment, wherein the scanned zone is scanned by the at least one sensing device. 11. The method of claim 1, wherein the selected object is an object that is determined to be the closest object to the vehicle within the environment, as measured in a lateral direction that is perpendicular to a central axis of the vehicle. 12. The method of claim 1, wherein the selected object is the first object that is detected in a scanned zone defined in the environment, wherein the scanned zone is scanned by the at least one sensing device. 13. The method of claim 1, wherein the selected object comprises one of a stationary wall, a stationary rack, and a stationary stacked product face having a generally axially extending edge portion such that the vehicle is substantially maintained at a desired distance from the edge portion of the wall, rack, or stacked product face. 14. The method of claim 1, wherein performing a steer maneuver comprises the controller causing the vehicle to steer such that the vehicle is substantially maintained at a desired distance from the selected object as the vehicle travels on a desired heading until a predetermined event occurs, the predetermined event comprising at least one of; the selected object leaving the hug zone;the selected object leaving the environment; anda second object entering a predefined portion of the steer zone. 15. The method of claim 1, wherein performing a steer maneuver is implemented by the controller upon authorization to do so by an operator. 16. The method of claim 15, wherein the operator designates whether to substantially maintain the vehicle at a desired distance from an object on the left or right side of the vehicle. 17. The method of claim 16, wherein the operator authorizes a steer maneuver by activating structure on the vehicle or on a remote control device associated with the vehicle. 18. The method of claim 17, wherein the operator activating structure comprises the operator depressing a button located either on the vehicle or on the remote control device. 19. The method of claim 1, wherein performing a steer maneuver comprises: if the selected object is located outside of a zone or line defined within the environment, the vehicle is automatically steered such that the selected object is at least partially located in the zone or on the line, at which point the vehicle is automatically steered to the desired heading while maintaining the desired distance from the selected object. 20. A method for a materials handling vehicle to implement a steer maneuver comprising: receiving sensor data from at least one sensing device by a controller on a materials handling vehicle;detecting that a selected object is in an environment proximate the vehicle; andperforming a steer maneuver by the controller causing the vehicle to steer such that the vehicle is substantially maintained at a desired distance from the selected object, wherein the selected object is an object that is determined to be the closest object to the vehicle within the environment, as measured in a lateral direction that is perpendicular to a central axis of the vehicle. 21. The method of claim 20, wherein performing a steer maneuver comprises the controller causing the vehicle to steer such that the vehicle is substantially maintained at a desired distance from the selected object until a predetermined event occurs, wherein the predetermined event comprises at least one of: the selected object leaving a hug zone defined within the environment;the selected object leaving the environment; anda second object entering a predefined portion of a steer zone defined within the environment. 22. The method of claim 20, wherein performing a steer maneuver comprises the controller automatically steering the vehicle such that the selected object is at least partially maintained in a hug zone defined within the environment. 23. The method of claim 22, wherein the hug zone extends in an axial direction that is parallel to a central axis of the vehicle and the hug zone is laterally displaced from a side of the vehicle. 24. The method of claim 20, wherein performing a steer maneuver further comprises steering the vehicle such that at least a portion of the selected object is substantially maintained on a hug line associated with the vehicle. 25. The method of claim 24, wherein: if a laterally innermost portion of the selected object is located laterally between the hug line and the vehicle, the vehicle is automatically steered away from the selected object until the laterally innermost portion of the selected object is located on the hug line, at which point the vehicle is automatically steered to a desired heading; andif the laterally innermost portion of the selected object is located laterally on the other side of the hug line than the vehicle, the vehicle is automatically steered toward the selected object until the laterally innermost portion of the selected object is located on the hug line, at which point the vehicle is automatically steered to the desired heading. 26. The method of claim 25, wherein the desired heading is substantially in an axial direction that is parallel to a central axis of the vehicle. 27. The method of claim 20, wherein the environment comprises first and second hug zones, the first hug zone displaced laterally from the left side of the vehicle and the second hug zone displaced from the right side of the vehicle. 28. The method of claim 27, wherein the environment further comprises: first and second stop zones laterally inwardly from the respective first and second hug zones, wherein if an object is detected in a stop zone, the vehicle is caused to initiate a braking operation;first and second no steer zones laterally outwardly from the respective stop zones, wherein if an object is detected in at least a portion of a no steer zone the vehicle is not permitted to turn toward the no steer zone in which the object was detected; andfirst and second steer zones laterally between the respective no steer zones and the respective hug zones, wherein if an object is detected in at least a portion of a steer zone the vehicle is permitted to turn toward the steer zone in which the object was detected as long as no portion of the object is located in the corresponding no steer zone. 29. The method of claim 27, wherein the controller is programmable to only perform a steer maneuver if an object is detected in a select one of the first and second hug zones. 30. The method of claim 20, wherein detecting that a selected object is in an environment proximate the vehicle comprises detecting that the selected object is in a scanned zone of the environment, wherein the scanned zone is scanned by the at least one sensing device. 31. The method of claim 20, wherein the selected object comprises one of a stationary wall, a stationary rack, and a stationary stacked product face having a generally axially extending edge portion such that the vehicle is substantially maintained at a desired distance from the edge portion of the wall, rack, or stacked product face. 32. The method of claim 20, wherein performing a steer maneuver is implemented by the controller upon authorization to do so by an operator. 33. The method of claim 32, wherein the operator authorizes a steer maneuver by activating structure on the vehicle or on a remote control device associated with the vehicle. 34. The method of claim 33, wherein the operator activating structure comprises the operator depressing a button located either on the vehicle or on the remote control device. 35. The method of claim 33, wherein the operator designates whether to substantially maintain the vehicle at a desired distance from an object on the left or right side of the vehicle. 36. The method of claim 20, wherein performing a steer maneuver comprises: if the selected object is located outside of a zone or line defined within the environment, the vehicle is automatically steered such that the selected object is at least partially located in the zone or on the line, at which point the vehicle is automatically steered to the desired heading while maintaining the desired distance from the selected object. 37. A method for a materials handling vehicle to implement a steer maneuver comprising: receiving sensor data from at least one sensing device by a controller on a materials handling vehicle;detecting that a selected object is in an environment proximate the vehicle;in response to an operator initiated request, performing a steer maneuver by the controller causing the vehicle to steer such that the vehicle is substantially maintained at a desired distance from the selected object, wherein the operator designates whether to substantially maintain the vehicle at a desired distance from an object on the left or right side of the vehicle. 38. The method of claim 37, wherein performing a steer maneuver comprises the controller steering the vehicle such that the selected object is at least partially maintained in a hug zone defined within the environment, the hug zone extending in an axial direction that is parallel to a central axis of the vehicle and the hug zone is laterally displaced from a side of the vehicle. 39. The method of claim 38, wherein performing a steer maneuver further comprises steering the vehicle such that at least a portion of the selected object is substantially maintained on a hug line associated with the hug zone. 40. The method of claim 39, wherein: if a laterally innermost portion of the selected object is located laterally between the hug line and the vehicle, the vehicle is automatically steered away from the selected object until the laterally innermost portion of the selected object is located on the hug line, at which point the vehicle is automatically steered to a desired heading; andif the laterally innermost portion of the selected object is located laterally on the other side of the hug line than the vehicle, the vehicle is automatically steered toward the selected object until the laterally innermost portion of the selected object is located on the hug line, at which point the vehicle is automatically steered to the desired heading. 41. The method of claim 40, wherein the desired heading is substantially in an axial direction that is parallel to a central axis of the vehicle. 42. The method of claim 37, wherein the environment comprises first and second hug zones, the first hug zone displaced laterally from the left side of the vehicle and the second hug zone displaced from the right side of the vehicle. 43. The method of claim 42, wherein the environment further comprises: first and second stop zones laterally inwardly from the respective first and second hug zones, wherein if an object is detected in a stop zone, the vehicle is caused to initiate a braking operation;first and second no steer zones laterally outwardly from the respective stop zones, wherein if an object is detected in at least a portion of a no steer zone the vehicle is not permitted to turn toward the no steer zone in which the object was detected; andfirst and second steer zones laterally between the respective no steer zones and the respective hug zones, wherein if an object is detected in at least a portion of a steer zone the vehicle is permitted to turn toward the steer zone in which the object was detected as long as no portion of the object is located in the corresponding no steer zone. 44. The method of claim 43, wherein the selected object is the first object that is detected in at least one of the steer zones and the no steer zones. 45. The method of claim 37, wherein detecting that a selected object is in an environment proximate the vehicle comprises detecting that the selected object is in a scanned zone of the environment, wherein the scanned zone is scanned by the at least one sensing device. 46. The method of claim 37, wherein the selected object is an object that is determined to be the closest object to the vehicle within the environment, as measured in a lateral direction that is perpendicular to a central axis of the vehicle. 47. The method of claim 37, wherein the selected object is the first object that is detected in a scanned zone defined in the environment, wherein the scanned zone is scanned by the at least one sensing device. 48. The method of claim 37, wherein the selected object comprises one of a stationary wall, a stationary rack, and a stationary stacked product face having a generally axially extending edge portion such that the vehicle is substantially maintained at a desired distance from the edge portion of the wall, rack, or stacked product face. 49. The method of claim 37, wherein the operator authorizes a steer maneuver by activating structure on a remote control device associated with the vehicle. 50. The method of claim 49, wherein the operator activating structure comprises the operator depressing a button located on the remote control device. 51. The method of claim 37, wherein performing a steer maneuver comprises the controller causing the vehicle to steer such that the vehicle is substantially maintained at a desired distance from the selected object until a predetermined event occurs, the predetermined event comprising at least one of: the selected object leaving a hug zone defined within the environment;the selected object leaving the environment; anda second object entering a predefined portion of a steer zone defined within the environment. 52. The method of claim 37, wherein performing a steer maneuver comprises: if the selected object is located outside of a zone or line defined within the environment, the vehicle is automatically steered such that the selected object is at least partially located in the zone or on the line, at which point the vehicle is automatically steered to the desired heading while maintaining the desired distance from the selected object. 53. A method for a materials handling vehicle to implement a steer maneuver comprising: receiving sensor data from at least one sensing device by a controller on a materials handling vehicle;detecting that a selected object is in an environment proximate the vehicle; andperforming a steer maneuver by the controller causing the vehicle to steer such that the vehicle is substantially maintained at a desired distance from the selected object, wherein if the selected object is located outside of a hug zone or hug line defined within the environment, the vehicle is automatically steered such that the selected object is at least partially located in the hug zone or on the hug line, at which point the vehicle is automatically steered to the desired heading while maintaining the desired distance from the selected object. 54. A method for a materials handling vehicle to implement a steer maneuver comprising: receiving sensor data from at least one sensing device by a controller on a materials handling vehicle;detecting that a selected object is in an environment proximate the vehicle, the environment including first and second hug zones, the first hug zone displaced laterally from the left side of the vehicle and the second hug zone displaced laterally from the right side of the vehicle; andperforming a steer maneuver by the controller causing the vehicle to steer such that the vehicle is substantially maintained at a desired distance from the selected object as the vehicle travels on a desired heading, wherein the controller is programmable to only perform a steer maneuver if an object is detected in a select one of the first and second hug zones. 55. The method of claim 54, wherein performing a steer maneuver comprises the controller steering the vehicle such that the selected object is at least partially maintained in the select one of the first and second hug zones. 56. The method of claim 55, wherein performing a steer maneuver further comprises steering the vehicle such that at least a portion of the selected object is substantially maintained on a hug line associated with the select one of the first and second hug zones. 57. The method of claim 56, wherein: if a laterally innermost portion of the selected object is located laterally between hug line and the vehicle, the vehicle is automatically steered away from the selected object until the laterally innermost portion of the selected object is located on the hug line, at which point the vehicle is automatically steered to a desired heading; andif the laterally innermost portion of the selected object is located laterally on the other side of the hug line than the vehicle, the vehicle is automatically steered toward the selected object until the laterally innermost portion of the selected object is located on the hug line, at which point the vehicle is automatically steered to the desired heading. 58. The method of claim 54, wherein the environment further comprises: first and second stop zones laterally inwardly from the respective first and second hug zones, wherein if an object is detected in a stop zone, the vehicle is caused to initiate a braking operation;first and second no steer zones laterally outwardly from the respective stop zones, wherein if an object is detected in at least a portion of a no steer zone the vehicle is not permitted to turn toward the no steer zone in which the object was detected; andfirst and second steer zones laterally between the respective no steer zones and the respective hug zones, wherein if an object is detected in at least a portion of a steer zone the vehicle is permitted to turn toward the steer zone in which the object was detected as long as no portion of the object is located in the corresponding no steer zone. 59. The method of claim 54, wherein the operator authorizes a steer maneuver by activating structure on a remote control device associated with the vehicle. 60. The method of claim 59, wherein the operator activating structure comprises the operator depressing a button located on a remote control device associated with the vehicle.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (68)
Kakinami Toshiaki,JPX ; Saiki Mitsuyoshi,JPX ; Soshi Kunihiko,JPX ; Satonaka Hisashi,JPX, Apparatus for detecting an object located ahead of a vehicle using plural cameras with different fields of view.
Hasselmann Heinz (Hagen DEX) Mnzebrock Anton (Dortmund DEX) Stehr Peter (Hagen DEX) Voll Walter (Hassfurt DEX), Method and apparatus for the wireless control of lift devices by infrared transmission.
Papanikolopoulos, Nikolaos P.; Krantz, Donald G.; Voyles, Richard M.; Bushey, John A.; Johnson, Alan N.; Nelson, Bradley J.; Rybski, Paul E.; Griggs, Kathleen A.; Urban, II, Ellison C., Miniature robotic vehicles and methods of controlling same.
Hatton John H. (5275 Craner Ave. North Hollywood CA 91601) Batt Gregory L. (#2 8741 Montcalm Street Vancouver ; British Columbia CAX V6 4R1), Multi-axis articulated all terrain vehicle.
Mueller Steven J. (Manitowoc WI) Haupt Richard O. (Manitowoc WI) Haupt Donald G. (Manitowoc WI) Kempfert Lowell A. (Neenah WI), Travel speed limiting system for forklift trucks.
Miller Phillip ; Koenck Steven E. ; Walter Jerry L. ; Kubler Joseph J. ; Cargin ; Jr. Keith K. ; Hanson George E. ; Davis Patrick H. ; Kunert Steven R. ; Schultz Darald R., Vehicular data system for communicating with remote host.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.