보고서 정보
주관연구기관 |
한국과학기술원 Korea Advanced Institute of Science and Technology |
연구책임자 |
변증남
|
참여연구자 |
정명진
,
장평훈
,
권동수
,
김대진
,
한정수
,
도준형
,
장효영
,
이영진
,
김재헌
,
유동현
,
김도형
,
박형순
,
강상훈
,
김성태
,
김종현
,
안진웅
,
유지환
,
황정훈
,
이규빈
,
임수철
,
곽은화
,
김형록
,
최군호
,
홍선기
,
이성현
,
정의정
,
이남수
,
전상원
|
보고서유형 | 3단계보고서 |
발행국가 | 대한민국 |
언어 |
한국어
|
발행년월 | 2003-08 |
과제시작연도 |
2002 |
주관부처 |
과학기술부 |
사업 관리 기관 |
한국과학재단 Korea Science and Engineering Foundtion |
등록번호 |
TRKO200900073056 |
과제고유번호 |
1350006009 |
사업명 |
중점국가연구개발사업 |
DB 구축일자 |
2015-01-08
|
키워드 |
인간-로봇 상호작용.재활공학 시스템.서비스 로봇.Human-robot interaction.Rehabilitation engineering system.Service robot.
|
초록
▼
. 환자 및 장애인을 위한 휠체어 로봇 시스템의 개발
- 목표 지향 설계를 통한 12가지 작업이 가능한 로봇 설계 기법 제안, 텐션 조절이 가능한 back-drivable하고 백 래쉬 없는 동력 전달 메카니즘 구현, 별도의 힘 센터 없는 강인한 compliance 제어 기법 제안
- 지능형 비주얼 서보잉을 위한 인간 시각 시스템에 기반한 새로운 영상 처리 기법, 얼굴 표정 인식을 통한 사용자 의도 파악 기능 구현, 소형/경량의 스테레오 카메라 헤드 및 제어기 개발, 인간-로봇 공동작업 기술의 구현, 로봇 팔 제어를
. 환자 및 장애인을 위한 휠체어 로봇 시스템의 개발
- 목표 지향 설계를 통한 12가지 작업이 가능한 로봇 설계 기법 제안, 텐션 조절이 가능한 back-drivable하고 백 래쉬 없는 동력 전달 메카니즘 구현, 별도의 힘 센터 없는 강인한 compliance 제어 기법 제안
- 지능형 비주얼 서보잉을 위한 인간 시각 시스템에 기반한 새로운 영상 처리 기법, 얼굴 표정 인식을 통한 사용자 의도 파악 기능 구현, 소형/경량의 스테레오 카메라 헤드 및 제어기 개발, 인간-로봇 공동작업 기술의 구현, 로봇 팔 제어를 위한 생체신호 분류 알고리즘.
- 자기센서와 이미지 처리 알고리즘을 통해 장착형 Eye-mouse 시스템을 구현했고 적외선 LED와 기하학적 특성을 이용하여 비장착형 Eye-mouse 시스템을 구현, 아울러 이러한 시스템을 활용하는 인터페이스 프로그램과 여기에 접목된 음성처리 시스템 및 전체 시스템의 성능을 향상시키는 3차원 위치 추출 시스템을 개발
- Body Motion을 측정하여 장애인의 보조 시스템을 구동할 수 있도록 미리 인터페이스, 의복 형태 조종기, 반능동 힘반향 수동 조종기를 개발하고 로봇과 사용자 사이의 원활한 협조가 가능하도록 지능형 인터페이스 알고리즘 개발
- 저가, 경량형으로 사용이 편리한 Hand-held Master의 개발 및 그를 위한 제어 알고리즘의 개발, 다양한 센서를 이용한 master의 6차원 자세를 실시간 획득하고, 가상 또는 실제의 로봇과 물체의 운동을 지시함으로써 힘의 피드백정보를 master의 진동으로 나타내어 사용하기 편리하고 직관적인 제어알고리즘 개발
- 휠체어 로봇 시스템에서, 사용자 인터페이스를 탑재한 휠체어 시스템과 함께 중요 시스템으로 인식되는 로봇팔 탑재용 시스템의 기구적/외형적 개선 및 제작/재활로봇 시스템과의 연동 기능 구현
Abstract
▼
To develop a wheelchair-based rehabilitation robotic system KARES (KA1ST Rehabilitation Engineering Service system) H, we have been developing various kinds of intelligent human- robot interaction system during whole period (1998. 12 ? 2003.8) as follows;
(1) Soft Robotic Arm
In this researc
To develop a wheelchair-based rehabilitation robotic system KARES (KA1ST Rehabilitation Engineering Service system) H, we have been developing various kinds of intelligent human- robot interaction system during whole period (1998. 12 ? 2003.8) as follows;
(1) Soft Robotic Arm
In this research, we focus on three aspects as follows; First, target oriented design method which can makes given task does well. Second, realization of tension adjustable cable mechanism which has good backdrivability. And third, sensorless compliance control based on realized cable mechanism. Target oriented design (TOD) is a important research about robot design method. Desinged by this method, robot can do the pre-defined well. And tension adjustable cable mechanism makes cable mechanism more reliable technology. And sensorless compliance control technique which can make robot more safely operating contact tasks is also important one.
The results of this research is as follow. (1) Tension adjustable cable mechanism is realized. (2) Using the developed cable mechanism, wrist part of robot is designed and Manufactured. (3)Sensorless active compliance control algorithm for the safety of the user is developed and implemented. And by the clinical evaluation, preference of compliance of the disabled is acquired. (4) Execution of 12 pre-defined tasks using designed robot arm is done. (5) Task analysis based on sub-task is performed. By the analysis, the relation of the tasks is established. System integration experiment was performed about ‘Serving a beverage’ task which includes all of the subtasks.
(2) Robotic Arm Control using Visual Servoing and Biosignal-based Control Scheme
In this research, we focus on two aspects as follows; First, Visual sevoing technology which is one of the vision-based control for robotic arm. Second, robotic arm control using electromyography (EMG) signal caused by muscular movement. Visual servoing technology is for enhancement of autonomy of the robotic arm, and EMG signal-based control technology is for realizing human- friendly and easy-to-use human-robot interaction. (1) Visua1 Servoing technology
A. Modified log-polar mapping and efficient visual servoing structure using fuzzy logic to overcome the limitation of conventional log-polar image-based visual servoing. B. Development of small-sized/light- weighted stereo camera head for visual servoing. C. Enhancement of performance for tasks based on ‘sensor fusion’ and ‘human-robot cooperation’ technology. D. Systematic approach for user’s facial expression recognition and its application to human-robot interaction. E. Clinical evaluation of visual servoing technology and reflection of evaluation results. F. Based on the feedback from clinical results, we designed a novel human-friendly eye-in-hand type stereo camera head with 1 DOF gripper system (flat finger with passively coupled joint based on tendon mechanism and gear). The results from this research is categorized as two parts such as hardware part and software part. In hardware part, tendon-driven mechanism has so many advantages and it can be applied to various kinds of mechanical driving part. Software part also can be applied to various human-robot interaction technology, real-time image processing and public service robots including rehabilitation robotics. Furthermore, results from clinical evaluation affects the value of final products in view of commodity and utility
(2) EMG signal-based control technology
A. Development of the 6 DOF Pattern Classification algorithm based on EMG signals, B. Development of the force extraction algorithm from EMG signals based on ICA (independent component algorithm), C. Development of the,wireless EMG AMP,D. Development of an EMG-based Powered Wheelchair Interface for Users with High-level Spinal Cord Injury, E. Clinical Evaluation of Biosignal Pattern Classification Technology and Further Works
We summarized our biosignal classification results as hardware and software components. In hardware part, small size, low cost, and high performance wire/wireless EMG AMP is developed. This EMG AMP gives us high S/N and is robust to other environmental noises. An EMG-based powered wheelchair interface for high-level spinal cord injury was successfully evaluated. We designed EMG pattern classifier based on soft computing techniques. We expect that the developed techniques will be used not only rehabilitation robot but also mouse pointer control of a wearable computer, and input device of home appliances, and so on.
(3) Development of Interface Technique using the Eye-mouse system In this research, we develop the system that allows the disabled who cannot use their limbs for themselves to control a robot and an electrical wheelchair. Since the persons with C4-spinal lesion can only move their head and eye to interface with an assistant robot, it is very difficult to transfer the commands that meet variable users demand to the robot Consequently, for the people with severe motor disability, it must be useful input device to move a mouse pointer, which is the most general input device for a computer, by using their eye-graze direction. The developing system in this research are intrusive Eye-mouse system and Non-intrusive Eye-mouse system. Intrusive Eye- mouse system, which is a wearable system, uses a IR LED, eye image and magnetic position sensor. Non-intrusive Eye-mouse system, which is a non-contact system, uses 5 IR LEDs and eye images Using the proposed system, the user moves the mouse Pointer on the computer monitor to select a menu or a button on an interface Program As the results, the user can transfer commands to the robot and the electrical wheelchair Also, we develop a user friendly interface Program suitable for this eye-mouse system and a stereo camera system that extracts the depth information of the target object to improve the performance of the robot. The eye-mouse system that is developed in this research is an convenient interface technique not only for the disabled and the old and the weak but also for the people who is not disabled.
(4) Body motion- based Interface (or Haptic Suit)
To aid handicapped persons for the basic tasks of their daily life, a user-friendly human- robot interface is the most important thing. Diversity of injury level of spinal cord injured persons causes the requirement of developing several different human-robot interfaces. The object of this research is to provide all actual aid to increase both mobility of severely spinal injured people by means of developing head interface (for C4 injured persons) , shoulder interface (for C5 injured persons), and hand master (for hand available persons) and manipulability by means of developing robot arm interface
Wearable master for wheelchair control using shoulder motion was developed. FSR sensor for measurement of shoulder motion is robust and cheap. The suit of wearable master is good-looking and convenient to put on and off. Head interface consists of a tilt sensor and looks like ordinary cap. With this interface, the disabled could control wheelchair just nodding head. Both wearable master and head interface call be connected to the wheelchair easily using developed stand- alone control unit. A supervisory control scheme was proposed for mutual cooperation naturally between human and wheelchair. Supervisory control was embodied using fuzzy algorithm to determine the supervisor level of human. Fatigue of the disabled controlling wheelchair can be reduced using the developed supervisory control scheme. Also vibro-tactile actuator plays an important role in the supervisory control to notify robot (wheelchair)’s intension of next motion.
A small compact force feedback joystick with 2 DOF was developed. It has two brakes as passive actuator and a independent control unit that communicate with computer by serial protocol. New control algorithm for active- passive combined haptic device was developed.
For the manipulability of the disabled, robot arm interface was also developed. It consists of laser pointer, sip&puff measuring device, voice recognition module. The disabled can control a 6 DOF robot arm easily with this interface
The feasibility of developed interfaces are verified through the clinical tests, and shoulder/head interfaces are close to commercially available level. All the developed interfaces can be applied to game industry, medical application and general human robot interface, etc. Therefore, this research results will be able to be the comerstone for the research about the human robot interfaces
(5) Development of micro input device for tele-operated diagnostic/surgery and its application to robot system.
In these days, a minimally invasive surgery (MIS) is focused by many work areas such as dignostic, surgery of human-body and so on. This research aims to develop easily usable/ extendible tele-operated diagnostic/surgery micro input device, control method for tele-operation and sensor fusion technology
At first, a pen-type human-friendly micro input device with MEMs-based micro accelerometer(3mG) and micro gyrosensor (0.1deg/sec) has been developed Next, reference-adaptive control techniques are devised for effective tele-operation Finally, for posture recognition, a new sensor fusion technology is made with perception-net approach.
The pen-type micro input device is characterized by its low-cost/light-weighted body and high maneuverability
(6) Hand-held Master Device
We propose a cordless hand-held master and a tele-operation based shared control method for the hand-held master.
The hand- held master is a means which converts the human intention into a command so that the command is used for the controller of the slave robot or VR object. Conventional master arm whose joint are geared or tendon-driven, can give an intuitive input command, and can reflect force and/or tactile feedback. Many master arms are put into the market, particularly in surgical robot and VR. We developed hand-held type masters to increase intuitiveness and to easily make input command. The sensors equipped are inertial sensors (accelerometers, gyros), ultrasonic sensors, magnetic sensors and camera.
The key factors of the masters are the intuitiveness, cost, and the applicability to various fields. The portability of a master is also very important in that the mobile internet technology is very rapidly spreading into wider areas. Considering the key factors, we propose a hand-held portable master which can be grasped like a pen by an operator’s hand to increase the intuitiveness. The proposed master has a very simple structure and can be produced at very low cost. The proposed master can be applied in various fields such as VR, hazardous environment, micro surgery, wearable computing, 3D calibration, and so on. We also propose a shared control method, called Force Accommodation Based Adaptive Reference (FABAR) Controller, to control the tele-operation system with the hand- held master. We employ the force accommodation and combine it with the adaptive reference control. The controller enables the hand-held master to be used human-friendly in master-slave tele-operation and virtual reality environment. The controller combines force accommodation with reference adaptation to achieve mode switching and contact transition of the hand- held master. With the FARAR control implemented in the tele-operation controller, the tele-operation system can be operated by allowing the operator to naturally switch mode based on the adaptive reference control, while the required contact transition and impedance control is implemented by force accommodation based control
(7) Development of Application system for Rehabilitation Robotic system.
In this research, we have been developed a mobile robot system which is a basic mobile platform for KARES II system Specially, this research aims a refinement of previous mobile robot system of KARES II with following characteristics: 1) development of advanced motion controller DMC2, 2) mechanically-enhanced differential-wheel structure of mobile robot system and 3) human-friendly and easy-to-use design of mobile robot system. Although it is newly developed for KARES II system, we believe that this basic products can be applied for maT1y other mobile robot system eddectively.
목차 Contents
- 제1장 연구개발과제의 개요...32
- 0. 지능형 인간-로봇 상호 작용 기술의 개요...32
- 1. 인간친화형 소프트 로봇팔의 구현...37
- 2. Visual Servoing과 생체신호를 이용한 로봇 팔 제어...39
- 3. 눈동자 추적에 의한 대상 물체 지적(Eye-mouse 시스템) 및 인터페이스 기술개발...41
- 4. Body Motion을 이용한 로봇 인터페이스 개발...42
- 5. 원격조종진단, 수술용 마이크로 입력장치 및 로봇 시스템 개발...44
- 6. Hand-held Master의 개발...44
- 7. 재활로봇 시스템용 파생기술 기반 응용시스템 개발...45
- 제2장 국내외 기술개발 현황...46
- 0. 기존의 재활로봇 시스템과 KARES II 시스템의 목표 기능...46
- 1. 인간친화형 소프트 로봇팔의 구현...47
- 2 Visual Servoing과 생체신호를 이용한 로봇 팔 제어...50
- 3. 눈동자 추적에 의한 대상 물체 지적(Eye-mouse 시스템) 및 인터페이스 기술개발...53
- 4. Body Motion을 이용한 로봇 인터페이스 개발...54
- 5. 원격조종진단, 수술용 마이크로 입력장치 및 로봇 시스템 개발...58
- 6. Hand-Held Master의 개발...59
- 7. 재활로봇 시스템용 파생기술 기반 응용시스템 개발...62
- 제3장 연구개발수행 내용 및 결과...66
- 0. KARES II 시스템 개요...66
- 1. 인간친화형 소프트 로봇팔의 구현...94
- 2. Visual Servoing과 생체신호를 이용한 로봇 팔 제어...201
- 3. 눈동자 추적에 의한 대상 물체 지적(Eye-mouse 시스템) 및 인터페이스 기술개발...339
- 4. Body Motion을 이용한 로봇 인터페이스 개발...419
- 5. 원격조종진단, 수술용 마이크로 입력장치 및 로봇 시스템 개발...545
- 6. Hand-held Master의 개발...549
- 7. 재활로봇 시스템용 파생기술 기반 응용시스템 개발...585
- 제4장 목표달성도 및 관련분야에의 기여도...598
- 1. 인간친화형 소프트 로봇팔의 구현...598
- 2. Visual Servoing과 생체신호를 이용한 로봇 팔 제어...600
- 3. 눈동자 추적에 의한 대상 물체 지적(Eye-mouse 시스템) 및 인터페이스 기술개발...605
- 4. Body Motion을 이용한 로봇 인터페이스 개발...609
- 5. 원격조종진단, 수술용 마이크로 입력장치 및 로봇 시스템 개발...611
- 6. hnd-held Master의 개발...611
- 7. 재활로봇 시스템용 파생기술 기반 응용시스템 개발...611
- 제5장 연구개발결과의 활용계획...614
- 1. 인간친화형 소프트 로봇팔의 구현...615
- 2. Visual Servoing과 생체신호를 이용한 로봇 팔 제어...617
- 3. 눈동자 추적에 의한 대상 물체 지적(Eye-mouse 시스템) 및 인터페이스 기술개발...621
- 4. Body Motion을 이용한 로봇 인터페이스 개발...623
- 5. 원격조종진단, 수술용 마이크로 입력장치 및 로봇 시스템 개발...625
- 6. hnd-held Master의 개발...625
- 7. 재활로봇 시스템용 파생기술 기반 응용시스템 개발...625
- 제6장 연구개발과정에서 수집한 해외과학기술정보...626
- 1. 인간친화형 소프트 로봇팔의 구현...626
- 2. Visual Servoing과 생체신호를 이용한 로봇 팔 제어...626
- 3. 눈동자 추적에 의한 대상 물체 지적(Eye-mouse 시스템) 및 인터페이스 기술개발...631
- 4. Body Motion을 이용한 로봇 인터페이스 개발...631
- 5. 원격조종진단, 수술용 마이크로 입력장치 및 로봇 시스템 개발...632
- 6. hnd-held Master의 개발...632
- 제7장 참고문헌...634
※ AI-Helper는 부적절한 답변을 할 수 있습니다.