Human-Robot Interaction Strategies for Walker-Assisted Locomotion

Nome: CARLOS ANDRES CIFUENTES GARCIA
Tipo: Tese de doutorado
Data de publicação: 25/06/2015
Orientador:

Nomeordem decrescente Papel
ANSELMO FRIZERA NETO Orientador
TEODIANO FREIRE BASTOS FILHO Co-orientador

Banca:

Nomeordem decrescente Papel
ADRIANO ALMEIDA GONÇALVES SIQUEIRA Examinador Externo
ANDRE FERREIRA Examinador Externo
ANSELMO FRIZERA NETO Orientador
EVANDRO OTTONI TEATINI SALLES Examinador Interno
RODRIGO VAREJÃO ANDREÃO Examinador Externo

Páginas

Resumo: Neurological and age-related diseases affect human mobility at different levels causing partial or total loss of such faculty. There is a significant need to improve safe and efficient ambulation of patients with gait impairments. In this context, walkers present important benefits for human mobility, improving balance and reducing the load on their lower limbs. Most importantly, walkers induce the use of patients residual mobility capacities in different environments. In the field of robotic technologies for gait assistance,
a new category of walkers has emerged, integrating robotic technology, electronics and mechanics. Such devices are known as robotic walkers, intelligent walkers or smart walkers One of the specific and important common aspects to the field of assistive technologies and rehabilitation robotics is the intrinsic interaction between the human and the robot.
In this thesis, the concept of Human-Robot Interaction (HRI) for human locomotion assistance is explored. This interaction is composed of two interdependent components.
On the one hand, the key role of a robot in a Physical HRI (pHRI) is the generation of supplementary forces to empower the human locomotion. This involves a net flux of power between both actors. On the other hand, one of the crucial roles of a Cognitive HRI (cHRI) is to make the human aware of the possibilities of the robot while allowing him to maintain control of the robot at all times.
This doctoral thesis presents a new multimodal human-robot interface for testing and validating control strategies applied to a robotic walkers for assisting human mobility and gait rehabilitation. This interface extracts navigation intentions from a novel sensor fusion method that combines: (i) a Laser Range Finder (LRF) sensor to estimate the users legs kinematics, (ii) wearable Inertial Measurement Unit (IMU) sensors to capture the human and robot orientations and (iii) force sensors measure the physical interaction
between the humans upper limbs and the robotic walker.
Two close control loops were developed to naturally adapt the walker position and to perform body weight support strategies. First, a force interaction controller generates velocity outputs to the walker based on the upper-limbs physical interaction. Second, a inverse kinematic controller keeps the walker within a desired position to the human improving such interaction.
The proposed control strategies are suitable for natural human-robot interaction as shown during the experimental validation. Moreover, methods for sensor fusion to estimate the control inputs were presented and validated. In the experimental studies, the parameters estimation was precise and unbiased. It also showed repeatability when speed changes and continuous turns were performed.

Acesso ao documento

Acesso à informação
Transparência Pública

© 2013 Universidade Federal do Espírito Santo. Todos os direitos reservados.
Av. Fernando Ferrari, 514 - Goiabeiras, Vitória - ES | CEP 29075-910