Search results

1 – 10 of 37
Article
Publication date: 11 July 2023

K. Madhana, L.S. Jayashree and Kalaivani Perumal

Human gait analysis is based on a significant part of the musculoskeletal, nervous and respiratory systems. Gait analysis is widely adopted to help patients increase community…

113

Abstract

Purpose

Human gait analysis is based on a significant part of the musculoskeletal, nervous and respiratory systems. Gait analysis is widely adopted to help patients increase community involvement and independent living.

Design/methodology/approach

This paper presents a system for the classification of abnormal human gaits using a Markerless 3D Motion Capture device. This study aims at examining and estimating the spatiotemporal and kinematic parameters obtained by 3D gait analysis in diverse groups of gait-impaired subjects and compares the parameters with that of healthy participants to interpret the gait patterns.

Findings

The classification is based on mathematical models that distinguish between normal and abnormal gait patterns depending on the deviations in the gait parameters. The difference between the gait measures of the control and each disease group was examined using 95% limits of agreement by the Bland and Altman method. The scatter plots demonstrated gait variability in Parkinsonian and ataxia gait and knee joint angle variation in hemiplegic gait when compared with those of healthy controls. To prove the validity of the Kinect camera, significant correlations were detected between Kinect- and inertial-based gait tests.

Originality/value

The various techniques used for gait assessments are often high in price and have existing limitations like the hindrance of components. The results suggest that the Kinect-based gait assessment techniques can be used as a low-cost, less-intrusive alternative to expensive infrastructure gait lab tests in the clinical environment.

Details

Journal of Enabling Technologies, vol. 17 no. 2
Type: Research Article
ISSN: 2398-6263

Keywords

Article
Publication date: 8 March 2011

Matthew Field, Zengxi Pan, David Stirling and Fazel Naghdy

The purpose of this paper is to provide a review of various motion capture technologies and discuss the methods for handling the captured data in applications related to robotics.

1604

Abstract

Purpose

The purpose of this paper is to provide a review of various motion capture technologies and discuss the methods for handling the captured data in applications related to robotics.

Design/methodology/approach

The approach taken in the paper is to compare the features and limitations of motion trackers in common use. After introducing the technology, a summary is given of robotic‐related work undertaken with the sensors and the strengths of different approaches in handling the data are discussed. Each comparison is presented in a table. Results from the author's experimentation with an inertial motion capture system are discussed based on clustering and segmentation techniques.

Findings

The trend in methodology is towards stochastic machine learning techniques such as hidden Markov model or Gaussian mixture model, their extensions in hierarchical forms and non‐linear dimension reduction. The resulting empirical models tend to handle uncertainty well and are suitable for incrementally updating models. The challenges in human‐robot interaction today include expanding upon generalising motions to understand motion planning and decisions and build ultimately context aware systems.

Originality/value

Reviews including descriptions of motion trackers and recent methodologies used in analyzing the data they capture are not very common. Some exist, as has been pointed out in the paper, but this review concentrates more on applications in the robotics field. There is value in regularly surveying the research areas considered in this paper due to the rapid progress in sensors and especially data modeling.

Details

Industrial Robot: An International Journal, vol. 38 no. 2
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 19 October 2015

Ping Zhang, Xin Liu, Guanglong Du, Bin Liang and Xueqian Wang

The purpose of this paper is to present a markerless human–manipulators interface which maps the position and orientation of human end-effector (EE, the center of the palm) to…

Abstract

Purpose

The purpose of this paper is to present a markerless human–manipulators interface which maps the position and orientation of human end-effector (EE, the center of the palm) to those of robot EE so that the robot could copy the movement of the operator hand.

Design/methodology/approach

The tracking system of this human–manipulators interface comprises five Leap Motions (LMs) which not only makes up the narrow workspace drawback of one LM but also provides redundancies to improve the data precision. However, because of the native noises and tracking errors of the LMs, the measurement errors increase over time. To address this problem, two filter tools are integrated to obtain the relatively accurate estimation of the human EE, that is, Particle Filter for position estimation and Kalman Filter for orientation estimation. Because the operator has inherent perceptive limitations, the motions of the manipulator may be out of sync with the hand motions, so that it is hard to complete with the high performance manipulation. Therefore, in this paper, an over-damping method is adopted to improve reliability and accuracy.

Findings

A series of human–manipulators interaction experiments were carried out to verify the proposed system. Compared with the markerless and contactless methods(Kofman et al., 2007; Du and Zhang, 2015), the method described in this study is more accurate and efficient.

Originality/value

The proposed method would not hinder most natural human limb motion and allows the operator to concentrate on his/her own task, making it perform high-precision manipulations efficiently.

Details

Industrial Robot: An International Journal, vol. 42 no. 6
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 11 June 2019

Muhammad Yahya, Jawad Ali Shah, Kushsairy Abdul Kadir, Zulkhairi M. Yusof, Sheroz Khan and Arif Warsi

Motion capture system (MoCap) has been used in measuring the human body segments in several applications including film special effects, health care, outer-space and under-water…

1501

Abstract

Purpose

Motion capture system (MoCap) has been used in measuring the human body segments in several applications including film special effects, health care, outer-space and under-water navigation systems, sea-water exploration pursuits, human machine interaction and learning software to help teachers of sign language. The purpose of this paper is to help the researchers to select specific MoCap system for various applications and the development of new algorithms related to upper limb motion.

Design/methodology/approach

This paper provides an overview of different sensors used in MoCap and techniques used for estimating human upper limb motion.

Findings

The existing MoCaps suffer from several issues depending on the type of MoCap used. These issues include drifting and placement of Inertial sensors, occlusion and jitters in Kinect, noise in electromyography signals and the requirement of a well-structured, calibrated environment and time-consuming task of placing markers in multiple camera systems.

Originality/value

This paper outlines the issues and challenges in MoCaps for measuring human upper limb motion and provides an overview on the techniques to overcome these issues and challenges.

Details

Sensor Review, vol. 39 no. 4
Type: Research Article
ISSN: 0260-2288

Keywords

Article
Publication date: 20 February 2009

Guan Tao, Li Lijun, Liu Wei and Wang Cheng

The purpose of this paper is to provide a flexible registration method for markerless augmented reality (AR) systems.

Abstract

Purpose

The purpose of this paper is to provide a flexible registration method for markerless augmented reality (AR) systems.

Design/methodology/approach

The proposed method distinguishes itself as follows: firstly, the method is simple and efficient, as no man‐made markers are needed for both indoor and outdoor AR applications. Secondly, an adaptation method is presented to tune the particle filter dynamically. The result is a system which can achieve tolerance to fast motion and drift during tracking process. Thirdly, the authors use the reduced scale invariant feature transform (SIFT) and scale prediction techniques to match natural features. This method deals easily with the camera pose estimation problem in the case of large illumination and visual angle changes.

Findings

Some experiments are provided to validate the performance of the proposed method.

Originality/value

The paper proposes a novel camera pose estimation method based on adaptive particle filter and natural features matching techniques.

Details

Assembly Automation, vol. 29 no. 1
Type: Research Article
ISSN: 0144-5154

Keywords

Abstract

Details

Marketing in Customer Technology Environments
Type: Book
ISBN: 978-1-83909-601-3

Article
Publication date: 2 July 2020

Zoltan Dobra and Krishna S. Dhir

Recent years have seen a technological change, Industry 4.0, in the manufacturing industry. Human–robot cooperation, a new application, is increasing and facilitating…

1314

Abstract

Purpose

Recent years have seen a technological change, Industry 4.0, in the manufacturing industry. Human–robot cooperation, a new application, is increasing and facilitating collaboration without fences, cages or any kind of separation. The purpose of the paper is to review mainstream academic publications to evaluate the current status of human–robot cooperation and identify potential areas of further research.

Design/methodology/approach

A systematic literature review is offered that searches, appraises, synthetizes and analyses relevant works.

Findings

The authors report the prevailing status of human–robot collaboration, human factors, complexity/ programming, safety, collision avoidance, instructing the robot system and other aspects of human–robot collaboration.

Practical implications

This paper identifies new directions and potential research in practice of human–robot collaboration, such as measuring the degree of collaboration, integrating human–robot cooperation into teamwork theories, effective functional relocation of the robot and product design for human robot collaboration.

Originality/value

This paper will be useful for three cohorts of readers, namely, the manufacturers who require a baseline for development and deployment of robots; users of robots-seeking manufacturing advantage and researchers looking for new directions for further exploration of human–machine collaboration.

Details

Industrial Robot: the international journal of robotics research and application, vol. 47 no. 5
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 21 August 2017

Yassine Bouteraa and Ismail Ben Abdallah

The idea is to exploit the natural stability and performance of the human arm during movement, execution and manipulation. The purpose of this paper is to remotely control a…

Abstract

Purpose

The idea is to exploit the natural stability and performance of the human arm during movement, execution and manipulation. The purpose of this paper is to remotely control a handling robot with a low cost but effective solution.

Design/methodology/approach

The developed approach is based on three different techniques to be able to ensure movement and pattern recognition of the operator’s arm as well as an effective control of the object manipulation task. In the first, the methodology works on the kinect-based gesture recognition of the operator’s arm. However, using only the vision-based approach for hand posture recognition cannot be the suitable solution mainly when the hand is occluded in such situations. The proposed approach supports the vision-based system by an electromyography (EMG)-based biofeedback system for posture recognition. Moreover, the novel approach appends to the vision system-based gesture control and the EMG-based posture recognition a force feedback to inform operator of the real grasping state.

Findings

The main finding is to have a robust method able to gesture-based control a robot manipulator during movement, manipulation and grasp. The proposed approach uses a real-time gesture control technique based on a kinect camera that can provide the exact position of each joint of the operator’s arm. The developed solution integrates also an EMG biofeedback and a force feedback in its control loop. In addition, the authors propose a high-friendly human-machine-interface (HMI) which allows user to control in real time a robotic arm. Robust trajectory tracking challenge has been solved by the implementation of the sliding mode controller. A fuzzy logic controller has been implemented to manage the grasping task based on the EMG signal. Experimental results have shown a high efficiency of the proposed approach.

Research limitations/implications

There are some constraints when applying the proposed method, such as the sensibility of the desired trajectory generated by the human arm even in case of random and unwanted movements. This can damage the manipulated object during the teleoperation process. In this case, such operator skills are highly required.

Practical implications

The developed control approach can be used in all applications, which require real-time human robot cooperation.

Originality/value

The main advantage of the developed approach is that it benefits at the same time of three various techniques: EMG biofeedback, vision-based system and haptic feedback. In such situation, using only vision-based approaches mainly for the hand postures recognition is not effective. Therefore, the recognition should be based on the biofeedback naturally generated by the muscles responsible of each posture. Moreover, the use of force sensor in closed-loop control scheme without operator intervention is ineffective in the special cases in which the manipulated objects vary in a wide range with different metallic characteristics. Therefore, the use of human-in-the-loop technique can imitate the natural human postures in the grasping task.

Details

Industrial Robot: An International Journal, vol. 44 no. 5
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 3 February 2020

Grant Rudd, Liam Daly and Filip Cuckov

This paper aims to present an intuitive control system for robotic manipulators that pairs a Leap Motion, a low-cost optical tracking and gesture recognition device, with the…

Abstract

Purpose

This paper aims to present an intuitive control system for robotic manipulators that pairs a Leap Motion, a low-cost optical tracking and gesture recognition device, with the ability to record and replay trajectories and operation to create an intuitive method of controlling and programming a robotic manipulator. This system was designed to be extensible and includes modules and methods for obstacle detection and dynamic trajectory modification for obstacle avoidance.

Design/methodology/approach

The presented control architecture, while portable to any robotic platform, was designed to actuate a six degree-of-freedom robotic manipulator of our own design. From the data collected by the Leap Motion, the manipulator was controlled by mapping the position and orientation of the human hand to values in the joint space of the robot. Additional recording and playback functionality was implemented to allow for the robot to repeat the desired tasks once the task had been demonstrated and recorded.

Findings

Experiments were conducted on our custom-built robotic manipulator by first using a simulation model to characterize and quantify the robot’s tracking of the Leap Motion generated trajectory. Tests were conducted in the Gazebo simulation software in conjunction with Robot Operating System, where results were collected by recording both the real-time input from the Leap Motion sensor, and the corresponding pose data. The results of these experiments show that the goal of accurate and real-time control of the robot was achieved and validated our methods of transcribing, recording and repeating six degree-of-freedom trajectories from the Leap Motion camera.

Originality/value

As robots evolve in complexity, the methods of programming them need to evolve to become more intuitive. Humans instinctively teach by demonstrating the task to a given subject, who then observes the various poses and tries to replicate the motions. This work aims to integrate the natural human teaching methods into robotics programming through an intuitive, demonstration-based programming method.

Details

Industrial Robot: the international journal of robotics research and application, vol. 47 no. 2
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 17 August 2015

Gilbert Tang, Seemal Asif and Phil Webb

The purpose of this paper is to describe the integration of a gesture control system for industrial collaborative robot. Human and robot collaborative systems can be a viable…

Abstract

Purpose

The purpose of this paper is to describe the integration of a gesture control system for industrial collaborative robot. Human and robot collaborative systems can be a viable manufacturing solution, but efficient control and communication are required for operations to be carried out effectively and safely.

Design/methodology/approach

The integrated system consists of facial recognition, static pose recognition and dynamic hand motion tracking. Each sub-system has been tested in isolation before integration and demonstration of a sample task.

Findings

It is demonstrated that the combination of multiple gesture control methods can increase its potential applications for industrial robots.

Originality/value

The novelty of the system is the combination of a dual gesture controls method which allows operators to command an industrial robot by posing hand gestures as well as control the robot motion by moving one of their hands in front of the sensor. A facial verification system is integrated to improve the robustness, reliability and security of the control system which also allows assignment of permission levels to different users.

Details

Industrial Robot: An International Journal, vol. 42 no. 5
Type: Research Article
ISSN: 0143-991X

Keywords

1 – 10 of 37