Open Access Open Access  Restricted Access Subscription or Fee Access

Biosignals, Computer vision and Smart Motion Capture Use in Myoelectric Prosthesis: A Novel Approach

Anushka A. Bakore, Priti Jain

Abstract


Worldwide case of limb amputation has touched 1.8–2 million annually and estimating for one in every 30 seconds which influences their daily activities, restricts them from doing their daily work and undermines the quality of life. EMG readings avail oneself of the movement of prosthetics alongside machine learning algorithms that confer the lost functionality. Motion capture and biosignals are used to make a computer-based vision that controls prosthetic data and operator’s motion capture data that bestow movement of the prosthetic in 3D space. A novel layout is presented by algorithms providing user-controlled movements (prosthetics equipped with biosensors and camera) that bridge the gap to dexterity that aid user intention and robotic automation. On the whole, prosthetic arms are operated by myoelectric signals, article emphasizes on the measurement of electrical signals from the residual muscles sensed by the SEMG, its processing based on ML data and combining it with the input, interpretation into signals like flexion-extension, opening of wrist, and other movements with degree of freedom. [Full forms-(EMG) Electromyography, (sEMG) Surface electromyography.]

 

The proposed module comprise of:

  • Sensory Feedback Module (SFM)
  • Mechanical Motion Control Module (MMCM)
  • Control Management Module (CMM)

 

Sensory Feedback Module covers all the movements in 3D space and relays them as feedback based on the EMG signals, orientation and mechanical data to the Control Management Module comprising of 6 managing systems including the User interface, computer vision and prosthetic action manager sends processing outputs to the MMCM for controlling the prostheses. EMG reading put together with computer vision can accomplish seamless and efficient working of prostheses.

Keywords


biosignals, ML data, sensory feedback module, mechanical motion control, control management module

Full Text:

PDF

References


Shinde, Chandrashekhar P.; Design of Myoelectric Prosthetic Arm, International Journal of Advanced Science, Engineering and Technology, vol. 1, issue 1, pp.21–25, 2012.

Liarokapis, M.V.; Artemiadis, P.K.; Katsiaris, P.T.; Kyriakopoulos, K.J.; Manolakos, E.S., Learning human reach-to-grasp strategies: Towards EMG-based control of robotic arm-hand systems, Robotics and Automation (ICRA), 2012 IEEE International Conference on , vol., no., pp.2287,2292, 14–18 May 2012.

Lavely, E.; Meltzner, G.; Thompson, R., Integrating human and computer vision with EEG toward the control of a prosthetic arm, Human-Robot Interaction (HRI), 2012 7th ACM/IEEE International Conference on , vol., no., pp.179, 180, 5–8 March 2012) (Micera, S.; Carpaneto, J.; Raspopovic, S., Control of Hand Prostheses Using Peripheral Information, Biomedical Engineering, IEEE Reviews in , vol.3, no., pp.48,68, 2010.

Ashutosh Saxena, Lawson Wong, Morgan Quigley, Andrew Y. Ng; A Vision-Based System for Grasping Novel Objects in Cluttered Environments, Springer Tracts in Advanced Robotics, vol.66, pp. 337–348, 2011.

A Novel Approach of Prosthetic Arm Control using Computer Vision, Biosignals, and Motion Capture Harold Martin, Jaime Donaw, Robert Kelly, Young Jin Jung, Jong-Hoon Kim. Page no.: 2–3. DOI: 10.1109/CIRAT. 2014.7009737.

H. Martin, J. Donaw, R. Kelly, Y. Jung and J. Kim, A novel approach of prosthetic arm control using computer vision, biosignals, and motion capture, 2014 IEEE Symposium on Computational Intelligence in Robotic Rehabilitation and Assistive Technologies (CIR2AT), Orlando, FL, 2014, pp. 26–30. doi: 10.1109/CIRAT.

7009737.


Refbacks

  • There are currently no refbacks.