URHAND: Hand Prosthesis for Identifying Objects for Activities of Daily Living

Orion Ramos, Diego Casas, Carlos A. Cifuentes, Mario F. Jimenez

Research output: Contribution to journalResearch Articlepeer-review

Abstract

This work introduces URHAND, an innovative prosthetic hand designed to succeed in the identification of objects used in daily life activities, addressing a critical gap in the field of hand prosthetics and artificial intelligence. By leveraging advanced 3D printing technologies, URHAND enhances functionality and adaptability with 10 degrees of freedom and a unique underactuated mechanism. Dynamixel MX-106 motors provide precise finger control, while force-sensitive sensors enable the implementation of machine learning algorithms. The primary objective of this study is to create a comprehensive dataset derived from standardized objects associated with Activities of Daily Living (ADL) and standardized protocols, a necessary step to advance the state-of-the-art. The dataset, including motor positions, loads, currents, and force sensing resistor (FSR) values, supports four classification problems: (1) using all measured variables to identify objects, (2) using only motor positions, (3) using FSR sensor data, and (4) identifying grip types with FSR data. Machine learning training, conducted using the Pycaret library, reveals that CatBoost, Extra Tree Classifier, and Random Forest are the top-performing algorithms for object and grip type identification. The results underscore the importance of FSR data in achieving high precision, demonstrating a novel contribution to optimizing object handling in daily activities. This work represents a significant advancement in the application of artificial intelligence and prosthetics, providing essential information for future developments in the field.

Original languageEnglish (US)
JournalIEEE Transactions on Instrumentation and Measurement
DOIs
StateAccepted/In press - 2024

All Science Journal Classification (ASJC) codes

  • Instrumentation
  • Electrical and Electronic Engineering

Cite this