TY - JOUR
T1 - URHAND
T2 - Hand Prosthesis for Identifying Objects for Activities of Daily Living
AU - Ramos, Orion
AU - Casas, Diego
AU - Cifuentes, Carlos A.
AU - Jimenez, Mario F.
N1 - Publisher Copyright:
© 1963-2012 IEEE.
PY - 2024
Y1 - 2024
N2 - This work introduces URHAND, an innovative prosthetic hand designed to succeed in the identification of objects used in daily life activities, addressing a critical gap in the field of hand prosthetics and artificial intelligence. By leveraging advanced 3D printing technologies, URHAND enhances functionality and adaptability with 10 degrees of freedom and a unique underactuated mechanism. Dynamixel MX-106 motors provide precise finger control, while force-sensitive sensors enable the implementation of machine learning algorithms. The primary objective of this study is to create a comprehensive dataset derived from standardized objects associated with Activities of Daily Living (ADL) and standardized protocols, a necessary step to advance the state-of-the-art. The dataset, including motor positions, loads, currents, and force sensing resistor (FSR) values, supports four classification problems: (1) using all measured variables to identify objects, (2) using only motor positions, (3) using FSR sensor data, and (4) identifying grip types with FSR data. Machine learning training, conducted using the Pycaret library, reveals that CatBoost, Extra Tree Classifier, and Random Forest are the top-performing algorithms for object and grip type identification. The results underscore the importance of FSR data in achieving high precision, demonstrating a novel contribution to optimizing object handling in daily activities. This work represents a significant advancement in the application of artificial intelligence and prosthetics, providing essential information for future developments in the field.
AB - This work introduces URHAND, an innovative prosthetic hand designed to succeed in the identification of objects used in daily life activities, addressing a critical gap in the field of hand prosthetics and artificial intelligence. By leveraging advanced 3D printing technologies, URHAND enhances functionality and adaptability with 10 degrees of freedom and a unique underactuated mechanism. Dynamixel MX-106 motors provide precise finger control, while force-sensitive sensors enable the implementation of machine learning algorithms. The primary objective of this study is to create a comprehensive dataset derived from standardized objects associated with Activities of Daily Living (ADL) and standardized protocols, a necessary step to advance the state-of-the-art. The dataset, including motor positions, loads, currents, and force sensing resistor (FSR) values, supports four classification problems: (1) using all measured variables to identify objects, (2) using only motor positions, (3) using FSR sensor data, and (4) identifying grip types with FSR data. Machine learning training, conducted using the Pycaret library, reveals that CatBoost, Extra Tree Classifier, and Random Forest are the top-performing algorithms for object and grip type identification. The results underscore the importance of FSR data in achieving high precision, demonstrating a novel contribution to optimizing object handling in daily activities. This work represents a significant advancement in the application of artificial intelligence and prosthetics, providing essential information for future developments in the field.
UR - http://www.scopus.com/inward/record.url?scp=85205671916&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85205671916&partnerID=8YFLogxK
U2 - 10.1109/TIM.2024.3470013
DO - 10.1109/TIM.2024.3470013
M3 - Research Article
AN - SCOPUS:85205671916
SN - 0018-9456
JO - IEEE Transactions on Instrumentation and Measurement
JF - IEEE Transactions on Instrumentation and Measurement
ER -