Discuss about the Controlling Leg Exoskeleton Using Leap Motion.
The field of robotics is the most important in the industrial and various automation systems. This implies that the robot is becoming increasingly significant in our daily lives. One sensor, in particular, the Leap motion sensor, is an example of a groundbreaking technology that is capable of changing the way we control machines and our world as a whole. In this project, we will use this technology to control a leg exoskeleton.
The aim of the project is to create an interaction between a leg exoskeleton and a robotic leg. The interaction between man and machine provides the relation between human and computer. This idea culminates in the making of a robotic leg which resembles the human leg as much as possible without limiting the leg to one set of task (Corke, 2017). The Leap motion controller is to be used in X, Y, Z in controlling the leg. The project will focus on the similarity of the human leg with the robotic leg although an underlying aim exists that entails expanding the functionality of the leg upon creation of the basic model just like in a sports welding robot (Soyguder and Boles, 2017). Using image processing technique at the same time makes control quite difficult since various identifications schemes of the leg like color identification, tracking, pattern recognition, and giving X and Y axis to the points are needed.
Data from the newest sensors can be used successfully in recognizing gestures and therefore controlling a computer. Some devices currently exist that yield data that can easily be applied in voice recognition. A good example is the Microsoft Kinect (Ascioglu, & Senol, 2018). This device provides a 3D point cloud of the scene observed. However, it lacks the needed accuracy for leg gesture recognition because it was designed for applications that interpret the user’s whole body movement.
Leap Motion Controller is another device that is designed to track the movement of the leg exoskeleton. The controller was developed by Leap Motion and was released in 2013. The device is small in size and can be situated in front of a computer. It offers a new way of human-technology interaction awaiting evaluation (Mishra and Sing, 2017). This device can be linked to a computer with the use of a USB. It can then sense leg movements within a distance of one meter then translate them into actions for the computer to perform. Since the Leap Motion is very sensitive to even the smallest movements, it can map the entire movement of the leg exoskeleton moving close to it.
Proposed Methodology
The main research questions of this project are
How can robotic legs be designed for the disabled?
Can you teach a robot to walk?
Currently, there are several works that are being done to identify the motion of the leg exoskeleton. Many articles have been utilized in surveying the motion of the leg exoskeleton. The major fields that are applying this technique entail computer graphics, automatic sketching, leg detection, as well as the industrial robots that perform human roles (Ascioglu, & Senol, 2018). The research paper looks into the most successful technique to utilize robotics. Two types of techniques emerge that can be used in this area. They include the contact type and the non-contact type. The contact type of devices entails the exoskeleton, electromagnetic tracking system, data gloves et cetera (Do, 2017. Non-contact type, on the other hand, entails vision-based system, camera based, and speech recognition etcetera.
In this project, the technique to be used comes under non-contact type because we will be using a Leap motion sensor to track the movement of the leg exoskeleton. One thing about Leap Motion is that it does not offer access to raw data in the form of a cloud of points, unlike Microsoft Kinect (Molinari et al., 2018). Proprietary drivers gave by vendor process the captured data and can be accessed via API. Recognizing the leg exoskeleton requires optimization of the Leap Motion since it was designed to be a human-computer interface rather than being a general purpose 3D scanner. The Leap Motion API provides a data container in a Frame with an average frame rate of fifty frames per second using dual core laptop and a USB 2.0 interface. In each frame, we have legs, printables, frame timestamp, additional information, rotation, translation and scaling data.
A legged robot is an example of an articulated robot. Articulated robots can span from simple two jointed structures to systems that have ten or more interacting joints (Godoy et al., 2018). These joints are driven by various means which includes electric motors. Robot types like robotic legs can be articulated or non-articulated. The Leap Motion operates with two IR (Infrared) cameras and three infrared LEDs in a FOV (limited field of view) of eight cubic feet. Both features enable the device to minimize errors from tools, leg exoskeleton features and rely on its inbuilt mathematical model in maximizing speed and precision (Ascioglu, & Senol, 2018). While the features are detected by the device, updates in data frames are provided by it. In each frame, a list of tracking data exists like recognized movements, tools, leg exoskeleton and factors that details the overall scene motion.
Conclusion
The Leap Motion Sensor is used as it provides analysis of the objects observed in the field of view of the object. It provides recognition for leg exoskeleton, tools, and reports discrete positions and motions.at the center of the device; we have the controller's field of view in the form of an inverted pyramid. This controller is accessed and programmed via the APIs, with a variety of programming languages giving it some support. These languages range from JavaScript, Objective C, and C++ to Python (Shelton IV et al., 2018). The robotic leg has found its applications real situations as it can be used in helping the disabled walk normally. Mechanical sensors are used.
This project will be aimed at creating a single program for the main computer capable of dealing with connection with the robot, and connection with the Leap Motion sensor, acting as a linkage and data manager. To connect the main computer with the server C# language, we will use the LabComm protocol (Schwartz and Yap, 2016). This communication protocol was designed by Automatic Control Department of the LTH, Lund. It enables a computer connected to the local network to communicate with the robot controller. To connect with the Leap Motion sensor, a Leap Motion Sensor Software will be used. Once the sensor reads the data, it would send it to the robot to initiate control. A robot leg would be our control object here and the Leap Motion Sensor our control tool.
In this proposal, frames 200fps frame rate will be covered by the leap motion and based on java scripting, we will be able to obtain the coordinates of the leg exoskeleton lower extremity joint angles. The data obtained from the Leap motion sensor will be sent to the Microcontroller for controlling the articulated robot (Chinmilli et al., 2017). The Leap motion can trace the angle to provide a signal to robotic leg exoskeleton based on the axis location. Transmission of the signal from PC to the microcontroller is for the movement of the robotic leg exoskeleton. The signal is then processed and then transmitted to the robotic leg to carry out various actions (Young and Ferris, 2017).
Conclusion
This proposal describes controlling the robotic leg exoskeleton using Leap Motion Sensor. It enables me to better understand robots and the field of computer science as a whole. In this proposed project, we get proper leg exoskeleton motion result with the use of Leap Motion Sensor in Real-time. We also obtain the X, Y, Z leg exoskeleton motions data with the application of the Leap Motion Sensor through the use of Java Programming and the mapping of this data with AT328PU.
References
Corke, P. (2017). Robotics, Vision, and Control: Fundamental Algorithms In MATLAB® Second, Completely Revised (Vol. 118). Springer.
Schwartz, J. T., & Yap, C. K. (Eds.). (2016). Algorithmic and Geometric Aspects of Robotics (Routledge Revivals). Routledge.
Shelton IV, F. E., Yates, D. C., Harris, J. L., Houser, K. L., & Swayze, J. S. (2018). U.S. Patent Application No. 15/237,946.
Molinari, M., Masciullo, M., Tamburella, F., Tagliamonte, N. L., Pisotta, I., & Pons, J. L. (2018). Exoskeletons for Over-Ground Gait Training in Spinal Cord Injury. In Advanced Technologies for the Rehabilitation of Gait and Balance Disorders (pp. 253-265). Springer, Cham.
Young, A. J., & Ferris, D. P. (2017). State of the art and future directions for lower limb robotic exoskeletons. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 25(2), 171-182.
Chinmilli, P. T., Redkar, S., Zhang, W., & Sugar, T. (2017). A Review of Wearable Inertial Tracking based Human Gait Analysis and Control Strategies of Lower-Limb Exoskeletons. Int Rob Auto J, 3(7), 00080.
Do, T. T. N. (2016). Development of a virtual pet game using Oculus Rift and leap motion technologies (Doctoral dissertation, Bournemouth University).
Ascioglu, G., & Senol, Y. (2018). PREDICTION OF LOWER EXTREMITY JOINT ANGLES USING NEURAL NETWORKS FOR EXOSKELETON ROBOTIC LEG. International Journal of Robotics and Automation, 33(2).
Godoy, J. C., Campos, I. J., Pérez, L. M., & Muñoz, L. R. (2018). Nonanthropomorphic exoskeleton with legs based on eight-bar linkages. International Journal of Advanced Robotic Systems, 15(1), 1729881418755770.
MISHRA, S., & SINGH, M. (2017). Different Walking Technology Used For Robotics Mechanisms And Mechanical Devices. Journal on Intelligent Systems & Robotics Insights & Transformations, 1(1).
Soyguder, S., & Boles, W. (2017). SLEGS robot: development and design of a novel flexible and self-reconfigurable robot leg. Industrial Robot: An International Journal, 44(3), 377-391.
To export a reference to this article please select a referencing stye below:
My Assignment Help. (2019). Controlling Leg Exoskeleton Using Leap Motion Sensor Essay.. Retrieved from https://myassignmenthelp.com/free-samples/leg-exoskeleton-using-leap-motion.
"Controlling Leg Exoskeleton Using Leap Motion Sensor Essay.." My Assignment Help, 2019, https://myassignmenthelp.com/free-samples/leg-exoskeleton-using-leap-motion.
My Assignment Help (2019) Controlling Leg Exoskeleton Using Leap Motion Sensor Essay. [Online]. Available from: https://myassignmenthelp.com/free-samples/leg-exoskeleton-using-leap-motion
[Accessed 11 December 2024].
My Assignment Help. 'Controlling Leg Exoskeleton Using Leap Motion Sensor Essay.' (My Assignment Help, 2019) <https://myassignmenthelp.com/free-samples/leg-exoskeleton-using-leap-motion> accessed 11 December 2024.
My Assignment Help. Controlling Leg Exoskeleton Using Leap Motion Sensor Essay. [Internet]. My Assignment Help. 2019 [cited 11 December 2024]. Available from: https://myassignmenthelp.com/free-samples/leg-exoskeleton-using-leap-motion.