default search action
International Journal of Mechatronics and Automation, Volume 9
Volume 9, Number 1, 2022
- Khairul Annuar Abdullah, Zuriati Yusof, Raja Mohd Tariqi B. Raja Lope Ahmad, Muhammad Fairuz Abd. Rauf, Zuraidy Adnan, Wan Azlan Wan Hassan, Riza Sulaiman:
Algebraic models based on trigonometric and Cramer's rules for computing inverse kinematics of robotic arm. 1-11 - Hanqing Zhao, Hidetaka Nambo:
Recognising and predicting gait cycle states for weight-reducing exoskeleton robots using deep learning. 12-21 - Naoki Moriya, Hiroki Shigemune, Hideyuki Sawada:
A robotic wheel locally transforming its diameters and the reinforcement learning for robust locomotion. 22-31 - Abhilasha Singh, V. Kalaichelvi, R. Karthikeyan:
Prototype design and performance analysis of genetic algorithm-based SLAM for indoor navigation using TETRIX Prizm mobile robot. 32-46 - Ken'ichi Koyanagi, Daisuke Takata, Takumi Tamamoto, Kentaro Noda, Takuya Tsukagoshi, Toru Oshima:
Design and development of a 3D-printed balloon type actuator for a hybrid force-display glove. 47-59
Volume 9, Number 2, 2022
- Hang Cui, Jiaming Zhang, William R. Norris:
A real-time embedded drive-by-wire control module for self-driving cars with ROS2. 61-71 - Yamato Umetani, Masahiko Minamoto, Shigeki Hori, Tetsuro Miyazaki, Kenji Kawashima:
Estimating future forceps movement using deep learning for robotic camera control in laparoscopic surgery. 72-80 - Chiharu Ishii, Ryo Sugiyama, Takahiro Yamada:
Proposal of guidelines for application of endoskeleton assist suit 'sustainable' to transfer assistance in nursing care. 81-89 - Deheng Zhu, Hiroaki Seki, Tokuo Tsuji, Tatsuhiro Hiramitsu:
Tableware tidying-up robot for self-service restaurant - robot system design. 90-98 - Mitsuhiro Yamano, Naoya Hanabata, Akira Okamoto, Toshihiko Yasuda, Yasutaka Nishioka, MD Nahin Islam Shiblee, Kazunari Yoshida, Hidemitsu Furukawa, Riichiro Tadakuma:
Development and motion analysis of a light and many-joint robot finger using shape memory gel and tendon-driven mechanism with arc route. 99-111
Volume 9, Number 3, 2022
- Nina Tajima, Koichiro Kato, Eriko Okada, Nobuto Matsuhira, Kanako Amano, Yuka Kato:
Development of a walking-trajectory measurement system. 113-122 - Bo Wen Yao, James K. Mills:
Automated real-time 3D visual servoing control of single cell surgery with application to microinjection processes. 123-133 - Lichang Yao, Qi Dai, Yiyang Yu, Yuki Nishioka, Qiong Wu, Jiajia Yang, Satoshi Takahashi, Yoshimichi Ejima, Jinglong Wu:
Investigating the difference of face distinction between adult and infant for the design of service robots. 134-141 - Fusaomi Nagata, Kei Furuta, Kohei Miki, Maki K. Habib, Keigo Watanabe:
Implementation and evaluation of calibration-less visual feedback controller for a robot manipulator DOBOT on a sliding rail. 142-150 - Ali Al-Ghanimi, Abdal-Razak Shehab, Adnan Alamili:
A tracking control design for linear motor using robust control integrated with online estimation technique. 151-159
Volume 9, Number 4, 2022
- Hongtao Yu, Qiong Wu, Mengni Zhou, Qi Li, Jiajia Yang, Satoshi Takahashi, Yoshimichi Ejima, Jinglong Wu:
The effects of crossmodal semantic reliability for audiovisual immersion experience of virtual reality. 161-171 - Peng Shi, Shuxiang Guo, Xiaoliang Jin:
Vascular centreline extraction for virtual reality interventional training systems. 172-179 - Rajmeet Singh, Manvir Singh Lamba, Tarun Kumar Bera:
Trajectory tracking of 4-DOF robot manipulator: a bond graph approach. 180-191 - S. Joseph Winston, P. V. Manivannan:
Visual servoing based self-calibration of robotic inspection system using rigid body transformation parameters. 192-209 - Ruochen An, Shuxiang Guo, Chunying Li, Tendeng Awa:
Underwater motion control of a bio-inspired father-son robot based on hydrodynamics analysis. 210-218
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.