New CNN and hybrid CNN-LSTM models for learning object manipulation of humanoid robots from demonstration

dc.contributor.author Simge Nur Aslan
dc.contributor.author Recep Ozalp
dc.contributor.author Aysegul Ucar
dc.contributor.author Cuneyt Guzelis
dc.contributor.author Güzeliş, Cüneyt
dc.contributor.author Uçar, Ayşegül
dc.contributor.author Özalp, Recep
dc.contributor.author Aslan, Simge Nur
dc.date JUN
dc.date.accessioned 2025-10-06T16:20:47Z
dc.date.issued 2022
dc.description.abstract As the environments that human live are complex and uncontrolled the object manipulation with humanoid robots is regarded as one of the most challenging tasks. Learning a manipulation skill from human Demonstration (LfD) is one of the popular methods in the artificial intelligence and robotics community. This paper introduces a deep learning based teleoperation system for humanoid robots that imitate the human operator's object manipulation behavior. One of the fundamental problems in LfD is to approximate the robot trajectories obtained by means of human demonstrations with high accuracy. The work introduces novel models based on Convolutional Neural Networks (CNNs) CNNs-Long Short-Term Memory (LSTM) models combining the CNN LSTM models and their scaled variants for object manipulation with humanoid robots by using LfD. In the proposed LfD system six models are employed to estimate the shoulder roll position of the humanoid robot. The data are first collected in terms of teleoperation of a real Robotis-Op3 humanoid robot and the models are trained. The trajectory estimation is then carried out by the trained CNNs and CNN-LSTM models on the humanoid robot in an autonomous way. All trajectories relating the joint positions are finally generated by the model outputs. The results relating to the six models are compared to each other and the real ones in terms of the training and validation loss the parameter number and the training and testing time. Extensive experimental results show that the proposed CNN models are well learned the joint positions and especially the hybrid CNN-LSTM models in the proposed teleoperation system exhibit a more accuracy and stable results.
dc.description.sponsorship TUBITAK; Nvidia; Türkiye Bilimsel ve Teknolojik Araştirma Kurumu, TÜBITAK, (117E589)
dc.description.sponsorship This work was funded by the Scientific and Technological Research Council of Turkey (TUBITAK) grant numbers 117E589. In addition, GTX Titan X Pascal GPU in this research was donated by the NVIDIA Corporation.
dc.description.sponsorship Scientific and Technological Research Council of Turkey (TUBITAK) [117E589]
dc.identifier.doi 10.1007/s10586-021-03348-7
dc.identifier.issn 1386-7857
dc.identifier.issn 1573-7543
dc.identifier.scopus 2-s2.0-85132572804
dc.identifier.uri http://dx.doi.org/10.1007/s10586-021-03348-7
dc.identifier.uri https://gcris.yasar.edu.tr/handle/123456789/6542
dc.identifier.uri https://doi.org/10.1007/s10586-021-03348-7
dc.language.iso English
dc.publisher SPRINGER
dc.relation.ispartof Cluster Computing
dc.rights info:eu-repo/semantics/closedAccess
dc.source CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS
dc.subject Humanoid robots, Learning from demonstration, Convolution neural networks, Long short-term memory network, Object manipulation
dc.subject Object Manipulation
dc.subject Convolution Neural Networks
dc.subject Learning from Demonstration
dc.subject Long Short-Term Memory Network
dc.subject Humanoid Robots
dc.title New CNN and hybrid CNN-LSTM models for learning object manipulation of humanoid robots from demonstration
dc.type Article
dspace.entity.type Publication
gdc.author.id OZALP, RECEP/0000-0001-6343-0372
gdc.author.id ucar, aysegul/0000-0002-5253-3779
gdc.author.scopusid 57194274546
gdc.author.scopusid 57219265872
gdc.author.scopusid 55937768800
gdc.author.scopusid 7004549716
gdc.author.wosid ucar, aysegul/P-8443-2015
gdc.author.wosid Aslan, Simge Nur/GWM-4618-2022
gdc.author.wosid OZALP, RECEP/V-3923-2019
gdc.bip.impulseclass C4
gdc.bip.influenceclass C4
gdc.bip.popularityclass C4
gdc.coar.type text::journal::journal article
gdc.collaboration.industrial false
gdc.description.department
gdc.description.departmenttemp [Aslan, Simge Nur; Ozalp, Recep; Ucar, Aysegul] Firat Univ, Mechatron Engn Dept, TR-23119 Elazig, Turkey; [Guzelis, Cuneyt] Yasar Univ, Elect & Engn Dept, TR-35100 Izmir, Turkey
gdc.description.endpage 1590
gdc.description.issue 3
gdc.description.publicationcategory Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı
gdc.description.startpage 1575
gdc.description.volume 25
gdc.description.woscitationindex Science Citation Index Expanded
gdc.identifier.openalex W3175197495
gdc.identifier.wos WOS:000666848400004
gdc.index.type WoS
gdc.index.type Scopus
gdc.oaire.diamondjournal false
gdc.oaire.impulse 11.0
gdc.oaire.influence 3.4723369E-9
gdc.oaire.isgreen true
gdc.oaire.popularity 1.4282889E-8
gdc.oaire.publicfunded false
gdc.oaire.sciencefields 0209 industrial biotechnology
gdc.oaire.sciencefields 02 engineering and technology
gdc.openalex.collaboration National
gdc.openalex.fwci 2.4891
gdc.openalex.normalizedpercentile 0.89
gdc.opencitations.count 15
gdc.plumx.crossrefcites 2
gdc.plumx.mendeley 20
gdc.plumx.scopuscites 17
gdc.scopus.citedcount 17
gdc.virtual.author Güzeliş, Cüneyt
gdc.wos.citedcount 14
oaire.citation.endPage 1590
oaire.citation.startPage 1575
person.identifier.orcid ucar- aysegul/0000-0002-5253-3779, OZALP- RECEP/0000-0001-6343-0372
project.funder.name Scientific and Technological Research Council of Turkey (TUBITAK) [117E589]
publicationissue.issueNumber 3
publicationvolume.volumeNumber 25
relation.isAuthorOfPublication 10f564e3-6c1c-4354-9ce3-b5ac01e39680
relation.isAuthorOfPublication.latestForDiscovery 10f564e3-6c1c-4354-9ce3-b5ac01e39680
relation.isOrgUnitOfPublication ac5ddece-c76d-476d-ab30-e4d3029dee37
relation.isOrgUnitOfPublication.latestForDiscovery ac5ddece-c76d-476d-ab30-e4d3029dee37

Files