Reinforcement Learning in Condition-Based Maintenance: A Survey

dc.contributor.author Gamze Erdem
dc.contributor.author Mehmet Cemali Dinçer
dc.contributor.author Mehmet Murat Fadiloglu
dc.contributor.author Dincer, M. Cemali
dc.contributor.author Fadiloglu, M. Murat
dc.contributor.author Erdem, Gamze
dc.contributor.editor C. Kahraman , S. Cebi , B. Oztaysi , S. Cevik Onar , C. Tolga , I. Ucal Sari , I. Otay
dc.date.accessioned 2025-10-06T17:48:45Z
dc.date.issued 2025
dc.description.abstract This literature review examines the convergence of Reinforcement Learning (RL) and Condition-Based Maintenance (CBM) emphasizing the trans- formative impact of RL methodologies on maintenance decision-making in com- plex industrial settings. By integrating insights from a diverse array of studies the review critically assesses the use of various RL techniques such as Q-learning deep reinforcement learning and policy gradient approaches in forecasting equipment failures optimizing maintenance schedules and reducing operational downtime. It outlines the shift from conventional rule-based maintenance practices to adaptive data-driven strategies that exploit real-time sensor data and probabilistic modeling. Key challenges highlighted include computational complexity the extensive training data requirements and the integration of RL models into existing industrial frameworks. Furthermore the review explores literature on CBM within multi-component systems where prevalent approaches include numerical analyses Markov Decision Processes (MDPs) and case studies all of which demonstrate notable cost reductions and decreased downtime. Relevant studies were identified through searches on databases such as Google Scholar Scopus and Web of Science. Overall this review provides a comprehensive analysis of the current state and prospects of employing reinforcement learning in condition-based maintenance offering valuable insights for both academic researchers and industry practitioners. © 2025 Elsevier B.V. All rights reserved.
dc.identifier.doi 10.1007/978-3-031-98565-2_69
dc.identifier.isbn 9789819652372, 9783031931055, 9789819662968, 9783031999963, 9783031950162, 9783031947698, 9783032004406, 9783031910074, 9783031926105, 9789819639410
dc.identifier.isbn 9783031985645
dc.identifier.isbn 9783031985652
dc.identifier.issn 23673389, 23673370
dc.identifier.issn 2367-3370
dc.identifier.issn 2367-3389
dc.identifier.scopus 2-s2.0-105013082603
dc.identifier.uri https://www.scopus.com/inward/record.uri?eid=2-s2.0-105013082603&doi=10.1007%2F978-3-031-98565-2_69&partnerID=40&md5=d6dd0e74b82f22aab14141670dbeb7b2
dc.identifier.uri https://gcris.yasar.edu.tr/handle/123456789/8078
dc.identifier.uri https://doi.org/10.1007/978-3-031-98565-2_69
dc.language.iso English
dc.publisher Springer Science and Business Media Deutschland GmbH
dc.relation.ispartof 7th International Conference on Intelligent and Fuzzy Systems INFUS 2025
dc.relation.ispartofseries Lecture Notes in Networks and Systems
dc.rights info:eu-repo/semantics/closedAccess
dc.source Lecture Notes in Networks and Systems
dc.subject Condition-based Maintenance, Machine Learning, Reinforcement Learning, Automation, Condition Based Maintenance, Cost Benefit Analysis, Cost Reduction, Decision Making, Deep Learning, Deep Reinforcement Learning, Learning Systems, Markov Processes, Gradient Approach, Industrial Settings, Literature Reviews, Machine-learning, Maintenance Decision Making, Policy Gradient, Q-learning, Reinforcement Learning Techniques, Reinforcement Learnings, Reinforcement Learning
dc.subject Automation, Condition based maintenance, Cost benefit analysis, Cost reduction, Decision making, Deep learning, Deep reinforcement learning, Learning systems, Markov processes, Gradient approach, Industrial settings, Literature reviews, Machine-learning, Maintenance decision making, Policy gradient, Q-learning, Reinforcement learning techniques, Reinforcement learnings, Reinforcement learning
dc.subject Condition-Based Maintenance
dc.subject Machine Learning
dc.subject Reinforcement Learning
dc.title Reinforcement Learning in Condition-Based Maintenance: A Survey
dc.type Conference Object
dspace.entity.type Publication
gdc.author.scopusid 57822095100
gdc.author.scopusid 6602212401
gdc.author.scopusid 58967922500
gdc.bip.impulseclass C5
gdc.bip.influenceclass C5
gdc.bip.popularityclass C5
gdc.coar.type text::conference output
gdc.collaboration.industrial false
gdc.description.department
gdc.description.departmenttemp [Erdem, Gamze] Izmir Univ Econ, Sakarya Cad 156, TR-35330 Izmir, Turkiye; [Dincer, M. Cemali] Yasar Univ, Univ Caddesi 37-39, Izmir, Turkiye; [Fadiloglu, M. Murat] Sabanci Univ, TR-34956 Istanbul, Turkiye
gdc.description.endpage 647
gdc.description.publicationcategory Konferans Öğesi - Uluslararası - Kurum Öğretim Elemanı
gdc.description.startpage 639
gdc.description.volume 1530
gdc.description.woscitationindex Conference Proceedings Citation Index - Science
gdc.identifier.openalex W4412718461
gdc.identifier.wos WOS:001587122800069
gdc.index.type Scopus
gdc.index.type WoS
gdc.oaire.diamondjournal false
gdc.oaire.impulse 0.0
gdc.oaire.influence 2.3811355E-9
gdc.oaire.isgreen false
gdc.oaire.popularity 2.5970819E-9
gdc.oaire.publicfunded false
gdc.openalex.collaboration National
gdc.openalex.fwci 0.0
gdc.openalex.normalizedpercentile 0.5
gdc.opencitations.count 0
gdc.plumx.scopuscites 0
gdc.scopus.citedcount 0
gdc.virtual.author Dinçer, Mehmet Cemali
gdc.virtual.author Erdem, Gamze
gdc.wos.citedcount 0
oaire.citation.endPage 647
oaire.citation.startPage 639
person.identifier.scopus-author-id Erdem- Gamze (57822095100), Dinçer- Mehmet Cemali (58967922500), Fadiloglu- Mehmet Murat (6602212401)
publicationvolume.volumeNumber 1530 LNNS
relation.isAuthorOfPublication 58be2429-0ed3-4ab0-8279-cc0874276c20
relation.isAuthorOfPublication 6a426dae-cb2f-480e-b1e4-2e7a39b7c78e
relation.isAuthorOfPublication.latestForDiscovery 58be2429-0ed3-4ab0-8279-cc0874276c20
relation.isOrgUnitOfPublication ac5ddece-c76d-476d-ab30-e4d3029dee37
relation.isOrgUnitOfPublication.latestForDiscovery ac5ddece-c76d-476d-ab30-e4d3029dee37

Files