Planning structural inspection and maintenance policies via dynamic programming and Markov processes. Part I: Theory

Research output: Contribution to journalArticle

31 Citations (Scopus)

Abstract

To address effectively the urgent societal need for safe structures and infrastructure systems under limited resources, science-based management of assets is needed. The overall objective of this two part study is to highlight the advanced attributes, capabilities and use of stochastic control techniques, and especially Partially Observable Markov Decision Processes (POMDPs) that can address the conundrum of planning optimum inspection/monitoring and maintenance policies based on stochastic models and uncertain structural data in real time. Markov Decision Processes are in general controlled stochastic processes that move away from conventional optimization approaches in order to achieve minimum life-cycle costs and advice the decision-makers to take optimum sequential decisions based on the actual results of inspections or the non-destructive testings they perform. In this first part of the study we exclusively describe, out of the vast and multipurpose stochastic control field, methods that are fitting for structural management, starting from simpler to sophisticated techniques and modern solvers. We present Markov Decision Processes (MDPs), semi-MDP and POMDP methods in an overview framework, we have related each of these to the others, and we have described POMDP solutions in many forms, including both the problematic grid-based approximations that are routinely used in structural maintenance problems, and the advanced point-based solvers capable of solving large scale, realistic problems. Our approach in this paper is helpful for understanding shortcomings of the currently used methods, related complications, possible solutions and the significance different solvers have not only on the solution but also on the modeling choices of the problem. In the second part of the study we utilize almost all presented topics and notions in a very broad, infinite horizon, minimum life-cycle cost structural management example and we focus on point-based solvers implementation and comparison with simpler techniques, among others.

Original languageEnglish (US)
Pages (from-to)202-213
Number of pages12
JournalReliability Engineering and System Safety
Volume130
DOIs
StatePublished - Jan 1 2014

Fingerprint

Dynamic programming
Markov processes
Inspection
Planning
Life cycle
Stochastic models
Nondestructive examination
Random processes
Costs
Monitoring

All Science Journal Classification (ASJC) codes

  • Safety, Risk, Reliability and Quality
  • Industrial and Manufacturing Engineering

Cite this

@article{fee0d11e551241a1aea4840b63eeedb0,
title = "Planning structural inspection and maintenance policies via dynamic programming and Markov processes. Part I: Theory",
abstract = "To address effectively the urgent societal need for safe structures and infrastructure systems under limited resources, science-based management of assets is needed. The overall objective of this two part study is to highlight the advanced attributes, capabilities and use of stochastic control techniques, and especially Partially Observable Markov Decision Processes (POMDPs) that can address the conundrum of planning optimum inspection/monitoring and maintenance policies based on stochastic models and uncertain structural data in real time. Markov Decision Processes are in general controlled stochastic processes that move away from conventional optimization approaches in order to achieve minimum life-cycle costs and advice the decision-makers to take optimum sequential decisions based on the actual results of inspections or the non-destructive testings they perform. In this first part of the study we exclusively describe, out of the vast and multipurpose stochastic control field, methods that are fitting for structural management, starting from simpler to sophisticated techniques and modern solvers. We present Markov Decision Processes (MDPs), semi-MDP and POMDP methods in an overview framework, we have related each of these to the others, and we have described POMDP solutions in many forms, including both the problematic grid-based approximations that are routinely used in structural maintenance problems, and the advanced point-based solvers capable of solving large scale, realistic problems. Our approach in this paper is helpful for understanding shortcomings of the currently used methods, related complications, possible solutions and the significance different solvers have not only on the solution but also on the modeling choices of the problem. In the second part of the study we utilize almost all presented topics and notions in a very broad, infinite horizon, minimum life-cycle cost structural management example and we focus on point-based solvers implementation and comparison with simpler techniques, among others.",
author = "Konstantinos Papakonstantinou and M. Shinozuka",
year = "2014",
month = "1",
day = "1",
doi = "10.1016/j.ress.2014.04.005",
language = "English (US)",
volume = "130",
pages = "202--213",
journal = "Reliability Engineering and System Safety",
issn = "0951-8320",
publisher = "Elsevier Limited",

}

TY - JOUR

T1 - Planning structural inspection and maintenance policies via dynamic programming and Markov processes. Part I

T2 - Theory

AU - Papakonstantinou, Konstantinos

AU - Shinozuka, M.

PY - 2014/1/1

Y1 - 2014/1/1

N2 - To address effectively the urgent societal need for safe structures and infrastructure systems under limited resources, science-based management of assets is needed. The overall objective of this two part study is to highlight the advanced attributes, capabilities and use of stochastic control techniques, and especially Partially Observable Markov Decision Processes (POMDPs) that can address the conundrum of planning optimum inspection/monitoring and maintenance policies based on stochastic models and uncertain structural data in real time. Markov Decision Processes are in general controlled stochastic processes that move away from conventional optimization approaches in order to achieve minimum life-cycle costs and advice the decision-makers to take optimum sequential decisions based on the actual results of inspections or the non-destructive testings they perform. In this first part of the study we exclusively describe, out of the vast and multipurpose stochastic control field, methods that are fitting for structural management, starting from simpler to sophisticated techniques and modern solvers. We present Markov Decision Processes (MDPs), semi-MDP and POMDP methods in an overview framework, we have related each of these to the others, and we have described POMDP solutions in many forms, including both the problematic grid-based approximations that are routinely used in structural maintenance problems, and the advanced point-based solvers capable of solving large scale, realistic problems. Our approach in this paper is helpful for understanding shortcomings of the currently used methods, related complications, possible solutions and the significance different solvers have not only on the solution but also on the modeling choices of the problem. In the second part of the study we utilize almost all presented topics and notions in a very broad, infinite horizon, minimum life-cycle cost structural management example and we focus on point-based solvers implementation and comparison with simpler techniques, among others.

AB - To address effectively the urgent societal need for safe structures and infrastructure systems under limited resources, science-based management of assets is needed. The overall objective of this two part study is to highlight the advanced attributes, capabilities and use of stochastic control techniques, and especially Partially Observable Markov Decision Processes (POMDPs) that can address the conundrum of planning optimum inspection/monitoring and maintenance policies based on stochastic models and uncertain structural data in real time. Markov Decision Processes are in general controlled stochastic processes that move away from conventional optimization approaches in order to achieve minimum life-cycle costs and advice the decision-makers to take optimum sequential decisions based on the actual results of inspections or the non-destructive testings they perform. In this first part of the study we exclusively describe, out of the vast and multipurpose stochastic control field, methods that are fitting for structural management, starting from simpler to sophisticated techniques and modern solvers. We present Markov Decision Processes (MDPs), semi-MDP and POMDP methods in an overview framework, we have related each of these to the others, and we have described POMDP solutions in many forms, including both the problematic grid-based approximations that are routinely used in structural maintenance problems, and the advanced point-based solvers capable of solving large scale, realistic problems. Our approach in this paper is helpful for understanding shortcomings of the currently used methods, related complications, possible solutions and the significance different solvers have not only on the solution but also on the modeling choices of the problem. In the second part of the study we utilize almost all presented topics and notions in a very broad, infinite horizon, minimum life-cycle cost structural management example and we focus on point-based solvers implementation and comparison with simpler techniques, among others.

UR - http://www.scopus.com/inward/record.url?scp=84904044225&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84904044225&partnerID=8YFLogxK

U2 - 10.1016/j.ress.2014.04.005

DO - 10.1016/j.ress.2014.04.005

M3 - Article

AN - SCOPUS:84904044225

VL - 130

SP - 202

EP - 213

JO - Reliability Engineering and System Safety

JF - Reliability Engineering and System Safety

SN - 0951-8320

ER -