Abstract: In structural health monitoring (SHM), revealing the underlying correlations of monitoring data is of considerable significance, both theoretically and practically. In contrast to the traditional correlation analysis for numerical data, this study seeks to analyse the correlation of probability distributions of inter-sensor monitoring data. Due to induced by some commonly shared random excitations, many structural responses measured at different locations are usually correlated in distributions. Clarifying and quantifying such distributional correlations not only enables a more comprehensive understanding of the essential dependence properties of SHM data, but also has appealing application values; however, statistical methods pertinent to this topic are rare. To this end, this article proposes a novel approach using functional data analysis techniques. The monitoring data collected by each sensor are divided into time segments and later summarized by the corresponding probability density functions (PDFs). The geometric relations of the PDFs in terms of their shape mappings between sensors are first characterized by warping functions, and they are subsequently decomposed into finite functional principal components (FPCs); one FPC of the warping functions characterizes one deformation pattern in the transformation of the shapes of the PDFs from one sensor to another. Using this principle, the inter-sensor geometric correlation pat- terns of PDFs can be clarified by analysing the correlation of the FPC scores of warping functions to the PDFs from one sensor. To overcome the challenge of correlation quantification for real-valued samples (FPC scores) coupled with their functional counterparts (PDFs), a novel nonparametric functional regression (NFR)-based correlation coefficient is defined. Both simulation and real data studies are conducted to illustrate and validate the proposed method.
You may also like
Nonlinearity and randomness are two intrinsic characteristics of the mechanical behavior of concrete material. The structural response under large excitation can barely be predicted without considering these two characteristics. Brilliant works have been done for decades in the material science and computational stochastic mechanics. However, the existed numerical methods are usually parameter dependent and the key mechanical properties of concrete material are determined by empirical recognition. Therefore, in this paper, a data-driven multi-scale constitutive model is proposed for representing the mechanical behavior of concrete material based on the polynomial chaos expansion and stochastic damage model. Several groups of compressive stress–strain data of concrete material are applied to train the proposed model. By cross validation of the prediction and the concrete stress–strain experimental data, the proposed model is firstly verified to have a robust performance to
gain accurate prediction results. Afterwards, the proposed method is compared with a neural network method, the results shows that the proposed method is more robust and accurate than the neural network method.
Structural health monitoring (SHM) is a multi-discipline field that involves the automatic sensing of structural loads and response by means of a large number of sensors and instruments, followed by a diagnosis of the structural health based on the collected data. Because an SHM system implemented into a structure automatically senses, evaluates, and warns about structural conditions in real time, massive data are a significant feature of SHM. The techniques related to massive data are referred to as data science and engineering, and include acquisition techniques, transition techniques, management techniques, and processing and mining algorithms for massive data. This paper provides a brief review of the state of the art of data science and engineering in SHM as investigated by these authors, and covers the compressive sampling-based data-acquisition algorithm, the anomaly data diagnosis approach using …
Surrogate model methods are widely used in structural reliability assessment, but conventional sampling methods require a large number of experimental points to construct a surrogate model. Inspired by the learning process of the AlphaGo, which is essentially optimization of sampling, we proposed a deep reinforcement learning (DRL)-based sampling method for structural reliability assessment. First, the sampling space and the existing samples are transformed into an array that is treated as the state in DRL. Second, a deep neural network is designed as the agent to observe the sampling space and select new experimental points, which are treated as actions. Finally, a reward function is proposed to guide the deep neural network to select experimental points along the limit state surface. Two numerical examples including a benchmark problem are employed to illustrate the sampling ability of the proposed …
In structural health monitoring, data quality is crucial to the performance of data-driven methods for structural damage identification, condition assessment, and safety warning. However, structural health monitoring systems often suffer from data imperfection, resulting in some entries being unusable in a data matrix. Discrete missing points are relatively easy to recover based on known adjacent points, whereas segments of continuous missing data are more common and also more challenging to recover in a practical scenario. Formulating the data recovery task as an optimization problem for matrix completion, we present a convolutional neural network to achieve simultaneous recovery for multi-channel data with the awareness of group sparsity. The data recovery process based on compressive sensing is formulated as a regression problem and achieved in the neural network. The basis matrix is utilized as the …