Publications Scientifiques

Permanent URI for this communityhttps://dspace.univ-boumerdes.dz/handle/123456789/10

Browse

Search Results

Now showing 1 - 10 of 12
  • Thumbnail Image
    Item
    Enhancing Fault Detection in Stochastic Environments Using Interval-Valued KPCA: A Cement Rotary Kiln Case Study
    (Institute of Electrical and Electronics, 2025) Louifi, Abdelhalim; Kouadri, Abdelmalek; Harkat, Mohamed-Faouzi; Bensmail, Abderazak; Mansouri, Majdi
    Fault detection in industrial processes is challenging due to significant data uncertainty, which complicates the accurate modeling of interval-valued data and the quantification of errors necessary for reliable detection. Existing approaches, such as kernel principal component analysis (KPCA), struggle with these challenges because they rely on single-valued data representations and are unable to effectively handle interval-based variability. To address these limitations, this paper introduces the interval-valued model KPCA (IV-KPCA), which extends KPCA by redefining similarity measures and kernel functions to accommodate interval-valued uncertainty. IV-KPCA preserves the interval structure throughout the modeling process, enhancing robustness to dynamic uncertainties and improving fault detection in complex nonlinear systems. Within this framework, fault detection statistics (T 2 , Q, and 8) are developed to enable precise error quantification. The proposed method is validated on a cement rotary kiln process, a highly stochastic industrial system characterized by significant uncertainties. Experimental results demonstrate that IV-KPCA reduces false alarms, missed detections, and detection delays by over 100%, 90%, and 95%, respectively, compared to traditional methods. These findings underscore the potential of IV-KPCA in enhancing fault detection performance in complex, uncertain environments
  • Item
    Uncertainty Quantification Kernel PCA: Enhancing Fault Detection in Interval-Valued Data
    (Institute of Electrical and Electronics Engineers Inc., 2024) Louifi, Abdelhalim; Kouadri, Abdelmalek; Harkat, Mohamed Faouzi; Bensmail, Abderazak; Mansouri, Majdi; Nounou, Hazem
    The interval-valued kernel PCA (UQ-KPCA) is a variation of the kernel PCA (KPCA) designed for interval-valued data, designed to handle data uncertainty by defining specific similarity measures and kernel functions for interval data. This paper introduces Uncertainty Quantification KPCA (UQ-KPCA) as a novel method to address uncertainties in data. UQ-KPCA converts the traditional KPCA model from single-valued to interval-valued representations, allowing for accurate error and uncertainty quantification. The process modeling using KPCA is then performed on data based on the interval model, followed by the computation of fault detection statistics such as T 2 , Q, and Φ. The method’s effectiveness is evaluated in the context of the cement rotary kiln process, and compared with the KPCA demonstrating superior performance in accurately identifying faults within a stochastic setting with unknown uncertainties.
  • Item
    Dynamic Interval-Valued PCA for Enhanced Fault Detection
    (Institute of Electrical and Electronics Engineers Inc., 2024) Rouani, Lahcene Rouani; Harkat, Mohamed Faouzi Harkat; Kouadri, Abdelmalek Kouadri; Bensmail, Abderazak; Mansouri, Majdi; Nounou, Mohamed
    This study introduces three novel dynamic interval-valued principal component analysis (DIPCA) methods: dynamic centers PCA (D-CPCA), dynamic vertices PCA (D-VPCA), and dynamic complete information PCA (D-CIPCA). These methods advance traditional interval-valued PCA (IPCA) by integrating dynamic aspects of industrial processes, thus addressing both data uncertainties and temporal correlations. The DIPCA methods were validated using real-world data from the Ain El Kebira cement plant. Results indicate significant improvements in fault detection accuracy, achieving lower false alarm rates and higher reliability compared to classical IPCA methods. Furthermore, an enhanced combined index for interval-valued data was developed, providing a single, comprehensive statistical measure for streamlined process monitoring.
  • Thumbnail Image
    Item
    Kernel Principal Component Analysis Improvement based on Data-Reduction via Class Interval
    (Elsevier B.V., 2024) Habib Kaib, Mohammed Tahar; Kouadri, Abdelmalek; Harkat, Mohamed Faouzi; Bensmail, Abderazak; Mansouri, Majdi; Nounou, Mohamed
    Kernel Principal Component Analysis (KPCA) is an effective nonlinear extension of the Principal Component Analysis for fault detection. For large-sized data, KPCA may drop its detection performance, occupy more storage space for the monitoring model, and take more execution time in the online part. Reduced KPCA pre-processes the training data before applying the KPCA method, the proposed approach selects samples based on class interval to reduce the number of observations in the training data set while maintaining decent detection performance. This approach is applied to the Tennessee Eastman Process and then compared to some of the existing approaches.
  • Item
    Improving kernel PCA-based algorithm for fault detection in nonlinear industrial process through fractal dimension
    (Institution of Chemical Engineers, 2023) Kaib, Mohammed Tahar Habib; Kouadri, Abdelmalek; Harkat, Mohamed Faouzi; Bensmail, Abderazak; Mansouri, Majdi
    Principal Component Analysis (PCA) is a widely used technique for fault detection and diagnosis. PCA works well when the data set has linear characteristics. However, most industrial processes have nonlinear characteristics in their data. Kernel PCA (KPCA) is an alternative solution for such types of data sets. This solution doesn’t come without a cost since one of KPCA’s disadvantages is a large number of observations which results in more occupied storage space and more execution time than the PCA technique. Furthermore, if the data is too large it may minimize the monitoring performance of the KPCA model. Reduced KPCA (RKPCA) is a solution for the conventional KPCA limitations. Firstly, RKPCA can deal with nonlinear characteristics without crucial problems because it is based on the KPCA algorithm with a data reduction part where it keeps most of the data’s infor- mation. Thus, by reducing the number of observations RKPCA reduces the occupied storage space and execution time while preserving tolerable monitoring performance. The proposed RKPCA algorithm consists of two parts. First, the large-sized training data set is reduced using the fractal dimension technique (correlation dimension). Afterward, the KPCA model is developed through the obtained reduced training data set. The proposed scheme is applied to the Tennessee Eastman Process and the Cement Plant Rotary Kiln data sets to evaluate its performance in comparison with other algorithms.
  • Item
    Improvement of kernel principal component analysis-based approach for nonlinear process monitoring by data set size reduction using class interval
    (Institute of Electrical and Electronics Engineers Inc, 2024) Kaib, Mohammed Tahar Habib; Kouadri, Abdelmalek; Harkat, Mohamed-Faouzi; Bensmail, Abderazak; Mansouri, Majdi
    Fault detection and diagnosis (FDD) systems play a crucial role in maintaining the adequate execution of the monitored process. One of the widely used data-driven FDD methods is the Principal Component Analysis (PCA). Unfortunately, PCA's reliability drops when data has nonlinear characteristics as industrial processes. Kernel Principal Component Analysis (KPCA) is an alternative PCA technique that is used to deal with a similar data set. For a large-sized data set, KPCA's execution time and occupied storage space will increase drastically and the monitoring performance can also be affected in this case. So, the Reduced KPCA (RKPCA) was introduced with the aim of reducing the size of a given training data set to lower the execution time and occupied storage space while maintaining KPCA's monitoring performance for nonlinear systems. Generally, RKPCA reduces the number of samples in the training data set and then builds the KPCA model based on this data set. In this paper, the proposed algorithm selects relevant observations from the original data set by utilizing a class interval technique (i.e. histogram) to maintain a bunch of representative samples from each bin. The proposed algorithm has been tested on three tank system pilot plant and Ain El Kebira Cement rotary kiln process. The proposed algorithm has successfully maintained homogeneity to the original data set, reduced the execution time and occupied storage space, and led to decent monitoring performance.
  • Item
    RKPCA-based approach for fault detection in large scale systems using variogram method
    (Elsevier, 2022) Kaib, Mohammed Tahar Habib; Kouadri, Abdelmalek; Harkat, Mohamed Faouzi; Bensmail, Abderazak
    Principal Component Analysis (PCA)-based approach for fault detection is a simple and accurate data-driven technique for feature extraction and selection. However, PCA performs poorly if the data used has nonlinear characteristics where this type of data is widely present in most industrial processes. To overcome this drawback, Kernel PCA (KPCA) is an alternative technique used to work on this type of data but it requires more computation time and memory storage space for large-sized data sets. Many size reduction techniques have been developed to select the most relevant observations that will be employed by KPCA. This, known as Reduced KPCA (RKPCA), consequently requires less computation time and memory storage space than KPCA. Besides, it possesses the advantages of both KPCA and standard PCA. In this paper, a reduction in the size of a data set based on a multivariate variogram is proposed. According to its conventional formalism, the uncorrelated observations are selected and kept to form a reduced training data set. Afterward, the KPCA model is built through this data set for faults detection purposes. The proposed RKPCA scheme is tested using an actual involuntary process fault and various simulated sensor faults in a cement plant. Compared to other RKPCA techniques, the developed one yields better results
  • Item
    Multivariate nuisance alarm management in chemical processes
    (Elsevier, 2021) Kaced, Radhia; Kouadri, Abdelmalek; Baiche, Karim; Bensmail, Abderazak
    Alarm systems are of vital importance in the safe and effective functioning of industrial plants, yet they frequently suffer from too many nuisance alarms (alarm overloading). It is necessary to intelligently enhance existing alarm systems and supply accurate information for the operators. Nowadays, process variables are more correlated and complicated. This correlation structure can be used as a basis to manage alarms efficiently. Hence, multivariate approaches are more appropriate. Designing a system aimed at reducing nuisance alarms is an essential phase to guarantee the reliable operation of a plant. Due to the definition of alarm limits, the problem of false alarms is inevitable in multivariate methods. In this paper, the conventional Principal Component Analysis (PCA) is applied to extract the sum of squared prediction error (SPE) known as the statistic and the Hotelling statistic. These statistics are used separately as alarm indicators where their control limits are duly modified. Consequently, for each statistic, a nonlinear combination of alarm duration and alarm deviation, is additionally exploited as a new requirement to activate an alarm or not. The resulting new index is fed to a delay timer with a defined parameter . The implementation of this technique resulted in a significant reduction in the severity of alarm overloading. Historical data collected from the cement rotary kiln operating under healthy conditions are employed to adequately build the PCA model and extract the proposed alarming indexes. Then, various testing data sets, covering different types of faults occurring in the cement process, are used to assess the performance of the developed method. In comparison with the conventional PCA technique, alarms are better managed nd almost nuisance alarms are suppressed. The proposed method is more robust to false alarms and more sensitive to fault detection
  • Item
    Kernelized relative entropy for direct fault detection in industrial rotary kilns
    (John Wiley and Sons Ltd, 2018) Hamadouche, Anis; Kouadri, Abdelmalek; Bensmail, Abderazak
    The objective of this work is to use a 1-dimensional signal that reflects the dissimilarity between multidimensional probability densities for detection. With the modified Kullback-Leibler divergence, faults can be directly detected without any normality assumption or joint monitoring of related test statistics in different subspaces such as the T2 and SPE in principal component analysis–based methods. To relieve the difficulty associated with asymptotic high-dimensional density estimates, we have estimated the density ratio rather than the densities themselves. This can be done by approximating the density ratio with kernel basis functions and learn the weights from the available data. The developed algorithm is generic and can be applied to any industrial system as long as process historical data is available. As a case study, we apply this algorithm to a real rotary kiln in operation, which is an integral part of the cement manufacturing plant of Ain El Kebira, Algeria.
  • Item
    A modified moving window dynamic PCA with fuzzy logic filter and application to fault detection
    (Elsevier, 2018) Ammiche, Mustapha; Kouadri, Abdelmalek; Bensmail, Abderazak