Publications Scientifiques

Permanent URI for this communityhttps://dspace.univ-boumerdes.dz/handle/123456789/10

Browse

Search Results

Now showing 1 - 10 of 35
  • Item
    Achievable Rates of Full Duplex Cooperative Relay Selection-Based Machine Learning
    (IEEE, 2025) Belaoura, Widad; Althunibat, Saud; Mazen, Hasna; Qaraqe, Khalid; Ammuri, Rula
    Machine learning (ML) is an advanced artificial intelligence technology that addresses the ever-growing complexity in communication signal processing. In this paper, the concept of ML-based classification model to choose the best relay is investigate in a full duplex (FD) cooperative system. Specifically, a K-nearest neighbors (KNN)-based relay selection is applied to accurately predict and evaluate the achievable rate of the optimal FD relay. The core idea of the multi-class KNN is to identify the optimal relay that yields the highest achievable rate performance by utilizing a large set of offline training data derived from the channel state information (CSI), ensuring that no further training is required during system processing. The results indicate that the KNN-based FD relay selection can achieve an achievable rate comparable to the optimal exhaustive search method with lower computation complexity.
  • Item
    Design and implementation of a self-driving car using deep reinforcement learning: A comprehensive study
    (Elsevier, 2025) Djerbi, Rachid; Rouane, Anis; Taleb, Zineb; Saradouni, Safia
    This paper presents a groundbreaking and comprehensive study on the design, implementation, and evaluation of a self-driving car utilizing deep reinforcement learning, showcasing significant advancements in autonomous vehicle technology. Our robust framework integrates three innovative AI models for essential functionalities: road detection, traffic sign recognition, and obstacle avoidance. The system architecture, structured around a three layers “DDD” (Data, Detection, Decision) approach, involves meticulous data preprocessing for traffic signs and road data, followed by specialized Deep Learning models for each detection task, including a CNN for traffic signs, a CNN for road detection, and the pre-trained MobileNet-SSD for obstacle detection. A reinforcement learning agent in the Decision Layer processes these outputs for real-time control (steering, acceleration, braking) through a continuous learning process with environmental feedback. The research encompasses both extensive simulation in Unity, leveraging the ML-Agents toolkit for agent training across diverse environments, and crucial real-world deployment. Our reward/punishment system in the simulation environment, based on collisions with road markers and obstacles, refined the agent's decision-making. The trained AI models were successfully exported and deployed onto a physical prototype, controlled by a Raspberry Pi and equipped with a camera and ultrasonic sensors. Real-world testing affirmed the robust performance of the physical model in detecting roads, recognizing traffic signs, and effectively avoiding obstacles. Quantitative results demonstrate compelling performance, including over 90% accuracy in obstacle detection and a 15% improvement in navigation efficiency compared to traditional algorithms under controlled simulation conditions. Model evaluation metrics show a 98% accuracy, 12% loss, and a prediction rate exceeding 77%. This study not only contributes a comprehensive framework for autonomous vehicle development but also highlights the transformative potential of deep reinforcement learning for creating intelligent and adaptable autonomous systems in both virtual and real-world scenarios, paving the way for safer and more efficient transportation technologies
  • Item
    A data driven fault diagnosis approach for robotic cutting tools in smart manufacturing
    (International Society of Automation, 2025) Afia, Adel; Gougam, Fawzi; Soualhi, Abdenour; Wadi, Mohammed; Tahi, Mohamed; Tahi, Mohamed
    In smart manufacturing within Industry 4.0, tool condition monitoring (TCM) is used to improve productivity and machine availability by leveraging advanced sensors and computational intelligence to prevent tool damage. This paper develops a hybrid methodology using heterogeneous sensor measurements for monitoring robotic cutting tools with four tool states: healthy, surface damage, flake damage and broken tooth. The proposed approach integrates the maximal overlap discrete wavelet packet transform (MODWPT) with health indicators to construct feature matrices for each tool state. Feature selection is performed using the tree growth algorithm (TGA) to reduce computation time and improve feature space separation by selecting only relevant features. The selected features are input into a Gaussian mixture model (GMM) to detect, identify and classify each tool state with high accuracy. The proposed method provides a classification accuracy of 99.04 % for vibration, 95.51 % for torque, and 91.67 % for force signals. Using unseen vibration data, the model achieved a test accuracy of 98.44 %, demonstrating a high degree of generalizability. Comparative analysis demonstrates that our proposed approach provides superior feature discrimination and model stability, balancing computational efficiency and classification accuracy, validating the TGA-GMM framework as an effective solution for tool fault diagnosis in noisy, high-dimensional data.
  • Thumbnail Image
    Item
    Exploring Multi-Channel GPS Receivers for Detecting Spoofing Attacks on UAVs Using Machine Learning
    (Multidisciplinary Digital Publishing Institute, 2025) Mouzai, Mustapha; Riahla, Mohamed Amine; Keziou, Amor; Fouchal, Hacène
    All current transportation systems (vehicles, trucks, planes, etc.) rely on the Global Positioning System (GPS) as their main navigation technology. GPS receivers collect signals from multiple satellites and are able to provide more or less accurate positioning. For civilian applications, GPS signals are sent without any encryption system. For this reason, they are vulnerable to various attacks, and the most prevalent one is known as GPS spoofing. The main consequence is the loss of position monitoring, which may increase damage risks in terms of crashes or hijacking. In this study, we focus on UAV (unmanned aerial vehicle) positioning attacks. We first review numerous techniques for detecting and mitigating GPS spoofing attacks, finding that various types of attacks may occur. In the literature, many studies have focused on only one type of attack. We believe that targeting the study of many attacks is crucial for developing efficient mitigation mechanisms. Thus, we have explored a well-known datasetcontaining authentic UAV signals along with spoofed signals (with three types of attacked signals). As a main contribution, we propose a more interpretable approach to exploit the dataset by extracting individual mission sequences, handling non-stationary features, and converting the GPS raw data into a simplified structured format. Then, we design tree-based machine learning algorithms, namely decision tree (DT), random forest (RF), and extreme gradient boosting (XGBoost), for the purpose of classifying signal types and to recognize spoofing attacks. Our main findings are as follows: (a) random forest has significant capability in detecting and classifying GPS spoofing attacks, outperforming the other models. (b) We have been able to detect most types of attacks and distinguish them
  • Item
    GPS Spoofing Attack Against UAVs: A Timeseries Dataset Case Study
    (Springer Science and Business Media, 2025) Mustapha, Mouzai; Amine, Riahla Mohamed
    Over the past few years, the world has witnessed a notable surge in the adoption of Unmanned Aerial Vehicles in civil and military applications, including border surveillance, search and rescue, agriculture and delivery. In contrast, this potential growth has been accompanied by the lack of necessary security mechanisms that respond to the threats and vulnerabilities posed by malicious actors. Therefore, in this study we investigate one of the stealthiest attacks that afflict the navigation system of UAVs named GPS Spoofing attack. We overview the different detection techniques existing in literature, and highlight machine learning based approaches dealing with time series data
  • Item
    Offline Arabic handwritten character recognition: from conventional machine learning system to deep learning approaches
    (2022) Faouci, Soumia; Gaceb, Djamel; Haddad, Mohammed
    Researchers have made great strides in the area of Arabic handwritten character recognition in the last decades especially with the fast development of deep learning algorithms. The characteristics of Arabic manuscript text pose several problems for a recognition system. This paper presents a conventional machine learning system based on the extraction of a set of preselected features and an SVM classifier. In the second part, a simplified convolutional neural network (CNN) model is proposed, which is compared to six other CNN models based on the pre-trained architectures. The suggested methods were tested using three databases: two versions of the OIHACDB dataset and the AIA9K dataset. The experimental results show that the proposed CNN model obtained promising results, as it is able to recognise 94.7%, 98.3%, and 95.6% of the test set of the three databases OIHACDB-28, OIHACDB-40, and AIA9K, respectively.
  • Item
    Using Machine Learning Algorithms for the Analysis and Modeling of the Rheological Properties of Algerian Crude Oils
    (Taylor and Francis Ltd., 2024) Souas, Farid; Oulebsir, Rafik
    Our research described in this report investigated the rheological behavior of crude oils from the Tin Fouye Tabankort oil field in Southern Algeria, focusing on their viscosity under varying temperatures (10 °C–50 °C). The results show that the oils exhibited non-Newtonian shear-thinning behavior at low shear rates, with the viscosity decreasing as the temperature was increased. At higher shear rates, the Herschel–Bulkley model accurately described the oils’ transition to Newtonian behavior. Machine learning models, including CatBoost, LightGBM, and XGBoost, were trained on the experimental data to predict the viscosity, with CatBoost and XGBoost showing superior performance. We suggest these findings are valuable for improving the efficiency of oil transportation and processing.
  • Item
    Development of an expert-informed rig state classifier using naive bayes algorithm for invisible loss time measurement
    (Springer Nature, 2024) Youcefi, Mohamed Riad; Boukredera, Farouk Said; Ghalem, Khaled; Hadjadj, Ahmed; Ezenkwu, Chinedu Pascal
    The rig state plays a crucial role in recognizing the operations carried out by the drilling crew and quantifying Invisible Lost Time (ILT). This lost time, often challenging to assess and report manually in daily reports, results in delays to the scheduled timeline. In this paper, the Naive Bayes algorithm was used to establish a novel rig state. Training data, consisting of a large set of rules, was generated based on drilling experts’ recommendations. This dataset was then employed to build a Naive Bayes classifier capable of emulating the cognitive processes of skilled drilling engineers and accurately recognizing the actual drilling operation from surface data. The developed model was used to process high-frequency drilling data collected from three wells, aiming to derive the Key Performance Indicators (KPIs) related to each drilling crew’s efficiency and quantify the ILT during the drilling connections. The obtained results revealed that the established rig state excelled in automatically recognizing drilling operations, achieving a high success rate of 99.747%. The findings of this study offer valuable insights for drillers and rig supervisors, enabling real-time visual assessment of efficiency and prompt intervention to reduce ILT.
  • Item
    Enhancing Porosity Prediction in Reservoir Characterization through Ensemble Learning: A Comparative Study between Stacking, Bayesian Model Optimization, Boosting, and Random Forest
    (Slovnaft VURUP a.s, 2024) Youcefi, Mohamed Riad; Alshokri, Ayman Inamat; Boussebci, Walid; Ghalem, Khaled; Hadjadj, Asma
    Accurate estimation of porosity is a critical factor in reservoir characterization. This study aims to enhance porosity prediction through the implementation and comparison of various stacking ensemble learning strategies. A dataset comprising 273 points, which consists of well logs and core measurements, was collected from two wells for model development. Four base learners, including Support Vector Regression (SVR), Multi-Layer Perceptron (MLP), Random Forest Regression (RFR), and XGBoost, were trained on this dataset. These models were then integrated using multiple stacking ensemble techniques, such as weighted averaging, Bayesian model averaging, and RFR as a meta-learner. Meta-learners were trained on predictions from the base learners, generated through cross-validation on leave-out data. Performance evaluations of both base and meta learners were conducted on a separate testing dataset using statistical and graphical error analysis. Results indicate that all learners demonstrated robust performance, with weighted averaging outperforming other strategies on testing data. The stacking ensemble approach, particularly through weighted averaging, effectively improved base learner performance on testing data by leveraging individual model strengths and mitigating weaknesses. The findings of this study are valuable for geoscientists and reservoir engineers in achieving accurate reservoir characterization and facilitating exploration activities.
  • Item
    Modeling wax disappearance temperature using robust white-box machine learning
    (Elsevier Ltd, 2024) Nait Amar, Menad; Zeraibi, Noureddine; Benamara, Chahrazed; Djema, Hakim; Saifi, Redha; Gareche, Mourad
    Wax deposition is one of the major operational problems encountered in the upstream petroleum production system. The deposition of this undesirable scale can cause a variety of challenging problems. In order to avoid the latter, numerous parameters associated with the mechanism of wax deposition should be determined precisely. In this study, a new smart correlation was proposed for the accurate prediction of Wax disappearance temperature (WDT) using a robust explicit-based machine learning (ML) approach, namely gene expression programming (GEP). The correlation was developed using comprehensive experimental measurements. The obtained results revealed the promising degree of accuracy of the suggested GEP-based correlations. In this context, the newly-introduced correlations provided excellent statistical metrics (R2 = 0.9647 and AARD = 0.5963 %). Furthermore, performance of the developed correlation outperformed that of many existing approaches for predicting WDT. In addition, the trend analysis performed on the outcomes of the proposed GEP-based correlations divulged their physical validity and consistency. Lastly, the findings of this study provide a promising benefit, as the newly developed correlations can notably improve the adequate estimation of WDT, thus facilitating the simulation of wax deposition-related phenomena. In this context, the proposed correlations can supply the effective management of the production facilities and improvement of project economics since the provided correlation is a simple-to-use decision-making tool for production and chemical engineers engaged in the management of organic deposit-related issues.