Publications Scientifiques

Permanent URI for this communityhttps://dspace.univ-boumerdes.dz/handle/123456789/10

Browse

Search Results

Now showing 1 - 10 of 15
  • Item
    Achievable Rates of Full Duplex Cooperative Relay Selection-Based Machine Learning
    (IEEE, 2025) Belaoura, Widad; Althunibat, Saud; Mazen, Hasna; Qaraqe, Khalid; Ammuri, Rula
    Machine learning (ML) is an advanced artificial intelligence technology that addresses the ever-growing complexity in communication signal processing. In this paper, the concept of ML-based classification model to choose the best relay is investigate in a full duplex (FD) cooperative system. Specifically, a K-nearest neighbors (KNN)-based relay selection is applied to accurately predict and evaluate the achievable rate of the optimal FD relay. The core idea of the multi-class KNN is to identify the optimal relay that yields the highest achievable rate performance by utilizing a large set of offline training data derived from the channel state information (CSI), ensuring that no further training is required during system processing. The results indicate that the KNN-based FD relay selection can achieve an achievable rate comparable to the optimal exhaustive search method with lower computation complexity.
  • Item
    Design and implementation of a self-driving car using deep reinforcement learning: A comprehensive study
    (Elsevier, 2025) Djerbi, Rachid; Rouane, Anis; Taleb, Zineb; Saradouni, Safia
    This paper presents a groundbreaking and comprehensive study on the design, implementation, and evaluation of a self-driving car utilizing deep reinforcement learning, showcasing significant advancements in autonomous vehicle technology. Our robust framework integrates three innovative AI models for essential functionalities: road detection, traffic sign recognition, and obstacle avoidance. The system architecture, structured around a three layers “DDD” (Data, Detection, Decision) approach, involves meticulous data preprocessing for traffic signs and road data, followed by specialized Deep Learning models for each detection task, including a CNN for traffic signs, a CNN for road detection, and the pre-trained MobileNet-SSD for obstacle detection. A reinforcement learning agent in the Decision Layer processes these outputs for real-time control (steering, acceleration, braking) through a continuous learning process with environmental feedback. The research encompasses both extensive simulation in Unity, leveraging the ML-Agents toolkit for agent training across diverse environments, and crucial real-world deployment. Our reward/punishment system in the simulation environment, based on collisions with road markers and obstacles, refined the agent's decision-making. The trained AI models were successfully exported and deployed onto a physical prototype, controlled by a Raspberry Pi and equipped with a camera and ultrasonic sensors. Real-world testing affirmed the robust performance of the physical model in detecting roads, recognizing traffic signs, and effectively avoiding obstacles. Quantitative results demonstrate compelling performance, including over 90% accuracy in obstacle detection and a 15% improvement in navigation efficiency compared to traditional algorithms under controlled simulation conditions. Model evaluation metrics show a 98% accuracy, 12% loss, and a prediction rate exceeding 77%. This study not only contributes a comprehensive framework for autonomous vehicle development but also highlights the transformative potential of deep reinforcement learning for creating intelligent and adaptable autonomous systems in both virtual and real-world scenarios, paving the way for safer and more efficient transportation technologies
  • Item
    A data driven fault diagnosis approach for robotic cutting tools in smart manufacturing
    (International Society of Automation, 2025) Afia, Adel; Gougam, Fawzi; Soualhi, Abdenour; Wadi, Mohammed; Tahi, Mohamed; Tahi, Mohamed
    In smart manufacturing within Industry 4.0, tool condition monitoring (TCM) is used to improve productivity and machine availability by leveraging advanced sensors and computational intelligence to prevent tool damage. This paper develops a hybrid methodology using heterogeneous sensor measurements for monitoring robotic cutting tools with four tool states: healthy, surface damage, flake damage and broken tooth. The proposed approach integrates the maximal overlap discrete wavelet packet transform (MODWPT) with health indicators to construct feature matrices for each tool state. Feature selection is performed using the tree growth algorithm (TGA) to reduce computation time and improve feature space separation by selecting only relevant features. The selected features are input into a Gaussian mixture model (GMM) to detect, identify and classify each tool state with high accuracy. The proposed method provides a classification accuracy of 99.04 % for vibration, 95.51 % for torque, and 91.67 % for force signals. Using unseen vibration data, the model achieved a test accuracy of 98.44 %, demonstrating a high degree of generalizability. Comparative analysis demonstrates that our proposed approach provides superior feature discrimination and model stability, balancing computational efficiency and classification accuracy, validating the TGA-GMM framework as an effective solution for tool fault diagnosis in noisy, high-dimensional data.
  • Item
    GPS Spoofing Attack Against UAVs: A Timeseries Dataset Case Study
    (Springer Science and Business Media, 2025) Mustapha, Mouzai; Amine, Riahla Mohamed
    Over the past few years, the world has witnessed a notable surge in the adoption of Unmanned Aerial Vehicles in civil and military applications, including border surveillance, search and rescue, agriculture and delivery. In contrast, this potential growth has been accompanied by the lack of necessary security mechanisms that respond to the threats and vulnerabilities posed by malicious actors. Therefore, in this study we investigate one of the stealthiest attacks that afflict the navigation system of UAVs named GPS Spoofing attack. We overview the different detection techniques existing in literature, and highlight machine learning based approaches dealing with time series data
  • Item
    Offline Arabic handwritten character recognition: from conventional machine learning system to deep learning approaches
    (2022) Faouci, Soumia; Gaceb, Djamel; Haddad, Mohammed
    Researchers have made great strides in the area of Arabic handwritten character recognition in the last decades especially with the fast development of deep learning algorithms. The characteristics of Arabic manuscript text pose several problems for a recognition system. This paper presents a conventional machine learning system based on the extraction of a set of preselected features and an SVM classifier. In the second part, a simplified convolutional neural network (CNN) model is proposed, which is compared to six other CNN models based on the pre-trained architectures. The suggested methods were tested using three databases: two versions of the OIHACDB dataset and the AIA9K dataset. The experimental results show that the proposed CNN model obtained promising results, as it is able to recognise 94.7%, 98.3%, and 95.6% of the test set of the three databases OIHACDB-28, OIHACDB-40, and AIA9K, respectively.
  • Item
    Using Machine Learning Algorithms for the Analysis and Modeling of the Rheological Properties of Algerian Crude Oils
    (Taylor and Francis Ltd., 2024) Souas, Farid; Oulebsir, Rafik
    Our research described in this report investigated the rheological behavior of crude oils from the Tin Fouye Tabankort oil field in Southern Algeria, focusing on their viscosity under varying temperatures (10 °C–50 °C). The results show that the oils exhibited non-Newtonian shear-thinning behavior at low shear rates, with the viscosity decreasing as the temperature was increased. At higher shear rates, the Herschel–Bulkley model accurately described the oils’ transition to Newtonian behavior. Machine learning models, including CatBoost, LightGBM, and XGBoost, were trained on the experimental data to predict the viscosity, with CatBoost and XGBoost showing superior performance. We suggest these findings are valuable for improving the efficiency of oil transportation and processing.
  • Item
    Development of an expert-informed rig state classifier using naive bayes algorithm for invisible loss time measurement
    (Springer Nature, 2024) Youcefi, Mohamed Riad; Boukredera, Farouk Said; Ghalem, Khaled; Hadjadj, Ahmed; Ezenkwu, Chinedu Pascal
    The rig state plays a crucial role in recognizing the operations carried out by the drilling crew and quantifying Invisible Lost Time (ILT). This lost time, often challenging to assess and report manually in daily reports, results in delays to the scheduled timeline. In this paper, the Naive Bayes algorithm was used to establish a novel rig state. Training data, consisting of a large set of rules, was generated based on drilling experts’ recommendations. This dataset was then employed to build a Naive Bayes classifier capable of emulating the cognitive processes of skilled drilling engineers and accurately recognizing the actual drilling operation from surface data. The developed model was used to process high-frequency drilling data collected from three wells, aiming to derive the Key Performance Indicators (KPIs) related to each drilling crew’s efficiency and quantify the ILT during the drilling connections. The obtained results revealed that the established rig state excelled in automatically recognizing drilling operations, achieving a high success rate of 99.747%. The findings of this study offer valuable insights for drillers and rig supervisors, enabling real-time visual assessment of efficiency and prompt intervention to reduce ILT.
  • Item
    Classification of Left/Right Hand and Foot Movements from EEG using Machine Learning Algorithms
    (Institute of Electrical and Electronics Engineers Inc, 2023) Cherifi, Dalila; Berghouti, Baha Eddine; Boubchir, Larbi
    In recent years, there has been growing interest in utilizing Electroencephalography (EEG) data and machine learning techniques to develop innovative solutions for individuals with disabilities. The ability to accurately classify hands and foot motion based on EEG signals holds great potential for enabling individuals to regain control and functionality of their disabled parts, improving their quality of life and independence. Making a better solution than the traditional ones that often require physical contact or can be challenging to operate. In our study, we have focused on hands (right/left) and foot motion disabilities, using supervised Machine Learning algorithms for the classification of EEG data related to left/right hand and foot movements; aiming to reach accurate results that can contribute to providing a solution for people with this kind of motion disabilities. Three supervised machine learning algorithms are considered for the EEG classification, namely Linear Discriminant Analysis (LDA), K-Nearest Neighbors (KNN), and Support Vector Machine (SVM), using Common Spatial Patterns (CSP) algorithm and logarithm of the variance (logvar) for feature extraction. In our experiments, we adopted these algorithms to classify the Motor Imagery EEG dataset for hands and foot movements given in BCI Competition IV. The data we used went through different steps before fitting into the models such as filtering, feature extraction, and discrimination. We achieved significant success in accurately classifying hand movements in the initial experiment, attaining an impressive classification accuracy of up to 97.5% with SVM and LDA. Furthermore, in the multi-classification task involving both hand (right/left) and foot movements, KNN and SVM classifiers yielded commendable results up to 87%. These models can be further used and developed, where a hardware implementation will be done as a further work for this study.
  • Item
    Geological mapping using extreme gradient boosting and the deep neural networks : application to silet area, central Hoggar, Algeria
    (Springer, 2022) Elbegue, Abderrahmane Aref; Allek, Karim; Zeghouane, Hocine
    Nowadays, machine learning algorithms are considered a powerful tool for analyzing big and complex data due to their ability to deliver accurate and fast results. The main objective of the present study is to prove the effectiveness of the extreme gradient boosting (XGBoost) method as well as employed data types in the Saharan region mapping. To reveal the potential of the XGBoost, we conducted two experiments. The first was to use different combinations of: airborne gamma-ray spectrometry data, airborne magnetic data, Landsat 8 data and digital elevation model. The objective is to train 9 XGBoost models in order to determine each data type sensitivity in capturing the lithological rock classes. The second experiment was to compare the XGBoost to deep neural networks (DNN) to display its potential against other machine learning algorithms. Compared to the existing geological map, the application of XGBoost reveals a great potential for geological mapping as it was able to achieve a correlation score of (78%) where igneous and metamorphic rocks are easily identified compared to sedimentary rocks. In addition, using different data combinations reveals airborne magnetic data utility to discriminate some lithological units. It also reveals the potential of the apparent density, derived from airborne magnetic data, to improve the algorithm’s accuracy up to 20%. Furthermore, the second experiment in this study indicates that the XGBoost is a better choice for the geological mapping task compared to the DNN. The obtained predicted map shows that the XGBoost method provides an efficient tool to update existing geological maps and to edit new geological maps in the region with well outcropped rocks
  • Item
    Toward robust models for predicting carbon dioxide absorption by nanofluids
    (John Wiley and Sons Inc, 2022) Nait Amar, Menad; Djema, Hakim; Belhaouari, Samir Brahim; Zeraibi, Noureddine; https://doi.org/10.1002/ghg.2166
    The application of nanofluids has received increased attention across a number of disciplines in recent years. Carbon dioxide (CO2) absorption by using nanofluids as the solvents for the capture of CO2 is among the attractive applications, which have recently gained high popularity in various industrial aspects. In this work, two robust explicit-based machine learning (ML) methods, namely group method of data handling (GMDH) and genetic programming (GP) were implemented for establishing accurate correlations that can estimate the absorption of CO2 by nanofluids. The correlations were developed using a comprehensive database that involved 230 experimental measurements. The obtained results revealed that the proposed ML-based correlations can predict the absorption of CO2 by nanofluids with high accuracy. Besides, it was found that the GP-based correlation yielded more precise predictions compared to the GMDH-based correlation. The GP-based correlation has an overall coefficient of determination of 0.9914 and an overall average absolute relative deviation of 3.732%. Lastly, the carried-out trend analysis confirmed the compatibility of the proposed GP-based correlation with the real physical tendency of CO2 absorption by nanofluids