Spectrum of Engineering Sciences
https://thesesjournal.com/index.php/1
<p data-start="64" data-end="394"><strong data-start="64" data-end="106">Spectrum of Engineering Sciences (SES)</strong> is a refereed international research platform committed to advancing high-quality scholarly work. It is an open-access, online journal that follows a rigorous editorial (blind) and double-blind peer-review process. SES is published monthly and operates on a continuous publication model.</p> <p data-start="396" data-end="759">The journal primarily focuses on publishing original research and review articles in <strong data-start="481" data-end="501">Computer Science</strong> and <strong data-start="506" data-end="530">Engineering Sciences</strong>. It is launched and managed by the <strong data-start="566" data-end="625">Sociology Educational Nexus Research Institute (SME-PV)</strong>. With a strong international orientation, SES aims to attract authors and readers from diverse academic and professional backgrounds.</p> <p data-start="761" data-end="1029">At SES, we believe in the value of interdisciplinary collaboration. Bringing together multiple academic disciplines allows for the integration of knowledge across fields, enabling researchers to address complex problems and develop innovative, well-grounded solutions.</p>SOCIOLOGY EDUCATIONAL NEXUS RESEARCH INSTITUTEen-USSpectrum of Engineering Sciences3007-312XMACHINE LEARNING PREDICTION OF SHRINKAGE CRACKING BEHAVIOUR IN ULTRA-HIGH-PERFORMANCE CONCRETE UNDER RESTRAINED CURING CONDITIONS IN BRIDGE DECK SLABS: A COMPREHENSIVE REVIEW
https://thesesjournal.com/index.php/1/article/view/2645
<p><em>Ultra-high-performance concrete (UHPC) is increasingly utilized in bridge deck slabs due to its superior mechanical properties and durability. However, its high autogenous shrinkage and the resulting risk of early-age cracking, especially under restrained curing conditions, present significant challenges for long-term structural integrity. Recent advances in machine learning (ML) have enabled more accurate prediction and understanding of shrinkage and cracking behaviors in UHPC, facilitating optimized mix designs and mitigation strategies. This review synthesizes over 100 recent studies on ML-based prediction of shrinkage cracking in UHPC bridge decks, focusing on quantitative model performance, influential material parameters, experimental validation, and practical engineering implications. Ensemble models such as XGBoost, Random Forest, and hybrid approaches consistently achieve high predictive accuracy (R² values up to 0.99), with feature importance analyses highlighting the roles of water-to-binder ratio, fiber content, curing regime, and supplementary cementitious materials. The integration of explainable AI methods (e.g., SHAP) has improved model transparency and practical adoption. Despite these advances, challenges remain regarding data scarcity for field-scale applications and the need for robust models that generalize across diverse environmental conditions. This review concludes with recommendations for future research directions to further enhance the reliability and applicability of ML-driven predictions for UHPC bridge infrastructure.</em></p>Dr. M. Adil KhanMuhammad Mudassir RamzanTawheed UllahBuland IqbalMuhammad Waqar NaseerMuazzam Nawaz
Copyright (c) 2026
2026-05-042026-05-04450111DEVELOPMENT OF FPGA-BASED FRACTIONAL-ORDER PID CONTROLLER MODULE FOR AUTONOMOUS ROBOTS USING FOR THE APPLICATION INDUSTRY 4.0
https://thesesjournal.com/index.php/1/article/view/2646
<p><em>Fractional-order PID controller plays an important role in various industrial applications. The FOPID is the expansion of the conventional PID controller based on fractional calculus. PID controller regulates temperature, flow, pressure, speed, and other process variables in industrial control systems. Therefore, this research intends to design and implement the real-time FPGA-based FOPID controller, which is complex regarding memory issues and energy consumption. Thus, this research uses the hybridized technique of fixed- and floating-point approach using Matlab and Xilinx Vivado. The proposed design is realized on a Zynq-7000 FPGA board, and the performance is improved in terms of dynamic range, speed, unlimited use of resources, efficiency, and less energy consumption.</em></p>Imran Mir ChohanIrfan AhmedSadam Hussain SoomroNabeela Abdul RasheedAijaz Ali Laghari
Copyright (c) 2026
2026-05-042026-05-04451226AI-DRIVEN CYBER THREAT INTELLIGENCE FOR CRITICAL INFRASTRUCTURE PROTECTION IN PAKISTAN: A DEEP LEARNING APPROACH
https://thesesjournal.com/index.php/1/article/view/2647
<p><em>The increasing digitization of critical infrastructure systems in Pakistan has significantly expanded the attack surface for sophisticated cyber threats, including advanced persistent threats, ransomware, and zero-day exploits. Traditional rule-based cybersecurity mechanisms are increasingly insufficient to address these evolving and complex threats. This study proposes an AI-driven Cyber Threat Intelligence (CTI) framework based on deep learning techniques to enhance the protection of critical infrastructure. The proposed model integrates Convolutional Neural Networks (CNN), Long Short-Term Memory (LSTM) networks, and Autoencoders to enable real-time anomaly detection, threat classification, and predictive analysis. A quantitative experimental design was employed using benchmark cybersecurity datasets and simulated critical infrastructure environments. The results demonstrate that the hybrid deep learning model outperforms traditional machine learning and signature-based approaches, achieving higher detection accuracy and lower false positive rates. The findings confirm that AI-based CTI significantly improves cybersecurity resilience, enabling proactive threat mitigation in high-risk environments. The study contributes to advancing intelligent cybersecurity frameworks and provides practical implications for strengthening national cyber defense systems in Pakistan.</em></p>Muhammad ShahbazSyed Ahmed AliSameen AmjadRida ZafarMuhammad Irfan Aslam
Copyright (c) 2026
2026-05-042026-05-04452738DEEN TRACKER: TECHNOLOGY-ENHANCED ISLAMIC PRACTICE MONITORING AND AI-BASED RELIGIOUS ACTIVITY TRACKER
https://thesesjournal.com/index.php/1/article/view/2648
<p><em>The rate that digital developments have been undertaken has transformed how individuals access information related to religion, education and lifestyles. A portal that does not only contribute to spiritual development but also uses the power of technological advances to enhance the process of studying, interaction, and the frequency of prayer becomes increasingly significant even in the setting of the Islamic worship. Deen Tracker is an innovative, artificially intelligent mobile application designed to meet these needs, providing a user of any age with a plethora of features that will definitely attract their attention. This application contains Islamic education programmes such as Ghazwat Explorer, with summaries and videos; Interactive Quranic Learning; Niyyah Tracker is a Niyyah advice using Artificial Intelligence, and Islamic quizzes; and stories told by children. .Deen Tracker can also be applied in Islamic upbringing and education of children through age specific instructions in dua, Salah and Wudu teachings, reading book, and through animated stories. On the same note, Hijab Cam relies on the face recognition shape and lessons to suggest the right type of hijab and there is Qibla Finder that provides the appropriate direction to pray without making use of any type of digital compass. Islamic digital calendar, prayers time module and intelligent alerts are also features of Deen Tracker to ensure a hassle-free experience of worship and life management. Deen Tracker is a combined software that contains AI personalization and learning, prayer, emotional counseling, and contemporary Muslims-friendly practices. It renders the worshiping process easy and appealing among Muslims, yet on the other hand it is an example, which demonstrates how modern technologies can be efficiently employed in the promotion of education and spirituality in the modern world.</em></p>Meerub AkhtarQoseen ZahraKhadija IshaqLaiba JabeenAteeb Ul Rahman
Copyright (c) 2026
2026-05-042026-05-04453959OPTION PRICING IN THE LIGHT OF ISLAMIC VISION FOR PAKISTAN STOCK EXCHANGE
https://thesesjournal.com/index.php/1/article/view/2649
<p><em>It is widely known, The Black-Scholes model is frequently used to establish the behavior of the options trading in the financial market. Throughout this paper we address the issue of Islamic Al-Arboun (the down payment) that is similar to the conventional options trading. We proposed the modified version of the 2-D time-fractional Black-Scholes partial differential c with two assets established on the combination of Finite-Volume Method for unsteady flow and numerical scheme. This work deals with the analytical solution of the European Call option based on financial derivative is so called modified Finite Volume (unsteady) options which numerically solved. Through mathematical analysis it is established that the explicit Finite-Volume scheme is unconditionally stable. After analyzing the conceptual and legal differences between the conventional and Islamic (Al-Arboun) options trading we conclude that Al-Arboun could be the shari’a compliant alternative to the European conventional Call option.</em></p>Imran KhanShahid KhanAnum ZaibIbad Ur Rehman
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-042026-05-04456073SMART INFRASTRUCTURE DEVELOPMENT: INTEGRATING DIGITAL TECHNOLOGIES INTO CIVIL ENGINEERING PRACTICES
https://thesesjournal.com/index.php/1/article/view/2655
<p><em>Smart infrastructure development represented a significant advancement in civil engineering through the integration of digital technologies such as artificial intelligence, Internet of Things, and building information modeling. This study examined the impact of digital technology integration on infrastructure performance, operational efficiency, and sustainability outcomes. A quantitative research design was employed, and data were collected from a sample of 220 civil engineering professionals using a structured questionnaire. Statistical analysis included descriptive statistics, correlation, and regression analysis. The results indicated high mean values for digital technology integration (M = 4.08, SD = 0.67), smart infrastructure development (M = 3.95, SD = 0.71), operational efficiency (M = 3.88, SD = 0.69), and sustainability outcomes (M = 3.82, SD = 0.73). Regression findings revealed that digital technology integration significantly influenced smart infrastructure development (β = 0.69, p < 0.001), operational efficiency (β = 0.64, p < 0.001), and sustainability outcomes (β = 0.60, p < 0.001). The study demonstrated that digital technologies enhanced project performance, improved resource utilization, and supported sustainable infrastructure development. Challenges such as implementation cost, technical skill gaps, and data management issues remained key concerns. The study provided practical implications for policymakers and industry professionals to promote digital transformation and achieve efficient and sustainable infrastructure systems.</em></p>Muhammad Bilal Israr*Noor Thair Abdal WahidPashtoon Ahmad Rayan
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-042026-05-04457486GENERATIVE AI REVOLUTION IN CYBERSECURITY: A COMPREHENSIVE REVIEW OF THREAT INTELLIGENCE AND OPERATIONS
https://thesesjournal.com/index.php/1/article/view/2658
<p><em>The rapid advancement of digital technologies has exposed significant limitations in traditional cybersecurity frameworks, creating an urgent demand for intelligent and adaptive security solutions. This study examines the transformative role of Generative Artificial Intelligence (GAI) in modern cybersecurity frameworks. With the increasing frequency and sophistication of cyber threats, traditional security mechanisms are becoming insufficient, creating a demand for intelligent, adaptive solutions. This review highlights how GAI technologies, including Large Language Models (LLMs) and Generative Adversarial Networks (GANs), enhance threat intelligence by enabling real-time data analysis, anomaly detection, and automated incident response. The study emphasizes the ability of generative models to identify novel threats, simulate cyberattacks, and support proactive defense strategies. Furthermore, GAI contributes to operational efficiency by reducing human workload and improving decision-making processes in security operations centers. However, the paper also critically discusses the emerging risks associated with generative AI, particularly its misuse in developing advanced malware, phishing attacks, and deepfake-based cybercrimes. Challenges such as high computational cost, model inaccuracies, and ethical concerns are also explored. The findings suggest that while GAI significantly strengthens cybersecurity capabilities, its dual-use nature requires balanced implementation, robust governance, and continuous monitoring. Overall, the study provides a comprehensive understanding of how generative AI is reshaping threat intelligence and cybersecurity operations in the digital era.</em></p> <p><strong>Keywords : </strong><em>Generative Artificial Intelligence, Cybersecurity Threat Intelligence, Anomaly Detection, Adversarial Attacks, Large Language Models, Incident Response.</em></p> <p><em> <a href="https://doi.org/10.5281/zenodo.20213458" target="_blank" rel="noopener">https://doi.org/10.5281/zenodo.20213458</a></em></p>Muhammad Irfan Aslam
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-042026-05-04458796THE ROAD TO DECARBONIZING PAKISTAN'S CONSTRUCTION SECTOR
https://thesesjournal.com/index.php/1/article/view/2665
<p><em>Pakistan’s building sector consumes over 40% of national energy, yet low-carbon building (LCB) adoption remains critically low. This study identifies key barriers and develops a contextual framework to accelerate LCB implementation. A mixed-methods design was used: a survey of 153 construction professionals and semi‑structured interviews with ten industry experts across major urban centers. Based on empirical evidence, the study proposes a Three‑Pillar Low‑Carbon Building Design (LCBD) Framework: (1) Policy and Regulatory Foundation (mandatory codes, financial incentives, institutional strengthening); (2) Technical Capacity Development (professional training, curriculum reform, knowledge sharing); and (3) Market Transformation (awareness campaigns, demonstration projects, supply chain development). Complementary outputs include passive design templates for 5 and 10 Marla houses, an Energy‑to‑Mortgage model to improve affordability, and a 10‑year implementation roadmap. The study concludes that overcoming Pakistan’s regulatory vacuum and capacity deficits requires coordinated action. With mandatory codes, green financing, and systemic educational reform, the construction sector can transition from a major carbon emitter to a cornerstone of sustainable development.</em></p>Rohan AhmedSumaira IsmailKazi Omer SadikShanza Abdul RazzakMuhammad Faisal AhmedNoor FatimaYawuz Sohail
Copyright (c) 2026
2026-05-052026-05-054597112SPATIO–TEMPORAL ANALYSIS OF LAND USE/LAND COVER DYNAMICS AND ITS IMPACT ON LAND SURFACE TEMPERATURE USING GEOSPATIAL TECHNIQUES: A CASE STUDY OF MARDAN, PAKISTAN
https://thesesjournal.com/index.php/1/article/view/2666
<p><em>Urbanization is presently a worldwide phenomenon. Pakistan, like many other South Asian nations, is experiencing rapid urbanization, with an annual growth rate of 3%. The consequences of urbanization on the climate and environment are critical for the country's natural resource management. One of the most significant aspects of land use change is the relationship between urbanization and the decrease of agriculture as a result of increased economic growth. Furthermore, analyzing dynamic changes in land use is necessary for developing a model for future land use changes. The research examines the city of Mardan's projected land use and land use development for the year 2050. Landsat pictures for the years 1990, 2000, 2010, and 2022 were used in this study. The photos were used to determine the temperature of the earth's surface, recover land use changes in land cover, and derive indices such as NDVI, NDBaI, NDBI, UI, and NDWI. Changes can be seen in built</em>–<em>up regions and agricultural areas, but water bodies and uncultivated places are also affected. Agriculture accounted for 51% of GDP in 1990 and will drop to 40% by 2022. From 1990 to 2022, the Built–up increased from 0.97 percent to 8.01 percent. The total accuracy of the images was between 89 and 90 percent. The LULC model has a significant impact on the projected temperature fluctuation. Additionally, a probability transition image was produced using the Markov model, demonstrating the transition forecast in the LULC model up to 2050, which shows a 35 percent decline in agricultural and a 136 sq.km rise in buildings. LST can be used to reflect the effect of a transition in the LULC model. The average maximum temperature in 1990 was 40 degrees Celsius, rising to 46 degrees Celsius in 2022, </em></p> <p><em> </em></p> <p><em>according to seven separate yearly photos acquired by the LANDSAT thermal band. LST was analyzed using linear regression with NDVI, NDBI, NDWI, UI</em></p> <p><em> </em></p> <p><em>and NDBaI. The study found that NDVI had a negative relationship with LST. </em></p> <p><em> </em></p> <p><em>LST rises when plant cover decreases. While LST has a high positive connection </em></p> <p><em><br>with NDBI, NDBaI, UI and NDWI. As a result, immediate steps must be taken to limit the rapid disappearance of urbanization in order to minimize environmental, natural resource, and biodiversity devastation by managing the evolution of soil surface temperature</em></p>Umair Aftab ChoudaryAttiq Ur Rahman FaridiMaryam KhalidAyesha JavedManzer Javed SindhuMuhammad IshfaqSobia Rani
Copyright (c) 2026
2026-05-052026-05-0545113140 A ROBUST PREPROCESSING AND FEATURE SELECTION FRAMEWORK IS PROPOSED TO ENHANCE HEART DISEASE PREDICTION ACCURACY
https://thesesjournal.com/index.php/1/article/view/2668
<p>Heart disease is now the leading health issue in the world and requires proper measures to diagnose and prevent heart disease at an early stage. The paper introduces statistical and machine learning methods for forecasting heart disease by analyzing vital health indicators and lifestyle factors. To create a predictive framework, the University of California, Irvine (UCI) Heart Disease Dataset, comprising patient-specific characteristics, is used. The performance of three classification models, including the Logistic Regression, K-Nearest Neighbors (KNN), and the Random Forest, is compared in terms of their predictive performance. The research methodology can be divided into two stages: identification of the most important clinical characteristics that suggest cardiovascular risk; evaluation of the accuracy of the model on the data. The results indicate that machine learning and data mining tools can be used to diagnose and prevent cardiovascular diseases promptly.</p>Aqib MehmoodHajar BendaoudMuhammad Ghaos Baksh UVESAttiq UllahMohsin MahmoodMubashir Zainoor Salman Ali Khan
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-052026-05-0545141152EMPIRICAL INSIGHTS INTO VR IMPLEMENTATION CHALLENGES AND PRACTICES FOR CHEMISTRY EDUCATION
https://thesesjournal.com/index.php/1/article/view/2669
<p><strong>Context</strong></p> <p><em>Virtual Reality (VR) has evolved into a vital tool in education and may make teaching science, particularly Chemistry education, much better. In a computer simulated environment, students can safely perform risky experiments, watch chemical reactions in detail, and learn about challenging chemical reactions. This interactive task assists students to learn more about fundamental concepts and lets the students to perform experiments that may not be feasible in a traditional lab. But even with these advantages, there are still a lot of challenges with using VR in labs and classrooms. These consist of the expensive prices of VR equipment’s, the challenge of blending VR into current educational environment’s, the fact that many teachers lack the technical know-how to use it, the software's constraints, and the unwillingness of schools to accept new technology.</em></p> <p><strong><em>Objectives</em></strong></p> <p><em>The aim of this study is to find out implementation challenges and strategies related with the use of Virtual Reality in Chemistry Education via Systematic Literature Review (SLR) and questionnaire survey. The process will explain how research studies find challenges to Virtual Reality adoption and what methods have worked well to overcome these challenges.</em></p> <p><strong><em>Anticipated Results</em></strong><em> <br>The anticipated outcomes include:<br>(1) A structured list of the challenges in implementation of virtual reality into practice. <br>(2) A Combination of mitigation strategies. <br>(3) A conceptual evaluation of the challenges and their solutions. <br>(4) Data that will help organizations, educators, and students in creating plans for the future use of Virtual Reality.</em></p>Mahboob Ur RahmanMuhammad SalamHaseena NoureenShah KhalidMuhammad Fawad
Copyright (c) 2026
2026-05-052026-05-0545153162PREDICTIVE HEART HEALTH ANALYSIS: MACHINE LEARNING WITH THE CARDIOVASCULAR DISEASE DATASET
https://thesesjournal.com/index.php/1/article/view/2671
<p><em>Cardiovascular diseases (CVDs) are the leading cause of global mortality, requiring accurate prediction systems for early detection and prevention. This study investigates predictive modeling of CVD risk using two benchmark datasets: Dataset 1 (Kaggle Cardiovascular Disease Risk Prediction, 70,000 records with demographic, clinical, and lifestyle features) and Dataset 2 (Early Medical Risk Dataset, 65,535 samples with clinical symptoms and risk factors). Two deep learning approaches were implemented and compared: a Deep Neural Network (DNN) baseline and a Transformer-based model tailored for tabular healthcare data. The DNN achieved consistent results with accuracies of 85.3% (Dataset 2) and ~90% (Dataset 1), demonstrating balanced precision and recall but limited ability to capture complex feature dependencies. In contrast, the Transformer achieved superior performance, recording precision and recall above 99% with an ROC-AUC of 0.999 on Dataset 2, and consistently higher metrics on Dataset 1. These results confirm that attention-based architectures are more effective in modeling non-linear, interdependent risk factors, offering near-perfect classification outcomes. The findings demonstrate that integrating advanced deep learning models with structured clinical datasets can significantly improve cardiovascular risk prediction, supporting clinical decision-making by reducing misclassification rates and enabling timely, personalized healthcare interventions</em></p>Sana CheemaAkkasha LatifHafiz Farrukh AbbasShoaib AhmedQandeel NasirAisha Tariq Khan
Copyright (c) 2026
2026-05-052026-05-0545163176MODELING AND ANALYSIS OF A CO-AUTHORSHIP SYSTEM USING COMPLEX NETWORK APPROACH: A CASE STUDY
https://thesesjournal.com/index.php/1/article/view/2673
<p><em>Complex systems across various domains including biological, social, environmental, technological, communication, and transportation can be effectively modeled as complex networks. These networks are typically large and intricate due to the vast number of nodes and interconnections among them.This study apply a network science approach to model and analyze co-authorship networks derived from the Journal of Lightwave Technology (JLT). The analysis is conducted using key network metrics such as degree centrality, clustering coefficient, and betweenness centrality.The results indicate that the network exhibit high clustering, short average path lengths, and an inhomogeneous distribution of weighted degree. Furthermore, the findings reveal the presence of influential authors acting as hubs who frequently appear across multiple issues.</em></p>Altaf Hussain AbroSaria AbbasiInayatuallah SamoonShahmurad ChandioAsif Jamali
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-052026-05-0545177183FEDERATED HYBRID DEEP LEARNING FOR NETWORK ANOMALY DETECTION WITH ADAPTIVE RESOURCE OPTIMIZATION
https://thesesjournal.com/index.php/1/article/view/2674
<p><em>The rapid growth of distributed systems and networked environments has increased the complexity of real-time traffic analysis and management. Traditional centralized approaches face limitations related to latency, scalability, and data privacy. This study proposes a federated hybrid deep learning framework for network anomaly detection combined with adaptive resource optimization. The model integrates Convolutional Neural Networks (CNN), Bidirectional Long Short-Term Memory (BiLSTM), and Autoencoders to capture spatial, temporal, and reconstruction-based anomaly patterns. A federated learning strategy using Federated Averaging (FedAvg) enables decentralized training across multiple edge devices with non-IID data distribution, preserving data privacy. Additionally, a Deep Q-Network (DQN) is employed to dynamically optimize network resource allocation based on detected anomalies and traffic conditions. The framework is evaluated using the UNSW-NB15 dataset and compared with traditional machine learning models and centralized deep learning approaches. Results demonstrate improved detection performance and efficient resource utilization, making the proposed system suitable for real-world distributed network environments.</em></p>Samad KhanAnfal YounasSiyal AhmadMuhammad Rehan KhanMaaz AnwarNizar Ahmad
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-052026-05-0545184194FRESH AND HARDENED CONCRETE COMBINED WITH LIME STONE FINES AS A SUBSTITUTION FOR CEMENT
https://thesesjournal.com/index.php/1/article/view/2675
<p><em>Cement manufacture emits substantial quantities of carbon dioxide, significantly affecting the climate, while also necessitating considerable energy use. Moreover, the disposal and recycling of conventional concrete constituents may result in environmental deterioration. Utilizing trash in concrete decreases both making cement and utilization of energy. This research aims to assess the characteristics of fresh and hardened concrete by partly substituting cement with limestone fines (LSF). This research included substituting cement with LSF at proportions of 0%, 5%, 10%, 15%, and 20% by weight of cement. The entirety of 30 samples of concrete was prepared using a mix ratio of 1:1.5:3. Cube-sized specimen was evaluated for compressive strength, and density of concrete at 28 days, correspondingly. The optimal result indicated that the compressive strength enhanced by 10.75% when 10% LSF was used as a cement replacement in concrete after 28 days. The slump value and density of concrete decreased with an increase in LSF concentration.</em></p> <p><strong>Keywords- </strong>Lime Stone Fines, Concrete, slump test, Density, Compressive Strength.</p>ZuhairuddinIqra Wahid LakhiarDileep KumarShafique Ahmed
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-052026-05-0545195200EXPERIMENTAL INVESTIGATION OF SUSTAINABLE CONCRETE SLENDER BEAM UTILIZING FINE COAL BOTTOM ASH AND GROUNDED COIR FIBRE
https://thesesjournal.com/index.php/1/article/view/2682
<p>The current study represents an attempt made to develop eco-friendly structural concrete using different percentages of fine coal bottom ash (FCBA) as a partial cement replacement and grounded coir fiber (GCF) as filler. X-ray diffraction (XRD and key testing were performed to investigate the chemical composition, workability, and splitting tensile strength of the slender concrete beam. The XRD outcomes revealed that the optimal addition of FCBA and GCF in mix can significantly mitigate the pozzolanic activities and create binding effect, thus leading to improving splitting tensile strength by 13.61%. The addition of GCF in concrete mix resulted in enhancement effect to improve the workability of the developed concrete mix by 20.90%. Also, with addition of 10% FCBA and 2% GCF, the flexural strength under four-point bending test showed substantial improvement by 24.50% respectively.</p> <p>Keywords- Eco-friendly concrete; Fine coal bottom ash; Grounded coir fiber; Mechanical properties</p>Sadaquat HussainMuhmmad FarooqMuneer AhmedNizakat Ali
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-062026-05-0645201210AVERAGING PRINCIPLE FOR IMPULSIVE IMPLICIT STOCHASTIC FRACTIONAL DIFFERENTIAL EQUATIONS
https://thesesjournal.com/index.php/1/article/view/2683
<p><em>The paper formulates an averaging principle of a family of impulsive implicit stochastic fractional differential equations, in the Caputo sense. Three basic properties of the system are: 1) fractional order 2a 2b the derivative is not fixed but varies in response to a Poisson process. These models happen in the viscoelastic systems of random shocks and hereditary properties. The first consequence of our work is that there exist and are unique mild solutions to the non-Lipschitz equations when local integrability is assumed on the coefficients. The main result is that the original impulsive system of fractionation can be approximated by an averaged system that has no impulses and has no explicit structure, and has explicit error limits in a mean-square sense. This is based on the application of fractional calculus, halting time techniques and BurkholderDavisGundy inequality to control the terms of stochastic integral. We find that the solution of the original equation tends to the solution of the averaged equation on finite time intervals as the small parameter ε goes to 0 with a rate proportional to ε(alpha-1/2). The theoretical results are confirmed using a numerical example that simulates a fractional RLC circuit with random switchings with the averaged equation requiring less time to compute (73% less) but the averaged equation has a mean-square error less than 4.2. Our results are a generalization of classical averaging theory to systems of fractional-order dynamics with impulses and implicit dependence, not covered by the existing literature. Applications This technology can be used to solve stochastic control problems, in jump process finance, and in memory-effects mechanics of materials. It can be applied to minimise models of more complex stochastic fractional systems that can occur in engineering and physics.</em></p>Imtiaz HussainRaheem Bux ShaikhAbdul QayoomIsrar Ahmed
Copyright (c) 2026
2026-05-062026-05-0645211221NUMERICAL ANALYSIS OF A PREDATOR–PREY MODEL WITH STAGE STRUCTURE AND SENSITIVITY INVESTIGATION
https://thesesjournal.com/index.php/1/article/view/2686
<p><em>In this paper, we analyze a stage-structured predator-prey model with juvenile and adult classes of both prey and predator populations. The model is formulated as a system of nonlinear ordinary differential equations describing biological processes such as growth, maturation, predation and natural death. The long-term dynamics of the populations is studied by obtaining the equilibrium points of the system and studying their stability. We obtain the basic reproduction number </em><em> by the next generation matrix method as a threshold parameter for persistence in population. The sensitivity analysis of </em><em> is performed to study the impact of the key parameters. The results show that the prey growth rate has a significant positive effect on </em><em> and the predation rate has a significant negative effect. Moreover, the numerical simulations are conducted by using Runge-Kutta fourth order (RK_4) method to verify the analytical results. The simulations show the time evolution of all the population classes and confirm the theoretical results about the stability and the behavior of the system. The study discusses the most important parameters governing the dynamics of the predator–prey system.</em></p>RabiaSasuiDr Ram ChandLubna Naz
Copyright (c) 2026
2026-05-062026-05-0645222236RIGID IMAGE REGISTRATION USING L2-NORM MINIMIZATION VIA COARSE SEARCH APPROACH
https://thesesjournal.com/index.php/1/article/view/2688
<p><em>Image registration is the process of aligning two or more images of the same scene captured under different conditions. It is widely used in applications such as medical imaging, computer vision, and remote sensing. In this study, we present a rigid image registration approach based on the minimization of an L2-norm objective function. The proposed method employs a coarse search algorithm to explore the parameter space of rigid transformations, including rotation, scaling, and translation. The objective function is defined as the squared difference between the transformed source image and the target image, which is minimized to achieve optimal alignment. The approach is evaluated on both synthetic data (where the target image is generated from the source) and non-synthetic data. The results demonstrate that the coarse search method is capable of achieving accurate registration in cases where the parameter space is adequately sampled. However, its performance depends on the resolution of the search grid. This work highlights the effectiveness of L2-norm-based objective functions in rigid image registration and demonstrates that coarse search can serve as a simple yet reliable optimization strategy.</em></p> <p><strong>Keywords : </strong><em>Image Registration, Rigid Transformation, L2 Norm, Coarse Search, Least Squares.<strong> </strong><strong><br></strong></em></p>Aizaz Hussain*Muhammad Wali khan*Muhammad Anas JawaidAli RazaAdnan Shehzad
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-062026-05-0645237244A PYTHON PROGRAM TO CALCULATE OSCILLATOR STRENGTHS OF HYPERFINE (HF) MULTIPLETS OF SC II
https://thesesjournal.com/index.php/1/article/view/2689
<p><em>In this work we theoretically investigated oscillator strengths of hyperfine levels of 131 lines of fine levels in Scandium ion. The Holn-Kronig intensity rule is used to estimate the oscillator strength of hyperfine transitions in the Scandium ion. A Python program is created to determine the weighted oscillator strength, wavenumbers, and relative intensities in a particular hyperfine multiplet of Sc II. Quantum numbers and energy of the levels engaged in transitions, hyperfine constants, as well as the weighted oscillator strength of fine transitions, are the input parameters for the code.</em></p>Naveed AliArif Akhtar AzamMuhammad Noman HameedAlay RazaTahir Ahsan
Copyright (c) 2026
2026-05-062026-05-0645268326ASSESSING THE IMPACT OF WATERSHED MANAGEMENT ON FLOOD DYNAMICS UNDER CLIMATE CHANGE IN THE SWAT VALLEY, KP, PAKISTAN
https://thesesjournal.com/index.php/1/article/view/2691
<p><em>Frequent flooding in the Swat River basin, draining the hilly terrain of Hindukush Mountains in the Northern Pakistan, has been intensifying due to rapid urbanization, inadequate flood planning, and climate change particularly in the monsoon season. This study addressed the need to assess future flood hazards and evaluated mitigation strategies for the basin. The research aimed to quantify peak river discharge in different scenarios obtained from the recorded precipitation data and future modeled based scenarios data to explore the flooding characteristics (height and discharge) in the River Swat thematic layer (DEM plus Satellite Image) geometry at the most vulnerable zones of river reaches. A coupled modeling approach integrated HEC-HMS for rainfall–runoff simulation and HEC-RAS for river routing and floodplain inundation. Historical simulations used observed precipitation and derived intensity–duration–frequency (IDF) curves, while future flows were estimated using down scaled climate projections for RCP4.5 and RCP8.5 for years 2032, 2042, 2052, 2062, and 2072 respectively. Results indicated that the historical design flood discharge was approximately 1168.8 m³/s at the basin outlet. Under RCP4.5, projected peak flows increased moderately, whereas under RCP8.5, flood magnitudes increased up to 1441.3 m³/s by 2072. Mitigation simulations revealed that an upstream reservoirs of area 57,507,700 sq meters could effectively attenuate peak flows up to 40.92%. while levee embankments up to 7 meters restrict the discharge within river floodplain and alleviate the flood risk. Combined application of both levees and reservoirs in catchment provided the greatest reduction 39.6% in flood extent. This study highlighted climate change predictions in storm events for the future and ultimate its alliance with River Swat peak flow rates to assess the present river geometry and levees application for flooding conditions which can support policy makers to assess the significance of different flood alleviation measures in the Swat Valley and similar regions in South Asia.</em></p>Muhammad Shehryar SulemanDr. Rasheed AhmedMuhammad Jahangir
Copyright (c) 2026
2026-05-072026-05-0745327346BERT-GNN APPROACH FOR IDENTIFICATION OF SEMANTIC LEGAL METADATA
https://thesesjournal.com/index.php/1/article/view/2692
<p><em>Legal documents are a crucial part of the regulatory level, governance, and legal decision-making. Seeking and retrieving the semantic legal metadata are crucial in enhancing the understanding of legal documents, automated compliance verification, and legal information smartly retrieved. The previous research regarding legal texts had a number of challenges that included the complex syntax of sentences, a number of words that are specific to the domain, ambiguity, and the lack of explicit semantic relations. In this research, deep learning models such as Graph Neural Networks (GNN) and Bidirectional Encoder Representations from Transformers (BERT), a Natural Language Processing (NLP) model, have been applied for the automated extraction of semantic legal metadata from legal documents. Through the literature review, the research work explores the techniques, data, and models used to analyze legal documents. Transformer-based language models and deep neural network are especially useful in acquiring contextual representations over legal corpora whereas graph-based representations enhance relational insight. To improve precision and scalability of semantic legal metadata extraction, this study will automatically detect legal entities, actions, conditions, and sanctions, removing the need to use human-mediated annotation. The proposed system includes legal research, compliance of regulations, and automated legal knowledge administration. This research improves the efficiency, accuracy, and scalability of the extraction of legal information and assists in intelligent legal analysis and automation of compliance.</em></p>Waseem SajjadNayyar IqbalMuhammad NadeemHaroon AhmedTauqir AhmadHilal Bello
Copyright (c) 2026
2026-05-052026-05-0545347356DISPUTES DURING WORK EXECUTION IN CONSTRUCTION PROJECTS AND THE ROLE OF EFFECTIVE DOCUMENTATION IN DISPUTES RESOLUTION: A CASE STUDY IN PAKISTAN
https://thesesjournal.com/index.php/1/article/view/2708
<p><em>But with the decreasing size of an integrated circuit, the conventional structure of a transistor cannot satisfy the requirements of the present era in terms of efficiency and power consumption. While conventional planar MOSFETs and subsequently FinFETs have served the semiconductor industry quite effectively till date, any further reduction in size of the transistor has faced severe limitations such as increased leakage currents and electrostatic breakdown. Technologies such as gate-all-around field-effect transistors, vertically stacked nanosheets, nanowire-based devices, and complementary FETs offer improved gate control and better scalability for advanced technology nodes. This review discusses recent research, published between 2020 and 2025, on how these emerging devices stack up against conventional CMOS structures. Architectural changes, fabrication concerns, performance trade-offs, and long-term feasibility are discussed in this paper with the aim of comprehending how CMOS technology will be able to evolve beyond the FinFET era.</em></p>Ishtiaq AhmadSamia Tariq*
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-082026-05-0845357369REAL-TIME THREAT DETECTION IN CONNECTED CARS: A MACHINE LEARNING APPROACH TO MITM ATTACKS
https://thesesjournal.com/index.php/1/article/view/2694
<p><em>This research investigates how several machine learning (ML) models are trained and evaluated to identify different forms of Man-in-the-Middle (MitM) attacks in connected cars. The framework integrated incorporates continuous surveillance and live threat identification to improve security of vehicles. The four machine learning algorithms, which include Decision Tree Classifier, Logistic Regression, Support Vector Machine (SVM), and K-Nearest Neighbors (KNN), were deployed and evaluated in the CARLA simulation world. Several attack scenarios, spoofing attacks, Denial-of-Service (DoS) attacks, and replay attacks were included in the simulation to the vehicle control unit. The results show that the Decision Tree Classifier had the best threat detection accuracy of 97.50, and the precision, recall, and F1-scores were consistent across all types of attacks. Also, the K-Nearest Neighbors model had a 90.00% accuracy, which shows that it is competitive in terms of threat detection. The results revealed the efficiency of machine learning-based solutions in terms of securing connected vehicles with the help of real-time monitoring and efficient attack detection systems.</em></p>AdnanDaud KhanJunaid Ur RahmanMohd SumerHamad Bashir Ahmad
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-072026-05-0745370382EXPERIMENTAL STUDY ON HIGH-CALCIUM ALKALI-ACTIVATED MATERIALS: SLAG, PALM KERNEL SHELL ASH, AND CLASS C FLY ASH-BASED MORTAR
https://thesesjournal.com/index.php/1/article/view/2704
<p><em>This study investigates the development of sustainable high-calcium alkali-activated mortar incorporating Ground Granulated Blast Furnace Slag (GGBS), Palm Kernel Shell Ash (PKSA), and Class C Fly Ash (CFA) as hybrid binders, using locally available Bholari brown sand as the sole fine aggregate. Mortar mixes were activated using sodium hydroxide (NaOH) solutions with molarities ranging from 8 M to 16 M, while the sand-to-binder (S/B) ratio was maintained at 2.0 to evaluate its influence on fresh and mechanical properties. Cylindrical specimens (100 mm diameter × 200 mm height) were initially cured at room temperature or in an oven at 60 °C for 5 hours and then stored under ambient laboratory conditions until testing. Experimental results indicate that compressive strength increased significantly with NaOH molarity up to an optimum range (12–14 M), after which slight reductions occurred at higher molarity levels due to rapid gel precipitation and microstructural defects. Oven curing enhanced early-age strength and densified the matrix. The 28-day compressive strength of 50% Slag + 50% CFA mixes reached up to 55 MPa, while PKSA-rich mixes achieved approximately 35–40 MPa. The study concludes that hybrid high-calcium binders, combined with locally available Bholari sand, can produce high-performance, eco-friendly mortar suitable for sustainable construction</em></p>Mohsin AliFaizan NoorRashid AliAli HussainSartaj Ul Nabi
Copyright (c) 2026
2026-05-082026-05-0845383397HYBRID MACHINE LEARNING AND FEDERATED LEARNING FRAMEWORK WITH EXPLAINABLE AI FOR PERSONALIZED STROKE TELEREHABILITATION IN RESOURCE-CONSTRAINED SETTINGS
https://thesesjournal.com/index.php/1/article/view/2699
<p><em>Background: Stroke rehabilitation in resource-constrained environments is limited by restricted access to specialized care, fragmented healthcare systems, and concerns related to data privacy and infrastructure. Telerehabilitation offers a scalable solution; however, existing systems lack predictive intelligence, personalization, and interpretability required for clinical adoption.Objective: This study aims to develop and evaluate a hybrid machine learning and federated learning framework for stroke telerehabilitation that ensures high predictive performance, data privacy, and model interpretability.</em><em> </em><em>Methods: A hybrid machine learning approach was implemented using Support Vector Machine, Decision Tree, Random Forest, and Artificial Neural Network models. A proposed hybrid ensemble model was developed using a weighted soft voting strategy to improve predictive accuracy. Federated learning was integrated using the Federated Averaging (FedAvg) algorithm across multiple simulated nodes to enable privacy-preserving distributed training. Data preprocessing included normalization and imputation, and model performance was evaluated using accuracy, precision, recall, F1 score, and ROC-AUC. Robustness was assessed using 10-fold cross-validation, with results reported as mean ± standard deviation.</em><em> </em><em>Results: The Random Forest model achieved the highest performance among individual models (accuracy: 89.3% ± 1.2). The proposed hybrid ensemble model demonstrated superior performance with an accuracy of 92.1% ± 1.0 and ROC-AUC of 0.93 ± 0.01. The federated hybrid ensemble model achieved comparable performance (accuracy: 90.8% ± 1.3; ROC-AUC: 0.92 ± 0.01), indicating minimal performance loss under distributed conditions. Cross-validation results confirmed model stability, while explainable AI techniques identified motor function score and therapy adherence as key predictors.</em><em> </em><em>Conclusion: The integration of hybrid machine learning, federated learning, and explainable AI provides an effective and scalable solution for stroke telerehabilitation. The proposed framework achieves high predictive performance while ensuring data privacy and interpretability, making it suitable for deployment in resource-constrained healthcare environments. Further validation using real-world clinical data is required to confirm its practical applicability.</em></p> <p><strong>Keywords : </strong><em>Stroke Telerehabilitation; Hybrid Ensemble Learning; Federated Learning; Explainable AI (XAI); SHAP; LIME; Privacy-Preserving Machine Learning; Stroke Rehabilitation Prediction; Resource-Constrained Healthcare.</em></p>Safa Ali KhanSahar Ali KhanAyesha RasheedWasim AhmadMariam FatimaNimrah HumayoonEtisam Wahid*Shahzad Ahmad
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-072026-05-0745398410SPAM GUARD PRO: A LIGHTWEIGHT REAL-TIME SMS SPAM DETECTION SYSTEM USING TF-IDF AND LOGISTIC REGRESSION WITH INTERPRETABLE FEATURE ENGINEERING
https://thesesjournal.com/index.php/1/article/view/2711
<p><em>The rapidly growing and overwhelming number of unsolicited SMS messages, their exploitative and deceiving characteristics, and thus introducing serious threat to security, private information, and finances of mobile users, efforts to identify unwanted applications are therefore essential. In this paper, SpamGuard Pro, a lightweight yet high accuracy SMS spam filter based on the Logistic Regression classification algorithm with Message TF IDF Vectorizer and three custom verb behavior and linguistic characteristics, is proposed for fast and reliable SMS spam detection, trained and assessed by application of the well-known UCI SMS Spam Collection dataset with about 5,700 samples. The experimental results showed the accuracy of 96.7, precision of 95.2, recall of 94.8, and F1 score of 95.0. In order to extract more features without the complexity, we added other manually designed features such as length of message, number of exclamations and number of capitalized words based on the likelihood of spam messages and the linguistic behaviors of spam messages. These features together with the T-FIDF, bag-of-words, n-grams best contributed to the interpretability and achieving performance. In addition, we built the whole system as a web application on the Streamlit platform which is a new, simple, and popular light-weighted platform for user to categorize their own data interactively and instantly. From the comparison and analysis among all three interpretable models, we find that an interpretable model is still competitive on the online real-time spam detection system, in particular, Logistic Regression for this kind of classification problem.</em></p>Umair Ayaz KamangarAbdul Sattar ChanSoyam KapoorRanjhan AliZainab Umair KamangarKhalid Hussain
Copyright (c) 2026
2026-05-032026-05-0345411427A LEAKAGE-FREE TWO-PHASE TRANSFER LEARNING ENSEMBLE FOR BINARY MELANOMA CLASSIFICATION USING EFFICIENTNETB3, DENSENET121, INCEPTIONV3, AND VIT-B16
https://thesesjournal.com/index.php/1/article/view/2712
<p><em>Melanoma, a type of skin cancer, is among the fastest-growing and most lethal cancers worldwide. Automation-based early detection of this disease is very crucial for increasing the survival rates of patients. This paper conducts a thorough comparative study of four pre-trained deep learning architectures: EfficientNetB3, DenseNet121, InceptionV3, and Vision Transformer (ViT-B16) with the aid of a weighted ensemble method for binary classification of melanoma on the Kaggle Melanoma Skin Cancer Dataset containing 10,000 dermoscopic images of 5,000 benign and 4,538 malignant skin lesions. Our work features a well-defined two-stage transfer learning methodology that effectively allows for the prevention of data augmentation leakage by performing stratified splitting before augmentation and significantly improves feature adaptation by resorting to progressive fine-tuning and adaptive learning rate scheduling. Individual model accuracies achieved are: EfficientNetB3 (94.90%), DenseNet121 (95.26%), InceptionV3 (95.11%), and ViT-B16 (96.04%). The weighted ensemble that combines the four models achieves 96.77% accuracy, 0.9886 precision, 0.9656 F1-score, and 0.9949 AUC, exceeding the 95.25% ensemble baseline of Sarıateş and Özbay by 1.52 percentage points on the same dataset. This implies that an effectively designed pipeline can continuously increase the accuracy irrespective of the model architecture, and that an ensemble of complementary CNN and Transformer features leads to even better results than single models.</em></p>Umair Ayaz KamangarAbdul Sattar ChanZainab Umair Kamangar
Copyright (c) 2026
2026-05-082026-05-0845428447TOWARD INTELLIGENT AND AUTONOMOUS SOCS: ENABLING LLM-DRIVEN, MCP-INTEGRATED, MULTI-AGENT SECURITY OPERATIONS
https://thesesjournal.com/index.php/1/article/view/2719
<p><em>The daily influx of alerts, high false positives, disjointed investigation processes, and the continuously grow- ing cyber threats are continuing to put pressure on Security Operations Centers (SOCs). The old model of SOC is reactive, signature-based, and manually intensive SOAR rule books, which are slow to adjust to new Indicators of Compromise (IOCs). It has been observed in empirical research that a considerable number of security alert events go uninvestigated, and those that are investigated are often inaccurately and consistently defined as false positives, which is a leading cause of analyst fatigue, dwell time, and uneven incident response. The recent developments in artificial intelligence (AI) and distributed cyber defense solutions suggest that smarter and autonomous SOC paradigms are evolving into existence. Combined with agentic reasoning and multi-agent coordination architectures, large language models (LLMs) exhibit great potential in Tier-1 alert triage, contextual evidence correlation, automated rule generation and adaptive response planning. There are also interoperability standards like Model Context Protocol (MCP) which allows tool invocation and exchange of contextual data between SIEM, SOAR, TIP and case management systems. It is a review of the existing research on SOC automation, the challenges that persist, and the opportunities that exist in the future to achieve explainable, closed-loop, and autonomous security operations.</em></p>Hamad Moiz ChodhryNaveed Naeem AbassAmina Naseem
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-092026-05-0945448463 THE ROLE OF PROBABILISTIC AI IN BUILDING TRUSTWORTHY EDGE–CLOUD INTELLIGENCE
https://thesesjournal.com/index.php/1/article/view/2723
<p><em>Background: The transformation of the edge-cloud computing has helped the smart systems to be applicable in dynamic real world environments. Nonetheless, some uncertainties like noisy data, variability of network and incomplete information are inherent, the reliability and credibility of traditional deterministic artificial intelligence models is also doubted, since most of them give overconfident predictions without quantifying their constraints. Objective: The paper would explain the application of probabilistic artificial intelligence to make edge cloud intelligence systems more confident. Specifically, it examines the opinions of the experts on its contribution to the key features like reliability, robustness, resilience, explainability and what it means in terms of its future adoption. Methodology: A survey-based, quantitative research design was used. The information was gathered about the professionals and researchers in the field of artificial intelligence, data science, etc. using a structured questionnaire. The instrument used a five-point Likert scale and covered the aspects of trustworthiness, system functioning, problems, and prospects. The descriptive statistics were used to analyse the 250 responses. </em></p>Waqas AhmedMuhmmad Qasim ZafarJunaid Ur RahmanAsma Javaid
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-092026-05-0945464474ELITE STOCHASTIC OPTIMIZED DISTRIBUTED LEARNING BASED ENSEMBLE MODEL FOR TOMATO LEAF DISEASE DETECTION AND CLASSIFICATION
https://thesesjournal.com/index.php/1/article/view/2724
<p><em>Tomato leaf disease is a primary factor impacting the quality and quantity of crop yield. The rapid spread of diseases and inefficient production have driven diverse models for classifying diseases. Nevertheless, existing traditional models are constrained by poor robustness, limited generalization, and inconsistent performance. This paper proposes an Elite Stochastic Optimized Distributed Learning-Based Ensemble Deep Neural Network Long Short-Term Memory (ESD2TM) model for effective disease detection and classification. The ESD2TM framework utilized a distributed mirrored strategy featuring a replica model which engaged in parallel training to boost the training speed and reduce significant computational requirements. In addition, the Hybrid Mutual Augmented Structural Features (HMAS) technique effectively gathers the context-aware characteristics and spatial relationships within the leaves to determine the irregularities based on the disease symptoms. In addition, the Elite Stochastic Optimization Algorithm (ElSTO) refines the hyper parameters and exhibits balanced diversity to find the optimal solution. The incorporation of these mechanisms enables the ESD2TM approach to achieve an accuracy, specificity, precision and sensitivity of 98.42%, 98.46%, 97.69%, and 98.39%, respectively.</em></p>Muhammad ShakeelWaseem AkramSaeed RasheedTauqir AhmadBadarqa ShakoorHamda Khalid
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-092026-05-0945475489ARTIFICIAL INTELLIGENCE IN SOFTWARE ENGINEERING: AUTOMATED CODE GENERATION, TESTING, AND SELF-HEALING SYSTEM DESIGN
https://thesesjournal.com/index.php/1/article/view/2728
<p><em>Artificial Intelligence (AI) significantly transformed software engineering by introducing intelligent automation into software development, testing, and system maintenance processes. This study examined the role of AI in automated code generation, AI-driven software testing, and self-healing system design within modern software engineering environments. The research adopted a quantitative research design and collected data from a sample of 320 software engineers, developers, quality assurance specialists, and IT professionals working in technology organizations. A structured questionnaire measured respondent perceptions regarding AI integration in software engineering practices. The findings revealed that AI technologies improved software development efficiency, coding accuracy, software reliability, and operational continuity. Automated code generation recorded a mean value of 4.18, indicating strong agreement regarding the effectiveness of AI-assisted programming systems in reducing repetitive coding activities and improving software maintainability. AI-driven software testing achieved a mean value of 4.11, demonstrating significant improvement in defect detection accuracy and software quality assurance processes. Self-healing system design produced a mean value of 4.06, reflecting positive perceptions regarding autonomous fault detection and intelligent recovery mechanisms. Software development efficiency recorded the highest mean value of 4.22, while software reliability achieved a mean score of 4.14. The study concluded that AI-powered software engineering practices enhanced productivity, scalability, and software resilience. Cybersecurity concerns, ethical challenges, and transparency issues remained critical factors requiring responsible AI governance and continuous human oversight.</em></p>Mujeeb Ur Rehman*Dr. Imran KhanAsad JavedVajeeha Mir
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-092026-05-0945490508GENERATIVE ARTIFICIAL INTELLIGENCE FOR METAHEURISTIC OPTIMIZATION: TAXONOMY, METHODOLOGICAL FRAMEWORKS, AND OPEN RESEARCH CHALLENGES
https://thesesjournal.com/index.php/1/article/view/2729
<p><em>The arrival of Generative Artificial Intelligence (GenAI) has brought a new paradigm shift in the field of optimization research because it has now allowed scientists to be able to model search spaces using data-based methods. This paper gives a thorough review of GenAI metaheuristic optimization with its research that assesses the functionality of generative models such as generative adversarial networks and diffusion models and variational autoencoders and large language models within both evolutionary and swarm-based systems. In order to integrate the existing methodologies, we suggest a three-dimensional taxonomy of the functional role, level of integration and learning paradigm. The survey explores generator-enhanced evolution, diffusion-guided search, LLM-assisted metaheuristic design, and surrogate-assisted generative optimization in terms of their research backgrounds and convergence. The researchers describe their strategies to be used in empirical testing by analyzing benchmarking processes and performance evaluation criteria and reproducibility test procedures. The research points to the key unresolved problems that encompass scalability issues and uncertainty quantification issues and interpretability issues and ethical concerns. The study opens new methodological frontiers that will result in independent adaptive systems to generative optimization that employ theoretical underpinnings to address problems in optimization of large scales in the real world.</em></p>Shamikh ImranRizwan IqbalFaisal KhanNadia Mustaqim Ansari
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-092026-05-0945509525PROBABILISTIC DESIGN OF FOUNDATIONS AND STRUCTURES: ADDRESSING UNCERTAINTY IN GEOTECHNICAL PARAMETERS
https://thesesjournal.com/index.php/1/article/view/2715
<p><em>This study examined the role of probabilistic design methods in addressing uncertainty in geotechnical parameters related to foundations and structures. The research focused on evaluating how probabilistic frameworks improved structural safety, reliability, and foundation performance under uncertain subsurface conditions. A quantitative research design was adopted, and data were collected from a sample of 300 geotechnical engineers, structural engineers, researchers, and construction professionals using a structured questionnaire based on a five-point Likert scale. Statistical analysis was performed using descriptive statistics, correlation analysis, and regression analysis to evaluate the relationships among probabilistic design methods, geotechnical parameter uncertainty, reliability-based design, and structural safety. The findings indicated high mean values for Structural Safety and Stability (M = 4.21), Probabilistic Design Methods (M = 4.18), Reliability-Based Design (M = 4.12), and Foundation Performance Optimization (M = 4.09). Correlation analysis revealed strong positive relationships among all variables, while regression analysis showed that Reliability-Based Design produced the strongest influence on structural safety with a beta coefficient of 0.42 and a significance value of 0.000. The coefficient of determination (R² = 0.73) demonstrated that probabilistic design variables explained 73% of the variation in structural safety and stability. The study concluded that probabilistic geotechnical engineering approaches improved uncertainty management, minimized structural risks, optimized foundation performance, and supported sustainable infrastructure development in modern engineering projects.</em></p> <p><strong>Keywords : </strong><em>Foundation Optimization, Geotechnical Engineering, Probabilistic Design, Reliability Analysis, Soil Variability, Structural Safety.</em></p>Ahmad Nawaz KhanChaimaa El JabliPashtoon Ahmad Rayan
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-092026-05-0945526542NON-EXISTENCE OF A GLOBALLY CONVEX ENTROPY PAIR AND WAVE STRUCTURE FOR A FLUID DYNAMICS MODEL OF BIOFILMS
https://thesesjournal.com/index.php/1/article/view/2732
<p><em>We study the hyperbolic structure of the one-dimensional fluid dynamics system introduced by Clarelli, Di Russo, Natalini, and Ribot (2013) to model biofilm growth, and subsequently analyzed analytically by Bianchini and Natalini (2016), who proved global existence and exponential stability of smooth solutions near the unique equilibrium by exploiting a total dissipativity condition in lieu of a convex entropy. The absence of a globally convex entropy pair was left implicit in that work. We make this absence explicit and rigorous. Specifically, we derive the full compatibility system for the Hessian H = ∇²η of any candidate entropy η, and prove that positive definiteness of H is incompatible with the constraints imposed by the flux structure on any open domain containing states with both positive and negative velocity. The proof exploits an affine dependence of the (4,4)-entry of H on the velocity variable, which changes sign and therefore cannot be globally positive definite. As a direct consequence, the classical entropy-based convergence framework of Lax, Glimm, and DiPerna does not apply to this system in its full generality. In the second part of the paper, we carry out a complete wave-structure analysis: we compute all right eigenvectors of the flux Jacobian, classify the four characteristic fields as two linearly degenerate contact families and two nonlinear families, and identify an inflection locus {L = 3/5} inside the hyperbolicity domain on which genuine nonlinearity fails. The resulting wave pattern in the Riemann problem is a two-contact structure bracketed by two nonlinear waves, with composite waves appearing whenever the inflection locus is crossed. These results provide the analytical foundation for a companion numerical paper in which a Godunov-type scheme is constructed and the numerical viscosity is shown to serve as the admissibility mechanism replacing convex entropy.</em></p> <p><strong>Keywords : </strong><em>Hyperbolic conservation laws; biofilm model; convex entropy; entropy pair; Riemann problem; wave structure; genuine nonlinearity; linearly degenerate; mixture theory; vanishing viscosity.</em></p>Sidrah Ahmed
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-102026-05-1045562569ENHANCING BLOCKCHAIN SECURITY USING MACHINE LEARNING-OPTIMIZED` ASYMMETRIC ENCRYPTION: A COMPREHENSIVE FRAMEWORK FOR INTELLIGENT CRYPTOGRAPHIC MANAGEMENT IN DISTRIBUTED LEDGER SYSTEMS
https://thesesjournal.com/index.php/1/article/view/2731
<p><em>Blockchain technology has emerged as a transformative paradigm for decentralized trust and transparent transaction processing across diverse sectors including finance, supply chain, healthcare, and digital identity management. However, the escalating sophistication of cyber threats, coupled with the computational rigidity of conventional cryptographic implementations, presents critical vulnerabilities in contemporary blockchain ecosystems. This paper proposes a novel Machine Learning-Optimized Asymmetric Encryption Framework (ML-OAEF) that integrates advanced supervised learning algorithms with intelligent cryptographic management to enhance blockchain security, scalability, and adaptive resilience. We present a comprehensive methodology encompassing dataset synthesis, multi-dimensional feature engineering, comparative model evaluation, and blockchain-specific security assessment. Four distinct machine learning architectures—Random Forest (RF), Gradient Boosting (GB), Support Vector Machine (SVM), and Multi-Layer Perceptron (MLP)—were systematically evaluated against a diverse cryptographic dataset comprising 1,647 samples across symmetric encryption (AES, DES, 3DES, Blowfish, RC4, ChaCha20), asymmetric encryption (RSA), and hash functions (SHA-256). Experimental results demonstrate that the Support Vector Machine and Neural Network models achieved exceptional classification accuracy of 96.5%, significantly outperforming traditional baseline approaches. We introduce a Composite Blockchain Security Score (CBSS) metric that quantifies cryptographic suitability across five dimensions: cryptographic strength, performance efficiency, quantum resistance, blockchain compatibility, and machine learning confidence. Furthermore, we propose a Blockchain Integration Mechanism (BIM) that operationalizes ML-driven insights across the data, consensus, and application layers of blockchain architecture. The developed real-time ML pipeline achieved 83.75% verification accuracy in live blockchain monitoring scenarios, confirming practical applicability for automated cryptographic verification and anomaly detection. This research establishes a foundation for next-generation blockchain security systems that leverage artificial intelligence to dynamically optimize cryptographic configurations, detect emerging threats, and ensure post-quantum readiness. The proposed framework bridges the gap between data-driven intelligence and decentralized trust mechanisms, offering a scalable, interpretable, and future-ready solution for securing distributed ledger technologies.</em></p> <p><strong><em>Keywords : </em></strong><em>Blockchain Security, Machine Learning, Asymmetric Encryption, Cryptographic Algorithm Identification, Deep Learning, Post-Quantum Cryptography, Zero-Knowledge Proofs, Federated Learning, Smart Contracts, Distributed Ledger Technology.</em></p> <p><em><a href="https://doi.org/10.5281/zenodo.20133029" target="_blank" rel="noopener">https://doi.org/10.5281/zenodo.20133029</a></em></p>Sobia AkmalDr. Amnah FirdousMuniba SaleemSabeeka Fatima
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-102026-05-1045543561AI-DRIVEN EXPLAINABILITY: ENHANCING TRANSPARENCY IN DEEP LEARNING MODELS FOR REAL-WORLD APPLICATIONS
https://thesesjournal.com/index.php/1/article/view/2739
<p><em>This paper will discuss how explainability through AI can aid in enhancing transparency and trust in deep learning models applied to real-life situations. The black-box quality of the deep learning techniques makes them difficult to comprehend and interpret, which is an issue in critical fields such as healthcare, finance and smart systems. The explainability framework suggested in this study integrates the transparency throughout the lifecycle of AI, including the processing of data, and the implementation of models.</em><em> </em><em>Mixed-method approach is used, as the evaluation is based on the experiment and the development of the framework. The results indicate that explainable AI models are more accurate by 89 percent compared to traditional models, which achieve 91 percent accuracy. The findings also show that the explainable AI models are much easier to interpret since the models are more accurate, 89 percent as compared to traditional models, which are 91 percent accurate. The user trust and bias detection efficiency increase by 55 to 88 and 48 to 82 respectively. Although the rate of computational efficiency (92 to 85) slightly dropped, the overall system effectiveness will also be increased (70 to 86).</em><em> </em><em>The results affirm that explainability boosts not only transparency but also fairness and accountability and usability of AI systems. The suggested framework is highly applicable in a variety of spheres, such as healthcare, finance, and smart infrastructure. The current research paper is applicable to the field of explainable AI since it provides a scalable and practical solution which balances both the model performance and explainability. It highlights the importance of adding explainability to AI systems to ensure that it can be deployed in the real world in an ethical and reliable manner.</em></p> <p><strong><em>Keywords : </em></strong><em>Explainable AI, Deep Learning, Transparency, Interpretability, Artificial Intelligence, Bias Detection, Trustworthy AI.</em></p> <p> </p>Urooj TariqImran ullahMaleeha Riaz
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-092026-05-0945570582CALIBRATION AND VALIDATION OF PMWIN MODFLOW WITH TANDO MUHAMMAD KHAN DISTRIBUTARY COMMAND AREA
https://thesesjournal.com/index.php/1/article/view/2740
<p>Since Sindh is in the lower reaches of the Indus River, it is most vulnerable to a variety of upstream water development challenges. To overcome these challenges, HEC has launched the project “Sustainable Freshwater Management (LIB) for Irrigated Agriculture in the Lower Indus River Basin”. Under the umbrella of this project, Muhammad Khan's distributary was selected as the study. The aim of this study is to Calibrate and validate PMWIN MODFLOW with Tando Muhammad Khan distributary command Area to highlight the groundwater potential. Documented pumping test data located in Muhammad khan distributary command area was simulated in PMWIN MODFLOW software to determine Sustainable usage of groundwater. For PMWIN MODFLOW the input data was obtained from distributary via pumping test and from already calculated data from various literatures. There were two tube-wells (head and tail); both were given no flow conditions in boundary conditions of mesh. The head values obtained at different intervals were used to calibrate and validate as well. Discharge was kept 70.5 m3/hour. The calibration was performed on a single time whereas for validation two simulated runs were generated. To verify the results, the root mean square deviation was calculated which states that the results are less than 10%. Time versus Hydraulic Head graphs presented to illustrate the conclusions graphically of the calibration and validation of the model.</p> <p><em>Keywords- </em>Muhammad khan distributary, Lower Indus Basin (LIB), PMWIN MODFLOW, Calibration and Validation.</p>Abdul MananMuhmmad Hamza Zaid
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-102026-05-1045583592 ATMOSPHERIC AEROSOL DYNAMICS AND THEIR IMPACT ON CLIMATE VARIABILITY AND AIR QUALITY IN URBAN REGIONS OF PAKISTAN
https://thesesjournal.com/index.php/1/article/view/2749
<p><em>Atmospheric aerosols have become a major environmental concern due to their significant impacts on climate variability, air quality, and public health, particularly in rapidly urbanizing developing countries such as Pakistan. This study investigated atmospheric aerosol dynamics and their influence on climate variability and urban air quality in major urban regions of Pakistan, including Lahore, Karachi, Islamabad, Faisalabad, and Peshawar. A quantitative and observational research design was employed using satellite remote sensing data, meteorological observations, and air quality measurements collected from 2020 to 2025. The study analyzed Aerosol Optical Depth (AOD), particulate matter concentrations (PM2.5 and PM10), temperature, humidity, and wind speed to evaluate aerosol distribution patterns and environmental impacts. Descriptive statistics, correlation analysis, and multiple regression analysis were applied to assess relationships among study variables. The findings revealed high aerosol loading and elevated particulate matter concentrations across urban regions of Pakistan, indicating severe atmospheric pollution conditions. Correlation analysis demonstrated a strong positive relationship between aerosol concentrations and urban air pollution indicators, particularly PM2.5 and PM10. Meteorological factors such as temperature and humidity significantly influenced aerosol accumulation, while wind speed negatively affected particulate matter concentration through pollutant dispersion mechanisms. Seasonal analysis indicated that winter experienced the highest aerosol concentrations due to temperature inversion, biomass burning, industrial emissions, and stagnant atmospheric conditions. Regression results confirmed that atmospheric aerosols significantly contributed to climate variability and deteriorating urban air quality. The study concluded that atmospheric aerosol pollution poses serious environmental, climatic, and public health challenges in Pakistan. The findings emphasize the need for effective air pollution control policies, enhanced aerosol monitoring systems, sustainable urban planning, and climate adaptation strategies to reduce environmental degradation and improve public health conditions in urban Pakistan</em></p>Zubair KhanZeeshan Amir
Copyright (c) 2026
2026-05-112026-05-1145593607EXPLAINABLE AI-BASED INTRUSION DETECTION FRAMEWORK FOR CRITICAL INFRASTRUCTURE PROTECTION IN PAKISTAN’S DIGITAL ECOSYSTEM
https://thesesjournal.com/index.php/1/article/view/2751
<p><em>The rapid expansion of digital infrastructure in Pakistan has increased the vulnerability of critical systems such as banking, healthcare, energy, telecommunications, and government services to sophisticated cyber threats. Traditional intrusion detection systems (IDS) are increasingly insufficient due to their limited adaptability, high false-positive rates, and inability to detect zero-day attacks. Although Artificial Intelligence (AI) and Machine Learning (ML)-based IDS models have improved detection accuracy, their “black-box” nature limits transparency, trust, and operational acceptance in critical infrastructure environments. To address these challenges, this study proposed an Explainable AI-Based Intrusion Detection Framework designed to enhance both cybersecurity performance and interpretability. The framework integrated advanced machine learning algorithms with Explainable Artificial Intelligence (XAI) techniques, including SHAP and LIME, to provide transparent and interpretable threat detection. The model was evaluated using benchmark datasets and expert assessments from cybersecurity professionals. Results demonstrated that the proposed framework achieved superior performance in terms of accuracy, precision, recall, and false-positive reduction compared to traditional models. Additionally, expert evaluations confirmed high levels of interpretability, transparency, and trust in AI-driven decisions. The findings highlight that integrating explainability into IDS frameworks significantly strengthens cybersecurity resilience and supports informed decision-making in critical infrastructure environments. The study concludes that XAI-based intrusion detection systems offer a reliable and scalable solution for protecting Pakistan’s evolving digital ecosystem</em></p> Usman EhsanMuhammad Atif Altaf
Copyright (c) 2026
2026-05-112026-05-1145608620BAYESIAN SPATIOTEMPORAL MODELING OF CLIMATE-INDUCED AGRICULTURAL YIELD VARIABILITY IN PAKISTAN UNDER DATA SCARCITY CONDITIONS
https://thesesjournal.com/index.php/1/article/view/2752
<p><em>Climate variability has emerged as a major challenge to agricultural productivity in developing countries, particularly in Pakistan, where agricultural systems remain highly dependent on climatic conditions and are often characterized by limited and incomplete datasets. This study investigated climate-induced agricultural yield variability in Pakistan using a Bayesian spatiotemporal modeling framework under data scarcity conditions. The study integrated climatic indicators, including temperature, precipitation, drought severity, evapotranspiration, and vegetation health indices, to examine their spatial and temporal effects on agricultural productivity across different agro-ecological regions of Pakistan. Secondary data were obtained from meteorological databases, satellite-derived remote sensing sources, and agricultural statistics covering multiple districts and time periods. A Bayesian hierarchical spatiotemporal model was employed to address uncertainty, missing observations, spatial dependence, and temporal variability simultaneously. The findings revealed that rising temperatures and drought intensity significantly reduced agricultural yields, whereas rainfall and vegetation health indicators positively influenced crop productivity. Significant spatial and temporal dependencies were also identified, indicating substantial regional heterogeneity in climate-agriculture relationships. The Bayesian framework demonstrated strong predictive accuracy and robustness under incomplete data conditions, outperforming conventional statistical approaches in handling uncertainty and heterogeneous datasets. The study contributes methodologically by advancing probabilistic climate-agriculture modeling and practically by providing evidence-based insights for climate adaptation, agricultural forecasting, and food security planning in Pakistan. The findings support the adoption of Bayesian and geospatial analytical approaches for sustainable agricultural management in climate-vulnerable and data-constrained environments.</em></p>Ishaq khanSadam Hussain MughalEman
Copyright (c) 2026
2026-05-112026-05-1145621636PEROVSKITE-BASED NANOMATERIALS FOR HIGH-EFFICIENCY SOLAR CELLS UNDER PAKISTAN’S CLIMATIC CONDITIONS
https://thesesjournal.com/index.php/1/article/view/2753
<p><em>The increasing demand for sustainable and high-efficiency renewable energy technologies has intensified global interest in perovskite-based solar cells due to their exceptional photovoltaic properties, low fabrication cost, and tunable optoelectronic characteristics. However, the operational stability of perovskite solar cells remains a major challenge under harsh environmental conditions characterized by high temperature, humidity, and dust exposure, particularly in developing countries such as Pakistan. This study investigated the effectiveness of perovskite-based nanomaterials in enhancing the efficiency, thermal stability, and environmental durability of solar cells under Pakistan’s climatic conditions. A quantitative experimental research design was employed using laboratory-fabricated photovoltaic samples consisting of conventional perovskite solar cells, mixed-cation moisture-resistant perovskite cells, and encapsulated nanostructured perovskite devices. Data were collected through photovoltaic characterization, thermal stress testing, humidity exposure analysis, and environmental simulation techniques. Statistical analyses, including descriptive statistics, correlation analysis, regression analysis, and one-way ANOVA, were conducted to evaluate the relationship between nanomaterial properties and photovoltaic performance.</em></p> <p><em>The findings revealed that advanced nanostructural engineering, surface passivation, and encapsulation significantly improved power conversion efficiency and reduced environmental degradation under high-temperature and high-humidity conditions. Mixed-cation and encapsulated perovskite solar cells demonstrated superior thermal stability, moisture resistance, and operational durability compared to conventional photovoltaic structures. The study concluded that perovskite-based nanomaterials possess substantial potential for developing cost-effective, high-efficiency, and climate-resilient solar technologies suitable for Pakistan’s environmental conditions. The research contributes to the advancement of renewable energy materials and provides practical insights for sustainable photovoltaic development in emerging economies</em></p>Mohammad Arif GoyaShamsher Khan
Copyright (c) 2026
2026-05-112026-05-1145637653INTEGRATING ARTIFICIAL INTELLIGENCE WITH MATHEMATICAL MODELING AND GRAPH THEORY FOR SOLVING HIGH-DIMENSIONAL OPTIMIZATION AND PREDICTION PROBLEMS IN COMPLEX NETWORK SYSTEMS
https://thesesjournal.com/index.php/1/article/view/2756
<p><em>The increasing complexity of modern networked systems, including communication infrastructures, transportation networks, biological systems, and social networks, has created significant challenges in solving high-dimensional optimization and prediction problems. Traditional analytical and heuristic methods often struggle to scale efficiently due to the exponential growth of state spaces and complex interdependencies among network components. This study proposes an integrated framework that combines Artificial Intelligence (AI), mathematical modeling, and graph theory to address these challenges in complex network systems. The proposed framework utilizes graph-based representations to model structural and dynamic relationships within networks while incorporating machine learning and deep learning techniques, particularly Graph Neural Networks (GNNs), to capture nonlinear patterns and hidden dependencies in high-dimensional data. The framework further integrates metaheuristic optimization methods, convex and non-convex optimization techniques, and reinforcement learning–based decision-making to improve resource allocation, routing optimization, and predictive inference. Mathematical modeling is employed to define objective functions, system constraints, and optimization structures for efficient problem formulation. Experimental evaluations demonstrate that the proposed hybrid framework achieves improved prediction performance, optimization efficiency, scalability, and adaptability compared to conventional approaches. The integration of AI-driven learning with graph-theoretic modeling also enhances performance in dynamic and uncertain environments, making the framework suitable for real-time applications.</em></p> <p><em>The findings demonstrate that the combination of AI, mathematical modeling, and graph theory provides a scalable and flexible solution for intelligent network analytics in large-scale systems. The proposed framework has potential applications in smart cities, IoT networks, energy systems, transportation infrastructures, and cybersecurity environments. Future work will focus on integrating Explainable AI (XAI) techniques and distributed computing paradigms to further improve interpretability, scalability, and real-time deployment.</em></p>Tanveer AhmadMuhammad MajidRabia EssaMuhammad Javed AyubImad AliAshraf Zia
Copyright (c) 2026
2026-05-112026-05-1145654682DETERMINANTS OF MULTI-FACTOR AUTHENTICATION ADOPTION AMONG PAKISTANI INTERNET USERS: A SURVEY-BASED STUDY
https://thesesjournal.com/index.php/1/article/view/2762
<p>Multi-factor authentication (MFA) refers to an electronic authentication method that requires the use of more than one key. Despite its effectiveness against cyber threats, the use of MFA is still not widely adopted; the case for Pakistan. This research examines the principal factors influencing the intention of MFA adoption among university students and faculty in Pakistan. Through a structured questionnaire, data was collected from the respondents of the academic institutions. The researchers analyzed six distinct features such as perceived security benefit, perceived ease of use, security awareness, social influence and so on. Using descriptive statistics, reliability testing, and correlation analysis, the data were analyzed. The results show that all five independent variables favorably influence MFA adoption intention. The perceived security advantage and social influence appear to be the most prominent drivers while, the perceived ease of use illustrates the usability challenges users associate with MFA. The research concludes with workable suggestions for educators, service providers, and legislators to promote wider adoption of MFA in Pakistan.</p> <p>Keywords : <em>Multi-Factor Authentication, Cybersecurity Adoption, Pakistani Internet Users, Security Awareness, Digital Security</em></p>Muhammad Faseeh AnsariAsjad AbbasHamza AliMohsin Nasir
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-112026-05-1145683705AN OPTIMIZID MICROSTRIP PATCH ANTENNA DESIGN WITH ENHANCED BANDWIDH FOR 5G COMMUNICATION AND MEDICAL APPLICATIONS
https://thesesjournal.com/index.php/1/article/view/2763
<p>In order to meet higher bandwidth requirements, the quick development of 5G connectivity has required a move toward millimeter-wave (mm-wave) frequencies. Because of its small profile, simplicity in manufacture, and operating at high-frequency bands, microstrip patch antennas (MPAs) are becoming more famous for using in various applications. The design and development of a high-performance microstrip patch antenna working in the 5G mm-wave spectrum (>24 GHz) is the main topic of this work, with a particular focus 5G communication and possible use in early stage tumor detection. For this purpose the designed models was tuned from the 27 GHz to 28 GHz resonance frequency range using CST Microwave Studio for thorough optimized modeling and simulation. The simulation results show good performance characteristics, with a Return Loss (S11) that is consistently between -19 dB and -45 dB and a Voltage Standing Wave Ratio (VSWR) that ranges from 1.22 to 1.01. These results suggest that the optimized model offers the high sensitivity and impedance matching needed for sophisticated medical sensing as well as next-generation communication systems.</p> <p>Keywords</p> <p><em>Tumor Imaging, Return Loss “S11”, Voltage Standing Wave Ratio “VSWR”, CST Microwave Studio, Millimeter-wave (mm-wave), Microstrip Patch Antenna, 5G Connectivity</em></p>*Zaheer Hussain AbbasiShams ParveenDeedar Ali JamroFozia K Soomro
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-112026-05-1145706715FROM AI TO AGI: THE NEXT EVOLUTION OF LIBRARIES AND INFORMATION SERVICES
https://thesesjournal.com/index.php/1/article/view/2764
<p>Artificial Intelligence (AI) has significantly transformed library and information services by enhancing information retrieval, automating cataloging, improving research support, and enabling intelligent user services. Recently, advancements in Generative AI, Agentic AI, and Artificial General Intelligence (AGI) have opened new possibilities for autonomous, adaptive knowledge systems in libraries. This paper explores the transition from AI to AGI and its implications for libraries and information services. The study discusses the evolution of intelligent library ecosystems, current AI applications, potential AGI-driven services, emerging librarian competencies, and associated ethical and organizational challenges. Furthermore, the paper highlights future research directions and emphasizes the importance of strategic planning, AI literacy, and ethical governance for the sustainable adoption of AGI in libraries. The findings suggest that AGI may redefine libraries from traditional information repositories into intelligent and autonomous ecosystems capable of delivering highly personalized, efficient, and innovative services.</p> <p>Keywords : <em>Artificial General Intelligence, AGI, Academic Libraries, Smart Libraries, AI Literacy, Intelligent Information Systems, Agentic AI</em></p>Muhammad Kabir Khan* Tahir Yasmin
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-112026-05-1145716724A SCALABLE AI AND CLOUD-BASED FRAMEWORK FOR SMART AUTOMATION IN IOT NETWORKS
https://thesesjournal.com/index.php/1/article/view/2767
<p><strong>Background</strong></p> <p>The rapid evolution of the Internet of Things (IoT), as well as the Artificial Intelligence (AI) and cloud computing have transformed the modern digital ecosystems, making them intelligent, connected, and automated. However, the classical IoT systems lack scalability, real-time processing, and efficient data management, which necessitates more advanced solutions.</p> <p><strong>Objective</strong></p> <p>This paper seeks to discuss and analyze a scalable AI and cloud-based framework of smart automation in IoT networks, its effectiveness, scalability, challenges associated with its implementation, and it’s potential to be adopted in the future.</p> <p><strong>Methodology</strong></p> <p>A quantitative research methodology was employed based on a structured questionnaire that was sent to 300 professionals, including IoT engineers, data scientists, cloud experts, IT managers, and researchers. The analysis of data was performed with the help of descriptive statistics, including mean, standard deviation, frequency, and percentage, and the reliability was evaluated with the help of Cronbach’s Alpha to ensure the internal consistency.</p> <p> </p> <p> </p> <p><strong>Results</strong></p> <p>The research proves that there is a high level of support to the use of AI and cloud computing in IoT-based automation systems. The data is very reliable (Cronbachs alpha = 0.931) and respondents are very aware and adopters of these technologies. AI-based automation is more efficient and reliable in decision-making, as well as in data management and scalability, cloud computing is more efficient and reliable. Nevertheless, there are still challenges like privacy of data, security threats, and cost of implementation. In general, the prospects are the most optimistic, and the integration of AI and IoT is likely to become the key to significant technological improvements.</p> <p><strong>Conclusion</strong></p> <p>The paper concludes that AI and cloud-based IoT solutions can be of much advantage in smart automation and improving the performance of systems. However, to be sustainable, security, cost and scalability issues must be considered. The findings have significant implications to intelligent and scalable design of IoT solutions to researchers, practitioners, and policymakers</p>Waqas AhmedMuhammad ShoaibAli RazaToseef Naser Khan
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-112026-05-1145725742SOYBEAN LEAF DISEASE CLASSIFICATION USING MOBILENETV2 WITH LIGHTWEIGHT DEEP LEARNING IN RESOURCE CONSTRAINED AGRICULTURAL ENVIRONMENT
https://thesesjournal.com/index.php/1/article/view/2768
<p><em>Soybean is one of the most significant oilseed crops in the world and is highly susceptible to a number of diseases of the leaves that cause losses in quality and yield. Early and accurate detection of disease is vital for successful early disease management and sustainable agriculture production. In this study, a lightweight deep learning framework MobileNetV2 is proposed for an automatic soybean leaf disease diagnosis system in resource-poor agricultural settings. Images of 2000 soybean leaves were captured from agricultural fields of the Pakistan region and a region-specific dataset named SDD-2025 was developed. The data is split into three disease classes: Charcoal Rot, Mosaic Virus and Pest Infestation. Pre-processing methods such as rotation, flipping, zooming and brightness adjustment were used to augment the data to help prevent overfitting and to make the model more generalizable. The feature extraction and classification was performed using transfer learning with a pre-trained MobileNetV2 architecture. The dataset was split in the ratio 70:15:15 for training, validation and testing respectively. The experimental results were evaluated by calculating accuracy, precision, recall, F1 score and ROC analysis. The proposed model successfully classified the soybean diseases with 96.14% accuracy, which shows the effectiveness of lightweight CNNs to recognize soybean diseases. The trained framework yields an automated and efficient early disease diagnosis solution, which requires minimal human involvement. Moreover, the low weight of the model makes it well suited for implementing real-time agricultural monitoring and smart farming applications. Although the study results are positive, the number of records and geographical diversity are limited. In order to make the framework more robust and generic, it will be crucial to expand the data set, add more illness classes, and test it under various climatic conditions.</em></p>Qamer Un NisaMuhamamd Usman JaveedAsim Ali RaoMuhammad NaumanWaheed Yousuf RamayZaira MarriamRabia Rasool
Copyright (c) 2026
2026-05-122026-05-1245743759HYBRID SEMI SUPERVISED MULTIMODAL YOLO11 FRAMEWORK FOR ROBUST SOLAR PHOTOVOLTAIC PANEL DEFECT DETECTION
https://thesesjournal.com/index.php/1/article/view/2769
<p><em>Solar photovoltaic systems have become one of the most important renewable energy technologies for sustainable power generation. However, photovoltaic panel defects such as cracks, hotspots, thick lines, and broken fingers significantly reduce energy conversion efficiency and increase operational maintenance costs. Existing deep learning based photovoltaic defect detection systems still suffer from several limitations including dependence on fully labeled datasets, weak robustness under environmental disturbances, insufficient small defect detection capability, and high computational complexity. To address these challenges, this paper proposes a Hybrid Semi Supervised Multimodal YOLO11 Framework for robust solar photovoltaic panel defect detection under real world environmental conditions. The proposed framework integrates semi supervised pseudo label learning, adaptive multimodal RGB and thermal feature fusion, lightweight YOLO11 optimization, environmental robustness enhancement, and explainable attention visualization within a unified architecture. The semi supervised learning mechanism improves rare defect representation using unlabeled photovoltaic data, while the adaptive multimodal fusion strategy combines structural and thermal information to improve hidden defect localization. Experimental results demonstrate that the proposed framework achieves superior performance compared with existing photovoltaic defect detection methods. The proposed model achieved a precision of 92.8 percent, recall of 90.1 percent, and mean average precision of 93.6 percent while maintaining low parameter complexity and efficient inference speed. Environmental robustness experiments further confirmed stable performance under illumination variation, shadow interference, thermal noise, and dust distortion conditions. The explainable visualization module improved transparency by highlighting the defect regions responsible for model predictions. Overall, the proposed framework provides an accurate, lightweight, interpretable, and deployment efficient solution for intelligent photovoltaic defect detection systems operating in real world environments</em></p>Asif Khalid QureshiSyed Faraz AfsarSarang AhmedAli MuhammadAriz Muhammad BrohiMuhammad Tahir
Copyright (c) 2026
2026-05-112026-05-1145760794ZTFORENSICS: ZERO TRUST POLICY ENFORCEMENT WITH TAMPER-EVIDENT FORENSIC EVIDENCE PACKAGING FOR HYBRID CLOUD ENVIRONMENTS
https://thesesjournal.com/index.php/1/article/view/2765
<p>Hybrid cloud environment reveals a serious lack in security: perimeter-based models of access control provide one-shot authentication, with no continuous verification, nor evidence collection that is tamper-proof. If valid credentials are gained by an adversary, they can eexfiltrate data, and then delete or manipulate audit logs, which can interfere with forensic accountability. Current Zero Trust Architecture (ZTA) solutions focus on enforcement and don’t necessarily tie every access decision to a legally admissible, cryptographically bound forensic record. This paper introduces an integrated framework, called ZTForensics, which enforces Zero Trust policy decisions in real time, and generates hash-chained forensic evidence records for every access event using the SHA-256 hash function. Enforcement of the framework is done by a FastAPI gateway, Open Policy Agent (OPA) with Rego policies, Keycloak identity management, PostgreSQL evidence storage and MinIO object packaging. Seven contextual risk factors are applied to each request, and the outcome of the decision (allow, deny, or challenge) is recorded as an integrity-linked forensic record. Long-term evidence integrity is provided by RSA-signed anchors and a chain-verification endpoint. In a simulated banking environment, correct decisions are validated, tamper detection on several attack vectors is ensured with no false positives and the evidence bundle export is structured according to legal rules. The challenges of reactive forensic collection and proactive decision-making are bridged by ZTForensics, creating trustworthy forensic evidence within cloud-based application programming interfacesin cloud API environments.</p>Ali ShanMuhammad AhmadHassaan IqbalAli Sufyan
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-122026-05-1245795810ADVANCED DEEP LEARNING METHODS TO ACCURATELY DETECT AND CLASSIFY PLANT DISEASE
https://thesesjournal.com/index.php/1/article/view/2725
<p><em>Plant diseases are a big challenge in the agricultural productivity, food security and livelihood of farmers around the world. Conventional disease detection techniques where human judgment plays a major role are usually lengthy, inaccurate and unavailable in the rural communities. In recent years, machine learning, and deep learning have also made it possible to develop automated plant disease detection systems that are able to identify and classify plant diseases correctly with the help of leaf images. This paper discusses how computational methods can be used to diagnose the diseases in plants through the analysis of the visual symptoms, including the discoloration of leaves, spots, and textural features. We used convolutional neural networks (CNNs) to train a model using a set of images of plant leaves labelled with different diseases to identify the disease of different crop species in the dataset. The suggested system is highly accurate and scalable, and it provides a practical application of the early detection of the disease and a prompt intervention. This type of technology can help a great deal in precision agriculture, lessen the waste of pesticides, and enhance real-time monitoring of the health of crops.</em></p> <p><a href="https://doi.org/10.5281/zenodo.20132276" target="_blank" rel="noopener">https://doi.org/10.5281/zenodo.20132276</a></p>*Farhan Ali
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-092026-05-0945811850DETECTION OF VULNERABILITIES IN AI-GENERATED SOFTWARE CODE USING TRANSFORMER-BASED MODELS
https://thesesjournal.com/index.php/1/article/view/2776
<p>The convenience of AI code generation technologies has increased software development productivity, but it also introduces a higher risk of insecure code. The code generated by AI is often syntactically correct, but semantically vulnerable, posing significant cyber security risks for enterprise, critical infrastructure, and consumer software environments. The research is qualitative doctrinal and exploratory in nature based on interpretivist epistemological approach and focuses on identification of vulnerabilities in software code generated by AI using transformer based models. The secondary data was collected systematically from scholarly journal articles, conference proceedings, cybersecurity databases, and trusted sources such as Common Weakness Enumeration (CWE) and Common Vulnerabilities and Exposures (CVE). Thematic content analysis and comparative qualitative analysis of the transformer architectures (BERT, CodeBERT, GraphCodeBERT, GPT-4, UniXcoder) for detection, classification and explanation of security vulnerabilities (SQL injection, buffer overflow, cross-site scripting (XSS), insecure authentication, memory management errors). The results demonstrate that the transformer-based models significantly outperform the classical static analysis tools in terms of accuracy in detecting errors, and that the models pre-trained on code corpora and enriched with data flow graph representations of the programs are the most effective. GraphCodeBERT and CodeBERT are found to be especially suitable for the classification of vulnerability, and GPT-4 exhibits its explainability through natural language reasoning. The study highlights persistent challenges such as false positive rates, lack of diversity in training material and poor explainability in certain architectures. The paper discusses practical implications for developers, security engineers and organizational policymakers, and makes recommendations for the integration of the transformer-based detection into the DevSecOps pipelines</p>Bashir KhanDr. Amber Sarwar HashmiSumaira RasoolSalman Khan
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-122026-05-1245851867THE HIDING OF IMAGE DATA USING S-BOXES BASED ON LINEAR FRACTIONAL TRANSFORMATION
https://thesesjournal.com/index.php/1/article/view/2778
<p>Digital image divulgence is being practiced as an indispensable approach to information channeling across the board. The credibility of images in the course of conveyance has now become a crucial assignment to work upon. Thereupon, significantly more surveillance has been devoted to contriving the non-linear component acknowledged as the substitution box (S-box), which possesses the caliber to outlast against unauthorized intrusion. The suggested piece of work comprises two portions; the first part proposes a technique to generate a secured non-linear component of block cipher (S-box) underpinning both the Galois field and the Mobius transformation. Besides, the elements of ????????(28) created via fastidious kinds of primitive irreducible elements are practiced for the realization of Mobius transformation. In the other portion, applications of the S-box are applied to digital image encryption centered on the Advanced Encryption Standard (AES) in MATLAB. Ultimately, the competence of the designed S-box is scrutinized through bringing into efficacious action of multiple procedures from literature to wit, strict avalanche criterion, nonlinearity, linear approximation probability, Histogram analysis, bit independence criterion, and differential approximation probability.”</p>Muhammad Asif
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-122026-05-1245868886MODEL-BASED OPTIMIZATION OF CARRIER ANAEROBIC BAFFLED REACTOR FOR MUNICIPAL WASTEWATER UNDER VARIABLE TEMPERATURES
https://thesesjournal.com/index.php/1/article/view/2781
<p><em>Variations in operating temperature can significantly affect the treatment behavior of an anaerobic baffled reactor (ABR), while experimental evaluation at low temperatures is challenging and resource-intensive. This study employed process modeling to assess and optimize the performance of a carrier anaerobic baffled reactor (CABR) across a range of temperatures. The model was calibrated using organics removal data and validated for suspended solids (SS) removal at HRTs of 24 to 6 h, with average absolute relative errors of 11.1% and 13.3%, respectively, within the acceptable 7–15% range. The validated model was then applied to predict the organics and SS removal at temperatures of 35, 32, 28, and 20 °C and HRTs from 24 to 6 h, identifying the optimum HRTs required to meet effluent standards. The results demonstrated that carrier media effectively enhance biomass retention, enabling efficient treatment at reduced HRTs, and provide a cost-effective strategy for compact and reliable decentralized wastewater treatment under variable operational temperatures.</em></p>Nadeem UllahMuhammad Wasim
Copyright (c) 2026
2026-05-132026-05-1345887907QUANTITATIVE EVALUATION OF MOTION CORRECTION ALGORITHMS IN MYOCARDIAL PERFUSION SPECT IMAGING USING CLINICAL AND SIMULATED DATA
https://thesesjournal.com/index.php/1/article/view/2782
<p><em>During myocardial perfusion SPECT imaging, patient motion remains a persistent challenge in SPECT imaging, often leading to image degradation and misinterpretation of perfusion defects. It is obvious that when motions occurred in patient studies, there were alterations in the perfusion defects in the anterior and inferior walls. Various Algorithms are used to correct the motion during imaging. In this study, three motion correction techniques that is Stasis, Hopkins, and Motion Detection and Correction (MDC) were evaluated using both simulated motion patterns and clinical data. Image reconstruction was performed using the ordered subset expectation maximization (OSEM) algorithm. Motion introduced artificial defects, particularly in the anterior and inferior myocardial regions. Among the evaluated techniques, MDC demonstrated superior performance, achieving an artifact reduction of approximately 85-92%, compared to 65-70% for Stasis and 50- 60% for Hopkins. Statistical analysis using one-way ANOVA confirmed that the improvement was significant (p < 0.05). These findings suggest that MDC provides a reliable framework for motion correction and may improve diagnostic confidence in clinical SPECT imaging.</em></p>Ayesha HumnaAyesha FarooqAsma SaleemHira Nayab
Copyright (c) 2026
2026-05-132026-05-1345908918AI-POWERED SELF-DECISIVE ALGORITHM FOR TWO-STEP QUASI-NEWTON METHODS
https://thesesjournal.com/index.php/1/article/view/2783
<p><em>The rapid evolution of machine learning has introduced a wide range of challenging and significant optimization problems. Various algorithms have been developed and trained to obtain optimal solutions for diverse problems in science, engineering, medicine, and related fields through machine learning techniques. In this context, fast gradient-driven optimization algorithms have become essential for computationally efficient model training. This study investigates an AI-powered self-decisive algorithm based on image-processing techniques for solving nonlinear unconstrained optimization problems. Different skipping strategies and search-direction modification techniques are incorporated within the framework of two-step quasi-Newton methods. Two test functions with different dimensions and initial points are examined using the fixed-point approach. The numerical simulations support the selection of the most suitable strategy for the proposed self-decisive algorithm in two-step quasi-Newton methods.</em></p>Farah JaffarNudrat AamirSyed IbrahimRoohi LailaSidra AmanMuhammad Touseef Irshad
Copyright (c) 2026
2026-05-132026-05-1345919934BLOCKCHAIN TECHNOLOGY AND SUSTAINABLE DEVELOPMENT GOALS: A SURVEY OF OPPORTUNITIES AND CHALLENGES IN PAKISTAN
https://thesesjournal.com/index.php/1/article/view/2785
<p><em>As a decentralized distributed ledger system, blockchain technology has evolved as a game-changing instrument with the promise to greatly advance Pakistan's Sustainable Development Goals (SDGs) by improving transparency, authenticity, accountability, efficiency, and inclusion in various governmental and private sectors. Blockchain-enabled decentralized tools can raise money for development sustainable infrastructure and inclusive financial growth in Pakistan, addressing work, growth and prosperity (SDGs 8) and Innovation and industrial growth (SDGs 9) while fostering authentic and transparent governance. The government's attempts to incorporate blockchain technology into the country's economy are evident in recent regulatory developments, such as the creation of the Pakistan pilot programs for digital currencies. Blockchain technologies promise to increase supply chain management processes transparency, especially in agriculture addressing Zero Hunger (SDG 2) and No Poverty (SDG 1), by lowering fraud and inefficiencies that impede sustainable production and consumption (SDG 12).Blockchain-integrated smart grids in the energy industry can achieve Renewable and accessible Energy (SDG 7) by enabling decentralized trades on energy and effective energy management in renewable process. Blockchain-based cross-border payment and remittance systems might enhance Digital transactions with financial inclusion (SDG 10). Regardless of these opportunities, blockchain's full potential in Pakistan's sustainable development goal would need addressing issues including capacity shortfalls, legislative uncertainty, and limits in technological infrastructure. This abstract emphasizes that although blockchain presents interesting avenues for Pakistan to achieve the SDGs, its implementation requires integrated policy frameworks, investments in digital infrastructure, and inclusive governance.</em></p>Talha AhsanAbdul Haseeb MalikMuhammad JunaidQazi Ejaz AliWaheed Ur RehmanMuhammad Haseeb
Copyright (c) 2026
2026-05-132026-05-1345935975EXPERIMENTAL INVESTIGATION OF COMPRESSIVE, TENSILE, AND FLEXURAL STRENGTHS OF NORMAL CONCRETE VS. GEOPOLYMER CONCRETE (6 MOLARITY) CONTAINING FLY ASH, NAOH, AND NA₂SIO₃
https://thesesjournal.com/index.php/1/article/view/2790
<p><em>This study experimentally investigates the mechanical performance of 6</em><em> </em><em>Molarity geopolymer concrete (GPC) made with Class</em><em> </em><em>F fly ash, sodium hydroxide (NaOH), and sodium silicate (Na</em><em>₂</em><em>SiO</em><em>₃</em><em>), under ambient curing conditions. The results are compared with M25</em><em>‑</em><em>grade ordinary Portland cement (OPC) concrete. Compressive strength (ASTM</em><em> </em><em>C39) was measured at 7, 14, and 28</em><em> </em><em>days; split tensile strength (ASTM</em><em> </em><em>C496) and flexural strength (ASTM</em><em> </em><em>C78) were assessed at 28</em><em> </em><em>days. Workability (slump), hardened density, and estimated CO</em><em>₂</em><em> emissions were also evaluated. The 6M GPC achieved a 28</em><em>‑</em><em>day compressive strength of 31.2</em><em> </em><em>MPa, reaching 95</em><em> </em><em>% of the OPC control (32.8</em><em> </em><em>MPa). However, its 7</em><em>‑</em><em>day strength was only 12.5</em><em> </em><em>MPa (58</em><em> </em><em>% of OPC), indicating a delayed strength gain. Split tensile strength of GPC (3.45</em><em> </em><em>MPa) exceeded that of OPC (3.21</em><em> </em><em>MPa) by 7.5</em><em> </em><em>%, and the tensile/compressive ratio was higher (0.111 vs.</em><em> </em><em>0.098). Flexural strength of GPC (4.5</em><em> </em><em>MPa) was 9.75</em><em> </em><em>% higher than OPC (4.1</em><em> </em><em>MPa), with a 38</em><em> </em><em>% greater deflection at peak load (0.58</em><em> </em><em>mm vs.</em><em> </em><em>0.42</em><em> </em><em>mm), demonstrating superior ductility and crack bridging. Slump was lower (70</em><em> </em><em>mm vs.</em><em> </em><em>85</em><em> </em><em>mm) but workable with superplasticizer, while hardened density was 3.3</em><em> </em><em>% lower. Estimated CO</em><em>₂</em><em> emissions per cubic metre were slightly higher for GPC (372</em><em> </em><em>kg vs.</em><em> </em><em>355</em><em> </em><em>kg) due to chemical production, but the full lifecycle benefits include eliminating cement and repurposing fly ash.</em></p> <p><em>It is concluded that 6M geopolymer concrete under ambient curing is a viable sustainable alternative for structural applications requiring moderate compressive strength (25–30 MPa) and high tensile/flexural performance, provided early‑age loading is not critical. The material is particularly suitable for foundations, pavements, and green building projects</em></p>Waqar AliRuhal Pervez MemonShoaib AhmedHassan NawazMohammad Usama
Copyright (c) 2026
2026-05-132026-05-1345976988CRUSTAL-SCALE STRUCTURAL EVOLUTION AND SYN-TECTONIC SEDIMENTATION IN RIFT BASINS: INSIGHTS FROM ANALOG MODELING AND BASIN ANALYSIS
https://thesesjournal.com/index.php/1/article/view/2791
<p><em>Identifying syn-tectonic deposition, demonstrating its link among buildings, and properly defining strata ranges to construct an appropriate allocated a specific foundation are all important factors in illuminating the timing and mechanics of continental distortion. With a better understanding of the relationship between rift basin growth and soil infiltration paths, the overall performance of locating resource potential near inactive rift borders should be enhanced. This research compiles existing data and gives recommendations for further research. The analysis has been revolving around different approaches that have been used including fold-and-thrust-belt, Syn-tectonic sedimentation on crustal-scale structure and analog modeling. Draining entrance sites can form along hyperbolic geometry basin margins if seismic activity exceeds deposition and excavation levels, pushing geological flow channels closer toward the slope. Streams and valleys may cauterise into transmission ramping at core level lowstands, giving circulation conduits from the basins border through to basins. Criticizing and fracture were the primary determinants of alignment and morphology, while basic input modifications influenced circulation. Inflows are restricted to the feet of the relaying ramps, where silt builds owing to reservoir geology, by flow barriers such as canals perpendicular to the ramps shaft. Outflow on the basal plane may resume if depositional elevation exceeds cutting levels, and supply of previous depocenters may halt. During rift boundary creation, channel ramp bordered cracks may join, causing relay slopes to be ruptured and buried. However, the impact of ongoing power levelling on escarpment fragments and associated syn-rift formations is uncertain, and more research is needed.</em></p>Ahmed KhalidBin DengFateh AliAli ImranAhmed MasroorAhmed Mansoor
Copyright (c) 2026
2026-05-132026-05-13459891005A ROBUST DEEP LEARNING FRAMEWORK FOR PREDICTING ACADEMIC SUCCESS USING ADVANCED NEURAL ARCHITECTURES
https://thesesjournal.com/index.php/1/article/view/2792
<p><em>In recent years, data mining techniques have gained significant attention in educational institutions for improving the quality of education and enhancing academic decision-making processes. Accurate prediction of student academic performance plays a vital role in identifying students at risk of poor achievement and supports the development of effective educational strategies. Numerous studies have focused on predicting student performance at the higher education level, as academic success in earlier semesters strongly influences students’ future learning progress and retention. In semester-based educational systems, many students experience academic difficulties or fail to achieve satisfactory grades during the initial stages of higher education. Therefore, the early prediction of student performance is essential for improving student retention and academic outcomes. Educational Data Mining (EDM) provides techniques for extracting meaningful information, hidden patterns, and valuable knowledge from large volumes of educational data. These extracted insights can be utilized to predict students’ future academic success and support timely interventions. The primary objective of this research is to evaluate student performance using multiple classification techniques and identify the model that achieves the highest predictive accuracy. The educational dataset used in this study is obtained from a Kaggle repository. The proposed methodology consists of several stages. Initially, the dataset undergoes preprocessing, including the removal of duplicate records and handling of missing values through appropriate data imputation techniques. Subsequently, three classification algorithms are implemented using the Weka data mining tool. These algorithms include Deep Learning-based Neural Networks (NN) and traditional machine learning techniques such as Random Forest (RF), Support Vector Machine (SVM). To enhance feature quality and reduce dimensionality, Principal Component Analysis (PCA) is applied for optimized feature extraction. Furthermore, the performance of all classification models is evaluated using a training–testing split validation available in the Weka environment. The models are assessed using standard performance evaluation metrics, including Training Accuracy, Testing Accuracy, Precision, Recall, and F1-Score. Experimental results indicate that the Neural Network and Random Forest classifiers outperform the SVM model in terms of predictive accuracy and overall classification performance</em></p>Sundas IsrarMuhammad Sajid MaqboolDr. Israr HanifMuqadas NadeemAbdul BasitAiman Ali Batool
Copyright (c) 2026
2026-05-132026-05-134510061022 ARTIFICIAL INTELLIGENCE BASED CONTROL SYSTEMS FOR ROBOTICS AND RENEWABLE ENERGY APPLICATIONS
https://thesesjournal.com/index.php/1/article/view/2797
<p>The high development rate and growing complexity of current systems in electrical engineering prompted the incorporation of artificial intelligence-oriented approaches to control mechanisms, which have the potential to manage non-linear dynamics, uncertainties of the system, and changing operating environments in an effective manner. In this manuscript, we present a detailed narrative review of recent progress in AI-based methods of control with a specific emphasis on the implementation of the methods in the context of robotics and renewable energy systems. The review discusses major intelligent control paradigms among them artificial neural networks, fuzzy logic control, reinforcement learning, evolutionary optimization techniques and hybrid intelligent control structure. The literature survey is organized by the main areas of applications including robotic motion and trajectory control, renewable energy conversion systems, microgrid operation, and power electronic applications. Moreover, the paper gives a critical comparison of AI-based controllers against the traditional control methods including Proportional Integral Derivative, Linear Quadratic Regulator, H∞ control highlighting the variations of adaptability, robustness, computational load, and model transparency. Significant constraints associated with data accessibility, operational safety, explainability as well as real-life use are also considered in more detail. Lastly, the we provide practical implications and the prospective areas of research that can be used to support the concept of creating dependable, effective, and scalable AI-based control solutions. The work is also meant to have a good reference to researchers and practitioners involved in intelligent control application in the field of robotics and renewable energy.</p>Muhammad IlyasFarhan AliAwais MaqsoodAbdul Basit Butt
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-132026-05-134510231034 EVALUATION OF AGRICULTURAL BY-PRODUCT NATURAL FIBERS FOR THE DEVELOPMENT OF SUSTAINABLE THERMAL INSULATION COMPOSITES
https://thesesjournal.com/index.php/1/article/view/2798
<p>To satisfy expanding economic and sustainability needs, there is an increasing demand for cutting-edge novel materials in the building sector. Natural fibers are a viable option for creating sustainable constructions as they are less expensive to produce than synthetic fibers and have better mechanical and thermal insulation properties. In addition, natural fibers are accessible, inexpensive, and have no impact on the environment, which makes them an appropriate green material option. While certain natural fibers, like kenaf or wood fiber, are somewhat commercialized, others are still being researched and are just in the early stages of development. The main goal of this study is to perform a comprehensive research on acceptable natural fibers for the building sector from agricultural by-products using multivariable analysis including analytical hierarchy approach, and a multi-criteria decision analysis. Analysis of the existing literature was done in comparisons according to the following criteria: availability, cost, modulus of elasticity, moisture content, compressive strength, morphology, and thermal conductivity. The study includes various natural fibres, while agricultural byproducts received special consideration because these preferences decrease the effects of disposal and transportation, respectively. Based on analysis, reed, banana and bagasse were proved to be the most favourable natural fibres for incorporation in development of green materials for the building industry.</p>Amna IqbalSana YounasAhmed Iqbal
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-132026-05-134510351067 A MACHINE LEARNING AND RULE-BASED HYBRID APPROACH FOR ADVANCED PERSISTENT THREAT DETECTION
https://thesesjournal.com/index.php/1/article/view/2799
<p>Advanced Persistent Threats present major risks to organizational security because attackers maintain access to target systems for extended periods while using sophisticated evasion methods. This study develops a hybrid intrusion detection framework that integrates signature-based rules with Isola- tion Forest for anomaly identification, combined with MITRE ATT&CK technique mapping to enhance threat recognition and forensic investigation. The proposed system applies feature extraction, signature matching, and machine learning-driven anomaly detection to analyze network flow records from the CIC- IDS-2017 dataset containing 2.8 million flows. Evaluation results demonstrate 92.6% accuracy, 91% precision, 89% recall, and an ROC-AUC score of 0.96. Performance comparisons are conducted against traditional signature-based tools using benchmark data.</p>Momina RehmanDr. Ali SufyanSana YounisKishwar Ishfaq
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-132026-05-134510681080 A NOVEL EARLY PREDICTION OF DIABETES MELLITUS SCREENING FRAMEWORK WITH ADVANCED DEEP LEARNING TECHNIQUES
https://thesesjournal.com/index.php/1/article/view/2800
<p>Diabetes Mellitus is a chronic and life-threatening disease still posing a significant burden on the health system worldwide. During recent years, there has been a surge in the development of machine learning (ML) models for predicting the risk of developing diabetes, although many of these models are considered as “black boxes,” that is, they are primarily concerned with prediction accuracy, and provide limited information about what is driving the decisions of their models. The lack of transparency reduces their usefulness in real world clinical situations, where understanding of the causes and risk factors is pivotal for optimal prevention and treatment. In an effort to overcome this limitation, the present study proposes Ensemble-based machine learning framework that enhances the prediction accuracy and specifies the most significant factors that play a crucial role in the onset of diabetes. A dataset with both clinical and demographic data was used to train and test multiple models including XGBoost, Random Forest, Support Vector Machines. The ensemble model proposed in this work showed an accuracy of 87% and good precision compared to some well-known models. The study focuses not only on the prediction but also the interpretability using SHapley Additive exPlanations (SHAP) values. Through this analysis, key predictors were identified and glucose, Body Mass Index (BMI), and age were identified as the most influential. These findings give a better idea of the effects of various factors on diabetes risk. Furthermore, the research results can help health care professionals in their decision making processes based on the facts. Early detection allows for more effective targeted prevention strategies. This can help to enhance the quality of healthcare for patients and optimize its resource use. Furthermore, the framework illustrates the use of accuracy and interpretability to foster trust in systems with AI. The methodology may be applied to other chronic diseases as well, making it a valuable contribution to the field of Intelligent Healthcare.</p>Henry Mukalazi SerugundaHafiz Muhammad IjazGhazanfar AliNadeem Akhtar Bukhari
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-132026-05-134510811098APPLICATION OF NANOMATERIALS FOR THE REMOVAL OF TOXIC ELEMENTS FROM WATER
https://thesesjournal.com/index.php/1/article/view/2804
<p><em>The water crisis which has been intensified by rapid industrialization and population explosion of the globe, has rendered conventional water treatment technologies such as coagulation and chemical precipitation inefficient, owing to the high cost of operation and the inability to remove trace amounts of non- biodegradable heavy metals. This article aims to review the tremendous potential of nanotechnology, which has been recognized as efficient, eco-friendly and cost effective water purification technology compared to conventional methods. This technology utilizes the physicochemical properties of nanomaterials such as their high surface area to volume ratio, high durability and surface properties, to enable efficient water purification mechanism such as physical and chemical Adsorption, surface complexion, ion exchange and electrostatic interactions. This article aims to discuss the potential of nanomaterials, ranging from metallic and metal oxide nanoparticles, carbon nanostructures like 0D, 1D, 3D and hybrid nanocomposites, which are capable of achieving water high purification efficiency rates of water contaminants such as chromium and lead etc. Additionally, critically operational factors such as pH, contact time and nanomaterials dosage are also reviewed along with environmental factors such as cytotoxicity, bioaccumulation and oxidative stress. Despite the challenges facing scalability and ecotoxicity, nanotechnology has emerged as cornerstone for achieving future water security by moving from filtration to molecular engineering.</em></p>Shahid MahmoodRazia IqbalMemona RehmanAmna NawazMinahil AzharAreeba Arif
Copyright (c) 2026
2026-05-142026-05-144510991109COMPARATIVE PERFORMANCE ANALYSIS OF MACHINE LEARNING AND ARTIFICIAL NEURAL NETWORK MODELS FOR HEART DISEASE FORECASTING
https://thesesjournal.com/index.php/1/article/view/2805
<p><em>Cardiovascular disease (CVD) is still the most common cause of death in the whole world. Identification of patients at risk on early stage is very important for making the decisions better clinically. In this study, four supervised machine learning models are used on the Cleveland cardiac Disease dataset. These models are Random Forest (RF), Support Vector Machine (SVM), eXtreme Gradient Boosting (XGBoost), and an Artificial Neural Network (ANN). </em></p> <p><em>All models were trained using same dataset split. Where 80% data was used for training and 20% for testing. The dataset consists of 303 patient records with 13 clinical and demographic features. Before training, the data was normalized by using z-score standardization.</em></p> <p><em>The results show that the Random Forest (RF) and XGBoost performed the best. Both achieving an accuracy of 98.53%. ANN achieves 94.14% and SVM achieves 88.78% with its default settings. Additional evaluation such as (ROC) analysis, precision-recall evaluation, and feature importance of Random Forest (RF) were used for better understanding of model performance. The findings show that ensemble tree-based methods work very effectively on these organized tabular clinica dataset. </em></p>Anam ZahoorRashid Mehmood GondalMuhammad Atif SultanAqsa EjazMuhammad SaqibMuhammad Waqas HaiderMuhammad Usman
Copyright (c) 2026
2026-05-142026-05-144511101124AN INTEGRATED BIOINFORMATICS AND COMPUTATIONAL DRUG DESIGN FRAMEWORK FOR TARGET IDENTIFICATION, LEAD DISCOVERY, AND LEAD OPTIMIZATION IN MODERN DRUG DEVELOPMENT
https://thesesjournal.com/index.php/1/article/view/2811
<p><em>The modern drug discovery paradigm has undergone a transformative shift with the integration of bioinformatics and computational drug design approaches. This comprehensive review examines the synergistic application of computational methodologies across the entire drug development pipeline, from target identification through lead optimization. We analyze recent advances in bioinformatics-driven target identification, including network-based approaches, machine learning algorithms, and multi-omics integration strategies. Virtual screening methodologies for lead discovery are evaluated, encompassing both structure-based and ligand-based approaches, with emphasis on emerging deep learning techniques. Lead optimization strategies utilizing free energy calculations, molecular dynamics simulations, and AI-driven generative models are critically assessed. Furthermore, we explore integrated frameworks that unify these computational approaches into cohesive pipelines, highlighting successful case studies across therapeutic areas including oncology, infectious diseases, and neurodegenerative disorders. Current challenges including data quality, model interpretability, and experimental validation are discussed alongside future directions emphasizing explainable AI, quantum computing applications, and personalized medicine approaches. This review demonstrates that the strategic integration of bioinformatics and computational drug design represents a powerful paradigm for accelerating drug discovery while reducing costs and improving success rates.</em></p>Anam TalatEmman fatimaSyeda Hadia TirmiziSaad WahabHadia
Copyright (c) 2026
2026-05-142026-05-144511251140REAL-TIME FRUIT RIPENESS CLASSIFICATION USING VOC PROFILING AND DECISION TREE ALGORITHMS: A SOLUTION FOR REDUCING POST-HARVEST LOSSES
https://thesesjournal.com/index.php/1/article/view/2812
<p><em>In Pakistan, the agricultural sector is vital for economic stability, particularly through fruit cultivation. However, accurately determining fruit ripeness poses a significant challenge, resulting in considerable post-harvest losses and diminished market value. Traditional ripeness assessment methods are often subjective, inconsistent, and labor-intensive. To address this issue, we developed an innovative AI-based fruit ripeness detection system utilizing MQ gas sensors. This system detects volatile organic compounds (VOCs) emitted by fruits during the ripening process, providing real-time data on gas concentrations associated with ripeness stages. The sensor data is processed using advanced artificial intelligence algorithms, specifically a decision tree model, to classify fruits as "ripe" or "unripe." By implementing this technology, farmers and vendors can significantly reduce post-harvest losses and enhance the quality of produce while improving overall supply chain efficiency. The decision tree algorithm effectively analyzes patterns in sensor data to make accurate predictions about fruit ripeness. This project represents a substantial advancement in modernizing agricultural practices in Pakistan, contributing to sustainable development and economic growth. The integration of cutting-edge sensor technology with machine learning not only addresses the critical challenges of fruit ripeness detection but also paves the way for innovative solutions in the agricultural sector.</em></p>Junaid AhmedUmair Ayaz KamangarAbdul Sattar Chan Zainab Umair Kamangar
Copyright (c) 2026
2026-05-142026-05-144511411152AI-GENERATED CONTENT AND THE FUTURE OF CREATIVE EMPLOYMENT: IMPACT ON CREATIVE SKILLS AND PROFESSIONAL OPPORTUNITIES IN DIGITAL INDUSTRIES
https://thesesjournal.com/index.php/1/article/view/2816
<p><em>Generative Artificial Intelligence (AI) technologies are ushering in a revolution in the creative sector changing the way content is generated, the way work is done and organized, and the nature of work in the digital space. The adoption of AI technologies, such as ChatGPT, Midjourney, Canva AI, and Runway, in the field of writing, design, marketing, and media production has become so widespread that there is a growing debate over whether machines or humans will be more creative in the future. The study looks at the implications of AI in generating content on creativity, employment, and career opportunities in the digital industry. Data collected were of a quantitative nature and the research design used was a cross-sectional type of research, which was conducted online and the number of respondents was 300, who were from different educational and background levels. Data were analyzed descriptively, reliability, correlation, and regression analysis in SPSS Version 26 software. Overall, the respondents had a positive perception of the AI-generated content, particularly in relation to its productivity, innovation, and professional development. There were strong positive correlations between AI-generated content and perceptions of future creative employment, and between human-AI collaboration and perceptions of future creative employment. But worries about job security, plagiarism, ethical issues and dependency on automation had a negative impact on employment sustainability perceptions in creative professions. The study supports Technological Determinism Theory and Human AI Collaboration Theory as they indicate that the AI technologies are not replacing human creativity but changing and redefining the creative work environment. The study contributes to the scholarly debate on the paradigm shift in the digital industries and highlights the importance of a balanced and comprehensive use of AI and human creativity.</em></p>Rida ZafarMaryam MansoorDr. Mian AsimSameen AmjadUsman Ehsan
Copyright (c) 2026
2026-05-142026-05-144511531165MAPPING VALUE STREAM AND POLICY FOR FURNITURE MAKING INDUSTRIES
https://thesesjournal.com/index.php/1/article/view/2818
<p><em>Exports are an important element in the economic growth and development in a country. Other products are traded in the international market in the form of agricultural products, textiles, machinery, and chemicals, to the highly advanced military equipment. Furniture can be described as a personalized product that is fundamental in homes, offices, hotels and other environments. In 2018, China had the largest share in the global furniture export market with a total sale amount of 28 billion dollars that is approximately 31.5% of the total furniture export. After China came Germany (9%), Italy (8.8%), Poland (6%), Vietnam (5.2%), the United States (3.4) in their respective shares of the world furniture export market that year. In 2018, the furniture exports of Pakistan were only 0.01% of the total furniture export market in the world. Some of the major challenges faced in this sector in Pakistan are; old technology in manufacturing, lead time, high cost of production and lack of government policies aimed at exploring and diversification of markets. Although the country has skilled craftsmen and other developments such as CPEC are underway, the country lacks advanced machinery and other relevant training programs. Moreover, the supply chain analysis and mapping of the furniture industry in terms of value chain is notably lacking. This research primarily aims to develop value chain mapping for the furniture industry along with appropriate mapping tools for analysis. The findings will contribute to establishing a value chain framework for Pakistan's furniture industry, which, when implemented, should result in increased exports, reduced imports, and enhanced overall revenue generation for the country.</em></p>Shabeer AhmedUsman GhaniKiran RaheelAhmad Junaid
Copyright (c) 2026
2026-05-142026-05-144511661179NEXORASCAN: A MACHINE LEARNING–DRIVEN CHROME EXTENSION FOR REAL-TIME DETECTION OF MALICIOUS WEBSITES AND BROWSER PERMISSION ABUSE
https://thesesjournal.com/index.php/1/article/view/2819
<p>Malicious websites and abusive browser permissions continue to pose serious cybersecurity threats to internet users, while traditional blacklist-based protection mechanisms often fail to detect newly emerging and adaptive attacks in real time. This paper presents NexoraScan, a machine learning–driven security framework designed to identify malicious websites and browser permission abuse through a lightweight and privacy-preserving approach. The proposed system consists of three integrated com-ponents: (1) a Google Chrome extension developed using Mani-fest V3 for real-time website monitoring, (2) a web-based URL scanning platform, and (3) a companion Android application for accessible cross-platform protection. The framework extracts six behavioural and infrastructural security features directly from active browsing sessions and evaluates them using supervised ma-chine learning models, including Random Forest (RF), Support Vector Machine (SVM), K-Nearest Neighbours (KNN), Decision Tree (DT), Logistic Regression (LR), and Naive Bayes (NB). Experimental evaluation was conducted on a balanced dataset containing 3,000 labelled website instances. Among all classifiers, Random Forest achieved the best performance with 95.0% accuracy, a macro F1-score of 0.95, and a ROC-AUC score of 0.98 under 5-fold cross-validation. Furthermore, real-time inference latency remained below 300 ms, making the solution suitable for practical browser-based deployment. Feature importance analysis demonstrates that SSL certificate validity, domain age, redirect behaviour, and JavaScript obfuscation indicators provide the strongest discriminative capability for malicious website detec-tion. The proposed framework uniquely combines infrastructure-level indicators with behaviour-level JavaScript analysis to detect both phishing-oriented and permission-abusing web activity. All prediction and analysis operations are executed locally within the browser environment, ensuring that no user browsing history or metadata is transmitted externally, thereby preserving user privacy and supporting GDPR-oriented data minimisation prin-ciples. Experimental results indicate that NexoraScan provides an effective, lightweight, and deployable solution for real-time malicious website detection on resource-constrained systems..</p>Muhammad AliMuhammad IshaqAsmia Mukhtiar
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-142026-05-144511801191HYBRID LEXICAL-SEMANTIC RETRIEVAL FOR IMPROVED ACADEMIC LITERATURE SEARCH
https://thesesjournal.com/index.php/1/article/view/2820
<p>The rapid growth of scientific publications has made accurate and comprehensive literature search a critical challenge for researchers. Traditional keyword-based search engines often miss relevant papers that use different terminology, while semantic embedding-based retrieval can overlook exact matches for domain-specific terms. To address this limitation, this paper proposes a hybrid retrieval approach that combines lexical BM25 matching with dense semantic embeddings using a weighted fusion score. The hybrid method aims to improve both recall and ranking quality in academic document search. Experiments are conducted on a curated dataset of 100 computer science papers from the arXiv repository. Retrieval performance is evaluated using Recall@5, Recall@10, and nDCG@10. Baseline comparisons include BM25-only and dense-only retrieval. Experimental results show that the hybrid approach achieves a Recall@10 of 0.85, outperforming BM25-only (0.72) and dense-only (0.74) baselines. The hybrid method also achieves the highest nDCG@10 score of 0.83, indicating better ranking quality. These findings demonstrate that combining lexical and semantic signals significantly improves literature search effectiveness without requiring complex multi-agent systems or citation verification. The proposed hybrid retrieval is lightweight, easy to implement, and suitable for integration into academic search engines and digital libraries.</p>Fawad KhanSaddam Hussain KhanHamad KhanTahir Hussain
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-142026-05-144511921203A PRIVACY-PRESERVING IOMT DIGITAL TWIN: INTEGRATING WEARABLE MULTIMODAL SENSING AND EDGE-DRL FOR PRECISION GERIATRIC CARDIOLOGY
https://thesesjournal.com/index.php/1/article/view/2821
<p>Physiological frailty and the ubiquitous clinical burden of polypharmacy significantly exacerbate cardiovascular disease in the geriatric population. Current guidelines for prescribing and static predictive models do not take into account the non-linear pharmacokinetics of the elderly that leads to adverse drug events, renal toxicity, and acute hemodynamic decompensation. To address the limitations of reactive clinical care, this paper presents the Cardio-Geriatric Digital Twin, an end-to-end privacy-preserving computational framework for continuous cardiovascular trajectory simulation and autonomous polypharmacy optimization. The proposed architecture fuses high-frequency IoT wearable telemetry with unstructured Electronic Health Records (EHRs) via a novel Cross-Modal Transformer fusion core, yielding a highly dynamic, context-aware patient replica. The medication titration process in this simulation environment is formulated as a Partially Observable Markov Decision Process (POMDP) and solved by a clinically constrained Proximal Policy Optimization (PPO) agent. To ensure strict data privacy and regulatory compliance, the agent is trained in a decentralized Federated Learning (FedPPO) protocol over distributed edge nodes. Crucially, the reinforcement learning policy is guided by a strict multi-objective reward function that independently penalizes renal degradation and hyperpolypharmacy, while ensuring the stability of vital hemodynamics. Extensive empirical evaluation on 65,420 simulated longitudinal profiles shows the decisive superiority of the framework over state-of-the-art predictive and heuristic baselines. The proposed model produced an unprecedented Trajectory RMSE of 4.20% and increased the Medication Adherence F1-Score to 0.89, generating disproportionate compliance gains within the highly vulnerable “Frail” patient stratum. And the framework achieved a robust 65.0% Hospitalization Aversion Rate in a 90-day simulation. We develop a highly scalable, empirically validated and interpretable at the level of features, computational framework for proactive, precision-driven geriatric cardiology using Shapley Additive Explanations (SHAP).</p> <p><strong>Keywords:</strong> Wearable MEMS Sensors, Digital Twin, Edge AI, Multimodal Sensing, Federated Learning, Geriatric Cardiovascular Care, and Deep Reinforcement Learning.</p>Muzammil KhanZhou ZhihaoFaiza Ayyob*Altaf Hussain
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-142026-05-144512041241A MICROSCOPIC STUDY OF THE MORPHOLOGICAL PROPERTIES OF NATURAL AND RECYCLED FIBERS FOR THE DEVELOPMENT OF COMPOSITE INSULATION MATERIALS
https://thesesjournal.com/index.php/1/article/view/2822
<p>The utilization of agricultural and recycled waste materials for sustainable composite development has gained significant attention in recent years. In this study, a hybrid composite reinforced with banana fibers and recycled fibers was developed and analyzed by microscopic for their suitability to incorporate in composites. Banana fibers were extracted from banana pseudo stems, while recycled fibers were obtained from textile spinning waste. Both fiber types were cleaned, processed, and incorporated into a composite system for analysis. Microscopic analysis was performed to investigate fiber distribution, surface morphology, interfacial bonding, and internal structural arrangement within the composite. The observations revealed that banana fibers exhibited rough and irregular surfaces with longitudinal grooves and fibrillar structures, while recycled fibers contributed to a heterogeneous but compact arrangement within the matrix. Improved fiber–matrix interaction was observed due to the surface roughness of both fiber types, which enhanced mechanical interlocking. The presence of micro-voids and porous regions was also identified within the composite structure. These morphological features suggest that the hybrid composite possesses reduced density and improved thermal insulation potential. The study indicates that the combination of banana and recycled fibers can be effectively utilized in sustainable composite development. The findings further demonstrate that agricultural and recycled waste materials can serve as viable alternatives to synthetic reinforcements. Microscopic evaluation provides essential insights into fiber compatibility and structural behavior, which are important for optimizing composite performance in future engineering and construction applications.</p>Amna IqbalSana YounasQasim Masood
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-142026-05-144512421251INTERPRETABLE PREDICTION OF STUDENT HAPPINESS USING SUPPORT VECTOR REGRESSION AND SHAP EXPLANATIONS
https://thesesjournal.com/index.php/1/article/view/2824
<p><em>Based on the UN Sustainable Development Goal 3 (Good Health and Well-Being), the proposed study constructs a machine-learning framework, which can be interpreted in a person-oriented way, to forecast Happiness level in a cohort of 1,500 university students using a psychosocial and demographic dataset obtained from Kaggle. Unlike previous research, in which most aggregate national well-being indexes are predicted based on black-box models, we model individual prediction and incorporate explainable AI to determine practical drivers of student well-being. Evaluation was done by the 5-fold cross-validation. Support Vector Regression (SVR) demonstrated the best generalization performance of indirect regressors (MAE = 0.0740, MSE = 0.0088) as far as RMSE is concerned (RMSE = 0.0937, R<sup>2</sup> = 0.6664) and adjusted R<sup>2</sup> = 0.6566). To make predictive accuracy policy-relevant, we utilized SHAP to measure the contributions of features. Social Support, Work-Life Balance, Work-related factors and Academic Stress emerged as the most influential predictors and Generosity and Financial Status generated lesser positive effects, whereas, Anxiety, Depression and Isolation had negative effects. Demographic factors (i.e. age and gender) did not have significant influence and it is thought that modification of psychosocial conditions plays the biggest role in explanatory power in this cohort. The implications of these findings would make a clear and implementable impact on the universities: a prioritization in stress-reduction programs, a reinforcement of peer-support structures, and a provision of focused financial support are likely to result in quantifiable improvement in the level of student well-being. In addition to the current use, the suggested architecture also proves the transferable prediction-to-intervention pipeline to evidence-based decision-making in education and population health setting in accordance with SDG-3.</em></p>Asifa IttfaqMuazzam AliM. U. HashmiAmna AshrafFatima Irshad
Copyright (c) 2026
2026-05-152026-05-154512521268MULTIMEDIA STEGANALYSIS USING HYBRID CNN AND TRANSFORMER
https://thesesjournal.com/index.php/1/article/view/2825
<p><em>Steganography enables covert communication by concealing secret information within digital multimedia content such as images, audio, and video files. The increasing misuse of steganographic techniques in cybercrime and covert communication underscores the urgent need for effective multimedia steganalysis systems. This study introduces a unified multimedia steganalysis framework utilizing a Hybrid CNN–Transformer architecture to detect hidden information across diverse multimedia modalities. The framework integrates the local feature extraction strengths of Convolutional Neural Networks (CNNs) with the global contextual learning capabilities of Transformer encoders to identify spatial, spectral, and temporal steganographic artifacts. Publicly available datasets, such as BOWSBASE, BOWS2, TIMIT, ESC-50, LibriSpeech, HMDB51, UCF-101, and Kinetics-400, are employed for experimental evaluation. The model is assessed using various embedding techniques and multimodal late fusion for final classification. Results indicate that the proposed framework outperforms standalone CNN and Transformer models, achieving an overall accuracy of 96.4%, and demonstrating enhanced robustness and generalization across multimedia modalities.</em></p>Aroob MukhtarFarhan HassanM. MadniUmar Daraz
Copyright (c) 2026
2026-05-132026-05-134512691281A HYBRID EFFICIENTNET-B4 AND SWIN TRANSFORMER V2 FRAMEWORK FOR LARGE-SCALE MALWARE FAMILY RECOGNITION
https://thesesjournal.com/index.php/1/article/view/2826
<p>Malware classification on a large scale is one of the most significant issues in modern cybersecurity research. The conversion of malware executables into grayscale PNG images for byte-level visualization allows the problem to be redefined as a computer vision problem solvable by deep learning networks. In this study, we introduce the EfficientNetB4-SwinV2 Ensemble method, which incorporates two parallel neural network backbones, EfficientNet-B4 (favouring local texture patterns) and Swin Transformer V2-Base (favouring global relation modeling), connected by a soft voting mechanism based on class probability prediction scores. We pre-trained both backbones on a highly deduplicated and harmonized dataset of 32,601 malware images distributed across 59 different malware families using the Malimg, Microsoft BIG 2015, and MaleVis datasets, resized to 256 × 256 pixels. The issue of severe class imbalance is addressed using an inverse frequency loss weight and WeightedRandomSampler, whose effectiveness is validated separately and jointly. EfficientNet-B4 obtains an accuracy of 98.40 ± 0.12% and macro AUC 0.9986, while Swin Transformer V2-Base reaches 98.55 ± 0.09% accuracy and macro AUC 0.9993, and the soft-voting ensemble obtains 98.67 ± 0.07% accuracy and macro AUC 0.9996, which obtains the best performance with the lowest Expected Calibration Error (ECE = 0.0094) after three independent training trials. The ablation study shows that soft voting significantly surpasses hard voting and learned stacking in the current experiment. Confusion pattern analysis indicates that there are three confusion clusters within each family group with visual similarity. Limitations, deployment considerations, and future work are elaborated upon in Section 6.</p> <p><strong>Keywords : </strong><em>Malware classification; Malware visualization; Deep learning; EfficientNet-B4; Swin Transformer V2; Ensemble learning; Soft voting; Calibration; Cybersecurity</em></p>Asif Khan*Saddam Hussain KhanMuhammad Saad SalmanAbdur RahmanMian Saeed Akbar
Copyright (c) 2026 Spectrum of Engineering Sciences
2026-05-152026-05-154512821304APPLICATION OF NANOMATERIALS FOR THE REMOVAL OF TOXIC ELEMENTS FROM WATER
https://thesesjournal.com/index.php/1/article/view/2827
<p><em>The water crisis which has been intensified by rapid industrialization and population explosion of the globe, has rendered conventional water treatment technologies such as coagulation and chemical precipitation inefficient, owing to the high cost of operation and the inability to remove trace amounts of non- biodegradable heavy metals. This article aims to review the tremendous potential of nanotechnology, which has been recognized as efficient, eco-friendly and cost effective water purification technology compared to conventional methods. This technology utilizes the physicochemical properties of nanomaterials such as their high surface area to volume ratio, high durability and surface properties, to enable efficient water purification mechanism such as physical and chemical Adsorption, surface complexion, ion exchange and electrostatic interactions. This article aims to discuss the potential of nanomaterials, ranging from metallic and metal oxide nanoparticles, carbon nanostructures like 0D, 1D, 3D and hybrid nanocomposites, which are capable of achieving water high purification efficiency rates of water contaminants such as chromium and lead etc. Additionally, critically operational factors such as pH, contact time and nanomaterials dosage are also reviewed along with environmental factors such as cytotoxicity, bioaccumulation and oxidative stress. Despite the challenges facing scalability and ecotoxicity, nanotechnology has emerged as cornerstone for achieving future water security by moving from filtration to molecular engineering.</em></p>Shahid MahmoodRazia IqbalMemona RehmanAmna NawazMinahil AzharAreeba Arif
Copyright (c) 2026
2026-05-162026-05-164513051315QUANTUM COMPUTING: THE NEXT REVOLUTION IN COMPUTER SCIENCE THE IMPACT OF AUGMENTED REALITY ON HUMAN-COMPUTER INTERACTION
https://thesesjournal.com/index.php/1/article/view/2828
<p><em>Quantum computational technologies are precipitating a foundational reconfiguration of computer science, challenging long-held assumptions regarding computational complexity, algorithmic design, cryptographic security, and system architecture. This article examines the theoretical, architectural, and algorithmic dimensions of quantum computation, situating contemporary advances within a rigorous academic framework. While noisy intermediate-scale quantum (NISQ) devices have demonstrated task-specific advantages, the transition to fault-tolerant quantum computing remains contingent upon breakthroughs in error correction, hardware scalability, and classical-quantum co-design. The article critically evaluates quantum algorithms, their implications for subfields including optimisation, machine learning, and simulation, and the imminent cryptographic disruption necessitating post-quantum standardisation. It concludes by outlining research imperatives, epistemic challenges, and socio-technical considerations that will shape the integration of quantum technologies into the broader computational ecosystem. The revolutionary potential of quantum computation lies not in the wholesale replacement of classical paradigms, but in the emergence of a hybrid, problem-tailored computational landscape that demands interdisciplinary rigour, empirical validation, and sustained theoretical innovation.</em></p>Muhammad Suleman KhanZaid WaliAli RazaAsra ilyas
Copyright (c) 2026
2026-05-162026-05-164513161333A COMPREHENSIVE REVIEW OF DEEP LEARNING ADVANCEMENTS IN EDUCATION: CHALLENGES AND FUTURE DIRECTIONS
https://thesesjournal.com/index.php/1/article/view/2829
<p><em>Deep learning (DL) has emerged as a transformative paradigm in education, enabling intelligent, automated, and data-driven solutions across multiple dimensions of teaching and learning. This review provides a comprehensive analysis of recent advancements in DL applications within education, focusing on personalized learning, automated evaluation, student performance prediction, sentiment analysis, and learning engagement. The paper highlights how DL models such as Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), Generative Adversarial Networks (GANs), and hybrid architectures are being employed to enhance teaching efficiency, optimize learning outcomes, and improve decision-making in educational institutions. Key contributions include the identification of novel frameworks for real-time monitoring, emotion detection, and cybersecure access to learning platforms. The review also examines major challenges such as dataset scarcity, lack of scalability, ethical concerns, and limitations in real-world integration. Finally, future research directions are outlined, emphasizing the development of unified, holistic, and ethically responsible DL-powered education systems that integrate academic, behavioral, and administrative functions.</em></p>Mashal TariqShehla AndleebRabbia Muhammad QasmiDurr-e-ShahwarHafsa AsifMuhammad Shakir Khan
Copyright (c) 2026
2026-05-162026-05-164513341343COMPARATIVE PERFORMANCE ANALYSIS OF MACHINE LEARNING AND ARTIFICIAL NEURAL NETWORK MODELS FOR HEART DISEASE FORECASTING
https://thesesjournal.com/index.php/1/article/view/2835
<p><em>Cardiovascular disease (CVD) is still the most common cause of death in the whole world. Identification of patients at risk on early stage is very important for making the decisions better clinically. In this study, four supervised machine learning models are used on the Cleveland cardiac Disease dataset. These models are Random Forest (RF), Support Vector Machine (SVM), eXtreme Gradient Boosting (XGBoost), and an Artificial Neural Network (ANN). </em></p> <p><em>All models were trained using same dataset split. Where 80% data was used for training and 20% for testing. The dataset consists of 303 patient records with 13 clinical and demographic features. Before training, the data was normalized by using z-score standardization.</em></p> <p><em>The results show that the Random Forest (RF) and XGBoost performed the best. Both achieving an accuracy of 98.53%. ANN achieves 94.14% and SVM achieves 88.78% with its default settings. Additional evaluation such as (ROC) analysis, precision-recall evaluation, and feature importance of Random Forest (RF) were used for better understanding of model performance. The findings show that ensemble tree-based methods work very effectively on these organized tabular clinica dataset. </em></p>Anam ZahoorRashid Mehmood GondalMuhammad Atif SultanAqsa EjazMuhammad SaqibMuhammad Waqas HaiderMuhammad Usman
Copyright (c) 2026
2026-05-162026-05-164513441358MULTI-AGENT AI ORCHESTRATION OF GRID-FORMING VIRTUAL POWER PLANTS FOR BIDIRECTIONAL EV ENERGY NETWORKS
https://thesesjournal.com/index.php/1/article/view/2839
<p><em>This manuscript proposes a novel AI-based framework that combines Virtual Power Plants (VPPs) with a bidirectional Electric Vehicle (EV) energy network, along with decentralized Multi-Agent System (MAS) orchestration for the future of renewable energy-dominated smart grids. The study addresses important issues of intermittency, low-inertia grid operation, integration of large numbers of EVs, and distributed energy coordination. A holistic Cyber-Physical Smart Grid Architecture is proposed that integrates Renewable Energy Systems, Battery Energy Storage Systems (BESS), Vehicle-to-Grid (V2G) technology, decentralized grid-forming inverter control, and reinforcement learning-based Artificial Intelligence orchestration in a decentralized operational environment. The framework uses autonomous intelligent agents to coordinate renewable generation and EV charging/discharging schedules, storage dispatch, and grid stabilization in real time. The simulation-based evaluation was performed under various operational conditions, including renewable variability stress, high EV penetration, and bidirectional V2G operation. Results show significant improvements in renewable energy use, operational Efficiency, voltage and frequency stability, peak demand reduction, and carbon emissions reduction over traditional central grid systems. The design framework resulted in a renewable utilization rate of>90%, reduced operational costs by up to 29%, and increased grid stability through adaptive grid-forming control mechanisms. In addition, coordinated EV fleets were effectively used as distributed mobile energy storage resources for peak shaving, ancillary services, and balancing renewable energy. The results validate the potential of AI-driven grid-forming VPPs as a scalable, resilient, and economically viable future option for low-carbon smart energy systems with significant renewable energy and EV uptake. The proposed framework offers valuable technical, operational, and policy lessons for utility operators, smart grid planners, and policymakers to support sustainable, intelligent energy transition strategies.</em></p>Salman AliMuhammad MoosaAftab AliRimsha ArainMuhammad Umar Memon
Copyright (c) 2026
2026-05-162026-05-164513591393