'International Journal of Experimental Research and Review ISSN 2455-4855 (Online) | Iaph (original) (raw)
Papers by 'International Journal of Experimental Research and Review ISSN 2455-4855 (Online)
International Journal of Experimental Research and Review, 2024
Cancer in breasts appears as a terrible malediction in society. It snitches huge human lives acro... more Cancer in breasts appears as a terrible malediction in society. It snitches huge human lives across the world and its peril is going to increase at a startling rate. Identification of this disease at the initial stages is indispensable. In many cases, traditional methods are prone to errors and protracted. Models applying machine learning approaches have been shown fruitful in this application area. There are large numbers of approaches in machine learning which demonstrate impressive results. This research strives to take out the short comings from the existing models and, by resolving the underlying technical issues, deliver higher accuracy in end results. The research motivates and endeavours to make the patients' treatment processes more justified and cost-effective. The research works with WDBC dataset for breast cancer, which is publicly accessible from the UCI research database. This study uses multiple individual learners, namely Support Vector Machines (SVM), Logistic Regression(LR), Random Forest(RF), Naive Bayes(NB), K-Nearest Neighbours(K-NN), Decision Tree(DT) and an ensemble learner called Gradient Boosting(GB) with multiple techniques of feature selection namely Principal Component Analysis (PCA) and Recursive Feature Elimination (RFE). The experimental techniques discern subtle patterns within the dataset. The proposed model evaluates the results and performances through metrics specificity, sensitivity and accuracy in a comparative structure. It succeeds with higher accuracy of 98%. The study highlights its potential as a significant tool in medical diagnostics.
International Journal of Experimental Research and Review, 2024
The detection of communities is an important problem in social network analysis, which has applic... more The detection of communities is an important problem in social network analysis, which has applications in various domains like sociology, biology, computer science, and marketing. In this context, genetic algorithms have proven to be effective in detecting communities by optimizing the modularity score of the network. The proposed work in this research paper uses an elitism-based genetic algorithm with some modified crossover and mutation techniques to detect communities in social networks. The proposed methodology incorporates the concepts of elitism, N-point crossover, and inverse mutation to enhance the effectiveness of genetic algorithms in solving optimization problems. The idea introduced in this article significantly extends the current understanding of optimization and evolutionary algorithms. We present an advanced methodology that leverages various genetic operators to improve the performance of a genetic algorithm in solving community detection problems in complex networks. Numerous research papers have extensively showcased the practicality of evolutionary and swarm-based algorithms in addressing real-world problems across diverse domains like viral marketing, link prediction, influence maximization, political polarization, etc. Hybridizing these algorithms with other optimization techniques has improved the performance and convergence speed, leading to enhanced optimization outcomes.
International Journal of Experimental Research and Review, 2024
A multi-variable relationship exists in Cognitive Radio Networks (CRNs) where factors such as Ene... more A multi-variable relationship exists in Cognitive Radio Networks (CRNs) where factors such as Energy efficiency, Throughput, Delay and Signal Noise Ratio (SINR) are related. The SINR shows the quality of the signal and is defined as the total power of a specific signal over the total power of an inter signal plus noise. This work proposes an effective energy and delay-efficient channel allocation strategy for CRNs (Cognitive Radio Networks) using Q-Learning and actor-criticism algorithms that maximize rewards. We also propose a Proximal Policy Optimization (PPO) algorithm that uses clipping of surrogate objectives to prevent large policy changes and ensure that the other parameters remain stable over time. We study the tradeoff between rewards, energy efficiency and other parameters and compare the algorithms with respect to the same. Results show that the proposed PPO method, while using optimally increased energy consumption, significantly reduces the delay, improves the thought and reduces the packet loss ratio for efficient channel allocation. This is positive with our findings shown in the results section and by comparing the proposed method with other algorithms to identify improved throughput and channel utilization. As the simulation results indicate that the PPO algorithm has very high throughput and significantly minimizes the delay and packet loss, it is suitable for application in all sorts of services such as video, imaging or M2M. The results are also compared with two of the existing channel allocation schemes and they confirm that the proposed algorithm performs better in terms of throughput discussed in one scheme and channel efficiency in the other.
International Journal of Experimental Research and Review, 2024
Obesity is one of the greatest public health challenges of the 21st century. In India, about 30-6... more Obesity is one of the greatest public health challenges of the 21st century. In India, about 30-65% of urban adults are obese. The aim of this study was to investigate neuromuscular electrical stimulation (NMES) markers as a therapeutic tool in diagnosing the effectiveness of exercise intervention in obesity at different levels in obese Class I and Class II male subjects. This randomized controlled study was conducted in the physiotherapy outpatient department of Madha Hospital, Kovur, Chennai. The study duration was 12 weeks. The sample of 30 men was divided into the class I obese men group, 15 nos, and the class II obese men group, 15 numbers. The subjects of age between 18-50 years were included in the study. The Class III Obese men associated with co-morbidities were excluded in this study. The data of Randomized controlled study include anthropometric measurements like height, weight, BMI, Hip circumference, Waist circumference, Thigh circumference, Waist to Height ratio, Waist to Hip ratio, Waist to Thigh ratio, Sagittal abdominal diameter, Abdominal Skin fold thickness, Thigh skin fold thickness and NMES markers. Paired t-test analysis was done. SPSS 20 version was used to analyze the collected data. The result was presented as mean and standard deviation. There was a significant difference in mean values at P?0.005 between different levels of study duration Class I and Class II obese men with neuromuscular electrical stimulator markers (NMES) as a Diagnostic tool. Thus, this study concludes that the NMES marker can be used as a therapeutic tool to analyze obesity.
International Journal of Experimental Research and Review, 2024
A field experiment was conducted at Agriculture Farm, Institute of Agriculture, Visva-Bharati, Sr... more A field experiment was conducted at Agriculture Farm, Institute of Agriculture, Visva-Bharati, Sriniketan, West Bengal, during kharif season of 2021-22 and 2022-23 in order to assess the performance of rice under various levels of nitrogen and silicon in red and laterite soil of West Bengal. The experiment was laid out in split plot design with three levels of nitrogen in main plots and five levels of silicon in sub plots. In the pooled data of both the years, among the main plots, treatment N3 with nitrogen applied @ 20 kg/ha as basal + LCC-4 at 20 kg N/ha had the highest plant height (141.24 cm), dry matter accumulation (1411.97 g/sq. m) at harvest, LAI at 60 DAT (4.88), number of tillers per sq. m at harvest (273.94), panicle length (29.04 cm), panicle number (263.82), number of filled grains (102.51), test weight (25.2 g), rice grain yield (6.17 t/ha), straw yield (8.25 t/ha), protein content (7.22%) and protein yield (446.78 kg/ha). Among the sub-plots, treatment S5 with SiO2 applied @ 400 kg/ha recorded the highest plant height (137.98 cm), dry matter accumulation (1337.48 g/sq. m) at harvest, LAI at 60 DAT (4.74), number of tillers per sq. m at harvest (272.91), panicle length (29.53 cm), panicle number (259.73), number of filled grains (105.37), test weight (25.3 g), rice grain yield (5.83 t/ha), straw yield (7.86 t/ha), protein content (7.65 %) and protein yield (445.81 kg/ha).
International Journal of Experimental Research and Review, 2024
This study deals with emerging machine learning technologies, deep learning, and Transformers wit... more This study deals with emerging machine learning technologies, deep learning, and Transformers with autoencode-decode mechanisms for image captioning. This study is important to provide in-depth and detailed information about methodologies, algorithms and procedures involved in the task of captioning images. In this study, exploration and implementation of the most efficient technologies to produce relevant captions is done. This research aims to achieve a detailed understanding of image captioning using Transformers and convolutional neural networks, which can be achieved using various available algorithms. Methods and utilities used in this study are some of the predefined CNN models, COCO dataset, Transformers (enc-BERT,dec-GPT) and machine learning algorithms which are used for visualization and analysis in the area of model’s performance which would help to contribute to advancements in accuracy and effectiveness of image captioning models and technologies. The evaluation and comparison of metrics that are applied to the generated captions state the model's performance.
International Journal of Experimental Research and Review, 2024
Internet of vehicle (IoV) is a new way of enhancing vehicle performance and communication. This p... more Internet of vehicle (IoV) is a new way of enhancing vehicle performance and communication. This paper investigates the technology and applications that drive its integration. By dealing with data privacy, speed, and sizing issues, IoV in electric vehicles enhances real-time data transfer, predictive maintenance and route optimisation. The research led to the Blockchain Six-Layer Centralised Architecture Model (BS-LCAM). This research mainly seeks to improve the efficiency and safety of electric vehicles (EVs) and IoV networks. The six levels comprising this paradigm are data link, physical, transport, application and security. For instance, this layer assures that information can be kept safe over networks but cannot get lost or modified without all parties' permission. All these stages are enhanced by blockchain technology. It provides a secure connection for the BS-LCAM framework, allowing fast data storage and seamless interoperability among several IoV constituents. For example, an intelligent mobility system’s flexibility (comprehensive BS-LCAM model) might benefit different areas such as autonomous driving, energy optimization, fleet management and user-centric services. This paper simulates the performance of BS-LCAM model in diverse environments with 97.6 % performance and scalability, 98.7% route optimization and route optimization.
International Journal of Experimental Research and Review, 2024
This study examines hostile cyber-physical assaults on autonomous systems and proposes a novel ap... more This study examines hostile cyber-physical assaults on autonomous systems and proposes a novel approach. The recommended strategy integrates many domains, evaluates data quantitatively, and emphasizes real-world applications. A detailed comparison of six conventional approaches is underway. Four graphics depict the comparative study and how the recommended strategy would handle cyber-physical hostile assault challenges. The recommended solution utilizes multidisciplinary research, prioritizes quantitative reviews, and demonstrates its practical application and adaptability to various security challenges, thereby establishing a robust framework. Our analysis highlights the key advantages of the recommended technique by comparing six well-known methodologies. Clear illustrations support these findings, demonstrating the potential of the approach. The grid shows its strong presence across important criteria, and the radar image shows its success in data-driven analysis, freedom, and real-world application. The scatter plot illustrates that the technique is flexible and quantitative review-friendly, supporting its approach. Results reveal that the recommended technique is powerful and comprehensive. This helps analyze and mitigate cyber-physical hostile assaults on autonomous systems. Visualizations make it simple and fast to compare the recommended method to regular methods and discover its substantial advantages. This research will make autonomous systems safer and more resistant to new threats, ensuring their safety and consistency.
International Journal of Experimental Research and Review, 2024
In today's digitally connected world, Distributed Denial of Service (DDoS) attacks remain a formi... more In today's digitally connected world, Distributed Denial of Service (DDoS) attacks remain a formidable challenge, undermining the stability of network infrastructures and demanding robust detection strategies. This research explores advanced methodologies for DDoS detection by conducting a comparative analysis of machine learning and deep learning approaches using the CICDDoS2019 dataset. Initially, a hybrid machine learning framework is implemented, integrating K-Means clustering for pre-labeling the dataset and employing supervised models such as Random Forest (RF), Extreme Gradient Boosting (XGBoost), Adaptive Boosting (AdaBoost), Support Vector Machine (SVM), and Artificial Neural Network (ANN). This approach achieves an accuracy of 99.46%, showcasing its effectiveness while highlighting challenges like manual feature selection and limited scalability for complex datasets. A novel hybrid deep learning architecture is proposed to overcome these challenges, combining Convolutional Neural Networks (CNN) for spatial feature extraction and Long Short-Term Memory (LSTM) networks for temporal sequence learning. This automated feature extraction mechanism eliminates reliance on manual intervention, ensuring adaptability to evolving attack patterns. The proposed CNN-LSTM model demonstrates an impressive accuracy of 99.84%, significantly outperforming traditional machine learning models. Additionally, the model's adaptability and resilience against dynamic attack behaviours position it as a reliable solution for real-time DDoS mitigation. This study emphasizes the growing relevance of deep learning techniques in enhancing cyber security and underscores the potential of hybrid architectures in effectively detecting and mitigating modern cyber threats. The findings provide valuable insights into developing scalable, high-performance systems capable of addressing the ever-evolving nature of DDoS attacks.
International Journal of Experimental Research and Review, 2024
A novel and adaptable forming method, known as Single Point Incremental Forming (SPIF), has emerg... more A novel and adaptable forming method, known as Single Point Incremental Forming (SPIF), has emerged to meet the growing needs of the manufacturing industry. This technique is precious for producing innovative products from sheet metal, offering enhanced flexibility and precision in fabrication. The input parameters during the forming process play a crucial role in determining the final product's formability in terms of maximum formable depth (MFD) and surface quality in terms of average surface roughness (Ra). The present research aims to optimise the forming parameters for formability and surface quality during SPIF of material AZ31 magnesium alloy. In the present research, AZ31 is selected as the target material due to its widely recognized excellent strength-to-weight ratio, making it ideal for lightweight applications in the automotive, aerospace, and electronics industries. The experiments are accomplished based on the use of a Taguchi's design of experiments. The tool diameter (TD), tool rotational speed (TRS), tool feed rate (TFR), and incremental step depth (ISD) were chosen as variable parameters and, keeping other parameters constant for maximum formable depth and average surface roughness as response parameters. The best parameter settings were found, and the statistically significant parameters of the responses were examined using the Analysis of Variance (ANOVA). The results revealed that the maximum formable depth (23.5 mm), representing the material's formability, increases with larger tool diameters (10 mm and 12 mm), and higher tool rotational speeds (5000 rpm and 6000 rpm) but decreases with higher tool feed rates (500 mm/min and 600 mm/min). On the other hand, surface roughness improves (decreases) with higher tool rotational speeds, while it increases with larger tool diameters, higher tool feed rates and greater incremental step depths. Furthermore, the findings of a confirmation experiment using the optimal conditions showed a good agreement with the experimental observation. Additionally, linear regression models for predicting the maximum forming depth and average surface roughness were developed by applying the response surface methodology (RSM), which also had good agreement with experiments conducted on optimal parameters.
International Journal of Experimental Research and Review, 2024
A low-power strategy that can manage analogue, digital, and RF functionalities on a similar chip ... more A low-power strategy that can manage analogue, digital, and RF functionalities on a similar chip is crucial for wireless systems. Various difficulties restrict the widespread adoption of CMOS power amplifiers despite the fact that they provide highly integrated, low-cost wireless communication. Some of the main issues with CMOS power amplifiers include non-linearity, low breakdown voltage, a lack of high-voltage capacitors, and incorrect RF models. The RF signal is amplified without distortions using a linear power amplifier (LPA), which is less effective whenever driven by constant voltage. In order to significantly enhance the effectiveness of the power amplifiers, three frequently utilised techniques—Doherty, envelope elimination and restoration (EER), and envelope tracking (ET) techniques are reviewed in this work. Results point towards ET approach as the one that is ideally suited for future mobile communication systems. The essential component of ET systems, the envelope tracking power source, is what determines how effectively the system functions. It also lists the benefits of FinFET technology over CMOS and looks at three well-liked techniques for increasing power amplifier efficiency. Considering the advent of mobile communications systems, the frequency band and peak-to-average power ratio (PAPR) are quickly growing, posing significant design issues. FinFET as an alternative may considerably reduce the chip area.
International Journal of Experimental Research and Review, 2024
Data plays a crucial role in the contemporary era of technology, as it is a vital element in the ... more Data plays a crucial role in the contemporary era of technology, as it is a vital element in the publication of news on the internet or a website. Nevertheless, understanding long reports in order to fully comprehend events can be a challenging endeavor, frequently leading to subjective judgments. The application's architecture integrates the categorization of news stories by day, resulting in a well-organized and readily accessible archive. The application employs the web scraping method, which entails pulling pertinent news articles from numerous internet sources. The application employed sophisticated summarizing libraries, including the BERT, BART, T5 model and Google Pegasus, to condense the information into a succinct and comprehensible style. The T5 model performs exceptionally well in text summarization and other natural language processing tasks because of its text-to-text structure; it is also a very customizable language model. Google Pegasus, an expert in abstractive summarizing, uses self-attention mechanisms and rigorous pre-training to generate high-quality, concise news summaries. To summarize, these are the most important parts of our app's process. When it comes to collecting, storing, and summarizing news articles, the system has you covered. In addition, it will offer a straightforward design that makes it simple to browse past news stories and their summaries.
International Journal of Experimental Research and Review, 2024
The present study conducted a bibliometric analysis to synthesise available literature on cyberse... more The present study conducted a bibliometric analysis to synthesise available literature on cybersecurity. The analysis identifies prolific authors, relevant sources, affiliations, nations, trend topics, publication trends, themes, and collaboration patterns among countries. Scopus database was searched using the keyword “cyber security” and the search resulted in 31,852 articles. After the inclusion and exclusion criteria, a total of 733 documents were extracted in a CSV format for analysis. The Biblioshiny web tool of the R-package was used for analysis. The finding shows an exponential rise in scholarly work on cybersecurity. “Information and computer security” is identified as the most relevant source. Chen H has published most articles and the USA leads the field in terms of publication, global citations, and collaboration with other countries. The trend topics and themes have also been listed. The study would help readers understand the landscape of cybersecurity research, identify key contributors, and guide them in addressing the challenges posed by cyber threats.
International Journal of Experimental Research and Review, 2024
In urban context, wetlands play a proven role in retaining pollution, alleviating the impacts of ... more In urban context, wetlands play a proven role in retaining pollution, alleviating the impacts of climate change, and enhancing the well-being of citizens. Managing these ecosystems requires formulating an integrated development policy reconciling protection, enhancement, and the satisfaction of economic and social needs. Achieving this goal requires the establishment of a holistic and collaborative vision that engages all stakeholders. This paper focuses on rehabilitating the El Oulfa Pond located in Casablanca, outlining the role of civil society and scientists in producing quantitative and qualitative data. This work aims to facilitate an informed decision-making process concerning the pond's development. The quantitative analysis enabled us to characterize the water quality and identify the biodiversity of El Oulfa pond. The outcomes underscore that 60% of the bird species whose presence around the pond is confirmed require a habitat in a wetland. Some species identified are classified as rare, or endangered. 23% of the recorded plants are indicative of a natural area of ecological Interest. These characteristics are important to take into consideration when developing the site. The biodiversity in El Oulfa pond could be jeopardized by significant water pollution. Indeed, the parameters of surface water monitored between 2015 and 2021 show concentrations that exceed the Moroccan thresholds relating to fish water quality. This suggests that for these parameters, the ecosystem does not provide conditions conducive to the maintenance of biodiversity. This pollution also poses challenges for residents who have expressed, during the open forum and interviews conducted in this study, the critical need for depollution and securing the pond before initiating any development. Building upon the study's findings, development strategies have been proposed to enhance the site and promote collective stewardship.
International Journal of Experimental Research and Review, 2024
Plant disease identification is crucial to food security and agricultural product availability. T... more Plant disease identification is crucial to food security and agricultural product availability. Traditional disease diagnosis can be tedious, annoying, and inaccurate. The investigation examines how modern machine learning algorithms might improve plant disease diagnostics for efficacy and precision. Despite this, machine learning faces many obstacles, including model training, processing costs, and rising demand for large data sets. This study proposes a novel method called Automated Machine Learning Classification Framework (AMLCF) to predict crop yield and detect pest patterns. This framework simplifies model selection, hyperparameter adjustment, and feature engineering for non-experts. The amount of time and computational resources needed have additionally been greatly reduced. The suggested AMLCF is evaluated on different unique agricultural datasets to validate its plant disease detection versatility. Our extensive simulation analysis found that AMLCF exceeds existing machine learning methods in speed, accuracy, and usability. AMLCF's detailed demonstration shows this; besides predicting plant illnesses, this system can predict crop yield and detect pests. Those findings suggest AMLCF could transform farming. Better plant health monitoring, early disease identification, and farmer selection could be achieved. The experimental results show that the proposed AMLCF model increases the accuracy ratio by 92.6%, computational efficiency analysis by 97.4%, versatility analysis by 98.3%, user accessibility ratio by 99.1%, and crop health tracking analysis by 94.8% compared to other existing models.
International Journal of Experimental Research and Review, 2024
Classifying tumors from MRI scans is a key medical imaging and diagnosis task. Conventional featu... more Classifying tumors from MRI scans is a key medical imaging and diagnosis task. Conventional feature-based methods and traditional machine learning algorithms are used for tumor classification, which limits their performance and generalization. A hybrid framework is implemented for the classification of brain tumors using MRIs. The framework contains three basic components, i.e., Feature Extraction, Feature Fusion, and Classification. The feature extraction module uses a convolutional neural network (CNN) to automatically extract high-level features from MRI images. The high-level features are combined with clinical and demographic features through a feature fusion module for better discriminative power. The Support vector machine (SVM) was employed to classify the fused features as class label tumors by a classification module. The proposed model obtained 90.67% accuracy, 94.67% precision, 83.82% recall and 83.71% f1-score. Experimental results demonstrate the superiority of our framework over those existing solutions and obtain exceptional accuracy rates compared to all other frequently operated models. This hybrid deep learning framework has promising performance for efficient and reproducible tumor classification within brain MRI scans.
International Journal of Experimental Research and Review, 2024
Negotiation is a fundamental part of daily life, serving as a key tool for achieving success acro... more Negotiation is a fundamental part of daily life, serving as a key tool for achieving success across various domains. This study shifts the focus from merely successful negotiation to the concept of "wise negotiation.’ Central to this exploration is the investigation of gender differences in both the practice and conceptualization of wise negotiation. This study's novelty lies in distinguishing wise negotiation from traditional notions of success, revealing nuanced gender-specific approaches within a specialized cohort of recognized negotiators. Using a mixed-method research design, the study gathered implicit and explicit data from 20 women and 31 men, all recognized by peers for their qualities as wise negotiators. Data collection included interviews, negotiation tasks, and questionnaires, analyzed through both qualitative and quantitative methods. Findings revealed distinct gendered perspectives on wise negotiation. Implicit data showed that women often approached wise negotiation as a relational and holistic endeavor, prioritizing the well-being of all parties. Men, however, tended to adopt a more pragmatic view, focusing on efficiency and practical outcomes. Explicit data from questionnaires further highlighted gender disparities. A Mann-Whitney U test identified significant differences in wisdom-related variables, such as openness, emotionality, adaptive performance through training and learning, and emotional management. Notably, no gender differences emerged in a negotiation simulation task. Examining gendered views of negotiation is particularly critical because gender shapes interpersonal dynamics, communication styles, and decision-making processes more profoundly than many other demographic variables. Understanding these differences is essential for developing inclusive strategies that transcend stereotypes, ensuring equitable participation and outcomes. Women emphasized relational dynamics, while men leaned toward practicalities, suggesting a divergence from traditional gender roles. These insights can inform the development of gender-inclusive policies and training programs that cultivate wisdom in negotiation, fostering a balanced and equitable approach across diverse contexts. This focus on gendered nuances enriches our understanding of wise negotiation and its potential applications.
International Journal of Experimental Research and Review, 2024
Hypertension is a common condition of cardiovascular disease that poses significant health challe... more Hypertension is a common condition of cardiovascular disease that poses significant health challenges among the public on a larger scale globally. It is important to accurately predict the risk of hypertension to save people and improve overall quality of life. Traditionally, the detection of hypertension relies on clinical criteria such as blood pressure measurement and examination of medical history. However, these methods have drawbacks involving potential human error, time consumption, and the possibility of missed diagnoses. The paper aims to identify the features or symptoms of hypertension disease and predict its risk factors using machine learning algorithms. Apart from this, it is of utmost importance to identify the symptoms as they play a pivotal role in recognizing the type of risk for hypertension. To successfully conduct the work, a dataset of 13 attributes, including gender, age, smoking habits, etc, has been used, which is further visualized graphically to understand the pattern among them. Later, multiple machine learning-based learning techniques have been applied and examined on the basis of standard metrics. Results indicate that random forest models outperform existing approaches, achieving an accuracy of 87.26% in predicting low and high-risk hypertension. Furthermore, classification reports reveal superior precision, recall, and F1-score for random forests compared to alternative models. Insights from learning curves and confusion matrices provide a valuable understanding of model performance and data sufficiency. Overall, this research highlights the impact of machine learning in accurately predicting the risk of hypertension and underscores the importance of ongoing research efforts to translate these findings into practical clinical applications.
International Journal of Experimental Research and Review, 2024
Blockchain technology’s secure, decentralized platforms have revolutionized multiple industries. ... more Blockchain technology’s secure, decentralized platforms have revolutionized multiple industries. This paper discusses possible problems with employing blockchain technology to ensure security. This poses a question about scalability, privacy, and regulatory compliance. It suggests an approach to Blockchain-based Digital Signature Security Analysis (B-DSSA). This solution increases digital signature algorithms using the transparency and immutability of blockchain. This engineering makes it possible for electronic communications to be valid, intact and non-repudiated, making it applicable for secured settings like healthcare, finance, supply chain management etc. As results indicate, substantial advances have been made in preventing unauthorized access and tampering in digital transactions. By combining public and private blockchains, this method achieves scalability while protecting the privacy of sensitive data. This configuration makes it possible for real-time applications in healthcare and finance by optimizing resource utilization, maintaining high data integrity, and enabling speedier processing. Document signing systems, identity verification, and contract execution are some of the examples showing the flexibility and endurance of blockchain-based security solutions through B-DSSA. The paper reveals how blockchain technology may revolutionize the field of safety, leading to further research on marketing orientation issues.
International Journal of Experimental Research and Review, 2024
The railway infrastructure is a perfect blend of all branches of engineering. Technology has dras... more The railway infrastructure is a perfect blend of all branches of engineering. Technology has drastically increased, mainly in the Signalling, Civil, Electrical and Mechanical engineering streams. In the field of Signalling, it has leaped from Mechanical to Electronics Interlocking. Civil engineering has gone from manual track maintenance to high-end mechanized tools. In the field of mechanical engineering, it has progressed from wooden coaches to modern designed (LHB)coaches. In electrical engineering, the technology changed from steam loco to diesel and later electrically powered loco design. Nowadays, the loco is powered by nonconventional sources (like solar or wind). Hence, the maintenance of rail infrastructure with traditional methods of physical supervision is sluggish and prone to frequent failures. However, condition monitoring through manual verification by railwaymen helps in the upkeep of railway assets, but it is prone to human errors and subsequent failure of systems, which may result in disasters like derailments or head-on collisions. Signalling components (Point, Tracks, Signals, OFC, DNS, Dataloggers, EI, Block-working instruments etc.), Mechanical rolling stock (Carriage and Wagon), Electrical fitments (OHE, AT, DG, TSS/SP/SSP, locomotive etc.) and Civil Structures (Bridges, Culverts, rail-tracks, rail beds) all require regular maintenance. This paper introduces a “condition monitoring method Enhanced with Wireless Sensor Network” for railway infrastructure. It also clearly distinguishes the typical category of sensors, which are best suitable for condition monitoring of various types of railway assets. The proposed WSN-based technology will improve railway subsystem reliability, effectively reducing failure time and improving the operating ratio.
International Journal of Experimental Research and Review, 2024
Cancer in breasts appears as a terrible malediction in society. It snitches huge human lives acro... more Cancer in breasts appears as a terrible malediction in society. It snitches huge human lives across the world and its peril is going to increase at a startling rate. Identification of this disease at the initial stages is indispensable. In many cases, traditional methods are prone to errors and protracted. Models applying machine learning approaches have been shown fruitful in this application area. There are large numbers of approaches in machine learning which demonstrate impressive results. This research strives to take out the short comings from the existing models and, by resolving the underlying technical issues, deliver higher accuracy in end results. The research motivates and endeavours to make the patients' treatment processes more justified and cost-effective. The research works with WDBC dataset for breast cancer, which is publicly accessible from the UCI research database. This study uses multiple individual learners, namely Support Vector Machines (SVM), Logistic Regression(LR), Random Forest(RF), Naive Bayes(NB), K-Nearest Neighbours(K-NN), Decision Tree(DT) and an ensemble learner called Gradient Boosting(GB) with multiple techniques of feature selection namely Principal Component Analysis (PCA) and Recursive Feature Elimination (RFE). The experimental techniques discern subtle patterns within the dataset. The proposed model evaluates the results and performances through metrics specificity, sensitivity and accuracy in a comparative structure. It succeeds with higher accuracy of 98%. The study highlights its potential as a significant tool in medical diagnostics.
International Journal of Experimental Research and Review, 2024
The detection of communities is an important problem in social network analysis, which has applic... more The detection of communities is an important problem in social network analysis, which has applications in various domains like sociology, biology, computer science, and marketing. In this context, genetic algorithms have proven to be effective in detecting communities by optimizing the modularity score of the network. The proposed work in this research paper uses an elitism-based genetic algorithm with some modified crossover and mutation techniques to detect communities in social networks. The proposed methodology incorporates the concepts of elitism, N-point crossover, and inverse mutation to enhance the effectiveness of genetic algorithms in solving optimization problems. The idea introduced in this article significantly extends the current understanding of optimization and evolutionary algorithms. We present an advanced methodology that leverages various genetic operators to improve the performance of a genetic algorithm in solving community detection problems in complex networks. Numerous research papers have extensively showcased the practicality of evolutionary and swarm-based algorithms in addressing real-world problems across diverse domains like viral marketing, link prediction, influence maximization, political polarization, etc. Hybridizing these algorithms with other optimization techniques has improved the performance and convergence speed, leading to enhanced optimization outcomes.
International Journal of Experimental Research and Review, 2024
A multi-variable relationship exists in Cognitive Radio Networks (CRNs) where factors such as Ene... more A multi-variable relationship exists in Cognitive Radio Networks (CRNs) where factors such as Energy efficiency, Throughput, Delay and Signal Noise Ratio (SINR) are related. The SINR shows the quality of the signal and is defined as the total power of a specific signal over the total power of an inter signal plus noise. This work proposes an effective energy and delay-efficient channel allocation strategy for CRNs (Cognitive Radio Networks) using Q-Learning and actor-criticism algorithms that maximize rewards. We also propose a Proximal Policy Optimization (PPO) algorithm that uses clipping of surrogate objectives to prevent large policy changes and ensure that the other parameters remain stable over time. We study the tradeoff between rewards, energy efficiency and other parameters and compare the algorithms with respect to the same. Results show that the proposed PPO method, while using optimally increased energy consumption, significantly reduces the delay, improves the thought and reduces the packet loss ratio for efficient channel allocation. This is positive with our findings shown in the results section and by comparing the proposed method with other algorithms to identify improved throughput and channel utilization. As the simulation results indicate that the PPO algorithm has very high throughput and significantly minimizes the delay and packet loss, it is suitable for application in all sorts of services such as video, imaging or M2M. The results are also compared with two of the existing channel allocation schemes and they confirm that the proposed algorithm performs better in terms of throughput discussed in one scheme and channel efficiency in the other.
International Journal of Experimental Research and Review, 2024
Obesity is one of the greatest public health challenges of the 21st century. In India, about 30-6... more Obesity is one of the greatest public health challenges of the 21st century. In India, about 30-65% of urban adults are obese. The aim of this study was to investigate neuromuscular electrical stimulation (NMES) markers as a therapeutic tool in diagnosing the effectiveness of exercise intervention in obesity at different levels in obese Class I and Class II male subjects. This randomized controlled study was conducted in the physiotherapy outpatient department of Madha Hospital, Kovur, Chennai. The study duration was 12 weeks. The sample of 30 men was divided into the class I obese men group, 15 nos, and the class II obese men group, 15 numbers. The subjects of age between 18-50 years were included in the study. The Class III Obese men associated with co-morbidities were excluded in this study. The data of Randomized controlled study include anthropometric measurements like height, weight, BMI, Hip circumference, Waist circumference, Thigh circumference, Waist to Height ratio, Waist to Hip ratio, Waist to Thigh ratio, Sagittal abdominal diameter, Abdominal Skin fold thickness, Thigh skin fold thickness and NMES markers. Paired t-test analysis was done. SPSS 20 version was used to analyze the collected data. The result was presented as mean and standard deviation. There was a significant difference in mean values at P?0.005 between different levels of study duration Class I and Class II obese men with neuromuscular electrical stimulator markers (NMES) as a Diagnostic tool. Thus, this study concludes that the NMES marker can be used as a therapeutic tool to analyze obesity.
International Journal of Experimental Research and Review, 2024
A field experiment was conducted at Agriculture Farm, Institute of Agriculture, Visva-Bharati, Sr... more A field experiment was conducted at Agriculture Farm, Institute of Agriculture, Visva-Bharati, Sriniketan, West Bengal, during kharif season of 2021-22 and 2022-23 in order to assess the performance of rice under various levels of nitrogen and silicon in red and laterite soil of West Bengal. The experiment was laid out in split plot design with three levels of nitrogen in main plots and five levels of silicon in sub plots. In the pooled data of both the years, among the main plots, treatment N3 with nitrogen applied @ 20 kg/ha as basal + LCC-4 at 20 kg N/ha had the highest plant height (141.24 cm), dry matter accumulation (1411.97 g/sq. m) at harvest, LAI at 60 DAT (4.88), number of tillers per sq. m at harvest (273.94), panicle length (29.04 cm), panicle number (263.82), number of filled grains (102.51), test weight (25.2 g), rice grain yield (6.17 t/ha), straw yield (8.25 t/ha), protein content (7.22%) and protein yield (446.78 kg/ha). Among the sub-plots, treatment S5 with SiO2 applied @ 400 kg/ha recorded the highest plant height (137.98 cm), dry matter accumulation (1337.48 g/sq. m) at harvest, LAI at 60 DAT (4.74), number of tillers per sq. m at harvest (272.91), panicle length (29.53 cm), panicle number (259.73), number of filled grains (105.37), test weight (25.3 g), rice grain yield (5.83 t/ha), straw yield (7.86 t/ha), protein content (7.65 %) and protein yield (445.81 kg/ha).
International Journal of Experimental Research and Review, 2024
This study deals with emerging machine learning technologies, deep learning, and Transformers wit... more This study deals with emerging machine learning technologies, deep learning, and Transformers with autoencode-decode mechanisms for image captioning. This study is important to provide in-depth and detailed information about methodologies, algorithms and procedures involved in the task of captioning images. In this study, exploration and implementation of the most efficient technologies to produce relevant captions is done. This research aims to achieve a detailed understanding of image captioning using Transformers and convolutional neural networks, which can be achieved using various available algorithms. Methods and utilities used in this study are some of the predefined CNN models, COCO dataset, Transformers (enc-BERT,dec-GPT) and machine learning algorithms which are used for visualization and analysis in the area of model’s performance which would help to contribute to advancements in accuracy and effectiveness of image captioning models and technologies. The evaluation and comparison of metrics that are applied to the generated captions state the model's performance.
International Journal of Experimental Research and Review, 2024
Internet of vehicle (IoV) is a new way of enhancing vehicle performance and communication. This p... more Internet of vehicle (IoV) is a new way of enhancing vehicle performance and communication. This paper investigates the technology and applications that drive its integration. By dealing with data privacy, speed, and sizing issues, IoV in electric vehicles enhances real-time data transfer, predictive maintenance and route optimisation. The research led to the Blockchain Six-Layer Centralised Architecture Model (BS-LCAM). This research mainly seeks to improve the efficiency and safety of electric vehicles (EVs) and IoV networks. The six levels comprising this paradigm are data link, physical, transport, application and security. For instance, this layer assures that information can be kept safe over networks but cannot get lost or modified without all parties' permission. All these stages are enhanced by blockchain technology. It provides a secure connection for the BS-LCAM framework, allowing fast data storage and seamless interoperability among several IoV constituents. For example, an intelligent mobility system’s flexibility (comprehensive BS-LCAM model) might benefit different areas such as autonomous driving, energy optimization, fleet management and user-centric services. This paper simulates the performance of BS-LCAM model in diverse environments with 97.6 % performance and scalability, 98.7% route optimization and route optimization.
International Journal of Experimental Research and Review, 2024
This study examines hostile cyber-physical assaults on autonomous systems and proposes a novel ap... more This study examines hostile cyber-physical assaults on autonomous systems and proposes a novel approach. The recommended strategy integrates many domains, evaluates data quantitatively, and emphasizes real-world applications. A detailed comparison of six conventional approaches is underway. Four graphics depict the comparative study and how the recommended strategy would handle cyber-physical hostile assault challenges. The recommended solution utilizes multidisciplinary research, prioritizes quantitative reviews, and demonstrates its practical application and adaptability to various security challenges, thereby establishing a robust framework. Our analysis highlights the key advantages of the recommended technique by comparing six well-known methodologies. Clear illustrations support these findings, demonstrating the potential of the approach. The grid shows its strong presence across important criteria, and the radar image shows its success in data-driven analysis, freedom, and real-world application. The scatter plot illustrates that the technique is flexible and quantitative review-friendly, supporting its approach. Results reveal that the recommended technique is powerful and comprehensive. This helps analyze and mitigate cyber-physical hostile assaults on autonomous systems. Visualizations make it simple and fast to compare the recommended method to regular methods and discover its substantial advantages. This research will make autonomous systems safer and more resistant to new threats, ensuring their safety and consistency.
International Journal of Experimental Research and Review, 2024
In today's digitally connected world, Distributed Denial of Service (DDoS) attacks remain a formi... more In today's digitally connected world, Distributed Denial of Service (DDoS) attacks remain a formidable challenge, undermining the stability of network infrastructures and demanding robust detection strategies. This research explores advanced methodologies for DDoS detection by conducting a comparative analysis of machine learning and deep learning approaches using the CICDDoS2019 dataset. Initially, a hybrid machine learning framework is implemented, integrating K-Means clustering for pre-labeling the dataset and employing supervised models such as Random Forest (RF), Extreme Gradient Boosting (XGBoost), Adaptive Boosting (AdaBoost), Support Vector Machine (SVM), and Artificial Neural Network (ANN). This approach achieves an accuracy of 99.46%, showcasing its effectiveness while highlighting challenges like manual feature selection and limited scalability for complex datasets. A novel hybrid deep learning architecture is proposed to overcome these challenges, combining Convolutional Neural Networks (CNN) for spatial feature extraction and Long Short-Term Memory (LSTM) networks for temporal sequence learning. This automated feature extraction mechanism eliminates reliance on manual intervention, ensuring adaptability to evolving attack patterns. The proposed CNN-LSTM model demonstrates an impressive accuracy of 99.84%, significantly outperforming traditional machine learning models. Additionally, the model's adaptability and resilience against dynamic attack behaviours position it as a reliable solution for real-time DDoS mitigation. This study emphasizes the growing relevance of deep learning techniques in enhancing cyber security and underscores the potential of hybrid architectures in effectively detecting and mitigating modern cyber threats. The findings provide valuable insights into developing scalable, high-performance systems capable of addressing the ever-evolving nature of DDoS attacks.
International Journal of Experimental Research and Review, 2024
A novel and adaptable forming method, known as Single Point Incremental Forming (SPIF), has emerg... more A novel and adaptable forming method, known as Single Point Incremental Forming (SPIF), has emerged to meet the growing needs of the manufacturing industry. This technique is precious for producing innovative products from sheet metal, offering enhanced flexibility and precision in fabrication. The input parameters during the forming process play a crucial role in determining the final product's formability in terms of maximum formable depth (MFD) and surface quality in terms of average surface roughness (Ra). The present research aims to optimise the forming parameters for formability and surface quality during SPIF of material AZ31 magnesium alloy. In the present research, AZ31 is selected as the target material due to its widely recognized excellent strength-to-weight ratio, making it ideal for lightweight applications in the automotive, aerospace, and electronics industries. The experiments are accomplished based on the use of a Taguchi's design of experiments. The tool diameter (TD), tool rotational speed (TRS), tool feed rate (TFR), and incremental step depth (ISD) were chosen as variable parameters and, keeping other parameters constant for maximum formable depth and average surface roughness as response parameters. The best parameter settings were found, and the statistically significant parameters of the responses were examined using the Analysis of Variance (ANOVA). The results revealed that the maximum formable depth (23.5 mm), representing the material's formability, increases with larger tool diameters (10 mm and 12 mm), and higher tool rotational speeds (5000 rpm and 6000 rpm) but decreases with higher tool feed rates (500 mm/min and 600 mm/min). On the other hand, surface roughness improves (decreases) with higher tool rotational speeds, while it increases with larger tool diameters, higher tool feed rates and greater incremental step depths. Furthermore, the findings of a confirmation experiment using the optimal conditions showed a good agreement with the experimental observation. Additionally, linear regression models for predicting the maximum forming depth and average surface roughness were developed by applying the response surface methodology (RSM), which also had good agreement with experiments conducted on optimal parameters.
International Journal of Experimental Research and Review, 2024
A low-power strategy that can manage analogue, digital, and RF functionalities on a similar chip ... more A low-power strategy that can manage analogue, digital, and RF functionalities on a similar chip is crucial for wireless systems. Various difficulties restrict the widespread adoption of CMOS power amplifiers despite the fact that they provide highly integrated, low-cost wireless communication. Some of the main issues with CMOS power amplifiers include non-linearity, low breakdown voltage, a lack of high-voltage capacitors, and incorrect RF models. The RF signal is amplified without distortions using a linear power amplifier (LPA), which is less effective whenever driven by constant voltage. In order to significantly enhance the effectiveness of the power amplifiers, three frequently utilised techniques—Doherty, envelope elimination and restoration (EER), and envelope tracking (ET) techniques are reviewed in this work. Results point towards ET approach as the one that is ideally suited for future mobile communication systems. The essential component of ET systems, the envelope tracking power source, is what determines how effectively the system functions. It also lists the benefits of FinFET technology over CMOS and looks at three well-liked techniques for increasing power amplifier efficiency. Considering the advent of mobile communications systems, the frequency band and peak-to-average power ratio (PAPR) are quickly growing, posing significant design issues. FinFET as an alternative may considerably reduce the chip area.
International Journal of Experimental Research and Review, 2024
Data plays a crucial role in the contemporary era of technology, as it is a vital element in the ... more Data plays a crucial role in the contemporary era of technology, as it is a vital element in the publication of news on the internet or a website. Nevertheless, understanding long reports in order to fully comprehend events can be a challenging endeavor, frequently leading to subjective judgments. The application's architecture integrates the categorization of news stories by day, resulting in a well-organized and readily accessible archive. The application employs the web scraping method, which entails pulling pertinent news articles from numerous internet sources. The application employed sophisticated summarizing libraries, including the BERT, BART, T5 model and Google Pegasus, to condense the information into a succinct and comprehensible style. The T5 model performs exceptionally well in text summarization and other natural language processing tasks because of its text-to-text structure; it is also a very customizable language model. Google Pegasus, an expert in abstractive summarizing, uses self-attention mechanisms and rigorous pre-training to generate high-quality, concise news summaries. To summarize, these are the most important parts of our app's process. When it comes to collecting, storing, and summarizing news articles, the system has you covered. In addition, it will offer a straightforward design that makes it simple to browse past news stories and their summaries.
International Journal of Experimental Research and Review, 2024
The present study conducted a bibliometric analysis to synthesise available literature on cyberse... more The present study conducted a bibliometric analysis to synthesise available literature on cybersecurity. The analysis identifies prolific authors, relevant sources, affiliations, nations, trend topics, publication trends, themes, and collaboration patterns among countries. Scopus database was searched using the keyword “cyber security” and the search resulted in 31,852 articles. After the inclusion and exclusion criteria, a total of 733 documents were extracted in a CSV format for analysis. The Biblioshiny web tool of the R-package was used for analysis. The finding shows an exponential rise in scholarly work on cybersecurity. “Information and computer security” is identified as the most relevant source. Chen H has published most articles and the USA leads the field in terms of publication, global citations, and collaboration with other countries. The trend topics and themes have also been listed. The study would help readers understand the landscape of cybersecurity research, identify key contributors, and guide them in addressing the challenges posed by cyber threats.
International Journal of Experimental Research and Review, 2024
In urban context, wetlands play a proven role in retaining pollution, alleviating the impacts of ... more In urban context, wetlands play a proven role in retaining pollution, alleviating the impacts of climate change, and enhancing the well-being of citizens. Managing these ecosystems requires formulating an integrated development policy reconciling protection, enhancement, and the satisfaction of economic and social needs. Achieving this goal requires the establishment of a holistic and collaborative vision that engages all stakeholders. This paper focuses on rehabilitating the El Oulfa Pond located in Casablanca, outlining the role of civil society and scientists in producing quantitative and qualitative data. This work aims to facilitate an informed decision-making process concerning the pond's development. The quantitative analysis enabled us to characterize the water quality and identify the biodiversity of El Oulfa pond. The outcomes underscore that 60% of the bird species whose presence around the pond is confirmed require a habitat in a wetland. Some species identified are classified as rare, or endangered. 23% of the recorded plants are indicative of a natural area of ecological Interest. These characteristics are important to take into consideration when developing the site. The biodiversity in El Oulfa pond could be jeopardized by significant water pollution. Indeed, the parameters of surface water monitored between 2015 and 2021 show concentrations that exceed the Moroccan thresholds relating to fish water quality. This suggests that for these parameters, the ecosystem does not provide conditions conducive to the maintenance of biodiversity. This pollution also poses challenges for residents who have expressed, during the open forum and interviews conducted in this study, the critical need for depollution and securing the pond before initiating any development. Building upon the study's findings, development strategies have been proposed to enhance the site and promote collective stewardship.
International Journal of Experimental Research and Review, 2024
Plant disease identification is crucial to food security and agricultural product availability. T... more Plant disease identification is crucial to food security and agricultural product availability. Traditional disease diagnosis can be tedious, annoying, and inaccurate. The investigation examines how modern machine learning algorithms might improve plant disease diagnostics for efficacy and precision. Despite this, machine learning faces many obstacles, including model training, processing costs, and rising demand for large data sets. This study proposes a novel method called Automated Machine Learning Classification Framework (AMLCF) to predict crop yield and detect pest patterns. This framework simplifies model selection, hyperparameter adjustment, and feature engineering for non-experts. The amount of time and computational resources needed have additionally been greatly reduced. The suggested AMLCF is evaluated on different unique agricultural datasets to validate its plant disease detection versatility. Our extensive simulation analysis found that AMLCF exceeds existing machine learning methods in speed, accuracy, and usability. AMLCF's detailed demonstration shows this; besides predicting plant illnesses, this system can predict crop yield and detect pests. Those findings suggest AMLCF could transform farming. Better plant health monitoring, early disease identification, and farmer selection could be achieved. The experimental results show that the proposed AMLCF model increases the accuracy ratio by 92.6%, computational efficiency analysis by 97.4%, versatility analysis by 98.3%, user accessibility ratio by 99.1%, and crop health tracking analysis by 94.8% compared to other existing models.
International Journal of Experimental Research and Review, 2024
Classifying tumors from MRI scans is a key medical imaging and diagnosis task. Conventional featu... more Classifying tumors from MRI scans is a key medical imaging and diagnosis task. Conventional feature-based methods and traditional machine learning algorithms are used for tumor classification, which limits their performance and generalization. A hybrid framework is implemented for the classification of brain tumors using MRIs. The framework contains three basic components, i.e., Feature Extraction, Feature Fusion, and Classification. The feature extraction module uses a convolutional neural network (CNN) to automatically extract high-level features from MRI images. The high-level features are combined with clinical and demographic features through a feature fusion module for better discriminative power. The Support vector machine (SVM) was employed to classify the fused features as class label tumors by a classification module. The proposed model obtained 90.67% accuracy, 94.67% precision, 83.82% recall and 83.71% f1-score. Experimental results demonstrate the superiority of our framework over those existing solutions and obtain exceptional accuracy rates compared to all other frequently operated models. This hybrid deep learning framework has promising performance for efficient and reproducible tumor classification within brain MRI scans.
International Journal of Experimental Research and Review, 2024
Negotiation is a fundamental part of daily life, serving as a key tool for achieving success acro... more Negotiation is a fundamental part of daily life, serving as a key tool for achieving success across various domains. This study shifts the focus from merely successful negotiation to the concept of "wise negotiation.’ Central to this exploration is the investigation of gender differences in both the practice and conceptualization of wise negotiation. This study's novelty lies in distinguishing wise negotiation from traditional notions of success, revealing nuanced gender-specific approaches within a specialized cohort of recognized negotiators. Using a mixed-method research design, the study gathered implicit and explicit data from 20 women and 31 men, all recognized by peers for their qualities as wise negotiators. Data collection included interviews, negotiation tasks, and questionnaires, analyzed through both qualitative and quantitative methods. Findings revealed distinct gendered perspectives on wise negotiation. Implicit data showed that women often approached wise negotiation as a relational and holistic endeavor, prioritizing the well-being of all parties. Men, however, tended to adopt a more pragmatic view, focusing on efficiency and practical outcomes. Explicit data from questionnaires further highlighted gender disparities. A Mann-Whitney U test identified significant differences in wisdom-related variables, such as openness, emotionality, adaptive performance through training and learning, and emotional management. Notably, no gender differences emerged in a negotiation simulation task. Examining gendered views of negotiation is particularly critical because gender shapes interpersonal dynamics, communication styles, and decision-making processes more profoundly than many other demographic variables. Understanding these differences is essential for developing inclusive strategies that transcend stereotypes, ensuring equitable participation and outcomes. Women emphasized relational dynamics, while men leaned toward practicalities, suggesting a divergence from traditional gender roles. These insights can inform the development of gender-inclusive policies and training programs that cultivate wisdom in negotiation, fostering a balanced and equitable approach across diverse contexts. This focus on gendered nuances enriches our understanding of wise negotiation and its potential applications.
International Journal of Experimental Research and Review, 2024
Hypertension is a common condition of cardiovascular disease that poses significant health challe... more Hypertension is a common condition of cardiovascular disease that poses significant health challenges among the public on a larger scale globally. It is important to accurately predict the risk of hypertension to save people and improve overall quality of life. Traditionally, the detection of hypertension relies on clinical criteria such as blood pressure measurement and examination of medical history. However, these methods have drawbacks involving potential human error, time consumption, and the possibility of missed diagnoses. The paper aims to identify the features or symptoms of hypertension disease and predict its risk factors using machine learning algorithms. Apart from this, it is of utmost importance to identify the symptoms as they play a pivotal role in recognizing the type of risk for hypertension. To successfully conduct the work, a dataset of 13 attributes, including gender, age, smoking habits, etc, has been used, which is further visualized graphically to understand the pattern among them. Later, multiple machine learning-based learning techniques have been applied and examined on the basis of standard metrics. Results indicate that random forest models outperform existing approaches, achieving an accuracy of 87.26% in predicting low and high-risk hypertension. Furthermore, classification reports reveal superior precision, recall, and F1-score for random forests compared to alternative models. Insights from learning curves and confusion matrices provide a valuable understanding of model performance and data sufficiency. Overall, this research highlights the impact of machine learning in accurately predicting the risk of hypertension and underscores the importance of ongoing research efforts to translate these findings into practical clinical applications.
International Journal of Experimental Research and Review, 2024
Blockchain technology’s secure, decentralized platforms have revolutionized multiple industries. ... more Blockchain technology’s secure, decentralized platforms have revolutionized multiple industries. This paper discusses possible problems with employing blockchain technology to ensure security. This poses a question about scalability, privacy, and regulatory compliance. It suggests an approach to Blockchain-based Digital Signature Security Analysis (B-DSSA). This solution increases digital signature algorithms using the transparency and immutability of blockchain. This engineering makes it possible for electronic communications to be valid, intact and non-repudiated, making it applicable for secured settings like healthcare, finance, supply chain management etc. As results indicate, substantial advances have been made in preventing unauthorized access and tampering in digital transactions. By combining public and private blockchains, this method achieves scalability while protecting the privacy of sensitive data. This configuration makes it possible for real-time applications in healthcare and finance by optimizing resource utilization, maintaining high data integrity, and enabling speedier processing. Document signing systems, identity verification, and contract execution are some of the examples showing the flexibility and endurance of blockchain-based security solutions through B-DSSA. The paper reveals how blockchain technology may revolutionize the field of safety, leading to further research on marketing orientation issues.
International Journal of Experimental Research and Review, 2024
The railway infrastructure is a perfect blend of all branches of engineering. Technology has dras... more The railway infrastructure is a perfect blend of all branches of engineering. Technology has drastically increased, mainly in the Signalling, Civil, Electrical and Mechanical engineering streams. In the field of Signalling, it has leaped from Mechanical to Electronics Interlocking. Civil engineering has gone from manual track maintenance to high-end mechanized tools. In the field of mechanical engineering, it has progressed from wooden coaches to modern designed (LHB)coaches. In electrical engineering, the technology changed from steam loco to diesel and later electrically powered loco design. Nowadays, the loco is powered by nonconventional sources (like solar or wind). Hence, the maintenance of rail infrastructure with traditional methods of physical supervision is sluggish and prone to frequent failures. However, condition monitoring through manual verification by railwaymen helps in the upkeep of railway assets, but it is prone to human errors and subsequent failure of systems, which may result in disasters like derailments or head-on collisions. Signalling components (Point, Tracks, Signals, OFC, DNS, Dataloggers, EI, Block-working instruments etc.), Mechanical rolling stock (Carriage and Wagon), Electrical fitments (OHE, AT, DG, TSS/SP/SSP, locomotive etc.) and Civil Structures (Bridges, Culverts, rail-tracks, rail beds) all require regular maintenance. This paper introduces a “condition monitoring method Enhanced with Wireless Sensor Network” for railway infrastructure. It also clearly distinguishes the typical category of sensors, which are best suitable for condition monitoring of various types of railway assets. The proposed WSN-based technology will improve railway subsystem reliability, effectively reducing failure time and improving the operating ratio.