IJRE ORG | International Journal of Research and Engineering (original) (raw)

Papers by IJRE ORG

Research paper thumbnail of Effect of shape, size and electrical properties on specific absorption rate (SAR

This paper presents the dependency of SAR distribution in human brain and eye on shape, size and ... more This paper presents the dependency of SAR distribution in human brain and eye on shape, size and electrical properties for different frequencies. Calculations were carried out using the Finite Difference Time Domain (FDTD) Method. The results indicate that the SAR distribution does not much depend on the shape and size but it depends mainly on the electrical properties of the tissues. There is a drop in magnitude of the SAR in the brain, when moving from a cubical model to the spherical model. There is a magnitude drop in the eye, when going from a spherical model to the cubical model. For both brain and eye, when the size is decreased, the volume is reduced and therefore the electromagnetic energy absorption goes up. The human eye at 2500 MHz was considered to observe how the electrical properties of the tissues affect the electromagnetic energy absorption in an organ. When relative permittivity is perturbed in small percentages with conductivity remaining unchanged; the value of the maximum SAR also changes by small values. However both these cases, when the electrical properties are changed, the location of maximum SAR remains unchanged. It is exactly at the center of the eye.

Research paper thumbnail of Dispersion Modelling of Odoriferous Compounds from Landfill

Landfills contain vast quantities of waste as they are sites for deposition of a city's solid was... more Landfills contain vast quantities of waste as they are sites for deposition of a city's solid waste. This solid waste further undergoes the process of decomposition producing several by-products such as odorous gases and volatile organic compounds. Both of these compounds have potential to severely damage physical and mental health of human. These components were analysed using AERMOD by United States Environmental Protection Agency (USEPA). AERMOD needs uniform and horizontally consistent surface and upper air parameters of meteorology. Along with AERMOD, Landfill Gas Emission Model (LandGEM Version 3.02) was used to form an emission inventory. LandGEM analyses odorous gases and VOCs. The dispersion of odorous gases over areas adjacent to landfill of Pune, India has been estimated from information on advances in emission inventory and field monitoring operation. Four odorous gases namely Total Landfill gas, Methane, Carbon Dioxide and Non Methane Organic Compounds (NMOC) were detected at 17 discrete places and were found within permissible limits.

Research paper thumbnail of Resistive Random Access Memory (ReRAM)

Resistive Random-Access Memory (ReRAM) technology has been viewed as one of the most reliable non... more Resistive Random-Access Memory (ReRAM) technology has been viewed as one of the most reliable non-volatile memories that have are emerging in markets. In this research paper, the revolution of ReRAM will be analyzed. Also, the paper will also review the recent progress in the technological development of ReRAM. The performance parameters of these non-volatile memories such as their operating voltage, operation speed, resistance ratio, endurance, retention time, device yield, and multilevel storage will be analyzed. Integration and reliability of Re-RAM in the practical level is compared with other types of memories. Challenges faced by users of ReRAM are addressed in regards to technological fallbacks among other challenges. Finally, the future research on the ReRAM will be analyzed.

Research paper thumbnail of Research & Analysis on Hybrid Storage: Combining SSDs and HDDs Drive

The traditional Hard Disk Drives and the upgrading Solid State Drives (SSD) are popular for their... more The traditional Hard Disk Drives and the upgrading Solid State Drives (SSD) are popular for their storage traits that keep large mass of data. The two technologies provide users with a large number of I/O per second. However, the two have a limited space capacity and perform differently. The SSDs offer exceptional performance. However, as compared to HDDs, they have much less capacity per drive and are costlier. Depending on the organizational application workload, the needs of capacities and performance requirements make users to have different preferences. In the IT system, the challenges on the usage of SSDs and HHDs are navigation of price, performance, and capacity trade-offs between SSDs and HDDs. Determining the most cost-effective drive type for each workload type is essential and will be discussed in this research paper. With the new technologies, there has been a series of combination of SSDs and HDDs into a single storage pool. The reduction of the I/O bottleneck through the hybrid SSD and HDD storage is also analyzed based on the terms of maintaining cost of the cache manager through settling on newer budgets for organizational optimization on storage solutions.

Research paper thumbnail of Temporary Climate Variability for the Period 1965-2016 in the Pedro Moncayo Canton, Ecuador

The objective of the study was to determine the climatic variability of Pedro Moncayo canton, to ... more The objective of the study was to determine the climatic variability of Pedro Moncayo canton, to establish methodologies for planning and adaptation of the agricultural production systems of the canton. Meteorological data in the period 1965-2016, various aspects of climate variability and the occurrence of extreme events such as the ENSO phenomenon were analyzed, and how they have influenced the air pressure, precipitation and the temperatures of the canton. The Standardized Precipitation Index and the equation created by Fournier were applied. The results show that the average thermal amplitude was 2.46 ° C, the average temperature increase of 0.5 ° C in the whole-time series analyzed. The period of drought corresponds to the months of May to September and to the rainy season of October to April; and the annual average value is 618 mm per year, with a minimum of 409 mm for 2001 and a maximum of 846 mm for 2008. The characterization of the climate variability of the canton will allow corrective measures of territorial planning and formulate a sustainable management plan of the territory.

Research paper thumbnail of Performance Comparison between Two Interpretations of Missing Data using Matrix-Characterized Approximations

Nowadays, the veracity related with data quality such as incomplete, inconsistent, vague or noisy... more Nowadays, the veracity related with data quality such as incomplete, inconsistent, vague or noisy data creates a major challenge to data mining and data analysis. Rough set theory presents a special tool for handling the incomplete and imprecise data in information systems. In this paper, rough set based matrix-represented approximations are presented to compute lower and upper approximations. The induced approximations are conducted as inputs for data analysis method, LERS (Learning from Examples based on Rough Set) used with LEM2 (Learning from Examples Module, Version2) rule induction algorithm. Analyzes are performed on missing datasets with "do not care" conditions and missing datasets with lost values. In addition, experiments on missing datasets with different missing percent by using different thresholds are also provided. The experimental results show that the system outperforms when missing data are characterized as "do not care" conditions than represented as lost values.

Research paper thumbnail of Action Recognition Framework using Saliency Detection and Random Subspace Ensemble Classifier

Action recognition can be defined as a problem to determine what kind of action is happening in a... more Action recognition can be defined as a problem to determine what kind of action is happening in a video. It is a process of matching the observation with the previously labelled samples and assigning label to that observation. In this paper, a framework of the action recognition system based on saliency detection and random subspace ensemble classifier, is introduced in order to increase the performance of the action recognition. The proposed action recognition framework can be partitioned into three main processing phases. The first processing phase is detecting salient foreground objects by considering pattern and color distinctness of a set of pixels in each video frame. In the second processing phase, changing gradient orientation features are used as a useful feature representation. The third processing phase is recognizing actions using random subspace ensemble classifier with discriminant learner. Experimental results are evaluated on the UIUC action dataset. The proposed action recognition framework achieved satisfying action recognition accuracy.

Research paper thumbnail of An Analysis on Security Threats of Black-Hole and Jellyfish Attacks in Mobile Ad-Hoc Network using HTTP Traffic

Mobile ad-hoc network (MANET) is an infrastructure less network. This network is a collection of ... more Mobile ad-hoc network (MANET) is an infrastructure less network. This network is a collection of randomly moving mobile nodes. As MANET does not have any centralized management, this network can form anywhere with the participation of randomly moving nodes. Because of such vulnerable behaviour of MANET, this network has to face many security problems. There are so many security threats of MANET, which does not have any solution. Even detection of those problems is not easy. Some of the security threats are very severe. Those threats can even destroy the whole network. Researchers are working to find out the solution of those threats. Among those threats, we have worked with two security threats, which are Black-hole attack and Jellyfish attack. Here, we have found out the threats using HTTP traffic. We use OPNET modeler 14.5 as simulator AODV routing protocol. The aim of this paper is to find out the impact of security threats on MANET using HTTP traffic. We decide the impact using number of events and average number of events utilizing throughput of the OPNET modeler.

Research paper thumbnail of Low Enriched Uranium based Nuclear Rocket Propulsion Technology: Mars Exploration Mission

Many space agencies like NASA, SPACE-X have promised to send humans into the red planet in future... more Many space agencies like NASA, SPACE-X have promised to send humans into the red planet in future. So, considering their project of mars colonization, nuclear rocket propulsion would be the better option. Replacing chemical rockets by nuclear rockets may reduce the mission duration and also can reduce the mass of the propellant used. In chemical rockets, propellant releases energy through combustion but in case of nuclear rockets, propellant i.e. hydrogen is heated up in controlled fission reaction in nuclear reactor inside the rocket engine. Specific impulse of the nuclear rocket is greater than chemical rocket. This helps in providing gigantic thrust as a result mission duration is decreased. The challenging parameter of increasing specific impulse is solved by maximizing specific impulse which is done by increasing the exhaust core temperature. The fuel is selected in such a way so that the exhaust temperature would be obtained. The (U, Zr) C-graphite fuel is selected because it has high uranium density and melting point is equivalent to exhaust core temperature which is sufficient enough to enhance the reactivity of the fissile material and thus to increase the rocket performance. A mathematical analysis shows that the percentage of mass of propellant used in mars mission will be lesser than the chemical rockets because the specific impulse is expected to be more in nuclear propulsion. The specific impulse obtained from the CFD Analysis of rocket nozzle is 979 sec with exit velocity of 9604m/s.

Research paper thumbnail of Performance of Combined Vertical and Horizontal Flow Sub-Surface Constructed Wetlands

The present study demonstrates wetland projects to treat industrial wastewater for reuse implemen... more The present study demonstrates wetland projects to treat industrial wastewater for reuse implemented for different hydraulic & organic loadings. The combination of vertical and horizontal flow wetland treatment system with fill and draw controls provides a design for effective contact of wastewater with the root system to achieve higher treatment efficiencies by creating necessary environments for nitrification-denitrification removal of organic materials, and phosphorus adsorption reactions. Systems have been implemented for large scale applications in automobile, sand reclamation, municipal leachate and other industries for process and domestic wastewater treatment & reuse. The results show that there is a marked removal efficiency using Typha species & several other indigenous plants. The percentage reductions in various physicochemical parameters such as Total Suspended Solids (TSS), Biochemical Oxygen Demand (BOD), Nitrate (N), Phosphate (P), and Fecal Coliforms (FC) are 85%, 90%, 70%, 60% and 95% respectively. The study further reveals the effect of variable hydraulic loading rates on treatment efficiencies. The system has been successfully adopted for the past 3 years reducing electrical, mechanical operations and maintenance requirements for wastewater treatment by almost 70% benefiting industries to a great extent and exploring opportunities for application in other industrial sectors for implementation of such technologies which were not in practice earlier.

Research paper thumbnail of New Insights of Background Estimation and Region Localization

Subtraction of background in a crowded scene is a crucial and challenging task of monitoring the ... more Subtraction of background in a crowded scene is a crucial and challenging task of monitoring the surveillance systems. Because of the similarity between the foreground object and the background, it is known that the background detection and moving foreground objects is difficult. Most of the previous works emphasize this field but they cannot distinguish the foreground from background due to the challenges of gradual or sudden illumination changes, high-frequencies background objects of motion changes, background geometry changes and noise. After getting the foreground objects, segmentation is need to localize the objects region. Image segmentation is a useful tool in many areas, such as object recognition, image processing, medical image analysis, 3D reconstruction, etc. In order to provide a reliable foreground image, a carefully estimated background model is needed. To tackle the issues of illumination changes and motion changes, this paper establishes an effective new insight of background subtraction and segmentation that accurately detect and segment the foreground people. The scene background is investigates by a new insight, namely Mean Subtraction Background Estimation (MS), which identifies and modifies the pixels extracted from the difference of the background and the current frame. Unlike other works, the first frame is calculated by MS instead of taking the first frame as an initial background. Then, this paper make the foreground segmentation in the noisy scene by foreground detection and then localize these detected areas by analyzing various segmentation methods. Calculation experiments on the challenging public crowd counting dataset achieve the best accuracy than state-of-the-art results. This indicates the effectiveness of the proposed work.

Research paper thumbnail of Disparity Map Computation from Stereo Images Using Hill-Climbing Segmentation

Stereo matching is one of the most active research areas in computer vision for decades. The task... more Stereo matching is one of the most active research areas in computer vision for decades. The task of stereo matching is to find the point correspondence between two images of the same scene taken from different viewpoints. This paper presents a segment-based stereo matching algorithm. Firstly, the reference image is segmented using hill-climbing algorithm and local stereo matching is performed Scale Invariant Feature Transform (SIFT) feature points with Sum of Absolute Differences (SAD) block matching. Secondly, a set of reliable pixels is constructed by comparing the matching cost and the mutual cross-checking consistent between the left and right initial disparity maps, which can lead to an actual disparity plane. Thirdly, a set of all possible disparity planes are extracted and then plane fitting and neighboring segment merging are performed. Finally, the disparity planes are set in each region using graph cuts to obtain final disparity map. The evaluation of proposed algorithm on the Middlebury data set result shows that the proposed algorithm is competitive with state-of-the-art stereo matching algorithms.

Research paper thumbnail of Big Data and Big Data Analytics: Concepts, Types and Technologies

Nowadays, companies are starting to realize the importance of data availability in large amounts ... more Nowadays, companies are starting to realize the importance of data availability in large amounts in order to make the right decisions and support their strategies. With the development of new technologies, the Internet and social networks, the production of digital data is constantly growing. The term "Big Data" refers to the heterogeneous mass of digital data produced by companies and individuals whose characteristics (large volume, different forms, speed of processing) require specific and increasingly sophisticated computer storage and analysis tools. This article intends to define the concept of Big Data, its concepts, challenges and applications, as well as the importance of Big Data Analytics

Research paper thumbnail of Failure Analysis in Conceptual Phase toward a Robust Design: Case Study in Monopropellant Propulsion System

As a system becomes more complex, the uncertainty in the operating conditions increases. In such ... more As a system becomes more complex, the uncertainty in the operating conditions increases. In such a system, implementing a precise failure analysis in early design stage is vital. However, there is a lack of applicable methodology that shows how to implement failure analysis in the early design phase to achieve a robust design. The main purpose of this paper is to present a framework to design a complex engineered system resistant against various factors that may cause failures, when design process is in the conceptual phase and information about detailed system and component is unavailable. Within this framework, we generate a population of feasible designs from a seed functional model, and simulate and classified failure scenarios. We also develop a design selection function to compare robust score for candidate designs, and produce a preference ranking. We implement the proposed method on the design of an aerospace monopropellant propulsion system.

Research paper thumbnail of Modeling of Phosphorous Acid Fuel Cell in PSCAD

The renewable energy sources, such as wind, fuel cells, etc. are gaining more attention due to th... more The renewable energy sources, such as wind, fuel cells, etc. are gaining more attention due to the increase in energy demand as well as being environmental kindly. A dynamic model of Phosphorous Acid Fuel Cell is modeled and simulated using PSCAD/EMTDC. The system consists of a fuel cell stack along with 3-phase Pulse-Width Modulator (PWM) inverter, LCL filter and step-up transformer connected to the main grid. A Real-Reactive power controller is implemented into the 3-phase PWM inverter to control and stabilize the active and reactive power flow onto the main grid. A LCL filter is connected to the inverter side, which eliminates the ultra-harmonic distortions of the frequency. The effect of the Line-Ground, Line-Line, etc. faults on the performance of the main grid's output voltage is analyzed and studied. The fuel cell is connected to the main grid and the simulation results contain the analysis at different stages of the simulation.

Research paper thumbnail of Plant Leaf Disease Detection and Classification using Image Processing

Myanmar is an agricultural country and then crop production is one of the major sources of earnin... more Myanmar is an agricultural country and then crop production is one of the major sources of earning. So, more than half of our population depends on agriculture for livelihood. Due to the factors like diseases, pest attacks and sudden change in the weather condition, the productivity of the crop decreases. Automatic detection of plant diseases is essential to automatically detect the symptoms of diseases as early as they appear on the growing stage. This paper proposed a methodology for the analysis and detection of plant leaf diseases using digital image processing techniques. The experimental results demonstrate that the proposed system can successfully detect and classify four major plant leaves diseases: Bacterial Blight and Cercospora Leaf Spot, Powdery Mildew and Rust.

Research paper thumbnail of Optimal Energy Storage Sizing in Photovoltaic and Wind Hybrid Power System Meeting Demand-Side Management Program in Viet Nam

This paper proposes a new method to determine optimal energy storage sizing in photovoltaic and w... more This paper proposes a new method to determine optimal energy storage sizing in photovoltaic and wind hybrid power generation systems. These generations are placed in a scheme of three blocks to forecast, measure, and dispatch/control and distribute power flows in whole system to meet requirements of the demand-side management program in Viet Nam. Data about electric load power, power of solar irradiance, ambient temperature, wind speed and other weather conditions must be forecasted in a high accuracy. An algorithm to determine the optimal sizing is designed basing on forecasting data, constraints, the relation of quantities in whole system and the capability to charge/discharge energy of energy storage. The optimal sizing in this research helps to rearrange load diagrams that compensate deficient energy completely in stages having high and medium price levels. It can be applied at each bus to reduce cost for buying electricity from electric power system. The new proposal is illustrated by simulation results in a case study carried out by MATLAB 2017a.

Research paper thumbnail of Solving Vehicle Routing Problem using Ant Colony Optimisation (ACO) Algorithm

Affiliation(s): Abstract-Engineering field usually requires having the best design for an optimum... more Affiliation(s): Abstract-Engineering field usually requires having the best design for an optimum performance, thus optimization plays an important part in this field. The vehicle routing problem (VRP) has been an important problem in the field of distribution and logistics since at least the early 1960s. Hence, this study was about the application of ant colony optimization (ACO) algorithm to solve vehicle routing problem (VRP). Firstly, this study constructed the model of the problem to be solved through this research. The study was then focused on the Ant Colony Optimization (ACO). The objective function of the algorithm was studied and applied to VRP. The effectiveness of the algorithm was increased with the minimization of stopping criteria. The control parameters were studied to find the best value for each control parameter. After the control parameters were identified, the evaluation of the performance of ACO on VRP was made. The good performance of the algorithm reflected on the importance of its parameters, which were number of ants (nAnt), alpha (α), beta (β) and rho (ρ). Alpha represents the relative importance of trail, beta represents the importance of visibility and rho represents the parameter governing pheromone decay. The route results of different iterations were compared and analyzed the performance of the algorithm. The best set of control parameters obtained is with 20 ants, α = 1, β = 1 and ρ = 0.05. The average cost and standard deviation from the 20 runtimes with best set of control parameters were also evaluated, with 1057.839 km and 25.913 km respectively. Last but not least, a conclusion is made to summarize the achievement of the study.

Research paper thumbnail of Time-aware Traffic Shaper using Time-based Packet Scheduling on Intel I210

Affiliation(s): Abstract-By 2015, the Institute of Electrical and Electronics Engineers (IEEE) ti... more Affiliation(s): Abstract-By 2015, the Institute of Electrical and Electronics Engineers (IEEE) time-sensitive networking (TSN) task group has released several TSN standards. Amongst them is 802.1Qbv, also known as time-aware shaper, aiming to provide performance assurances of latency and delivery variation to enable applications in a TSN network. While there are several products and evaluation kits that employ 802.1Qbv in the market now, it is still not widely adopted yet due to the maturity of the standard. Hardware-enabled 802.1Qbv use hardware queues and timers to achieve accurate transmission of packets in the switch and bridge. This research aims to investigate the feasibility of using an existing end-station Ethernet controller, Intel I210, and its launch time control feature (commonly known as time-based packet scheduling) to shape traffic compatible to 802.1Qbv-enabled network bridges. A software solution is developed by implementing a software configurable gate-control list and employing open-source Linux RFC patches for per-packet transmit time specification. By configuring the kernel and mapping kernel-layer traffic classes to the hardware queues, packets can be transmitted out at precise times while attaching 802.1Q VLAN tags, required by bridges to identify packets. Through analysis, it is found that this solution will require an additional 30 μs transmit offset to be used effectively. That is 55% more time is needed to transmit a packet in a back-to-back connection and 17.6% on a 3-switch network to improve period peak jitter performance to just 8.9 μs compared to 1 ms on solutions that send packets out periodically using software sleep functions.

Research paper thumbnail of A Method to Harness Maximum Power from Photovoltaic Power Generation Basing on Completely Mathematical Model

This paper introduces a new method that no previous study has been done in this photovoltaic powe... more This paper introduces a new method that no previous study has been done in this photovoltaic power generation similar to this paper to harness maximum potential power from photovoltaic power generation. The completely mathematical model added the relation between diode factor of the generation and p-n junction temperature is proposed to use in this method. The maximum power point tracker combines the iterative and bisectional technique, the completely mathematical model of PVG and the system of equations that converts value of parameters from standard test condition to any working condition, measuring sensors to measure power of solar irradiance and p-n junction temperature to determine parameters at maximum power point at any working condition. The voltage controller is designed to drive this generation to expect working state to harness maximum power. An experimental model corresponding to this method was designed and operated in real conditions in Viet Nam. Experimental results show the high accuracy of analyzing in theory and high capability to bring this method out real applications to harness all available energy of this generation.

Research paper thumbnail of Effect of shape, size and electrical properties on specific absorption rate (SAR

This paper presents the dependency of SAR distribution in human brain and eye on shape, size and ... more This paper presents the dependency of SAR distribution in human brain and eye on shape, size and electrical properties for different frequencies. Calculations were carried out using the Finite Difference Time Domain (FDTD) Method. The results indicate that the SAR distribution does not much depend on the shape and size but it depends mainly on the electrical properties of the tissues. There is a drop in magnitude of the SAR in the brain, when moving from a cubical model to the spherical model. There is a magnitude drop in the eye, when going from a spherical model to the cubical model. For both brain and eye, when the size is decreased, the volume is reduced and therefore the electromagnetic energy absorption goes up. The human eye at 2500 MHz was considered to observe how the electrical properties of the tissues affect the electromagnetic energy absorption in an organ. When relative permittivity is perturbed in small percentages with conductivity remaining unchanged; the value of the maximum SAR also changes by small values. However both these cases, when the electrical properties are changed, the location of maximum SAR remains unchanged. It is exactly at the center of the eye.

Research paper thumbnail of Dispersion Modelling of Odoriferous Compounds from Landfill

Landfills contain vast quantities of waste as they are sites for deposition of a city's solid was... more Landfills contain vast quantities of waste as they are sites for deposition of a city's solid waste. This solid waste further undergoes the process of decomposition producing several by-products such as odorous gases and volatile organic compounds. Both of these compounds have potential to severely damage physical and mental health of human. These components were analysed using AERMOD by United States Environmental Protection Agency (USEPA). AERMOD needs uniform and horizontally consistent surface and upper air parameters of meteorology. Along with AERMOD, Landfill Gas Emission Model (LandGEM Version 3.02) was used to form an emission inventory. LandGEM analyses odorous gases and VOCs. The dispersion of odorous gases over areas adjacent to landfill of Pune, India has been estimated from information on advances in emission inventory and field monitoring operation. Four odorous gases namely Total Landfill gas, Methane, Carbon Dioxide and Non Methane Organic Compounds (NMOC) were detected at 17 discrete places and were found within permissible limits.

Research paper thumbnail of Resistive Random Access Memory (ReRAM)

Resistive Random-Access Memory (ReRAM) technology has been viewed as one of the most reliable non... more Resistive Random-Access Memory (ReRAM) technology has been viewed as one of the most reliable non-volatile memories that have are emerging in markets. In this research paper, the revolution of ReRAM will be analyzed. Also, the paper will also review the recent progress in the technological development of ReRAM. The performance parameters of these non-volatile memories such as their operating voltage, operation speed, resistance ratio, endurance, retention time, device yield, and multilevel storage will be analyzed. Integration and reliability of Re-RAM in the practical level is compared with other types of memories. Challenges faced by users of ReRAM are addressed in regards to technological fallbacks among other challenges. Finally, the future research on the ReRAM will be analyzed.

Research paper thumbnail of Research & Analysis on Hybrid Storage: Combining SSDs and HDDs Drive

The traditional Hard Disk Drives and the upgrading Solid State Drives (SSD) are popular for their... more The traditional Hard Disk Drives and the upgrading Solid State Drives (SSD) are popular for their storage traits that keep large mass of data. The two technologies provide users with a large number of I/O per second. However, the two have a limited space capacity and perform differently. The SSDs offer exceptional performance. However, as compared to HDDs, they have much less capacity per drive and are costlier. Depending on the organizational application workload, the needs of capacities and performance requirements make users to have different preferences. In the IT system, the challenges on the usage of SSDs and HHDs are navigation of price, performance, and capacity trade-offs between SSDs and HDDs. Determining the most cost-effective drive type for each workload type is essential and will be discussed in this research paper. With the new technologies, there has been a series of combination of SSDs and HDDs into a single storage pool. The reduction of the I/O bottleneck through the hybrid SSD and HDD storage is also analyzed based on the terms of maintaining cost of the cache manager through settling on newer budgets for organizational optimization on storage solutions.

Research paper thumbnail of Temporary Climate Variability for the Period 1965-2016 in the Pedro Moncayo Canton, Ecuador

The objective of the study was to determine the climatic variability of Pedro Moncayo canton, to ... more The objective of the study was to determine the climatic variability of Pedro Moncayo canton, to establish methodologies for planning and adaptation of the agricultural production systems of the canton. Meteorological data in the period 1965-2016, various aspects of climate variability and the occurrence of extreme events such as the ENSO phenomenon were analyzed, and how they have influenced the air pressure, precipitation and the temperatures of the canton. The Standardized Precipitation Index and the equation created by Fournier were applied. The results show that the average thermal amplitude was 2.46 ° C, the average temperature increase of 0.5 ° C in the whole-time series analyzed. The period of drought corresponds to the months of May to September and to the rainy season of October to April; and the annual average value is 618 mm per year, with a minimum of 409 mm for 2001 and a maximum of 846 mm for 2008. The characterization of the climate variability of the canton will allow corrective measures of territorial planning and formulate a sustainable management plan of the territory.

Research paper thumbnail of Performance Comparison between Two Interpretations of Missing Data using Matrix-Characterized Approximations

Nowadays, the veracity related with data quality such as incomplete, inconsistent, vague or noisy... more Nowadays, the veracity related with data quality such as incomplete, inconsistent, vague or noisy data creates a major challenge to data mining and data analysis. Rough set theory presents a special tool for handling the incomplete and imprecise data in information systems. In this paper, rough set based matrix-represented approximations are presented to compute lower and upper approximations. The induced approximations are conducted as inputs for data analysis method, LERS (Learning from Examples based on Rough Set) used with LEM2 (Learning from Examples Module, Version2) rule induction algorithm. Analyzes are performed on missing datasets with "do not care" conditions and missing datasets with lost values. In addition, experiments on missing datasets with different missing percent by using different thresholds are also provided. The experimental results show that the system outperforms when missing data are characterized as "do not care" conditions than represented as lost values.

Research paper thumbnail of Action Recognition Framework using Saliency Detection and Random Subspace Ensemble Classifier

Action recognition can be defined as a problem to determine what kind of action is happening in a... more Action recognition can be defined as a problem to determine what kind of action is happening in a video. It is a process of matching the observation with the previously labelled samples and assigning label to that observation. In this paper, a framework of the action recognition system based on saliency detection and random subspace ensemble classifier, is introduced in order to increase the performance of the action recognition. The proposed action recognition framework can be partitioned into three main processing phases. The first processing phase is detecting salient foreground objects by considering pattern and color distinctness of a set of pixels in each video frame. In the second processing phase, changing gradient orientation features are used as a useful feature representation. The third processing phase is recognizing actions using random subspace ensemble classifier with discriminant learner. Experimental results are evaluated on the UIUC action dataset. The proposed action recognition framework achieved satisfying action recognition accuracy.

Research paper thumbnail of An Analysis on Security Threats of Black-Hole and Jellyfish Attacks in Mobile Ad-Hoc Network using HTTP Traffic

Mobile ad-hoc network (MANET) is an infrastructure less network. This network is a collection of ... more Mobile ad-hoc network (MANET) is an infrastructure less network. This network is a collection of randomly moving mobile nodes. As MANET does not have any centralized management, this network can form anywhere with the participation of randomly moving nodes. Because of such vulnerable behaviour of MANET, this network has to face many security problems. There are so many security threats of MANET, which does not have any solution. Even detection of those problems is not easy. Some of the security threats are very severe. Those threats can even destroy the whole network. Researchers are working to find out the solution of those threats. Among those threats, we have worked with two security threats, which are Black-hole attack and Jellyfish attack. Here, we have found out the threats using HTTP traffic. We use OPNET modeler 14.5 as simulator AODV routing protocol. The aim of this paper is to find out the impact of security threats on MANET using HTTP traffic. We decide the impact using number of events and average number of events utilizing throughput of the OPNET modeler.

Research paper thumbnail of Low Enriched Uranium based Nuclear Rocket Propulsion Technology: Mars Exploration Mission

Many space agencies like NASA, SPACE-X have promised to send humans into the red planet in future... more Many space agencies like NASA, SPACE-X have promised to send humans into the red planet in future. So, considering their project of mars colonization, nuclear rocket propulsion would be the better option. Replacing chemical rockets by nuclear rockets may reduce the mission duration and also can reduce the mass of the propellant used. In chemical rockets, propellant releases energy through combustion but in case of nuclear rockets, propellant i.e. hydrogen is heated up in controlled fission reaction in nuclear reactor inside the rocket engine. Specific impulse of the nuclear rocket is greater than chemical rocket. This helps in providing gigantic thrust as a result mission duration is decreased. The challenging parameter of increasing specific impulse is solved by maximizing specific impulse which is done by increasing the exhaust core temperature. The fuel is selected in such a way so that the exhaust temperature would be obtained. The (U, Zr) C-graphite fuel is selected because it has high uranium density and melting point is equivalent to exhaust core temperature which is sufficient enough to enhance the reactivity of the fissile material and thus to increase the rocket performance. A mathematical analysis shows that the percentage of mass of propellant used in mars mission will be lesser than the chemical rockets because the specific impulse is expected to be more in nuclear propulsion. The specific impulse obtained from the CFD Analysis of rocket nozzle is 979 sec with exit velocity of 9604m/s.

Research paper thumbnail of Performance of Combined Vertical and Horizontal Flow Sub-Surface Constructed Wetlands

The present study demonstrates wetland projects to treat industrial wastewater for reuse implemen... more The present study demonstrates wetland projects to treat industrial wastewater for reuse implemented for different hydraulic & organic loadings. The combination of vertical and horizontal flow wetland treatment system with fill and draw controls provides a design for effective contact of wastewater with the root system to achieve higher treatment efficiencies by creating necessary environments for nitrification-denitrification removal of organic materials, and phosphorus adsorption reactions. Systems have been implemented for large scale applications in automobile, sand reclamation, municipal leachate and other industries for process and domestic wastewater treatment & reuse. The results show that there is a marked removal efficiency using Typha species & several other indigenous plants. The percentage reductions in various physicochemical parameters such as Total Suspended Solids (TSS), Biochemical Oxygen Demand (BOD), Nitrate (N), Phosphate (P), and Fecal Coliforms (FC) are 85%, 90%, 70%, 60% and 95% respectively. The study further reveals the effect of variable hydraulic loading rates on treatment efficiencies. The system has been successfully adopted for the past 3 years reducing electrical, mechanical operations and maintenance requirements for wastewater treatment by almost 70% benefiting industries to a great extent and exploring opportunities for application in other industrial sectors for implementation of such technologies which were not in practice earlier.

Research paper thumbnail of New Insights of Background Estimation and Region Localization

Subtraction of background in a crowded scene is a crucial and challenging task of monitoring the ... more Subtraction of background in a crowded scene is a crucial and challenging task of monitoring the surveillance systems. Because of the similarity between the foreground object and the background, it is known that the background detection and moving foreground objects is difficult. Most of the previous works emphasize this field but they cannot distinguish the foreground from background due to the challenges of gradual or sudden illumination changes, high-frequencies background objects of motion changes, background geometry changes and noise. After getting the foreground objects, segmentation is need to localize the objects region. Image segmentation is a useful tool in many areas, such as object recognition, image processing, medical image analysis, 3D reconstruction, etc. In order to provide a reliable foreground image, a carefully estimated background model is needed. To tackle the issues of illumination changes and motion changes, this paper establishes an effective new insight of background subtraction and segmentation that accurately detect and segment the foreground people. The scene background is investigates by a new insight, namely Mean Subtraction Background Estimation (MS), which identifies and modifies the pixels extracted from the difference of the background and the current frame. Unlike other works, the first frame is calculated by MS instead of taking the first frame as an initial background. Then, this paper make the foreground segmentation in the noisy scene by foreground detection and then localize these detected areas by analyzing various segmentation methods. Calculation experiments on the challenging public crowd counting dataset achieve the best accuracy than state-of-the-art results. This indicates the effectiveness of the proposed work.

Research paper thumbnail of Disparity Map Computation from Stereo Images Using Hill-Climbing Segmentation

Stereo matching is one of the most active research areas in computer vision for decades. The task... more Stereo matching is one of the most active research areas in computer vision for decades. The task of stereo matching is to find the point correspondence between two images of the same scene taken from different viewpoints. This paper presents a segment-based stereo matching algorithm. Firstly, the reference image is segmented using hill-climbing algorithm and local stereo matching is performed Scale Invariant Feature Transform (SIFT) feature points with Sum of Absolute Differences (SAD) block matching. Secondly, a set of reliable pixels is constructed by comparing the matching cost and the mutual cross-checking consistent between the left and right initial disparity maps, which can lead to an actual disparity plane. Thirdly, a set of all possible disparity planes are extracted and then plane fitting and neighboring segment merging are performed. Finally, the disparity planes are set in each region using graph cuts to obtain final disparity map. The evaluation of proposed algorithm on the Middlebury data set result shows that the proposed algorithm is competitive with state-of-the-art stereo matching algorithms.

Research paper thumbnail of Big Data and Big Data Analytics: Concepts, Types and Technologies

Nowadays, companies are starting to realize the importance of data availability in large amounts ... more Nowadays, companies are starting to realize the importance of data availability in large amounts in order to make the right decisions and support their strategies. With the development of new technologies, the Internet and social networks, the production of digital data is constantly growing. The term "Big Data" refers to the heterogeneous mass of digital data produced by companies and individuals whose characteristics (large volume, different forms, speed of processing) require specific and increasingly sophisticated computer storage and analysis tools. This article intends to define the concept of Big Data, its concepts, challenges and applications, as well as the importance of Big Data Analytics

Research paper thumbnail of Failure Analysis in Conceptual Phase toward a Robust Design: Case Study in Monopropellant Propulsion System

As a system becomes more complex, the uncertainty in the operating conditions increases. In such ... more As a system becomes more complex, the uncertainty in the operating conditions increases. In such a system, implementing a precise failure analysis in early design stage is vital. However, there is a lack of applicable methodology that shows how to implement failure analysis in the early design phase to achieve a robust design. The main purpose of this paper is to present a framework to design a complex engineered system resistant against various factors that may cause failures, when design process is in the conceptual phase and information about detailed system and component is unavailable. Within this framework, we generate a population of feasible designs from a seed functional model, and simulate and classified failure scenarios. We also develop a design selection function to compare robust score for candidate designs, and produce a preference ranking. We implement the proposed method on the design of an aerospace monopropellant propulsion system.

Research paper thumbnail of Modeling of Phosphorous Acid Fuel Cell in PSCAD

The renewable energy sources, such as wind, fuel cells, etc. are gaining more attention due to th... more The renewable energy sources, such as wind, fuel cells, etc. are gaining more attention due to the increase in energy demand as well as being environmental kindly. A dynamic model of Phosphorous Acid Fuel Cell is modeled and simulated using PSCAD/EMTDC. The system consists of a fuel cell stack along with 3-phase Pulse-Width Modulator (PWM) inverter, LCL filter and step-up transformer connected to the main grid. A Real-Reactive power controller is implemented into the 3-phase PWM inverter to control and stabilize the active and reactive power flow onto the main grid. A LCL filter is connected to the inverter side, which eliminates the ultra-harmonic distortions of the frequency. The effect of the Line-Ground, Line-Line, etc. faults on the performance of the main grid's output voltage is analyzed and studied. The fuel cell is connected to the main grid and the simulation results contain the analysis at different stages of the simulation.

Research paper thumbnail of Plant Leaf Disease Detection and Classification using Image Processing

Myanmar is an agricultural country and then crop production is one of the major sources of earnin... more Myanmar is an agricultural country and then crop production is one of the major sources of earning. So, more than half of our population depends on agriculture for livelihood. Due to the factors like diseases, pest attacks and sudden change in the weather condition, the productivity of the crop decreases. Automatic detection of plant diseases is essential to automatically detect the symptoms of diseases as early as they appear on the growing stage. This paper proposed a methodology for the analysis and detection of plant leaf diseases using digital image processing techniques. The experimental results demonstrate that the proposed system can successfully detect and classify four major plant leaves diseases: Bacterial Blight and Cercospora Leaf Spot, Powdery Mildew and Rust.

Research paper thumbnail of Optimal Energy Storage Sizing in Photovoltaic and Wind Hybrid Power System Meeting Demand-Side Management Program in Viet Nam

This paper proposes a new method to determine optimal energy storage sizing in photovoltaic and w... more This paper proposes a new method to determine optimal energy storage sizing in photovoltaic and wind hybrid power generation systems. These generations are placed in a scheme of three blocks to forecast, measure, and dispatch/control and distribute power flows in whole system to meet requirements of the demand-side management program in Viet Nam. Data about electric load power, power of solar irradiance, ambient temperature, wind speed and other weather conditions must be forecasted in a high accuracy. An algorithm to determine the optimal sizing is designed basing on forecasting data, constraints, the relation of quantities in whole system and the capability to charge/discharge energy of energy storage. The optimal sizing in this research helps to rearrange load diagrams that compensate deficient energy completely in stages having high and medium price levels. It can be applied at each bus to reduce cost for buying electricity from electric power system. The new proposal is illustrated by simulation results in a case study carried out by MATLAB 2017a.

Research paper thumbnail of Solving Vehicle Routing Problem using Ant Colony Optimisation (ACO) Algorithm

Affiliation(s): Abstract-Engineering field usually requires having the best design for an optimum... more Affiliation(s): Abstract-Engineering field usually requires having the best design for an optimum performance, thus optimization plays an important part in this field. The vehicle routing problem (VRP) has been an important problem in the field of distribution and logistics since at least the early 1960s. Hence, this study was about the application of ant colony optimization (ACO) algorithm to solve vehicle routing problem (VRP). Firstly, this study constructed the model of the problem to be solved through this research. The study was then focused on the Ant Colony Optimization (ACO). The objective function of the algorithm was studied and applied to VRP. The effectiveness of the algorithm was increased with the minimization of stopping criteria. The control parameters were studied to find the best value for each control parameter. After the control parameters were identified, the evaluation of the performance of ACO on VRP was made. The good performance of the algorithm reflected on the importance of its parameters, which were number of ants (nAnt), alpha (α), beta (β) and rho (ρ). Alpha represents the relative importance of trail, beta represents the importance of visibility and rho represents the parameter governing pheromone decay. The route results of different iterations were compared and analyzed the performance of the algorithm. The best set of control parameters obtained is with 20 ants, α = 1, β = 1 and ρ = 0.05. The average cost and standard deviation from the 20 runtimes with best set of control parameters were also evaluated, with 1057.839 km and 25.913 km respectively. Last but not least, a conclusion is made to summarize the achievement of the study.

Research paper thumbnail of Time-aware Traffic Shaper using Time-based Packet Scheduling on Intel I210

Affiliation(s): Abstract-By 2015, the Institute of Electrical and Electronics Engineers (IEEE) ti... more Affiliation(s): Abstract-By 2015, the Institute of Electrical and Electronics Engineers (IEEE) time-sensitive networking (TSN) task group has released several TSN standards. Amongst them is 802.1Qbv, also known as time-aware shaper, aiming to provide performance assurances of latency and delivery variation to enable applications in a TSN network. While there are several products and evaluation kits that employ 802.1Qbv in the market now, it is still not widely adopted yet due to the maturity of the standard. Hardware-enabled 802.1Qbv use hardware queues and timers to achieve accurate transmission of packets in the switch and bridge. This research aims to investigate the feasibility of using an existing end-station Ethernet controller, Intel I210, and its launch time control feature (commonly known as time-based packet scheduling) to shape traffic compatible to 802.1Qbv-enabled network bridges. A software solution is developed by implementing a software configurable gate-control list and employing open-source Linux RFC patches for per-packet transmit time specification. By configuring the kernel and mapping kernel-layer traffic classes to the hardware queues, packets can be transmitted out at precise times while attaching 802.1Q VLAN tags, required by bridges to identify packets. Through analysis, it is found that this solution will require an additional 30 μs transmit offset to be used effectively. That is 55% more time is needed to transmit a packet in a back-to-back connection and 17.6% on a 3-switch network to improve period peak jitter performance to just 8.9 μs compared to 1 ms on solutions that send packets out periodically using software sleep functions.

Research paper thumbnail of A Method to Harness Maximum Power from Photovoltaic Power Generation Basing on Completely Mathematical Model

This paper introduces a new method that no previous study has been done in this photovoltaic powe... more This paper introduces a new method that no previous study has been done in this photovoltaic power generation similar to this paper to harness maximum potential power from photovoltaic power generation. The completely mathematical model added the relation between diode factor of the generation and p-n junction temperature is proposed to use in this method. The maximum power point tracker combines the iterative and bisectional technique, the completely mathematical model of PVG and the system of equations that converts value of parameters from standard test condition to any working condition, measuring sensors to measure power of solar irradiance and p-n junction temperature to determine parameters at maximum power point at any working condition. The voltage controller is designed to drive this generation to expect working state to harness maximum power. An experimental model corresponding to this method was designed and operated in real conditions in Viet Nam. Experimental results show the high accuracy of analyzing in theory and high capability to bring this method out real applications to harness all available energy of this generation.