MSE Research Papers - Academia.edu (original) (raw)

In modern days, image processing methods are widely adopted in the medical field for enhancing the earlier detection of certain abnormalities, such as the breast cancer, lung cancer, brain cancer and so on. This paper mainly concentrates... more

In modern days, image processing methods are widely adopted in the medical field for enhancing the earlier detection of certain abnormalities, such as the breast cancer, lung cancer, brain cancer and so on. This paper mainly concentrates on the
segmentation of lung cancer tumors from X-ray images, Computed Tomography (CT) images and MRI images. Image processing methods are adopted in segmenting the images. In the pre-processing stage mean and median filters are used. In the image segmentation stage, Otsu's thresholding and k-Means clustering segmentation approaches are used to segment the lung images and locate the tumors. To evaluate the performance of the methods used for segmentation, the performance evaluation parameters such as Signal to noise Ratio(SNR) ,Mean Square Error (MSE) and Peak Signal Noise to Ratio (PSNR)) are computed on the segmented images of the two different segmentation methods used for segmentation. Better results are obtained for the K-Means segmentation irrespective of the images.

This paper presents a comparison between unoptimized and optimized video steganography. In today’s world of internet communication, video is considered to be an effective and important tool for communication. Video steganography is a... more

This paper presents a comparison between unoptimized and optimized video steganography. In today’s world of internet communication, video is considered to be an effective and important tool for communication. Video steganography is a technique of hiding secret information in the video frames or the audio beats of the given cover video so that the presence of the secret information is concealed. The un-optimized base technique used in this paper for video steganography is a 3-3-2 LSB based technique. The unoptimized video frames were then optimized using Modified Genetic Algorithm which generated an optimum imperceptibility of hidden data. Peak signal to noise ratio (PSNR), mean square error (MSE) and image fidelity (IF) are the important mathematical measures for analyzing any steganographic technique. In this paper, we have compared all these three parameters for both un-optimized and optimized video steganography. Experimental results show a considerable improvement in these param...

Steganography is a branch of information security which deals with transmission of message without being detected. Message, to be send, is embedded in a cover file. Different types of digital can be used as cover object, we used (.WAV)... more

Steganography is a branch of information security which deals with transmission of message without being detected. Message, to be send, is embedded in a cover file. Different types of digital can be used as cover object, we used (.WAV) audio as our cover file in the research work. The objective of steganography is to shield the fact that the message exists in the transmission medium. Many algorithms have so far derived for this purpose can be categorized in terms of their embedding technique, time and space complexity. LSB is the acronym of „Least Significant Bit‟, is one of the algorithm that is considered as the easiest in way of hiding information in a digital media, also it has good efficiency. It perform its task by embedding secret message in the least significant bits of each data sample of audio file. Ease of cracking this algorithm makes it more prone to visual and statistical attacks. Keeping this in mind few improvisation are being done on LSB algorithm that reduces the e...

There are situations in survey sampling where the study characters are sensitive. Due to the sensitivity of characters, practitioners don't get the actual response. Randomized response technique (RRT) models are developed to reduce the... more

There are situations in survey sampling where the study characters are sensitive. Due to the sensitivity of characters, practitioners don't get the actual response. Randomized response technique (RRT) models are developed to reduce the bias raised by an evasive response on the sensitive variable. The measurement error (ME) is usually always present in the surveys so we need to study the RRT models with ME. We propose an estimator to predict the population mean of a sensitive variable in the influence of ME. The properties of the proposed estimator are studied and comparisons are made with the existing estimators. At last, a simulation study is executed to illustrate the results numerically.

Noise is consistently presents in digital images during image acquisition, coding, transmission, and processing steps. The main purpose of de-noising the image is to restore the detail of original image as much as possible. Depending on... more

Noise is consistently presents in digital images during image acquisition, coding, transmission, and processing steps. The main purpose of de-noising the image is to restore the detail of original image as much as possible. Depending on the types of disturbance, the noise can affect the image to different extent. Salt and Pepper noise arises in the image because of sharp and sudden changes of image signal. Dust particles in the image acquisition source or over heated faulty components can cause this type of noise. In the field of image processing, several type of linear and non-linear filtering techniques have been proposed for reducing the noise in images. Filters like Median, Gaussian and Wiener are used to do this comparative study of different images of same type in Gray scale. The quality of the enhanced images is measured by the statistical quantity measures: Peak Signal-to Noise Ratio (PSNR), Mean Square Error (MSE) and Structural Similarity Index (SSIM) metrics. An application specific measure for natural flower images correlation is defined and known as Deviation Measure. A Deviation Measure (DM) is obtained by the comparison of filtered and original image which shows the deviation in the images.

Kompendium na temat utrzymania aktywności zawodowej pracowników w wieku 50+ zostało opracowane w ramach projektu pt. STAY. Wsparcie aktywności zawodowej osób 50+ w przedsiębiorstwie. Zdrowy i zmotywowany pracownik – zadowolony pracodawca,... more

Kompendium na temat utrzymania aktywności zawodowej pracowników w wieku 50+ zostało opracowane w ramach projektu pt. STAY. Wsparcie aktywności zawodowej osób 50+ w przedsiębiorstwie. Zdrowy i zmotywowany pracownik – zadowolony pracodawca, współfinansowanego ze środków Unii Europejskiej. W kompendium odnajdą Państwo praktyczne przykłady rozwiązań z zakresu zarządzania wiekiem. Pokażemy również, jak projektować i wdrażać rozwiązania sprzyjające utrzymaniu w zatrudnieniu osób 50+ oraz jak czerpać korzyści z różnorodności wiekowej pracowników. To nowe spojrzenie
na zarządzanie pracownikami, sięganie do potencjału wszystkich
– niezalenie od ich wieku, przynosi bowiem wymierne korzyści, zarówno pracownikom, jak i przedsiębiorstwu.

Image Compression is a wide area. Compression of Images increasing because the raw images need large amounts of disk space seems to be a big disadvantage during transmission & storage. Even though there are so many compression technique... more

Image Compression is a wide area. Compression of Images increasing because the raw images need large amounts of disk space seems to be a big disadvantage during transmission & storage. Even though there are so many compression technique already present a better technique which is faster, memory efficient and simple surely suit the requirements of the user. The work is to explain use of different compression techniques more suitable for particular data compression algorithm based on compression ratio, Bit per Pixel, Mean Square Error and Peak Signal to Noise Ratio. Using all four parameters image compression works and gives compressed image as an output.

Continuous infrastructural development in the country consumes a massive amount of natural resource materials which could be replaced partially or fully in future by the wide variety of industrial waste materials available in the country.... more

Continuous infrastructural development in the country consumes a massive amount of natural resource materials which could be replaced partially or fully in future by the wide variety of industrial waste materials available in the country. Such case is the construction of MSE walls for the smooth flow of traffic over the state highways and National highways which requires huge amount of non-plastic granular fine aggregates for backfilling. In this study, most of the potential industrial waste materials that can be used in place of standard structural fill materials and analyzed and compared. The tabular comparison helps in understanding the scope of using such waste materials in MSE walls at present time or in future. The material acceptability flow chart clearly describes the process of choosing a waste material for using as a backfill material in MSE walls.

ABTRACT One of vital parts of the mobile wireless channel is channel estimation significantly improve the performance of the system, especially for 4G LTE and 5G systems. The main goal of this project is to test the time-variant channel... more

ABTRACT One of vital parts of the mobile wireless channel is channel estimation significantly improve the performance of the system, especially for 4G LTE and 5G systems. The main goal of this project is to test the time-variant channel estimation using F-OFDM model. It investigates the Mean Square Error (MSE), the Least Squares (LS) method and compares performance of OFDM and F-OFDM using BER vs SNR.

Water marking scheme is use for secure the data to protect digital content from unauthorized modification. The digital image watermarking technology is an important aspect about multimedia authentication and copyright protection, in order... more

Water marking scheme is use for secure the data to protect digital content from unauthorized modification. The digital image watermarking technology is an important aspect about multimedia authentication and copyright protection, in order to enhance its reliability and security. Image watermarking scheme can effectively be used in medical image processing to authenticate or investigate the integrity on medical images. Join cryptography and watermarking is efficient method for security. Peak Signal to Noise Ratios and Normalized Correlation are computed to accesses the quality of the watermarked images and extracted the information of images.

Typically, temporary soil nailing systems are not required to provide for design level earthquake occurrences consistent with the building or structure being constructed inside the excavation. However, the seismic response of the... more

Typically, temporary soil nailing systems are not required to provide for design level earthquake occurrences consistent with the building or structure being constructed inside the excavation. However, the seismic response of the permanent soil nail walls during the earthquakes should be evaluated. On the other hand, evaluation of 3D response of soil nailing walls have some strange manners that should be considered in the numerical analysis. In this paper, numerical simulations of soil nail walls under vibrational input have been carried out, and the results are compared with the function of soil nail walls under ordinary statistical loading. The behaviour of geometry of nails are mentioned under static and seismic analysis. After that some investigations are carried out to find respond of soil nailing walls in some 3D excavation forms. The analysis is performed with finite difference software called FLAC3D. The results are prepared as lateral displacement of the walls and normalized maximum tensile forces for nails. These results can demonstrate the behavior of external and internal resistance of soil nail walls under seismic and static analysis. The deformation of wall under the static and dynamic manner varies in a wide range. On the other hand, tensile loads that are produced in nails under the static manner are namely 50% less than the dynamic manner.

Image Compression addresses the matter of reducing the amount of data needed to represent the digital image. There are several transformation techniques used for data compression. Discrete Cosine Transform (DCT) and Discrete Wavelet... more

Image Compression addresses the matter of reducing the amount of data needed to represent the digital image. There are several transformation techniques used for data compression. Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT) is mostly used transformation. The Discrete cosine transform (DCT) is a method for transform an image from spatial domain to frequency domain. DCT has high energy compaction property and requires less computational resources. On the other hand, DWT is multi resolution transformation. The research paper includes various approaches that have been used by different researchers for Image Compression. The analysis has been carried out in terms of performance parameters Peak signal to noise ratio, Bit error rate, Compression ratio, Mean square error. and time taken for decomposition and reconstruction.

Volatility Forecasting is an interesting challenging topic in current financial instruments as it is directly associated with profits. There are many risks and rewards directly associated with volatility. Hence forecasting volatility... more

Volatility Forecasting is an interesting challenging topic in current financial instruments as it is directly associated with profits. There are many risks and rewards directly associated with volatility. Hence forecasting volatility becomes most dispensable topic in finance. The GARCH distributions play an important role in the risk measurement and option pricing. The min motive of this paper is to measure the performance of GARCH techniques for forecasting volatility by using different distribution model. We have used 9 variations in distribution models that are used to forecast the volatility of a stock entity. The different GARCH distribution models observed in this paper are Std, Norm, SNorm, GED, SSTD, SGED, NIG, GHYP and JSU. Volatility is forecasted for 10 days in advance and values are compared with the actual values to find out the best distribution model for volatility forecast. From the results obtain it has been observed that GARCH with GED distribution models has outperformed all models.

A full-scale reinforced earth embankment was designed and constructed by the Department of Highways of Thailand on a hard foundation in Phitsanulok Province, Thailand. Two types of reinforcement were used in the embankment. One side was... more

A full-scale reinforced earth embankment was designed and constructed by the
Department of Highways of Thailand on a hard foundation in Phitsanulok Province,
Thailand. Two types of reinforcement were used in the embankment. One side was
reinforced with polymeric reinforcement consisting of polyester (PET), polypropylene
(PP) and high-density polyethylene (HDPE) and referred to as a reinforced steep slope
(RSS), with an angle of 70° from horizontal. On the other side, the embankment was
reinforced with metallic reinforcement consisting of metallic strips (MS) and steel wire
grids (SWG) combined with vertical segmental concrete facing and referred to as a
mechanically stabilised earth wall (MSEW). The behaviour of the reinforced soil slope
and the mechanically stabilised earth wall on a hard foundation were observed and
compared with predictions from the PLAXIS 3D software. The lateral displacements
and settlements were very small in the case of the MSEW with inextensible reinforcement.
The corresponding lateral and vertical deformations in the RSS were much larger
due to its extensible reinforcing materials. The stiffnesses of the reinforcing materials
decrease in the following order: MS, SWG, PP, HDPE and PET. The results obtained from
three-dimensional (3D) finite element method simulations (using PLAXIS 3D) were in
good agreement with the field measurements in terms of vertical and lateral deformations
and strains in the reinforcement.

Posidonia oceanica (L.) is an endemic sea grass in the Mediterranean Sea. Its dead leaves are accumulated in the beaches. Biomass based on dead leaves of P. oceanica is considered as a beach waste. Therefore, they are burned to keep... more

Posidonia oceanica (L.) is an endemic sea grass in the Mediterranean Sea. Its dead leaves are accumulated in the beaches. Biomass based on dead leaves of P. oceanica is considered as a beach waste. Therefore, they are burned to keep beaches clean. In the present study, an alternative eco-friendly evaluation approach for removal of methylene blue by dead leaves of P. oceanica is studied. Dynamic removal of methylene blue from aqueous solution is performed in a fixed-bed column. Effects of different flow rates and bed heights on column performance are investigated and best flow rate and bed height are observed at 7.28 mL/min and 9 cm, respectively. Column performance has been modeled with Thomas and Artificial Neural Network models. The results confirm that dead leaves of P. oceanica can be used as a fixed-bed material for the dynamic removal of dyes in the waste waters from textile industry.► Posidonia oceanica is an important endemic sea grass in the Mediterranean Sea. ► Its dead leaves accumulated on touristic beaches are considered as beach wastes. ► This study presents an alternative evaluation method related to dead leaves of Posidonia oceanica. ► Modeling of MB biosorption by Thomas and Artificial Neural Network Models. ► These beach wastes can be used in dynamic removal of dyes from aqueous solutions.

The attributes of micro and small enterprises (MSEs) influencing access to credit, in particular the level and role of firm informality, are analysed in the article. The puzzle is the push for MSEs to join the formal sector and the tug to... more

The attributes of micro and small enterprises (MSEs) influencing access to credit, in particular the level and role of firm informality, are analysed in the article. The puzzle is the push for MSEs to join the formal sector and the tug to avoid the extra burden it places on the firm. It is important to know more clearly what forces are at work and the sources of the causal effects. This study uses data from the World Bank Enterprise Surveys for five low-income countries (LICs) in Sub-Saharan Africa. The method is empirical and as we find informality to be endogenous to credit constraints, an instrumental variable approach is estimated. Further, to address the possibility of reverse causality, an instrument for the informality variable is required; not registered with Inland Revenue (tax office) is the chosen instrument variable. The findings reveal that as the probability of a firm operating in the formal sector increases, there is greater access to external credit. The causality relationships are tested providing a strong platform for the formalization of polices to reduce the informality of the MSE sector. These are discussed in the context of the research findings.

Motorcycles have a higher rate of fatal accidents than cars or trucks and buses. With the increase in the commonness of motorcycling, the safety of the riders has become an issue. Hence crash guards have become a necessary safety measure... more

Motorcycles have a higher rate of fatal accidents than cars or trucks and buses. With the increase in the commonness of motorcycling, the safety of the riders has become an issue. Hence crash guards have become a necessary safety measure for motorcycles. But the crash guards available have a fixed and broad frame, this makes maneuvering through dense traffic difficult. Although subjective, some riders remove crash guards because it affects the motorcycle's appearance. This system proposes varying width crash guards. The crash guard has two modes retracted and extended. The crash guard ideally remains in retracted mode but, if the motorcycle crosses the threshold roll angle, the crash guard extends. Accelerometer sensor is used for sensing the roll angle of the motorcycle and, motors are used to extend and retract the crash guard. This system will prevent the motorcycle from falling on the rider's leg and protect its engine in case of an accident.

In the field of artificial intelligence, Adaptive Learning Technique refers to the combination of artificial neural networks. In this research paper the Adaptive Learning Technique has been implemented to carry out the Detection and... more

In the field of artificial intelligence, Adaptive Learning Technique refers to the combination of artificial neural networks. In this research paper the Adaptive Learning Technique has been implemented to carry out the Detection and Localization of Sound (S). In this technique two methods are used to detect the pure sound, In the first method wiener filter are used to reduce the amount of noise in a signal and minimize the mean square error (M.S.E), And in the second method wiener with bacterial foraging optimization are used for effectiveness in sound. These proposed methods are compared and the results reveal its superiority.

The research aimed at determining factors influencing acceptance of mobile applications in enterprise management specifically testing whether relationship exists between demographic features, Perceived Usefulness (PU), Perceived Ease of... more

The research aimed at determining factors influencing acceptance of mobile applications in enterprise management specifically testing whether relationship exists between demographic features, Perceived Usefulness (PU), Perceived Ease of Use (PEOU) and Perceived Risk (PR); and acceptance of mobile money payment among micro and small enterprises in Kisumu City, Kenya. The research exhibited descriptive correlational research design enabling the researcher to determine the strength of relationship between the variables of the study and the study employed simple random and purposive sampling technique for selection of the samples. The questionnaires were administered by the researcher with the help of two research assistants to the MSE " s Owners in Kisumu City CBD. A total of 271 respondents returned their questionnaires, resulting in a response rate of 95.8% which was considered satisfactory for subsequent analysis. Due to the quantitative nature of the study, the results were analyzed with statistical tools using mean, mode, median, frequency tables and multiple regression. The results indicated that Demographic features, PU, PEOU and PR are significantly related to acceptance of MMP. PEOU was found to be the most influential determinant of acceptance with r= 0.872, p-value= 0.013 at 0.05% significant level. The analysis also revealed that PR on MMP was the major hindrance of users accepting the technology with a negatively strong relationship of r value =-0.548 and p value =0.003. The study also established joint contributions of the four independent variables on acceptance of MMP but failed to establish the individual contribution of each independent variable on MMP hence there is need for further studies.

Histogram Equalization is a simple and effective contrast enhancement technique. In spite of its popularity Histogram Equalization still have some limitations –produces artifacts, unnatural images and the local details are not considered,... more

Histogram Equalization is a simple and effective contrast enhancement technique. In spite of its popularity Histogram Equalization still have some limitations –produces artifacts, unnatural images and the local details are not considered, therefore due to these limitations many other Equalization techniques have been derived from it with some up gradation. In this proposed method statistics play an important role in image processing, where statistical operations is applied to the image to get the desired result such as manipulation of brightness and contrast. Thus, a novel algorithm using statistical operations and neighborhood processing has been proposed in this paper where the algorithm has proven to be effective in contrast enhancement based on the theory and experiment.

There has been a remarkable raise in the data exchange over web and the general use of digital media. The interest with reference to digital watermarking right through the last decade is certainly due to the increase in the need of... more

There has been a remarkable raise in the data exchange over web and the general use of digital media. The interest with reference to digital watermarking right through the last decade is certainly due to the increase in the need of copyright protection of digital data. Applications of video watermarking in copy control, broadcast monitoring, video authentication, fingerprinting, copyright protection etc is immensely rising. The major aspects of information hiding are security, capacity and robustness. Capacity deals with the amount of information that can be concealed. The ability of anyone detecting the information is robustness and refers to the resistance to modification of the cover content before concealed information is destroyed. Video watermarking algorithms normally prefers robustness. In a robust algorithm it is not possible to eliminate the watermark without precise degradation of the cover content. In this paper, features required to design a robust watermarked video for a valuable application. We review several algorithms, and introduce commonly used key technique. Aspire of this paper is to focus on the various domains of video watermarking techniques. The best part of the reviewed methods based on video watermarking highlight on the concept of robustness of the algorithm. Video Watermarking is a young and rapidly rising field in the area of multimedia. The aim of this paper is to focus on the various domains of video Watermarking techniques.

With the increase in demand, product of multimedia is increasing fast and thus contributes to insufficient network bandwidth and memory storage. Therefore image compression is more significant for reducing data redundancy for save more... more

With the increase in demand, product of multimedia is increasing fast and thus contributes to insufficient network bandwidth and memory storage. Therefore image compression is more significant for reducing data redundancy for save more memory and transmission bandwidth. An efficient compression technique has been proposed which combines fuzzy logic with that of Huffman coding. While normalizing image pixel, each value of pixel image belonging to that image foreground are characterized and interpreted. The image is sub divided into pixel which is then characterized by a pair of set of approximation. Here encoding represent Huffman code which is statistically independent to produce more efficient code for compression and decoding represents rough fuzzy logic which is used to rebuilt the pixel of image. The method used here are rough fuzzy logic with Huffman coding algorithm (RFHA). Here comparison of different compression techniques with Huffman coding is done and fuzzy logic is applied on the Huffman reconstrcted image. Result shows that high compression rates are achieved and visually negligible difference between compressed images and original images.

This paper deals with the performance of the use of fractional Fourier transform (FRFT) instead of conventional Fourier transform (FFT) in either symbol timing offset (STO) and carrier frequency offset (CFO) estimation. Orthogonal... more

This paper deals with the performance of the use of fractional Fourier transform (FRFT) instead of conventional Fourier transform (FFT) in either symbol timing offset (STO) and carrier frequency offset (CFO) estimation. Orthogonal frequency division multiplexing is widely used in many systems due to advantages of theses technique compared with mono-carrier systems. In spite of his advantages, OFDM presents drawbacks such as sensitivity to timing and frequency offsets. Many techniques are used in the literature to estimate these two parameters in order to compensate them (synchronization task). These techniques used conventional Fourier transform. In this paper, we are interested in estimating STO and CFO using fractional Fourier transform. Monte Carlo simulation demonstrates the performance of the use of FRFT instead of FFT.