Pit Pichappan - Academia.edu (original) (raw)
Papers by Pit Pichappan
Journal of Emerging Technologies in Web Intelligence, Aug 1, 2012
ABSTRACT This paper suggests a new algorithm for compression based on Boolean minimization of bin... more ABSTRACT This paper suggests a new algorithm for compression based on Boolean minimization of binary data. On the compressor side, the input bitstream is divided into blocks of N-bits each, and a "sum of products" function is found for each block using Quine-McCluskey algorithm. The minimized "sum of products" function is stored in a file, after that Huffman coding is applied to this file. The obtained Huffman code is used to convert the original file into a compressed one. On the decompression side, the Huffman tree is used to retrieve the original file. Since we are combining Quine-McCluskey with Huffman coding we called the proposed algorithm as QMHuff(N). The experimental investigation showed that a good compression have been achieved for files with low number of binary "Is" or "Os" only. While more work need to be done to improve the performance of this technique over the whole space.
Nature-inspired computing algorithms (NICs in short) inherit a certain length of history tracing ... more Nature-inspired computing algorithms (NICs in short) inherit a certain length of history tracing back to Genetic Algorithm and Evolutionary Computing in the 50's. Since February 2008 by the birth of Firefly Algorithm, NICs started to receive lots of attentions from researchers around the global. Variants and even new species of NIC algorithms boomed like sprouts after rain. While it may be disputable to question the necessity of creations of further new species of such algorithms, it is observed that these algorithms are fundamentally made up of several core components. By exposing these components, the underlying design of a collection of so-called modern NIC optimization algorithms is revealed. In this paper, these core components of constructs from some popular NIC algorithms are reviewed, thereby debunking the myths of novelty and perhaps the excitement of inventing something really "New" by simply branding a NIC search method by merely a name of another living creature. This work also serves as a general guideline and reference for any algorithm architect who wants to create a new NIC algorithm in the future.
Journal of Digital Information Management, 2008
ABSTRACT Increasingly web applications found to impact on numerous environments. The web log data... more ABSTRACT Increasingly web applications found to impact on numerous environments. The web log data offer more promises and particularly application of the genetic algorithms is significant as it represents the relations between different data components. We have used simple genetic algorithms to log files and we found that the preliminary results are more promising there by open more avenues for future research.
Int. J. Web Appl., 2009
Currently, search engines suffer from conceptual problems despite the application of refi ned tec... more Currently, search engines suffer from conceptual problems despite the application of refi ned techniques. The fundamental problem lies in the absence of classical information processing concepts in search engine processing. The limitation of search engines in phrase processing is highlighted in this paper. The results of search engine retrievals are compared with propriety database search results. The inferences if applied in search engine processing would lead to high precision in search engine retrievals.
Journal of Digital Information Management, 2007
Journal of Digital Information Management, Jun 1, 2023
ChatGPT, in recent months, has made a significant impact and exposure in the information world. M... more ChatGPT, in recent months, has made a significant impact and exposure in the information world. Many studies have been conducted within a shorter timeframe about its efficiency, reliability, ethics, accuracy and acceptance. Besides, hundreds of opinions and perception-based analyses have also emerged. In this work, we look at the ChatGPT as a question-answering tool. We have used randomly generated prompts to solicit answers and analysed the results from a text analysis angle. The answers are compared with text analysers both manually and statistically. ChatGPT still needs more precision for linguistic effects and fails to meet comprehensive users' requirements.
In this paper, we present a model based on the Neural Network (NN) for classifying Arabic texts. ... more In this paper, we present a model based on the Neural Network (NN) for classifying Arabic texts. We propose the use of Singular Value Decomposition (SVD) as a preprocessor of NN to reduce the data in terms of both size as well as dimensionality so that the input data become more classifiable and faster for the convergence of the training process used in the NN model. To test the effectiveness of the proposed model, experiments were conducted using an in-house collected Arabic corpus for text categorization. The results showed that the proposed model was able to achieve high categorization effectiveness as measured by precision, recall and F-measure. Experimental result shows that the ANN model using SVD is better than the basic ANN on Arabic text classification.
Measuring the value and quality of web page is a challenging issue. Research has produced many va... more Measuring the value and quality of web page is a challenging issue. Research has produced many varying standards and the absence of a global framework is evident. In the current work we generated a few quality measures for judging the quality of web page content. We limit our exercise to the content quality and refrain from other web page features.
This paper suggests splitting the existing email handlers into more than one group. The existing ... more This paper suggests splitting the existing email handlers into more than one group. The existing group can be considered as a group that has one-to-one relationship. However, this paper suggests that the email handlers should give the user the choice to select the relation that he would ...
Communications in computer and information science, 2010
The act of referencing to earlier scholarly papers is a norm in scientific literature. The refere... more The act of referencing to earlier scholarly papers is a norm in scientific literature. The references have been used a workable parameter to establish semantic links in literature. With the explosive growth of networked information systems available, it is becoming increasingly significant to evolve methods to automatically extract the semantic links of large collections effectively. The online pages have offered the hyperlink mechanism to other online and print resources. When the references and hyperlinks are integrated, the semantic linking would become more meaningful. Through this paper, a prototype architecture is outlined with the results of a pilot study.
Springer eBooks, 2018
The series "Advances in Intelligent Systems and Computing" contains publications on theory, appli... more The series "Advances in Intelligent Systems and Computing" contains publications on theory, applications, and design methods of Intelligent Systems and Intelligent Computing. Virtually all disciplines such as engineering, natural sciences, computer and information science, ICT, economics, business, e-commerce, environment, healthcare, life science are covered. The list of topics spans all the areas of modern intelligent systems and computing such as: computational intelligence, soft computing including neural networks, fuzzy systems, evolutionary computing and the fusion of these paradigms, social intelligence, ambient intelligence, computational neuroscience, artificial life, virtual worlds and society, cognitive science and systems, Perception and Vision, DNA and immune based systems, self-organizing and adaptive systems, e-Learning and teaching, human-centered and human-centric computing, recommender systems, intelligent control, robotics and mechatronics including human-machine teaming, knowledge-based paradigms, learning paradigms, machine ethics, intelligent data analysis, knowledge management, intelligent agents, intelligent decision making and support, intelligent network security, trust management, interactive entertainment, Web intelligence and multimedia. The publications within "Advances in Intelligent Systems and Computing" are primarily proceedings of important conferences, symposia and congresses. They cover significant recent developments in the field, both of a foundational and applicable character. An important characteristic feature of the series is the short publication time and worldwide distribution. This permits a rapid and broad dissemination of research results.
Scientometrics, Jun 1, 1993
He and Pao's method of identifying specific discipline journals is improved by adding the citing ... more He and Pao's method of identifying specific discipline journals is improved by adding the citing impact factor and self-citing rate. The proposed indicator strikes a balance by discounting the size of a discipline. And also this indicator paves the way to identify the constituent journals of a discipline. This method was tested in Physics, Applied Physics and Astronomy and Astrophysics. The findings lead to the rethinking about the inclusion of many journals in these fields.
Springer eBooks, 2011
The use of general descriptive names, registered names, trademarks, etc. in this publication does... more The use of general descriptive names, registered names, trademarks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use.
The web has profound applications in the world and influenced the society than any other technolo... more The web has profound applications in the world and influenced the society than any other technology. The present decade is marked by the increased application of web servers and the use of web 2.0 for many utilities. As the access to the Web 2.0 by the global users is on ever increase, the hijackers eye on the Web 2.0 for attacks. The principal technology behind the Web 2.0 is the AJAX which now becomes the target of many security attacks. The current study has visualized an enhanced architecture to reflect the AJAX vulnerability and the architecture is applied in randomized trails and the results call for increased applications
Journal of Emerging Technologies in Web Intelligence, Aug 1, 2012
ABSTRACT This paper suggests a new algorithm for compression based on Boolean minimization of bin... more ABSTRACT This paper suggests a new algorithm for compression based on Boolean minimization of binary data. On the compressor side, the input bitstream is divided into blocks of N-bits each, and a "sum of products" function is found for each block using Quine-McCluskey algorithm. The minimized "sum of products" function is stored in a file, after that Huffman coding is applied to this file. The obtained Huffman code is used to convert the original file into a compressed one. On the decompression side, the Huffman tree is used to retrieve the original file. Since we are combining Quine-McCluskey with Huffman coding we called the proposed algorithm as QMHuff(N). The experimental investigation showed that a good compression have been achieved for files with low number of binary "Is" or "Os" only. While more work need to be done to improve the performance of this technique over the whole space.
Nature-inspired computing algorithms (NICs in short) inherit a certain length of history tracing ... more Nature-inspired computing algorithms (NICs in short) inherit a certain length of history tracing back to Genetic Algorithm and Evolutionary Computing in the 50's. Since February 2008 by the birth of Firefly Algorithm, NICs started to receive lots of attentions from researchers around the global. Variants and even new species of NIC algorithms boomed like sprouts after rain. While it may be disputable to question the necessity of creations of further new species of such algorithms, it is observed that these algorithms are fundamentally made up of several core components. By exposing these components, the underlying design of a collection of so-called modern NIC optimization algorithms is revealed. In this paper, these core components of constructs from some popular NIC algorithms are reviewed, thereby debunking the myths of novelty and perhaps the excitement of inventing something really "New" by simply branding a NIC search method by merely a name of another living creature. This work also serves as a general guideline and reference for any algorithm architect who wants to create a new NIC algorithm in the future.
Journal of Digital Information Management, 2008
ABSTRACT Increasingly web applications found to impact on numerous environments. The web log data... more ABSTRACT Increasingly web applications found to impact on numerous environments. The web log data offer more promises and particularly application of the genetic algorithms is significant as it represents the relations between different data components. We have used simple genetic algorithms to log files and we found that the preliminary results are more promising there by open more avenues for future research.
Int. J. Web Appl., 2009
Currently, search engines suffer from conceptual problems despite the application of refi ned tec... more Currently, search engines suffer from conceptual problems despite the application of refi ned techniques. The fundamental problem lies in the absence of classical information processing concepts in search engine processing. The limitation of search engines in phrase processing is highlighted in this paper. The results of search engine retrievals are compared with propriety database search results. The inferences if applied in search engine processing would lead to high precision in search engine retrievals.
Journal of Digital Information Management, 2007
Journal of Digital Information Management, Jun 1, 2023
ChatGPT, in recent months, has made a significant impact and exposure in the information world. M... more ChatGPT, in recent months, has made a significant impact and exposure in the information world. Many studies have been conducted within a shorter timeframe about its efficiency, reliability, ethics, accuracy and acceptance. Besides, hundreds of opinions and perception-based analyses have also emerged. In this work, we look at the ChatGPT as a question-answering tool. We have used randomly generated prompts to solicit answers and analysed the results from a text analysis angle. The answers are compared with text analysers both manually and statistically. ChatGPT still needs more precision for linguistic effects and fails to meet comprehensive users' requirements.
In this paper, we present a model based on the Neural Network (NN) for classifying Arabic texts. ... more In this paper, we present a model based on the Neural Network (NN) for classifying Arabic texts. We propose the use of Singular Value Decomposition (SVD) as a preprocessor of NN to reduce the data in terms of both size as well as dimensionality so that the input data become more classifiable and faster for the convergence of the training process used in the NN model. To test the effectiveness of the proposed model, experiments were conducted using an in-house collected Arabic corpus for text categorization. The results showed that the proposed model was able to achieve high categorization effectiveness as measured by precision, recall and F-measure. Experimental result shows that the ANN model using SVD is better than the basic ANN on Arabic text classification.
Measuring the value and quality of web page is a challenging issue. Research has produced many va... more Measuring the value and quality of web page is a challenging issue. Research has produced many varying standards and the absence of a global framework is evident. In the current work we generated a few quality measures for judging the quality of web page content. We limit our exercise to the content quality and refrain from other web page features.
This paper suggests splitting the existing email handlers into more than one group. The existing ... more This paper suggests splitting the existing email handlers into more than one group. The existing group can be considered as a group that has one-to-one relationship. However, this paper suggests that the email handlers should give the user the choice to select the relation that he would ...
Communications in computer and information science, 2010
The act of referencing to earlier scholarly papers is a norm in scientific literature. The refere... more The act of referencing to earlier scholarly papers is a norm in scientific literature. The references have been used a workable parameter to establish semantic links in literature. With the explosive growth of networked information systems available, it is becoming increasingly significant to evolve methods to automatically extract the semantic links of large collections effectively. The online pages have offered the hyperlink mechanism to other online and print resources. When the references and hyperlinks are integrated, the semantic linking would become more meaningful. Through this paper, a prototype architecture is outlined with the results of a pilot study.
Springer eBooks, 2018
The series "Advances in Intelligent Systems and Computing" contains publications on theory, appli... more The series "Advances in Intelligent Systems and Computing" contains publications on theory, applications, and design methods of Intelligent Systems and Intelligent Computing. Virtually all disciplines such as engineering, natural sciences, computer and information science, ICT, economics, business, e-commerce, environment, healthcare, life science are covered. The list of topics spans all the areas of modern intelligent systems and computing such as: computational intelligence, soft computing including neural networks, fuzzy systems, evolutionary computing and the fusion of these paradigms, social intelligence, ambient intelligence, computational neuroscience, artificial life, virtual worlds and society, cognitive science and systems, Perception and Vision, DNA and immune based systems, self-organizing and adaptive systems, e-Learning and teaching, human-centered and human-centric computing, recommender systems, intelligent control, robotics and mechatronics including human-machine teaming, knowledge-based paradigms, learning paradigms, machine ethics, intelligent data analysis, knowledge management, intelligent agents, intelligent decision making and support, intelligent network security, trust management, interactive entertainment, Web intelligence and multimedia. The publications within "Advances in Intelligent Systems and Computing" are primarily proceedings of important conferences, symposia and congresses. They cover significant recent developments in the field, both of a foundational and applicable character. An important characteristic feature of the series is the short publication time and worldwide distribution. This permits a rapid and broad dissemination of research results.
Scientometrics, Jun 1, 1993
He and Pao's method of identifying specific discipline journals is improved by adding the citing ... more He and Pao's method of identifying specific discipline journals is improved by adding the citing impact factor and self-citing rate. The proposed indicator strikes a balance by discounting the size of a discipline. And also this indicator paves the way to identify the constituent journals of a discipline. This method was tested in Physics, Applied Physics and Astronomy and Astrophysics. The findings lead to the rethinking about the inclusion of many journals in these fields.
Springer eBooks, 2011
The use of general descriptive names, registered names, trademarks, etc. in this publication does... more The use of general descriptive names, registered names, trademarks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use.
The web has profound applications in the world and influenced the society than any other technolo... more The web has profound applications in the world and influenced the society than any other technology. The present decade is marked by the increased application of web servers and the use of web 2.0 for many utilities. As the access to the Web 2.0 by the global users is on ever increase, the hijackers eye on the Web 2.0 for attacks. The principal technology behind the Web 2.0 is the AJAX which now becomes the target of many security attacks. The current study has visualized an enhanced architecture to reflect the AJAX vulnerability and the architecture is applied in randomized trails and the results call for increased applications