Parsers Research Papers - Academia.edu (original) (raw)
Any program written in high level programming language must be meant object code before going to be executed. Compiler has various stages or passes. Each stage having significant criticalness however Parsing or syntactic investigation is... more
Any program written in high level programming language must be meant object code before going to be executed. Compiler has various stages or passes. Each stage having significant criticalness however Parsing or syntactic investigation is the way toward breaking down a series of input string, either in normal language, programming languages or information structures, adjusting to the principles of a proper sentence structure. This paper clarifies the sorts of parsers to create string of executable code. This paper centers on comparison between top down and bottom up parsing approaches. Each approach having critical advantage and downside, and furthermore endure with some bottleneck.
- by Hamna Baqai and +1
- •
- Computer Science, Lexical Semantics, Parsing, Parsers
- by Eduard Hovy
- •
- English language, Theory, Taxonomy, Parsers
The present paper aims at presenting a structured literature review for what concerns the following aspects: Infrastructure as a Code (IaC) languages, modelling approaches supporting the generation of IaC, categories of languages, and... more
The present paper aims at presenting a structured literature review for what concerns the following aspects: Infrastructure as a Code (IaC) languages, modelling approaches supporting the generation of IaC, categories of languages, and their characteristics, and security analysis techniques. Finally, the paper draws conclusions and gives some future research perspectives. More specifically, the objective is to identify, analyze and interpret all available evidence from primary studies related to four specific research questions which involve the review, planning, conducting, and reporting stages, as suggested by the PRISMA methodology. More than 100 articles in the latest years are surveyed and thus a current, justified state of the art is presented hereby.
- by Paul W Placeway and +1
- •
- Computer Architecture, Information Retrieval, Semantics, Syntax
Syntactic structure is an important component of natural language utterances, for both form and content. Therefore, a variety of applications can benefit from the integration of syntax into their statistical models of language. In this... more
Syntactic structure is an important component of natural language utterances, for both form and content. Therefore, a variety of applications can benefit from the integration of syntax into their statistical models of language. In this thesis, two new syntax-based models are presented, along with their training algorithms: a monolingual generative model of sentence structure, and a model of the relationship between the structure of a sentence in one language and the structure of its translation into another language. After these models are trained and tested on the respective tasks of monolingual parsing and word-level bilingual corpus alignment, they are demonstrated in two additional applications. First, a new statistical parser is automatically induced for a language in which none was available, using a bilingual corpus. Second, a statistical translation system is augmented with syntax-based models. Thus the contributions of this thesis include: a statistical parsing system; a bi...
- by Peter Venable
- •
- Algorithms, Syntax, Parsing, Translation
: This R&D Status Report covers progress in a study on natural language understanding which uses SDC Logic Bases Systems and Integrated Syntax and Semantics Progress areas include grammar work, semantics, improvement of computer... more
: This R&D Status Report covers progress in a study on natural language understanding which uses SDC Logic Bases Systems and Integrated Syntax and Semantics Progress areas include grammar work, semantics, improvement of computer facilities, change in ...
The term Natural Language Processing (NLP) refers to all research and development aimed at modeling and reproducing, with the machines, the human capacity to produce and understand linguistic statements with the goal of communication.... more
The term Natural Language Processing (NLP) refers to all research and development aimed at modeling and reproducing, with the machines, the human capacity to produce and understand linguistic statements with the goal of communication. NLP, which is the crossroads of linguistics, computer science and artificial intelligence, sees its applications, its programs and many computer techniques, at the service of human language in order to apprehend the meaning of data in natural language. A high-level understanding for the human reasoning has long been sought and considered the extreme goal of the early work. for many natural languages such as English and French, and for all levels of analysis, many applications and tools have been developed. Also for Modern Standard Arabic there are several tools using new technologies like artificial intelligence, machine learning, have appeared for decades. These tools generally concern machine translation, information retrieval, ontologies, part of speech tagging, etc. Unfortunately for classical Arabic, language of Quran and Hadith things are just beginning. The purpose of this paper is to show how to benefit from existing tools for Hadih Science processing especially for evaluation and judgment of Hadith to distinguish between the accepted and rejected Hadith.
The task of designing and implementing a compiler can be a difficult and error-prone process. In this paper, we present a new approach based on the use of higher-order abstract syntax and term rewriting in a logical framework. All program... more
The task of designing and implementing a compiler can be a difficult and error-prone process. In this paper, we present a new approach based on the use of higher-order abstract syntax and term rewriting in a logical framework. All program transformations, from parsing to code generation, are cleanly isolated and specified as term rewrites. This has several advantages. The correctness of the compiler depends solely on a small set of rewrite rules that are written in the language of formal mathematics. In addition, the logical framework guarantees the preservation of scoping, and it automates many frequently-occurring tasks including substitution and rewriting strategies. As we show, compiler development in a logical framework can be easier than in a general-purpose language like ML, in part because of automation, and also because the framework provides extensive support for examination, validation, and debugging of the compiler transformations. The paper is organized around a case st...
: This paper describes a draft of visual notation for meta-programming. The main suggestions of this work include specialized data structures (lists, tuples, trees), data item associations that provide for creation of arbitrary graphs,... more
: This paper describes a draft of visual notation for meta-programming. The main suggestions of this work include specialized data structures (lists, tuples, trees), data item associations that provide for creation of arbitrary graphs, visualization of data structures and data flows, graphical notation for pattern matching (list, tuple, and tree patterns, graphical notation for context free grammars, streams), encapsulation means for hierarchical rules design, two-dimensional data-flow diagrams for rules, visual control constructs for conditionals and iteration, default mapping results to reduce real-estate requirements for diagrams, and dynamic data attributes. Two-dimensional data flow diagrams improve readability of a meta-program. The abstract syntax type definitions for common programming languages and related default mappings (parsing and de-parsing) provide for a pracically feasible reuse of those components.
- by Eduard Hovy
- •
- English language, Theory, Taxonomy, Parsers
: Applications of interactive expert system techniques to the domain of automated cartography were studied in the context of the SRI Image Understanding Testbed, extended to include Symbolics Lisp Machines and the SRI ImagCalc(TM)... more
: Applications of interactive expert system techniques to the domain of automated cartography were studied in the context of the SRI Image Understanding Testbed, extended to include Symbolics Lisp Machines and the SRI ImagCalc(TM) software support system. As a sample domain, we chose to work mainly on the problem of locating generic cultural objects in aerial imagery. The discovery of such objects was accomplished by defining a generic model for rectilinear objects, along with rules for parsing the image geometry and correcting probable errors of the segmentation algorithm. These tools permit the semantic resegmentation of an initial syntactic scene partition to yield well-delineated buildings. The method owes its success to the combined utilization of both high-level and low-level knowledge about the target object context and the image. Keywords: Spatial reasoning; Cultural object detection; Image segmentation.
This paper describes a particular approach to parsing that utilizes recent advances in unification-based parsing and in classification-based knowledge representation. As unification-based grammatical frameworks are extended to
- by Jerry Hobbs
- •
- Algorithms, Syntax, Matching, Parsers
: This R&D Status Report covers progress in a study on natural language understanding which uses SDC Logic Bases Systems and Integrated Syntax and Semantics Progress areas include grammar work, semantics, improvement of computer... more
: This R&D Status Report covers progress in a study on natural language understanding which uses SDC Logic Bases Systems and Integrated Syntax and Semantics Progress areas include grammar work, semantics, improvement of computer facilities, change in ...