Efficient Beam Tree Recursion (original) (raw)

TreeNet: Learning Sentence Representations with Unconstrained Tree Structure

Haiqin YANG

Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, 2018

View PDFchevron_right

Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks

Christopher D Manning

Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), 2015

View PDFchevron_right

DL4NLP 2019 Proceedings of the First NLPL Workshop on Deep Learning for Natural Language Processing

Sara Stymne

2019

View PDFchevron_right

Improving Tree-LSTM with Tree Attention

Mahtab Ahmed

2019 IEEE 13th International Conference on Semantic Computing (ICSC)

View PDFchevron_right

Modeling Hierarchical Structures with Continuous Recursive Neural Networks

Jishnu Ray Chowdhury

2021

View PDFchevron_right

Compositional Distributional Semantics with Long Short Term Memory

Phong Le

Proceedings of the Fourth Joint Conference on Lexical and Computational Semantics, 2015

View PDFchevron_right

Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks

Aaron Courville

ArXiv, 2019

View PDFchevron_right

Exploring Asymmetric Encoder-Decoder Structure for Context-based Sentence Representation Learning

Virginia De Sa

2017

View PDFchevron_right

Speeding up Context-based Sentence Representation Learning with Non-autoregressive Convolutional Decoding

Virginia De Sa

Proceedings of The Third Workshop on Representation Learning for NLP, 2018

View PDFchevron_right

Span-Based Neural Buffer: Towards Efficient and Effective Utilization of Long-Distance Context for Neural Sequence Models

Kaisheng Yao

Proceedings of the AAAI Conference on Artificial Intelligence

View PDFchevron_right

A Convolutional Neural Network for Modelling Sentences

asadmo monas

View PDFchevron_right

Memory-Efficient Backpropagation for Recurrent Neural Networks

Issa Ayoub

Advances in Artificial Intelligence, 2019

View PDFchevron_right

Tree-Structured Composition in Neural Networks without Tree-Structured Architectures

Christopher D Manning

2015

View PDFchevron_right

Unsupervised Learning of Sentence Representations using Convolutional Neural Networks

Ricardo Henao

2016

View PDFchevron_right

Learning continuous phrase representations and syntactic parsing with recursive neural networks

Christopher D Manning

2010

View PDFchevron_right

Multi-cell LSTM Based Neural Language Model

Akshay Badola, Thomas Cherian

ArXiv, 2018

View PDFchevron_right

Quantifying the Vanishing Gradient and Long Distance Dependency Problem in Recursive Neural Networks and Recursive LSTMs

Phong Le

Proceedings of the 1st Workshop on Representation Learning for NLP, 2016

View PDFchevron_right

Network in Sequential Form: Combine Tree Structure Components into Recurrent Neural Network

mohit guru

IOP Conference Series: Materials Science and Engineering, 2021

View PDFchevron_right

Under review as a conference paper at ICLR 2016 GRID LONG SHORT-TERM MEMORY

Matthieu de Beaucorps

View PDFchevron_right

Tree Memory Networks for Modelling Long-term Temporal Dependencies

Tharindu Fernando

View PDFchevron_right

Learning sentence embeddings using Recursive Networks

anson bastos

View PDFchevron_right

An attention-gated convolutional neural network for sentence classification

asman sadino

Intelligent Data Analysis, 2019

View PDFchevron_right

Recursive LSTM Tree Representation for Arc-Standard Transition-Based Dependency Parsing

Bernd Bohnet

Proceedings of the Third Workshop on Universal Dependencies (UDW, SyntaxFest 2019), 2019

View PDFchevron_right

You Only Need Attention to Traverse Trees

Robert Mercer

Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 2019

View PDFchevron_right

Recognizing semantic relation in sentence pairs using Tree-RNNs and Typed dependencies

Abdul Nazeer K A

2020 6th IEEE Congress on Information Science and Technology (CiSt), 2020

View PDFchevron_right

Sentence embeddings in NLI with iterative refinement encoders

Anssi Yli-Jyrä

Natural Language Engineering, 2019

View PDFchevron_right

A feature-level attention-based deep neural network model for sentence embedding

Salma Jamoussi

International Journal of Intelligent Systems Technologies and Applications

View PDFchevron_right

Sequence Level Training with Recurrent Neural Networks

Sumit Chopra

CoRR, 2016

View PDFchevron_right

Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank

Saul Coronado

View PDFchevron_right

Context- and sequence-aware convolutional recurrent encoder for neural machine translation

Seba Susan

Proceedings of the 36th Annual ACM Symposium on Applied Computing

View PDFchevron_right

Convolutional Neural Network for Universal Sentence Embeddings

xiaoqi jiao

2018

View PDFchevron_right

TangoBERT: Reducing Inference Cost by using Cascaded Architecture

Moshe Wasserblat

ArXiv, 2022

View PDFchevron_right

Universal Sentence Encoder

Sheng-yi Kong

2018

View PDFchevron_right

Recursive Neural Networks Can Learn Logical Semantics

Christopher D Manning

Proceedings of the 3rd Workshop on Continuous Vector Space Models and their Compositionality, 2015

View PDFchevron_right

OCLSP at SemEval-2016 Task 9: Multilayered LSTM as a Neural Semantic Dependency Parser

manjuan duan

Proceedings of the 10th International Workshop on Semantic Evaluation (SemEval-2016), 2016

View PDFchevron_right