Efficient Beam Tree Recursion (original) (raw)
Related papers
TreeNet: Learning Sentence Representations with Unconstrained Tree Structure
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, 2018
Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks
Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), 2015
DL4NLP 2019 Proceedings of the First NLPL Workshop on Deep Learning for Natural Language Processing
2019
Improving Tree-LSTM with Tree Attention
2019 IEEE 13th International Conference on Semantic Computing (ICSC)
Modeling Hierarchical Structures with Continuous Recursive Neural Networks
2021
Compositional Distributional Semantics with Long Short Term Memory
Proceedings of the Fourth Joint Conference on Lexical and Computational Semantics, 2015
Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks
ArXiv, 2019
Exploring Asymmetric Encoder-Decoder Structure for Context-based Sentence Representation Learning
2017
Proceedings of The Third Workshop on Representation Learning for NLP, 2018
Proceedings of the AAAI Conference on Artificial Intelligence
A Convolutional Neural Network for Modelling Sentences
Memory-Efficient Backpropagation for Recurrent Neural Networks
Advances in Artificial Intelligence, 2019
Tree-Structured Composition in Neural Networks without Tree-Structured Architectures
2015
Unsupervised Learning of Sentence Representations using Convolutional Neural Networks
2016
Learning continuous phrase representations and syntactic parsing with recursive neural networks
2010
Multi-cell LSTM Based Neural Language Model
ArXiv, 2018
Proceedings of the 1st Workshop on Representation Learning for NLP, 2016
Network in Sequential Form: Combine Tree Structure Components into Recurrent Neural Network
IOP Conference Series: Materials Science and Engineering, 2021
Under review as a conference paper at ICLR 2016 GRID LONG SHORT-TERM MEMORY
Tree Memory Networks for Modelling Long-term Temporal Dependencies
Learning sentence embeddings using Recursive Networks
An attention-gated convolutional neural network for sentence classification
Intelligent Data Analysis, 2019
Recursive LSTM Tree Representation for Arc-Standard Transition-Based Dependency Parsing
Proceedings of the Third Workshop on Universal Dependencies (UDW, SyntaxFest 2019), 2019
You Only Need Attention to Traverse Trees
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 2019
Recognizing semantic relation in sentence pairs using Tree-RNNs and Typed dependencies
2020 6th IEEE Congress on Information Science and Technology (CiSt), 2020
Sentence embeddings in NLI with iterative refinement encoders
Natural Language Engineering, 2019
A feature-level attention-based deep neural network model for sentence embedding
International Journal of Intelligent Systems Technologies and Applications
Sequence Level Training with Recurrent Neural Networks
CoRR, 2016
Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
Context- and sequence-aware convolutional recurrent encoder for neural machine translation
Proceedings of the 36th Annual ACM Symposium on Applied Computing
Convolutional Neural Network for Universal Sentence Embeddings
2018
TangoBERT: Reducing Inference Cost by using Cascaded Architecture
ArXiv, 2022
2018
Recursive Neural Networks Can Learn Logical Semantics
Proceedings of the 3rd Workshop on Continuous Vector Space Models and their Compositionality, 2015
OCLSP at SemEval-2016 Task 9: Multilayered LSTM as a Neural Semantic Dependency Parser
Proceedings of the 10th International Workshop on Semantic Evaluation (SemEval-2016), 2016