Thang Luong's academic page (original) (raw)
I am a research scientist at Google Brain, finding a way to solve language understanding through deep learning.
I enjoyed 5 wonderful years as a PhD student in the Stanford NLP group, learning many many things from my advisor Prof. Christopher Manning.
I am from Vietnam and in case you haven't heard about it, my country has recently discovered Son Doong, the world's largest cave!
Here is my (generally outdated) CV.
News
Sep 2016 — Sep 26 marks my first day at Google! Exactly five years ago, I started my first day as a PhD student as Stanford. Time flies! Looking forward to a new journey & come say hi to me at Google!
Aug 2016 — We (Christopher Manning, Kyunghuyn Cho and I) gave a tutorial on Neural Machine Translation at ACL 2016 to an audience of over 200 people.
June 2016 — Three papers published at ACL and CoNLL, in Berlin, Germany.
Our paper "Achieving Open Vocabulary Neural Machine Translation with Hybrid Word-Character Models" has "conquered" English-Czech translation.
We have released our Matlab code too!May 2016 — Busy but fun time! Talks at Google, OpenAI, SemanticsMachine, Baidu, and Facebook. Guest lecture on Neural Machine Translation in the Stanford Deep Learning for NLP class CS224D. More importantly, I have defended my thesis!
Mar 2016 — Excited to be in Puerto Rico this May for ICLR 2016!
Our paper "Multi-task Sequence to Sequence Learning" has achieved state-of-the-art results in constituent parsing with 93 F1.Feb 2016 — Talks at Microsoft Research and USC, ISI.
Dec 2015 — In Da Nang, Vietnam, for a panel discussion on "New trends in Spoken Language Translation" at IWSLT.
We also have a winning entry for English-German TED talk translation, 26% error reduction from the 2nd place.Nov 2015 — Guest lecture on Neural Machine Translation in the Stanford NLP class CS224N.
Oct 2015 — Talk at IBM Watson Research.
Sep 2015 — Three papers published at EMNLP and CogACLL, in Lisbon, Portugal.
Our paper "Effective Approaches to Attention-based Neural Machine Translation" achieves the best result in English-German translation.
We have released our Matlab code too!Jul 2015 — Three papers published at ACL and CoNLL, in Beijing, China.
Our paper "Addressing the Rare Word Problem in Neural Machine Translation" describes the very first state-of-the-art neural translation system.May 2015 — Two papers at the NAACL VSM workshop, in Colorado, USA.
Our paper "Bilingual Word Representations with Monolingual Quality in Mind" achieves the best performance in the cross-lingual document classification task.
Our code which improves and extends word2vec is released too!Oct 2014 — I had a fruitful summer internship at Google hosted by Quoc Le. Together with other great members in Google Brain team, Ilya Sutskever, Oriol Vinyals, Wojciech Zaremba, we built a neural machine translation system that, for the first time, outperforms the state-of-the-art system in the English-to-French WMT'14 translation task. See our ACL'15 paper.
Jul 2013 — Our TACL'13 paper "Parsing entire discourses as very long strings: capturing topic continuity in grounded language learning" with Prof. Mark Johnson and Prof. Michael C. Frank is now available. This work is done in Fall, 2012 at Macquarie University, Australia.
Jun 2013 — Our CoNLL'13 paper "Better Word Representations with Recursive Neural Networks for Morphology" with Richard Socher and Prof. Christopher Manning is now available. See you in Sofia!
Apr 2012 — This Winter quarter, I am officially aligned with Prof. Christopher Manning, i.e., my advisor!
Jan 2012 — Happy new year, everyone! This Spring quarter, I am back to Gates, joining Prof. Christopher Manning's NLP group as part of the research rotation program.
Sep 2011 — This Fall quarter, I am having a research rotation with Prof. Noah D. Goodman, exploring computational models for human reading time prediction.
Aug 2011 — Paper accepted at ASRU'2011 "A Trajectory-based Parallel Model Combination with a Unified Static and Dynamic Parameter Compensation For Noisy Speech Recognition" with Prof. Khe Chai Sim.
Apr 2011 — I will be joining the Stanford CS department this Fall as a PhD candidate (Sep 2011).
Research
I am now generally interested in deep learning approaches for natural language understanding. In recent years (2014 & 1025), I have researched extensively in the area of neural machine translation.
In the past, I built parsers used for psycholinguistics applications, worked on scholarly digital library systems, and investigated in model-based techniques for robust speech recognition systems.
Publications
2016
Minh-Thang Luong and Christopher D. Manning. A Hybrid Word-Character Approach to Open Vocabulary Neural Machine Translation. ACL’16.
[ Paper ] [ Bib ] [ Project page ] [ SOTA English-Czech translation in WMT ]Joern Wuebker, Spence Green, John DeNero, Sasa Hasan, and Minh-Thang Luong. Models and Inference for Prefix-Constrained Machine Translation. ACL’16.
[ Paper ] [ Bib ] [ SOTA in the prefix completion task for interactive machine translation ]Abigail See*, Minh-Thang Luong*, and Christopher D. Manning. Compression of Neural Machine Translation Models via Pruning. CoNLL’16.
[ Paper ] [ Bib ] [ Can prune models up to 80% without loss of performance. ]Minh-Thang Luong, Ilya Sutskever, Quoc V. Le, Oriol Vinyals, and Lukasz Kaiser. Multi-task Sequence to Sequence Learning. ICLR’16.
[ Paper ] [ Bib ] [ Poster ] [ SOTA results in constituent parsing, 93 F1 ]
2015
Minh-Thang Luong and Christopher D. Manning. Stanford Neural Machine Translation Systems for Spoken Language Domain. IWSLT’15.
[ Paper ] [ Bib ] [ Slides ] [ IWSLT result overview ] [ SOTA English-German system for TED talks, 26% better than the 2nd place. ]Minh-Thang Luong, Hieu Pham, and Christopher D. Manning. Effective Approaches to Attention-based Neural Machine Translation. EMNLP’15.
[ Paper ] [ Bib ] [ Slides ] [ Project page ] [ SOTA results in WMT English-German translation ]Jiwei Li, Minh-Thang Luong, Dan Jurafsky, and Eduard Hovy. When Are Tree Structures Necessary for Deep Learning of Representations?. EMNLP’15.
[ Paper ] [ Bib ]Minh-Thang Luong, Timothy J. O'Donnell, and Noah D. Goodman. Evaluating Models of Computation and Storage in Human Sentence Processing. CogACLL’15.
[ Paper ] [ Bib ] [ Slides ] [ Code (Earleyx parser) ]Minh-Thang Luong*, Ilya Sutskever*, Quoc V. Le*, Oriol Vinyals, Wojciech Zaremba. Addressing the Rare Word Problem in Neural Machine Translation. ACL’15.
[ Paper ] [ Bib ] [ Slides ] [ SOTA results in WMT English-French translation ]Jiwei Li, Minh-Thang Luong, Dan Jurafsky. A Hierarchical Neural Autoencoder for Paragraphs and Documents. ACL’15.
[ Paper ] [ Bib ]Minh-Thang Luong, Michael Kayser, and Christopher D. Manning. Deep Neural Language Models for Machine Translation. CoNLL’15.
[ Paper ] [ Bib ] [ Poster ] [ Project page for code. ]Minh-Thang Luong, Hieu Pham, and Christopher D. Manning. Bilingual Word Representations with Monolingual Quality in Mind. NAACL’15 VSM workshop.
[ Paper ] [ Bib ] [ Project page for code and trained embeddings. ] [ SOTA results in cross-lingual document classification ]Hieu Pham, Minh-Thang Luong, Christopher D. Manning. Learning Distributed Representations for Multilingual Text Sequences. NAACL’15 VSM workshop.
[ Paper ] [ Bib ]
2013
Minh-Thang Luong, Richard Socher, and Christopher D. Manning. 2013. Better Word Representations with Recursive Neural Networks for Morphology. CoNLL’13.
[ Paper ] [ Bib ] [ Dataset ] [ Project page for word vectors and other information. ]Minh-Thang Luong, Michael C. Frank, and Mark Johnson. 2013. Parsing entire discourses as very long strings: Capturing topic continuity in grounded language learning. TACL’13.
[ Paper ] [ Bib ] [ Code (Earleyx parser) ]
2011
Khe Chai Sim and Minh-Thang Luong. 2011. A Trajectory-based Parallel Model Combination with a Unified Static and Dynamic Parameter Compensation For Noisy Speech Recognition. ASRU’11.
[ Paper ] [ Poster (.pdf) ]Minh-Thang Luong, Thuy Dung Nguyen, and Min-Yen Kan. 2011. Logical Structure Recovery in Scholarly Articles with Rich Document Features. IJDLS 1(4). pp 1-23. Invited paper.
[ Paper ]
2010
Minh-Thang Luong, Preslav Nakov and Min-Yen Kan. 2010. A Hybrid Morpheme-Word Representation for Machine Translation of Morphologically Rich Languages. In EMNLP’10.
[ Paper ]Minh-Thang Luong and Min-Yen Kan. 2010. Enhancing Morphological Alignment for Translating Highly Inflected Languages. In COLING’10.
[ Paper ]Thuy Dung Nguyen and Minh-Thang Luong. 2010. WINGNUS: Keyphrase Extraction Utilizing Document Logical Structure. In SemEval-2. Rank 3rd in the task #5 – Automatic Keyphrase Extraction from Scientific Articles.
[ Paper ]Thuy Dung Nguyen, Min-Yen Kan, Dinh-Trung Dang, Markus Hänse, Ching Hoi Andy Hong, Minh-Thang Luong, Jesse Prabawa Gozali, Kazunari Sugiyama and Yee Fan Tan (2010). ForeCite: Towards a Reader-centric Scholarly Digital Library. In JCDL’10.
[ [Paper](/lmthang/data/papers/JCDL ForeCite Poster.pdf) ] [ [Poster (.png)](/lmthang/data/papers/FC poster.png) ]