Sida Wang's academic homepage (original) (raw)
Bio (CV)
I am a research scientist at Facebook AI Research (FAIR). Previously, I was an instructor at Princeton Universityand the Institute for Advanced Study. In 2017, I completed a PhD in computer science at Stanford where I worked on machine learning and natural language processing co-advised by Chris Manning and Percy Liang. Before that I got a BASc at University of Toronto and worked on capsules with Geoffrey Hinton.
The best way to contact me is email: sidawxyz [at] gmail.com
Selected work
Code LLM:SWE-RL,eval-arena,LEVER, Coder-reviewer, InCoder,MBR-exec.
Benchmarks:SWE-bench M,Spider 2.0,LiveCodeBench,SAFIM,CRUXEval, DS-1000.
These works also reflect my interests and style:
- Accessing higher dimensionsdoes unsupervised translation with < 100MB of data + 90s method
- Naturalizing a programming language via interactive learning
- Data Noising as Smoothing in Neural Network Language Models
- Learning language games through interaction
- Estimating mixture models via mixtures of polynomials
- Fast dropout training and dropout as adapative regularization
- Baselines and bigrams (NBSVM) showed the power of bag of bigrams, adopted by fastText
- Capsules / transforming autoencoders
Publications
Services
- Area chair for Neurips, ICLR, ICML
- Reviewer for TACL, ARR, Neurips, ICLR, EMNLP, JMLR, ICML
- Spring 2018: COS495 Natural language processing
- Fall 2015: Head TA for CS224 Natural language processing
- Winter 2013: TA for CS229T Statistical machine learning