My research interests are mainly in the algorithmic and formal aspects of computational linguistics (esp. parsing and machine translation) and artificial intelligence in general.
The key questions that motivate my research are:
Why are computers so bad at understanding and processing natural language?
Can we teach computers to process natural language the way we humans do,
that is, both fast and accurate?
Or, can computers process natural language the way they process programming languages in spite of the inherent ambiguity of the former?
So recently I have been focusing on linear-time algorithms for parsing and translation inspired by both human processing (psycholinguistics)
and compiler theory.
On the other hand I also work on theoretical and practical problems in structured learning with inexact search that rises from NLP but
also applies to other structured domains such as computational biology.
I had also worked on structural biology (esp. protein folding) using
dynamic programming inspired by computational linguistics (see below).
Tutorial: Advanced Dynamic Programming in Semiring and Hypergraph Frameworks. NAACL 2009 and COLING 2008.
(based on my candidacy exam and Chapter 2 of my thesis)
Binarization of Synchronous Context-Free Grammars (and its connection to the Graham Scan for Convex Hulls).
talks given at CAS/ICT and HKUST (2007).
Fast Decoding with Synchronous Grammars and n-gram Models ("forest rescoring" paper). video and slides from the Microsoft Research talk (Dec 2006).
Latest and Current Work
Kai Zhao and Liang Huang (2015).
Type-Driven Incremental Semantic Parsing with Polymorphism.
To appear in Proceedings of NAACL 2015.
Haitao Mi and Liang Huang (2015).
Shift-Reduce Constituency Parsing with Dynamic Programming and POS Tag Lattice.
To appear in Proceedings of NAACL 2015.
I. Naim, Y. Song, Q. Liu, L. Huang, H. Kautz, J. Luo, and D. Gildea (2015).
Discriminative Unsupervised Alignment of Natural Language Instructions with Corresponding Video Segments.
To appear in Proceedings of NAACL 2015.
Liang Huang, Hao Zhang, Daniel Gildea, and Kevin Knight (2009).
Binarization of Synchronous Context-Free Grammars. Computational Linguistics, 35 (4). Conference version appeared at NAACL 2006.
(The core linear-time synchronous binarization algorithm
was inspired by the Graham Scan for Convex Hull.
It was a rather unexpected connection.)
grammars and dynamic programming for computational biology
Back in China, I also co-authored a popular textbook for
Algorithmic Programming Contests:
The Art of Algorithms and Programming Contests,
with the legendary Rujia Liu (Tsinghua University Press, 2003).
It was a national best-seller in computer science, 2005--2006,
and has been widely adopted as the standard textbook for NOI, IOI, and ACM/ICPC contests.
I love teaching so much. Currently I teach PhD-level courses
in both Computer Science and
at the CUNY Graduate Center,
as well as undergraduate and Master's courses at CUNY Queens College.
Current Teaching at CUNY:
Fall 2014: CS 71010, Programming Languages (Functional Programming in Haskell, Operational Semantics, Lambda Calculus, and Type Theory), Graduate Center.
Fall 2014: CS 3813/780, Advanced Programming (training for programming contests and industrial interviews), Queens College.
Spring 2014: CS 3813/780, Advanced Programming (training for programming contests and industrial interviews), Queens College.
Fall 2013: CS 71010, Programming Languages (Functional Programming in Haskell, Operational Semantics, Lambda Calculus, and Type Theory), Graduate Center.
Modeled after: CIS 500, Fall 2003 at Penn (the best class I've ever taken, taught by the best instructor I've ever had).
I'm looking for PhD students, postdocs, and visiting PhD students to join my group. Drop me a note if you're interested. The CUNY PhD application information is
I prefer students with a solid background in algorithms, math, and programming
(e.g., experience with ACM/ICPC or similar contests).
My research group is called
the Algorithms for Computational Linguistics (ACL) Group.
(Note that we are a group, not a lab, although we do have a physical lab space.
Our lab space is unfortunately free of windows, but is also proudly free of Windows.)
Former Ph.D. and M.S. Students at USC
Ashish Vaswani, Ph.D., June 2014 (co-advised by David Chiang). first job: Computer Scientist, ISI.
Theerawat "Dome" Songyot, M.S. student, Fulbright scholar.
We are also part of a larger family of NLP faculty and students at
NLP @ CUNY.
In particular, we have a wonderful NLP seminar series.
You are more than welcome to attend our talks if you happen to be in the city.
I am from Shanghai, China and speak Wu as my native language.
Ph.D., Computer Science, University of Pennsylvania, 2008. (old old homepage)
B.S., Computer Science, Shanghai Jiao Tong University, 2003. summa cum laude. (minor studies in French and English)
Research Scientist (part-time), IBM T. J. Watson Research Center, 2014/6--present. (Bowen Zhou's group)
Doctoral Faculty, The Graduate Center, City University of New York (CUNY), 2012/8--present.
Assistant Professor, Queens College, City University of New York (CUNY), 2012/8--present.
Research Assistant Professor, University of Southern California (USC), 2010/7--2012/8.
Computer Scientist, USC Information Sciences Institute, 2009/7--2012/8.
Research Scientist, Google Research (Mountain View), 2009/1--7. (Fernando Pereira's group)
Visiting Scholar, Hong Kong Univ. of Science and Technology, 2008/10--2008/11.
Visiting Scholar, Institute of Computing Technologies, Chinese Academy of Sciences, 2007/10--2008/1.
Summer Intern, USC/ISI, 2005/5--10 and 2006/5--10.
Liang Huang is currently an Assistant Professor at the City University of New York (CUNY) and a part-time Research Scientist at IBM T. J. Watson Research Center. He graduated in 2008 from Penn and has worked as a Research Scientist at Google and a Research Assistant Professor at USC/ISI. Most of his work develops fast algorithms and provable theory to speedup large-scale natural language processing and structured machine learning. He has received a Best Paper Award at ACL 2008, several best paper nominations (ACL 2007, EMNLP 2008, and ACL 2010), two Google Faculty Research Awards (2010 and 2013), and a University Teaching Prize at Penn (2005).
Note: since 2006, almost all my code is written in Python2.7, with some Python extension libraries in C.
I also love functional programming and declarative programming in general
(OCaml, Haskell, and Prolog), but hate C++ and Perl which are too ugly.
Compared to Python/Haskell/Ocaml, languages like C/C++ and Java are
stone-age artifacts; don't use them unless absolutely necessarily (such as for Python libraries).
Written in C as a Python extension module based on collections.defaultdict.
Much faster and slimmer (4 times less memory usage) than David Chiang's svector.
Builtin support for averaged parameters in online learning (e.g. perceptron, MIRA, etc.).
Note: for decoding (e.g. parsing), defaultdict is fast enough (mine is even faster by doing dot-product in C, which is also possible via Cython), but for learning (e.g. perceptron), defaultdict
becomes terrible on big data because Python float/int are immutable, which caused too many unnecessary hash operations. Using my hvector can make your learner up to 5 times faster.
This parser/reranker is described in the following paper:
Liang Huang (2008). Discriminative Parsing with Non-Local Features.
Proceedings of ACL 2008. (Best Paper Award)
errata: Following Charniak, the dev set was section 24, not section 22.
This software has three components:
The forest-dumping version of Charniak parser.
The forest reranker.
The perceptron trainer.
Currently part 1 is downloadable as a standalone package.
Parts 2 and 3 are being packaged for release.
Important: If you're using 64-bit Ubuntu, it is recommended that you install Python from source code (see Python.org).
The default Python2.7 in those Ubuntus (at least 12.04) has an obscure floating point problem
which gives inconsistent results.
We gratefully acknowledge the support from funding agencies.
PI, NSF EAGER, $135k for one year, 2014--2015.
PI, Google Faculty Research Award, unrestricted gift, $88k for one year, 2013--2014.
PI, PSC-CUNY Enhanced Research Award, $12k for one year, 2013--2014.
co-PI, DARPA DEFT Program, $2M for 4.5 years, 2012--2016. PI: Andrew Rosenberg.
PI, Google Faculty Research Award, unrestricted gift, $75k for one year, 2010--2011.
Computer Science Department, CUNY/QC
Science Building A-202
65-30 Kissena Blvd., Queens, NY 11367.
huang at cs dot qc dot cuny dot edu.
I am also at
Computer Science Department, CUNY/GC
365 Fifth Avenue, New York, NY 10016.
I don't check voice messages.
Disclaimer: I am known to be highly opinionated, and some points below might sound offensive to some readers.
I am a big fan of Classical Music.
The composers I admire most are
Johann Sebastian Bach (whose music is so mathematical),
Peter Ilych Tchaikovsky (whose melodic talent almost rivals that of Mozart),
and Antonin Dvorak (whose music blends Bohemia with America).
I also love, among others, (in chronological order)
Wolfgang Amadeus Mozart,
Ludwig van Beethoven,
Felix Mendelssohn, and Sergei Rachmaninoff.
Yes, I do have a preference for Baroque, Slavic, and melodic beauty.
On the other hand, I don't have a taste or much respect for Richard Wagner
(whom I found disgusting),
nor do I like Franz Lizst.
Compared to Frederic Chopin or Nicolo Paganini,
Lizst has almost nothing original to himself
(like comparing Clementi to Mozart).
A Personal History of Languages
I grew up speaking Wu,
but in a multilingual environment.
Back in the old days, Shanghai was just as multicultural as New York City today
with speakers and communities of all kinds of languages.
When I grew up, my parents spoke Shanghainese,
and my grandparents Ningbonese,
which I understood perfectly but could not speak well;
the majority of our neighbors, however,
spoke another distinctive language called
Lower Yangtze Mandarin,
which is an "interpolation" of Wu and Northern Mandarin,
and because of that I am still fluent in it today.
I started to learn Standard Mandarin rigorously as a de facto first foreign language
in the elementary school,
but ended up speaking it with a heavy Wu accent.
During college I took up French seriously
but forgot all of it after moving to the US.
On the other thand, living in the US helped me
get rid of my heavy Wu accent in Mandarin
where finally the "training data" around me
had more native samples than non-native ones.
The US also exposed me to other Chinese languages and dialects
which I never heard back in Shanghai,
such as the Upper Yangtze Mandarin (aka "Sichuan") and Cantonese,
but most importantly, various English dialects and Spanish.
I still enjoy learning new languages and dialects today.