Notes from the 4th Machine Translation Marathon of the Americas

History never repeats itself, but the kaleidoscopic combinations of the pictured present often seem to be constructed broken fragments of antique legends.

Mark Twain

Warren Weaver gave the first introduction to machine translation, claiming that our goal was “erecting a new Tower of Anti-Babel.” He talked about a “mythical” time when people could communicate simply with one another.


Graham Neubig is incredibly stylish. Or certainly modish.

Lane Schwartz: DLVM (a play on LLVM), a way of compiling your computation graphs.


Seemingly there’s a consensus that OpenNMT-py is inferior—the same models with the same hyperparameters rather consistently get lower BLEU scores than their counterparts in other toolkits.


People seem to enjoy toying with things here, but not necessarily going start-to-finish with their projects. I wonder how many hackathon projects have lain fallow.


Visualizing and understanding how weights change in a neural network is tricky.


Having a young, engaged, present advisor seems to make a huge difference in people’s productivity. People like Graham or Ben van Durme manage huge groups and the members benefit from each other.


I had a really good idea for NMT, so I asked Graham about it. Turns out, he did that for EMNLP. Can’t say because it’s during the anonymity period, but it’s comforting to think that I’m asking the right questions.


The Cathedral of Learning is huge and wonderful. Such a nice view.

Written on May 25, 2018