I am a graduate student in the Paul G. Allen School for Computer Science & Engineering at the University of Washington, where I am very fortunate to be advised by Noah Smith.

In my research I try to better understand the building blocks of neural NLP—the embedding/softmax layers, and the LSTM and self-attention modules—in order to make them faster, smaller, and more accurate.

Previously, I completed my Bachelor’s and Master’s degrees in Computer Science at Tel Aviv University and briefly worked as a software developer.

Papers (Google Scholar)

Partially Shuffling the Training Data to Improve Language Models
Ofir Press
Technical Report
[paper] [code] [bib]

You May Not Need Attention
Ofir Press, Noah A. Smith
[paper] [code] [summary] [bib]

Language Generation with Recurrent Generative Adversarial Networks without Pre-training
Ofir Press*, Amir Bar*, Ben Bogin*, Jonathan Berant, Lior Wolf
1st Workshop on Learning to Generate Natural Language at ICML 2017
[paper] [code] [summary] [bib]

Using the Output Embedding to Improve Language Models
Ofir Press, Lior Wolf
EACL 2017
[paper] [code] [blog post] [summary] [bib]


As of November 2018, this site has been accessed by more than 52,000 people from 167 countries.

Contact me