I am a graduate student in the Paul G. Allen School for Computer Science & Engineering at the University of Washington, where I am very fortunate to be advised by Noah Smith. In addition, I am currently a visiting researcher at Facebook AI Research in Seattle.

In my research I try to better understand the building blocks of neural NLP—the embedding/softmax layers, and the LSTM and self-attention modules—in order to make them faster, smaller, and more accurate.

In the summer of 2019 I interened at Facebook AI Research with Omer Levy.

Previously, I completed my Bachelor’s and Master’s degrees in Computer Science at Tel Aviv University (where I was advised by Lior Wolf and also worked with Jonathan Berant) and briefly worked as a software developer.

My brother Ori Press is a computer vision researcher.

Contact me

ofirp@cs.washington.edu

Papers (Google Scholar)

Shortformer: Better Language Modeling using Shorter Inputs
Ofir Press, Noah A. Smith, Mike Lewis
Preprint
[paper] [code]

Improving Transformer Models by Reordering their Sublayers
Ofir Press, Noah A. Smith, Omer Levy
ACL 2020
[paper] [summary] [code] [bib]
[ACL video (summarizes the important bits, 12 min)] [video (detailed overview, 35 min)]

You May Not Need Attention
Ofir Press, Noah A. Smith
Preprint
[paper] [summary] [code] [bib]

Language Generation with Recurrent Generative Adversarial Networks without Pre-training
Ofir Press*, Amir Bar*, Ben Bogin*, Jonathan Berant, Lior Wolf
1st Workshop on Learning to Generate Natural Language at ICML 2017
[paper] [summary] [code] [bib]

Using the Output Embedding to Improve Language Models
Ofir Press, Lior Wolf
EACL 2017
Introduced the Weight Tying method which is now used in BERT and many other state of the art language & translation models.
[paper] [summary] [blog post] [code] [bib]

Technical Reports

Partially Shuffling the Training Data to Improve Language Models
Ofir Press
[Report] [code] [bib]