am a PhD student in the Department
of Computer Science, Bar-Ilan
University, under the supervision of Prof. Ido Dagan and Prof. Jacob Goldberger.
I am developing computational models that capture semantic properties of
words or phrases and identify semantic inference relations between them.
In particular my research is focused on how different concrete contexts
affect semantic meanings and inferences (e.g. as in "Joe _runs_ a
business" versus "Joe _runs_ marathons").
I also enjoy collaborating with
other researchers to develop educational NLP applications that utilize
components of my semantic models.
Prior to starting my PhD in March 2012, I
have been working for several years in the hi-tech industry as a software
engineer and then system architect.
During this period I also earned an M.Sc. (Magna Cum
Laude) in Computer Science from Bar-Ilan University, Israel.
Prior to that I received a B.Sc. in Math and Physics from the Hebrew
- PhD Dissertation: Improving Lexical Inference using
Context-sensitive Distributional Models with Rich Context Representations. BIU,
Melamud, Jacob Goldberger, Ido Dagan. context2vec: Learning Generic
Context Embedding with Bidirectional LSTM. CoNLL,
Wojatzki, Oren Melamud, Torsten Zesch. Bundled Gap Filling: A New
Paradigm for Unambiguous Cloze Exercises. Workshop on Innovative Use of
NLP for Building Educational Applications (BEA), 2016.
Melamud, David McClosky, Siddharth Patwardhan, Mohit Bansal. The Role of
Context Types and Dimensionality in Learning Word Embeddings.
- Vasily Konovalov, Ron Artstein,
Oren Melamud, Ido Dagan. The Negochat Corpus
of Human-agent Negotiation Dialogues. LREC, 2016.
Melamud, Omer Levy, Ido Dagan. A Simple Word Embedding Model for Lexical
Substitution. Workshop on Vector Space Modeling for NLP (VSM), 2015.
Melamud, Ido Dagan, Jacob Goldberger. Modeling Word Meaning in Context
with Substitute Vectors. NAACL, 2015.
Melamud, Ido Dagan, Jacob Goldberger, Idan Szpektor and Deniz Yuret.
Probabilistic Modeling of Joint-context in Distributional Similarity. CoNLL, 2014. Best paper runner-up.
Zesch and Oren Melamud. Automatic Generation of Challenging Distractors
Using Context-Sensitive Inference Rules. Workshop on Innovative Use of
NLP for Building Educational Applications (BEA), 2014.
Melamud, Jonathan Berant, Ido Dagan, Jacob Goldberger and Idan Szpektor.
A Two Level Model for Context Sensitive Inference Rules. ACL, 2013. Best
Melamud, Ido Dagan, Jacob Goldberger and Idan Szpektor. Using Lexical
Expansion to Learn Inference Rules from Sparse Data. ACL, short paper, 2013.
Aumann, Moshe Lewenstein, Oren Melamud, Ron
Pinter, Zohar Yakhini. Dotted Interval Graphs.
ACM Transactions on Algorithms (TALG) 8.2 (2012): 9
- context2vec: Learning Generic
Context Embedding with Bidirectional LSTM. CoNLL. August
Role of Context Types and Dimensionality in Learning Word Embeddings. NAACL. June 2016.
Modeling of Joint-context in Distributional Similarity. CoNLL.
- A Two
Level Model for Context Sensitive Inference Rules. ACL. August 2013.
Software and Datasets