21–24 Feb 2018
Bonn
Europe/Zurich timezone

This is a sandbox server intended for trying out Indico. It should not be used for real events and any events on this instance may be deleted without notice.

Deep Sentence Embedding Using Long Short-Term Memory Networks

Not scheduled
15m
50 (Bonn)

50

Bonn

Speaker

Mr Palangi Hamid

Description

This paper develops a model that addresses sentence embedding, a hot topic in current natural language processing research, using recurrent neural Networks (RNN) with Long Short-Term Memory (LSTM) cells. The proposed LSTM-RNN model sequentially takes each word in a sentence, extracts its information, and embeds it into a semantic vector. Due to its ability to capture long term memory, the LSTM-RNN accumulates increasingly richer information as it goes through the sentence, and when it reaches the last word, the hidden layer of the network provides a semantic representation of the whole sentence.

Author

Presentation materials

There are no materials yet.