Connect your moderator Slack workspace to receive post notifications:
Sign in with Slack

GloVe

Could we learn GloVe word vectors using SVD?

If so, what modifications would we have to do?

  • filling the empty spaces with zero?
  • setting all weights to 1?

When classifying (like in project 2: Text classification), if the embeddings have the same length, can we assume that test time (to classify) will be the same regardless of the method used to create the vectors?
In other words, at test time, assuming the same embedding length, will gloVe, bag-of-word and word2vec have similar running times?

In Week 13 part 2 minute 23:51, professor Jaggi mentions that we could learn word vectors by filling the missing entries with zeros and then doing SVD. However, these word vectors are not GloVe word vectors (and don't perform as well).

To learn glove word vectors we must use the weights and SGD or ALS?

Page 1 of 1

Add comment

Post as Anonymous Dont send out notification