Do Intelligent Robots Need Emotion?

What's your opinion?

How To Build Word Embeddings

 .

Word embeddings are a type of word representation that allows words with similar meaning to have a similar representation.

They are a distributed representation for text that is perhaps one of the key breakthroughs for the impressive performance of deep learning methods on challenging natural language processing problems.

In this post, you will discover the word embedding approach for representing text data.

After completing this post, you will know:

(1) What the word embedding approach for representing text is and how it differs from other feature extraction methods.

(2) That there are 3 main algorithms for learning a word embedding from text data.

(3) That you can either train a new embedding or use a pre-trained embedding on your natural language processing task.

.

https://machinelearningmastery.com/what-are-word-embeddings/

.

How to Use Word Embedding Layers for Deep Learning with Keras

https://machinelearningmastery.com/use-word-embedding-layers-deep-learning-keras/