Do Intelligent Robots Need Emotion?

What's your opinion?

How To Build Attention Mechanism From Scratch

 .

The attention mechanism was introduced to improve the performance of the encoder-decoder model for machine translation. The idea behind the attention mechanism was to permit the decoder to utilize the most relevant parts of the input sequence in a flexible manner, by a weighted combination of all of the encoded input vectors, with the most relevant vectors being attributed the highest weights. 

In this tutorial, you will discover the attention mechanism and its implementation. 

After completing this tutorial, you will know:

(1) How the attention mechanism uses a weighted sum of all of the encoder hidden states to flexibly focus the attention of the decoder to the most relevant parts of the input sequence. 

(2) How the attention mechanism can be generalized for tasks where the information may not necessarily be related in a sequential fashion.

(3) How to implement the general attention mechanism in Python with NumPy and SciPy. 

.

https://machinelearningmastery.com/the-attention-mechanism-from-scratch/