Vectorize the words that make up a sentence. This is called word embedding.
For example, when embedding a word in the sentence "Today is sunny", each word is broken down and given an ID as shown below, and expressed as one hot.
The embedded layer becomes a perceptron operation as follows. For example, for the word "sunny", 1 is input to the third node of the input layer, and the perceptron operation is performed and output.
■Concrete example of embedding function
The initial values of the perceptron parameters are random values. The parameter values are updated by learning.
import torch
import torch.nn as nn
vocab_size = 3
embed_dim = 4
embed = nn.Embedding(vocab_size, embed_dim)
emb = embed(torch.tensor([2])) # The value to be entered is up to vocab_size (0 to 2)
print(emb)