ai-learn
· 2 min read
-
https://github.com/search?q=word2id+batch_size+seq_len+zip_longest&ref=opensearch&type=code
-
https://community.openai.com/t/foundational-must-read-gpt-llm-papers/197003
-
https://community.openai.com/t/foundational-must-read-gpt-llm-papers/197003
refs site
- https://github.com/mikaizhu/Notes/blob/f04d80456dde566abbd7e8cd1e3528cef1bc6f74/Deep_learning/%E6%B7%B1%E5%BA%A6%E5%AD%A6%E4%B9%A0%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86/TencentAds.md
- https://www.sharetechnote.com/html/NN/Handbook_MachineLearning_Index.html
- https://pytorch.org/tutorials/
- https://pytorch.org/tutorials/beginner/basics/intro.html
- https://pytorch.org/tutorials/beginner/nlp/word_embeddings_tutorial.html#sphx-glr-beginner-nlp-word-embeddings-tutorial-py
- ai for begginer https://github.com/microsoft/AI-For-Beginners
- https://www.baeldung.com/cs/category/ai
To remember
- one-hot encoding
- torch.nn.Liner
- torch.nn.Embedding
torch.tensor([[1,2,4,5]])
weights = torch.rand(10, 3)
net = torch.nn.liner(10,2)
print(net,net.h)
torch.mm
AI > machine learning > deep learning
pytorch.nn.Embedding
- https://www.jianshu.com/p/63e7acc5e890
- https://pytorch.org/tutorials/beginner/nlp/word_embeddings_tutorial.html#sphx-glr-beginner-nlp-word-embeddings-tutorial-py
- https://discuss.pytorch.org/t/how-does-nn-embedding-work/88518/3