Paper Review/Natural Language Processing
검색결과
1
개
[Attention Is All You Need] Transformer (Natural Language Processing)
🧑🏻💻용어 정리 Neural Networks RNN LSTM Attention Transformer Generator discriminator self-attention layer normalization multi-head attention positional encoding https://arxiv.org/abs/1706.03762 Attention Is All You Need The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration. The best performing models also connect..
Paper Review/Natural Language Processing
2023. 5. 21. 20:43