[NLP] Word Embedding - GloVe [practice]

2023. 3. 31. 18:07
๐Ÿง‘๐Ÿป‍๐Ÿ’ป ์ฃผ์š” ์ •๋ฆฌ
 
NLP
Word Embedding
GloVe

 

 

 

์•„๋ž˜์™€ ๊ฐ™์ด, GloVe์— ๋Œ€ํ•œ ์ฝ”๋“œ๋ฅผ ์‚ดํŽด๋ด…๋‹ˆ๋‹ค.

 

GloVe

 

์›Œ๋“œ ์ž„๋ฒ ๋”ฉ์€ ํ•˜๋‚˜์˜ ์›-ํ•ซ ์ธ์ฝ”๋”ฉ ๋ฒกํ„ฐ(ํ•œ ์š”์†Œ๋งŒ 1์ด๊ณ  ๋‚˜๋จธ์ง€๋Š” 0์ธ ๋ฒกํ„ฐ)๋ฅผ ํ›จ์”ฌ ์ž‘์€ ์‹ค์ˆ˜ ๊ฐ’์˜ ๋ฒกํ„ฐ๋กœ ๋ณ€ํ™˜ํ•ฉ๋‹ˆ๋‹ค. ์›-ํ•ซ ์ธ์ฝ”๋”ฉ ๋ฒกํ„ฐ๋Š” ํฌ์†Œ ๋ฒกํ„ฐ์ด๋ฉฐ, ์‹ค์ˆ˜ ๊ฐ’ ๋ฒกํ„ฐ๋Š” ๋ฐ€์ง‘ ๋ฒกํ„ฐ์ž…๋‹ˆ๋‹ค.

 

์ด ์›Œ๋“œ ์ž„๋ฒ ๋”ฉ์—์„œ ๊ฐ€์žฅ ์ค‘์š”ํ•œ ๊ฐœ๋…์€ ๋น„์Šทํ•œ ๋ฌธ๋งฅ์— ๋‚˜ํƒ€๋‚˜๋Š” ๋‹จ์–ด๋“ค์€ ๋ฒกํ„ฐ ๊ณต๊ฐ„์—์„œ ๊ฐ€๊นŒ์ด ์œ„์น˜ํ•œ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค. ์—ฌ๊ธฐ์„œ ๋ฌธ๋งฅ์ด๋ž€ ์ฃผ๋ณ€ ๋‹จ์–ด๋ฅผ ๋งํ•ฉ๋‹ˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด "I purchased some items at the shop"๊ณผ "I purchased some items at the store" ๋‘ ๋ฌธ์žฅ์—์„œ 'shop'๊ณผ 'store' ๋‹จ์–ด๋Š” ๊ฐ™์€ ๋ฌธ๋งฅ์— ๋‚˜ํƒ€๋‚˜๊ธฐ ๋•Œ๋ฌธ์— ๋ฒกํ„ฐ ๊ณต๊ฐ„์—์„œ ์„œ๋กœ ๊ฐ€๊นŒ์ด ์žˆ์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.

 

์—ฌ๊ธฐ์„œ ์ด๋ฏธ GloVe ๋ฒกํ„ฐ๋กœ ๋ฏธ๋ฆฌ ํ•™์Šต๋œ ๋ฒกํ„ฐ๋ฅผ ์‚ฌ์šฉํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค. GloVe๋Š” word2vec๊ณผ ์œ ์‚ฌํ•˜์ง€๋งŒ ๋‹ค๋ฅธ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ž…๋‹ˆ๋‹ค. ์ด๋Ÿฌํ•œ ๋ฏธ๋ฆฌ ํ•™์Šต๋œ ์ž„๋ฒ ๋”ฉ์€ ๊ฑฐ๋Œ€ํ•œ ๋ง๋ญ‰์น˜์—์„œ ํ•™์Šต๋˜์—ˆ์œผ๋ฉฐ, ๋ชจ๋ธ ๋‚ด์—์„œ ์ด๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ๋‹จ์–ด์˜ ๋ฌธ๋งฅ์„ ์ด๋ฏธ ํ•™์Šตํ•œ ์ƒํƒœ์—์„œ ์‹œ์ž‘์ ์œผ๋กœ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์ด๋Š” ์ผ๋ฐ˜์ ์œผ๋กœ ๋” ๋น ๋ฅธ ํ•™์Šต ์‹œ๊ฐ„๊ณผ/๋˜๋Š” ํ–ฅ์ƒ๋œ ์ •ํ™•๋„๋ฅผ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค.

 

ํŒŒ์ดํ† ์น˜์—์„œ๋Š” ๋‹จ์–ด ๋ฒกํ„ฐ๋ฅผ nn.Embedding ๋ ˆ์ด์–ด๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ์‚ฌ์šฉํ•ฉ๋‹ˆ๋‹ค. ์ด ๋ ˆ์ด์–ด๋Š” [๋ฌธ์žฅ ๊ธธ์ด, ๋ฐฐ์น˜ ํฌ๊ธฐ] ํ…์„œ๋ฅผ ๊ฐ€์ ธ์™€ [๋ฌธ์žฅ ๊ธธ์ด, ๋ฐฐ์น˜ ํฌ๊ธฐ, ์ž„๋ฒ ๋”ฉ ์ฐจ์›] ํ…์„œ๋กœ ๋ณ€ํ™˜ํ•ฉ๋‹ˆ๋‹ค. nn.Embedding ๋ ˆ์ด์–ด๋Š” ์ฒ˜์Œ๋ถ€ํ„ฐ ํ›ˆ๋ จํ•  ์ˆ˜๋„ ์žˆ๊ณ , ๋ฏธ๋ฆฌ ํ•™์Šต๋œ ์ž„๋ฒ ๋”ฉ ๋ฐ์ดํ„ฐ๋กœ ์ดˆ๊ธฐํ™”ํ•˜๊ณ  (์„ ํƒ์ ์œผ๋กœ ๊ณ ์ •์‹œํ‚ฌ ์ˆ˜๋„ ์žˆ์Œ) ์‚ฌ์šฉํ•  ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค. nn.Embedding์˜ ํ•ต์‹ฌ์€ ๋ช…์‹œ์ ์œผ๋กœ ์›-ํ•ซ ๋ฒกํ„ฐ ํ‘œํ˜„์„ ์‚ฌ์šฉํ•˜์ง€ ์•Š์•„๋„ ๋œ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๋‹จ์ˆœํžˆ ์ธ๋ฑ์Šค๋ฅผ ๋ฒกํ„ฐ์— ๋งคํ•‘ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค. ์ด๊ฒƒ์€ ๊ณ„์‚ฐ์ ์ธ ์ธก๋ฉด์—์„œ ๋งค์šฐ ์ค‘์š”ํ•ฉ๋‹ˆ๋‹ค.

 

๊ตฌ์ฒด์ ์œผ๋กœ๋Š”, nn.Embedding์€ ์›-ํ•ซ sparse-๋ฒกํ„ฐ๋ฅผ ๋‚˜ํƒ€๋‚ด๋Š” ์ธ๋ฑ์Šค์— ํ•ด๋‹นํ•˜๋Š” ๊ฐ€์ค‘์น˜ ํ–‰๋ ฌ์˜ ์—ด์„ ์„ ํƒํ•˜์—ฌ ๋‚ฎ์€ ์ฐจ์› (dense) ์ถœ๋ ฅ์„ ์ƒ์„ฑํ•˜๋Š” ์„ ํ˜• ๋งต์ž…๋‹ˆ๋‹ค. ์ด๋ฒˆ ํŒŒํŠธ์—์„œ๋Š” ๋ชจ๋ธ์„ ํ›ˆ๋ จํ•˜์ง€ ์•Š๊ณ , ๋‹จ์–ด ์ž„๋ฒ ๋”ฉ์„ ์‚ดํŽด๋ณด๊ณ , ๊ทธ๊ฒƒ๋“ค๋กœ ํ•  ์ˆ˜ ์žˆ๋Š” ๋ช‡ ๊ฐ€์ง€ ํฅ๋ฏธ๋กœ์šด ๊ฒƒ๋“ค์„ ์กฐ์‚ฌํ•ด ๋ณผ ๊ฒƒ์ž…๋‹ˆ๋‹ค.

 

๋จผ์ €, ๋ฏธ๋ฆฌ ํ•™์Šต๋œ GloVe ๋ฒกํ„ฐ๋ฅผ ๋กœ๋“œํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค. name ํ•„๋“œ๋Š” ๋ฒกํ„ฐ๊ฐ€ ํ•™์Šต๋œ ๋ฐ์ดํ„ฐ๋ฅผ ์ง€์ •ํ•˜๋ฉฐ, ์—ฌ๊ธฐ์„œ 6B๋Š” 60์–ต ๊ฐœ์˜ ๋‹จ์–ด ์ฝ”ํผ์Šค๋ฅผ ์˜๋ฏธํ•ฉ๋‹ˆ๋‹ค. dim ์ธ์ˆ˜๋Š” ๋‹จ์–ด ๋ฒกํ„ฐ์˜ ์ฐจ์›์„ ์ง€์ •ํ•ฉ๋‹ˆ๋‹ค. GloVe ๋ฒกํ„ฐ๋Š” 50, 100, 200 ๋ฐ 300 ์ฐจ์›์œผ๋กœ ์ œ๊ณต๋ฉ๋‹ˆ๋‹ค. ๋˜ํ•œ 42B ๋ฐ 840B glove ๋ฒกํ„ฐ๋„ ์žˆ์ง€๋งŒ, ์ด๋“ค์€ 300 ์ฐจ์›์—์„œ๋งŒ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์ด๊ฒƒ์„ ์ฒ˜์Œ ์‹คํ–‰ํ•  ๋•Œ, ๋ฒกํ„ฐ๋ฅผ ๋‹ค์šด๋กœ๋“œํ•˜๋Š” ๋ฐ ์‹œ๊ฐ„์ด ๊ฑธ๋ฆฝ๋‹ˆ๋‹ค.

 

 

import torchtext.vocab

glove = torchtext.vocab.GloVe(name='6B', dim=100) # ์ด๋ฏธ ํ•™์Šต๋œ ๊ฒƒ์„ ๋ถˆ๋Ÿฌ์˜ค๊ธฐ๋งŒ ํ•˜๋Š” ๊ฒƒ์ž„. torchtext์˜ vocab์— ์กด์žฌ.
# hidden size๋ฅผ ๋ช‡์œผ๋กœ ์ •ํ• ๋ž˜์ด๋‹ค. V -> hidden (100 / 300 / 512 / 256)
print(f'There are {len(glove.itos)} words in the vocabulary')
# itos -> dictionary์˜ ์‚ฌ์ด์ฆˆ๊ฐ€ ๋ฌด์—‡์ด๋ƒ๋Š” ๊ฒƒ.
# ์šฐ๋ฆฌ๋Š” ์ง€๊ธˆ ์•„๋ž˜์™€ ๊ฐ™์ด 400000๊ฐœ์˜ words๋ฅผ ์‚ฌ์šฉ ๊ฐ€๋Šฅํ•จ.

 

.vector_cache/glove.6B.zip: 862MB [02:39, 5.40MB/s]                           
100%|โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–‰| 399999/400000 [00:19<00:00, 20515.34it/s]
There are 400000 words in the vocabulary

 

์ฆ‰, ์šฐ๋ฆฌ๋Š” 40000๊ฐœ์˜ words๊ฐ€ ๋‹ด๊ธด vocabulary๊ฐ€ ์ƒ๊ธด ๊ฒƒ์ž…๋‹ˆ๋‹ค.

 

 

glove.vectors.shape

 

์œ„ ๊ฒฐ๊ณผ๋กœ

torch.Size([400000, 100])

์ด๊ฒƒ๊ณผ ๊ฐ™์ด 40000 x 100 ์ฐจ์› ์งœ๋ฆฌ๋ฅผ ๊ฐ€์ ธ์˜ต๋‹ˆ๋‹ค.

 

 

๊ฐ ํ–‰์ด ์–ด๋–ค ๋‹จ์–ด์™€ ๊ด€๋ จ์ด ์žˆ๋Š”์ง€๋Š” itos(int to string) ๋ฆฌ์ŠคํŠธ๋ฅผ ํ™•์ธํ•˜์—ฌ ์•Œ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

 

์•„๋ž˜์˜ ์˜ˆ์‹œ๋Š” 0๋ฒˆ ํ–‰์ด 'the'์™€ ๊ด€๋ จ๋œ ๋ฒกํ„ฐ, 1๋ฒˆ ํ–‰์ด ','(์‰ผํ‘œ)์™€ ๊ด€๋ จ๋œ ๋ฒกํ„ฐ, 2๋ฒˆ ํ–‰์ด '.'(๋งˆ์นจํ‘œ)์™€ ๊ด€๋ จ๋œ ๋ฒกํ„ฐ ๋“ฑ์œผ๋กœ ์ดํ•ดํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

 

 

glove.itos[:100]

 

์ด๊ฒƒ์„ ์ฐ์œผ๋ฉด ์•„๋ž˜์™€ ๊ฐ™์ด ๊ฒฐ๊ณผ๊ฐ€ ๋‚˜์˜ต๋‹ˆ๋‹ค.

 

['the',
 ',',
 '.',
 'of',
 'to',
 'and',
 'in',
 'a',
 '"',
 "'s",
 'for',
 '-',
 'that',
 'on',
 'is',
 'was',
 'said',
 'with',
 'he',
 .
 .
 .
 '']

์ผ๋ถ€ ์ƒ๋žตํ–ˆ์Šต๋‹ˆ๋‹ค.

 

์œ„์™€ ๊ฐ™์ด ๋‚˜์˜จ ๊ฒƒ์„ ๋ณด์•„์„œ,

glove๋Š” ์‚ฌ์ „๊ณผ๋„ ๊ฐ™๋‹ค๋Š” ๊ฒƒ์„ ์•Œ ์ˆ˜ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค. ์–ด๋–ค ๋‹จ์–ด์— ๋Œ€ํ•œ embedding ๊ฐ’์„ ์ญ‰ ๊ฐ€์ง€๊ณ  ์žˆ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.

 

 

glove.stoi['the'] # ๋‹จ์–ด๊ฐ€ ๋ช‡ ๋ฒˆ์งธ์— ์กด์žฌํ•˜๋ƒ. keyerror ๋Š” ์—†๋Š” ๊ฒƒ.
# ๊ฐ€์žฅ ๋งŽ์ด ๋“ฑ์žฅํ•˜๋Š” ๋‹จ์–ด๋ฅผ ์•ž์—์„œ๋ถ€ํ„ฐ ๋„ฃ๋Š”๋‹ค.

 

0

 

๋‹จ์–ด๊ฐ€ ์–ด๋””์— ์กด์žฌํ•˜๋‚˜๋„ ์•Œ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

 

print(glove.stoi['the'])
glove.vectors[glove.stoi['the']]

 

์šฐ๋ฆฌ์˜ ๋ชฉ์ ์€, ๋‹จ์–ด์˜ ์ธํ…์Šค์—์„œ ๋ฒกํ„ฐ๋ฅผ ๊ฐ€์ ธ์˜ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.

 

์ด ๋ฒกํ„ฐ๋Š” 400000 x 100 ์ฐจ์› ์งœ๋ฆฌ์ธ๋ฐ, the ๋ผ๋Š” ๋‹จ์–ด๋ฅผ ๊ฐ€์ ธ์˜ค๋ ค๋ฉด, the ๋ผ๋Š” ๋‹จ์–ด์˜ ์ธ๋ฑ์Šค๋ฅผ 0์œผ๋กœ ๋ถ€์—ฌํ•จ.

 

์ปดํ“จํ„ฐ๋Š” ๋‹จ์–ด๋ฅผ ์•Œ์ง€๋Š” ๋ชป ํ•œ๋‹ค. ์ˆซ์ž๋กœ ํ‘œํ˜„ํ–ˆ๋‹ค. ๊ทธ๋ž˜์„œ man - woman ์™€ ๊ฐ™์€ ์—ฐ์‚ฐ์ด ๊ฐ€๋Šฅํ•œ ๊ฒƒ์ด๋‹ค.

 

 

 

์•„๋ž˜ ๊ตฌํ˜„๋œ ํ•จ์ˆ˜๋ฅผ ์‚ดํŽด๋ณด์ž.

def get_vector(embeddings, word):
  return embeddings.vectors[embeddings.stoi[word]]

 

  • Glove๋ผ๋Š” embedding์— vectors๊ฐ’์„ ๋„ฃ์Œ.
  • ํŠน์ • embedding์— ๋Œ€ํ•œ ๋‹จ์–ด๋ฅผ ๋ฑ‰์–ด๋ผ.

 

import torch

def closest_words(embeddings, vector, n=10):
    distances = [(w, torch.dist(vector, get_vector(embeddings, w)).item()) for w in embeddings.itos]
    return sorted(distances, key = lambda w: w[1])[:n]

 

  • ๋ชจ๋“  ๋‹จ์–ด์™€ ์ฃผ์–ด์ง„ ๋‹จ์–ด์™€์˜ ๋ชจ๋“  distance๋ฅผ ๊ตฌํ•ด์„œ ๊ฐ€์žฅ ๊ฑฐ๋ฆฌ๊ฐ€ ์งง์€ ์•  n๊ฐœ๋ฅผ ๋ฆฌํ„ดํ•ด์ค˜.
  • ๊ฐ€์žฅ ๊ฐ€๊นŒ์šด ๊ฒƒ์„ ์ฐพ์Œ.
  • embedding๊ณผ vector๊ฐ€ ๋“ค์–ด์˜ค๋ฉด ์ฐพ์Œ.
  • ์œ ํด๋ฆฌ๋””์•ˆ distance๋กœ distance๋ฅผ ๊ตฌํ•ด์„œ์„œ ๊ณ„์‚ฐํ•จ.

 

closest_words(glove, get_vector(glove, 'korea'))

 

  •  glove์™€ korea๋ผ๋Š” ๋‹จ์–ด์™€ ๊ฐ€์žฅ ๊ฐ€๊นŒ์šด ๋‹จ์–ด 10๊ฐœ๋Š” ์•„๋ž˜์™€ ๊ฐ™๋‹ค.
[('korea', 0.0),
 ('pyongyang', 3.9039547443389893),
 ('korean', 4.068886756896973),
 ('dprk', 4.2631049156188965),
 ('seoul', 4.340494632720947),
 ('japan', 4.551243305206299),
 ('koreans', 4.615607738494873),
 ('south', 4.65822696685791),
 ('china', 4.8395185470581055),
 ('north', 4.986356735229492)]

 

 

def print_tuples(tuples):
    for w, d in tuples:
        print(f'({d:02.04f}) {w}')

 

์œ„ tuple์„ ์ด์˜๊ฒŒ ์ถœ๋ ฅํ•˜๋Š” ๊ฒƒ.

 

print_tuples(closest_words(glove, get_vector(glove, 'ai')))

 

์šฐ๋ฆฌ๋Š” ai๋ผ๋Š” ๋‹จ์–ด๋ฅผ ์›ํ•˜์ง€๋งŒ ,์•„๋ž˜์™€ ๊ฐ™์ด ๋‹ค๋ฅธ ๋‹จ์–ด๊ฐ€ ๋‚˜์˜ฌ ์ˆ˜ ์žˆ๋‹ค.

 

glove๋ผ๋Š” embedding์„ ํ•™์Šตํ•œ ๋ฐ์ดํ„ฐ๊ฐ€ generalํ•˜๊ณ , ๋Œ€ํ™” ํ˜•์‹์˜ ๋ฐ์ดํ„ฐ๋ฅผ ํ•™์Šตํ–ˆ๊ธฐ ๋•Œ๋ฌธ์— ์ด๋ ‡๊ฒŒ ๋‚˜์˜ด.

 

(0.0000) ai
(4.5332) hey
(4.5842) ok
(4.6785) fukuhara
(4.8145) fortunately
(4.8299) cause
(4.8935) yeah
(4.9061) hi
(4.9083) luckily
(4.9333) …

 

 

 

def analogy(embeddings, word1, word2, word3, n=5):

    candidate_words = closest_words(embeddings, get_vector(embeddings, word2) - get_vector(embeddings, word1) + get_vector(embeddings, word3), n+3)
    
    candidate_words = [x for x in candidate_words if x[0] not in [word1, word2, word3]][:n]
    
    print(f'{word1} is to {word2} as {word3} is to...')
    
    return candidate_words

 

 

  • vector๋กœ ํ‘œํ˜„ ํ–ˆ์œผ๋‹ˆ ๋นผ๊ธฐ ๋”ํ•˜๊ธฐ ์ด๋Ÿฐ ๊ฒƒ์ด ๋˜์ง€ ์•Š์„๊นŒ ?
    • king + woman - man =====> queen?์ด ๋‚˜์˜ค์ง€ ์•Š์„๊นŒ?
    • ์–ด๋–ค embedding ๊ฐ’์ด ๋‚˜์˜ฌ ํ…๋ฐ, ๊ทธ๊ฒƒ์ด 400000๊ฐœ ์ค‘ ๊ฐ™์€ ๊ฒŒ ๋‚˜์˜ฌ ๊ฐ€๋Šฅ์„ฑ์€ ๊ฑฐ์˜ ์—†๋‹ค.
    • ๊ทธ ๊ฒฐ๊ณผ๋กœ ๋‚˜์˜จ vector์™€ 40000 ๋‹จ์–ด ์ค‘์—์„œ ๊ทธ๋‚˜๋งˆ distance๊ฐ€ ์ž‘์€ ๊ฒƒ์„ ๊ณ ๋ฅธ๋‹ค.

 

print_tuples(analogy(glove, 'man', 'king', 'woman', n = 10))
print_tuples(analogy(glove, 'seoul', 'korea', 'india', n = 10))

 

 

์ž˜ ํ•˜๊ธด ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๊ฐ€ ์™„๋ฒฝํžˆ ์›ํ•˜๋Š” ๊ฒƒ์„ ์–ป๋Š” ๋ฐ์—๋Š” ํ•œ๊ณ„๊ฐ€ ์žˆ๋‹ค.

 

man is to king as woman is to...
(4.0811) queen
(4.6429) monarch
(4.9055) throne
(4.9216) elizabeth
(4.9811) prince
(4.9857) daughter
(5.0641) mother
(5.0775) cousin
(5.0787) princess
(5.1283) widow
seoul is to korea as india is to...
(5.8343) pakistan
(6.2924) lanka
(6.5571) australia
(6.5643) bangladesh
(6.5883) africa
(6.6894) sri
(6.7463) indonesia
(6.7763) indian
(6.9396) japan
(6.9865) zealand

 

๋”ฐ๋กœ ํ•™์Šตํ•˜์ง€ ์•Š๊ณ  LSTM์ด๋‚˜ ๋‹ค๋ฅธ ๊ฒƒ์— ๊ปด์„œ ํ•™์Šต์„ ์ง„ํ–‰ํ•ฉ๋‹ˆ๋‹ค.

 

 

 

 

 

 

 

'Artificial Intelligence > Natural Language Processing' ์นดํ…Œ๊ณ ๋ฆฌ์˜ ๋‹ค๋ฅธ ๊ธ€

[NLP] RNN - LSTM, GRU  (0) 2023.04.04
[NLP] RNN  (0) 2023.04.04
[NLP] Word Embedding - GloVe  (0) 2023.03.31
[NLP] Word Embedding - CBOW and Skip-Gram  (2) 2023.03.27
[NLP] Word Embedding - Word2Vec  (0) 2023.03.27

BELATED ARTICLES

more