Kaomojis
⛓️👀∥
Copy
#self-attention
#parallel processing
#sequence processing
#transformer architecture
#multi-head attention
#positional encoding
#self-attention #parallel processing #sequence processing #transformer architecture #multi-head attention #positional encoding