Kaomojis

⛓️👀∥

#self-attention #parallel processing #sequence processing #transformer architecture #multi-head attention #positional encoding