An early neural network model that learned word embeddings by predicting surrounding words in a sentence.
"Word2Vec made links like 'king' minus 'man' plus 'woman' is close to 'queen'—and revealed many biases in our language and training data."