· 4 min read

Transformers Decoded

A look at how a recent development in AI is the game changer for language learners.

A look at how a recent development in AI is the game changer for language learners.

How Transformers Drive inContext’s Definitions

inContext provides tailored definitions by mimicking how humans use context to understand words. This post explores how inContext leverages recent advancements in AI, particularly a development called transformers found in large language models (LLMs), to provide definitions and contextual usage of words. The key insight is that transformers consider the context to understand words, weighting the surrounding words differently to determine their importance in understanding the target word.

Benefits for Language Learners

For language learners, this technology means that when you use inContext to look up a word, you get the exact definition, not a list of possible definitions given by a traditional dictionary. In addition, there is an explanation tailored to how the word is used in that particular sentence or context. This deeper understanding can dramatically improve both vocabulary and comprehension. For instance, a regular dictionary might struggle with the word “chicken” in the phrase “Don’t be chicken.” Ordinarily, it would define “chicken” as a type of bird, since that is the most common usage of the word. But because transformers understand the context around the word, they will recognize that in this case, “chicken” actually means “cowardly” and not the animal.

Transformers

Transformers are at the core of the latest generation of AI technology. The transformer was first proposed in a landmark 2017 paper called Attention is All You Need that has already accumulated over 100,000 citations. Essentially, the transformer analyzes text through millions of small mathematical computations between all of the different words in a document. Unlike earlier technologies that focused on counting words or looking at them one at a time, the transformer keeps track of how each word relates to every other word. This broader view helps the AI understand which surrounding words are relevant in understanding a target word, which is crucial for accurate interpretation and translation.

The Magic of Embeddings and Self-Attention

The secret sauce of transformers is something called embeddings combined with a process known as self-attention.

When you look up a word in inContext, the AI doesn’t just see the word as a standalone unit. Instead, it sees an “embedding,” a kind of digital fingerprint of the word, influenced by all the other words around it. Each word is given an embedding that captures unique characteristics about the word in a series of numbers. These could be simple things like whether the word is a noun or verb, but they are usually much more abstract concepts like the “animal-ness” of the word (how much it relates to animals, e.g., a cat vs. an animal cracker vs. a cracker would all be placed on different parts of the animal-ness spectrum). These characteristics are qualities the model has determined are very important after analyzing trillions of words of training data.

Imagine you’re at a party, and you overhear someone talking about a “bat.” How do you know if they’re discussing an animal or a baseball tool? You listen to the other words in the conversation. AI does something similar with embeddings; it “listens” to the words surrounding the target word and uses this information to form a weighted combination that defines that word’s meaning in that specific instance.

Self-attention is the mechanism that transformers use to determine which words in the sentence are most important to understanding the meaning of the word you’re interested in. This process helps the AI focus on relevant words and ignore the rest, much like how you might focus on key words in a conversation to grasp the topic. In our example above, the word “a” may not be helpful in determining which “bat” we are talking about, but the words “cave,” “baseball,” or “pitcher” would probably give it away. Self-attention weighs these special words much more heavily to understand the word “bat.”

Conclusion

The inContext browser extension is more than just a dictionary. It’s a sophisticated AI tool that brings context-sensitive understanding of language to your fingertips. By leveraging the power of AI, specifically transformers, inContext helps language learners grasp not just the meaning of words but their usage in real-world situations, enhancing both learning efficiency and engagement.

Whether you’re studying a new language or perfecting your fluency in another, inContext offers a smart, intuitive way to deepen your understanding of language in context.

Back to Blog

Related Posts

View All Posts »