menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

What Is an Apple in 12,288 Dimensions?

33 12
30.03.2025

Ask a child what an apple is, and you'll get an answer that's sweet, literal, and probably red. Ask a theologian, and you might get sin. Ask a tech analyst, and you might get Cupertino, quarterly earnings, and silicon.

Now, ask a large language model like GPT-4, DeepSeek, or Grok what an apple is, and you won't get a definition—you’ll get a vector with about 12,288 dimensions*—each one encoding a slice of meaning.

That’s not poetry. That’s architecture.

In the world of LLMs, every word, token, or fragment of language isn’t just stored—it’s located or mapped. It's embedded in a vast multidimensional space that captures not just what a word is but how it behaves, how it changes, how it flexes under pressure. The word “apple” doesn’t mean anything by itself. It means everything in context—and that context is calculated.

Let’s start at the beginning. When you type the word “apple” into an LLM, it’s first broken down into a token. That token is then mapped to a unique vector in a 12,288-dimensional space (Note: This varies with models). Think of it as the model’s first impression—a kind of static,........

© Psychology Today