[McKinsey Explainers] What is tokenization?

Posted by:

|

On:

|

,

Tokenization is the process of creating a digital representation of a real thing. Tokenization can be used to protect sensitive data or to efficiently process large amounts of data.

Events of the past few years have made it clear: we’re hurtling toward the next era of the internet with ever-increasing speed. Several new developments are leading the charge. Generative AI (gen AI) is one; barely a week goes by without an important new breakthrough. Web3 is said to offer the potential of a new, decentralized internet, controlled by participants via blockchains rather than a handful of corporations. How we pay for things is also experiencing disruption: one in two consumers in 2021 used a fintech product, primarily peer-to-peer payment platforms and nonbank money transfers.

What do gen AI, Web3, and fintech all have in common? They all rely on a process called tokenization. But each case uses tokenization in a very different way.

In payments, tokenization is used for cybersecurity and to obfuscate the identity of the payment itself, essentially to prevent fraud. In Web3, by contrast, tokenization is a digitization process to make assets more accessible. And in AI, it’s something else entirely: tokenization is used to break down data for easier pattern detection.

Later in this Explainer, we’ll explore specific examples of how tokenization works differently in each context. First, let’s get the basics down.