Tokenization of Financial Assets: Context
Tokenization is a buzzword that is used by many in the industry without a single definition or a deep understanding. At its core, tokenization refers to the process of creating a digital representation of an asset on a blockchain or distributed ledger, where each token represents a claim, share, or fraction of the underlying asset to be securely stored, transferred, and traded. Tokenization enables capabilities such as near-instant settlement and extended market access, but the definition alone understates the scale of change required to support it in public market infrastructure.
This article examines tokenization of financial assets through the lens of liquid public market infrastructure and addresses the five W’s: who, what, where, when, and why. The focus is on U.S. markets, where regulatory protections, settlement frameworks, and market structure set a high bar for change. Global markets will be subject to additional regulatory efforts and political considerations, but the same underlying benefits of tokenization are likely to be realized as public markets continue to digitize.
Historically, when large public markets modernize their infrastructure, others tend to follow. It is important to note that many types of digital token technologies exist, including blockchain-based networks and other distributed ledger models (Ether, Smart Contracts, TapestryX). The specific technology selected is separate from the process of tokenization itself. This article focuses on the process and operating model implications, while agreement on industry-wide technology standards represents a separate challenge that must be addressed to fully realize the benefits of tokenized markets. Industry Organizations like ISITC and the World Economic Forum, among others have begun to address concerns on the topic.
What is Asset Tokenization?
Asset tokenization is the process of representing a real or financial asset as a digital token on a distributed ledger, where each token reflects a claim, share, or interest in the underlying asset. This digital representation enables assets to be securely recorded, transferred, and settled within a shared, controlled network.
Tokens are programmatic representations issued on distributed ledger technology (DLT) platforms. While public networks such as Bitcoin and Ethereum are the most well-known examples of DLT in practice, they primarily demonstrate the underlying technology rather than the model that public financial markets are expected to adopt. Institutional asset tokenization is more likely to rely on permissioned or private networks that support regulatory controls, governance, and investor protections.
Because tokenized assets are programmable, features such as income distribution, corporate actions processing, and compliance rules can be embedded directly into the asset record. To support timely settlement, tokenized assets also require digital forms of cash, such as stablecoins or central bank digital currencies (CBDCs), to enable delivery versus payment within a digital ecosystem.
Who will Asset Tokenization affect?
The current participants in public financial markets are most likely to lead the transition from certificated or book-entry securities to digital token representations using modern distributed ledger technology. Major market participants have already experimented with digital asset processing and custody and, in many cases, have acquired or partnered with firms operating in this space. The regulatory environment and safeguards embedded in today’s market structure can be extended into a tokenized framework.
Participants across the investment ecosystem, including issuers, asset managers, custodians, clearing agencies, and broker-dealers, stand to benefit from a tokenized asset base. Potential benefits include reduced operational costs through streamlined processes, the ability to settle transactions in near real time, increased liquidity through the elimination of extended settlement cycles, support for on-demand trading across extended hours, and improved facilitation of cross-border activity. Collateral management and securities lending must be carefully incorporated into any rollout, as they were in prior settlement cycle reductions, to ensure alignment with the underlying settlement processes.
Where will Tokenization of Financial Assets occur?
For security, regulatory, and control reasons, tokenized assets are most likely to reside on private or permissioned networks that provide access only to authorized participants, rather than in the public domain. While public networks may offer greater flexibility and self-custody options, they provide fewer investor protections both legally and operationally.
In a public domain model, security responsibilities are more dispersed, introducing new risks, including the potential loss of assets without a clear recovery path. The regulatory protections embedded in existing market infrastructure would be difficult to replicate in such an environment. As a result, regulators are unlikely to allow many of these protections to lapse. Tokenized transactions will therefore require trusted networks, with closed or permissioned systems most likely to meet regulatory expectations and provide the necessary safeguards.
When will Asset Tokenization be in daily Production?
Pilot tokenized asset systems are already in operation today. Cash and money market instruments have tokenized representations, and U.S. Treasuries represent the largest asset base currently tokenized, primarily for collateral purposes. Other markets, such as Hong Kong, are exploring native tokenized bond issuance.
In the United States, DTCC received an SEC no-action letter on December 11, 2025, to tokenize the Russell 1000, certain exchange-traded funds, and U.S. Treasuries, with services expected to launch in the second half of 2026. More widespread tokenization of publicly traded assets is likely to unfold over the next five years, though coordination across the market will be critical to maximizing its value.
The migration to a tokenized infrastructure does not need to occur all at once, nor does it imply that all settlements will immediately move to near real-time. Previous settlement cycle reductions, from T+5 to T+3, T+2, and T+1, demonstrated that industry-wide change is gradual and requires multiple components of the ecosystem to adapt. A key open question is whether regulatory action will be needed to drive coordinated adoption, or whether market participants will align organically. Uncoordinated initiatives may result in fragmented standards and increased reliance on interoperability solutions.
Why should I care about Asset Tokenization?
The securities settlement infrastructure in the United States is approximately 50 years old. While regulation has historically limited competition and preserved existing structures, technological advances and evolving business models continue to challenge established market frameworks. Tokenized infrastructure offers potential improvements in security, efficiency, and the delivery of related services such as collateral management.
Near real-time settlement differs fundamentally from prior settlement cycle reductions. Earlier changes optimized existing frameworks, while asset tokenization requires rethinking the framework itself. As public market infrastructure evolves, the question is no longer whether tokenization will influence future market design, but how prepared market participants will be to operate within it.
Conclusion
Tokenization of financial assets is often described as a technology shift, but in practice, it represents a broader operating model transformation. Asset tokenization changes how assets are recorded, settled, and governed, amplifying both strengths and weaknesses across data management, controls, and operational processes. In a near real-time settlement environment, fragmented data, manual reconciliation, and unclear ownership structures become more visible rather than less.
Tokenization does not simplify operations on its own; it raises the bar for operational maturity. Understanding asset tokenization as an infrastructure redesign, rather than a standalone technology implementation, provides clearer insight into how tokenized markets may evolve while maintaining resiliency, control, and regulatory confidence.
Download Thought Leadership Article Process Design and Change, Strategy and Roadmap Data and Digital Transformation, Front Office & Portfolio Management Asset Managers, Industry Vendors Gregg Weintraub
info@meradia.com
