Tokens and Tokenisation

This page explores the concept of tokens and tokenisation and some examples how they are implemented.

Tokens

Tokens are digital representations of data or information. They enable their owner(s) to access, participate in, or take part in the development of a digital data ecosystem. Tokens can be owned and transacted, some hold monetary value and others are associated with access and processes. There are different types of digital tokens including:

  • Authorisation tokens – Used to verify the identity of users or systems and to grant or deny access to resources. An example is an OAuth token.

  • Identity tokens – Used to represent a user's identity and may contain personal information such as name, email address, and address. An example is an OpenID Connect token.

  • Utility tokens - Allow owners access (via encrypted key) to a particular network or blockchain. Owners can perform actions on the network, including assisting in its development, and can benefit from the network’s output.

  • Security tokens - Represent ownership of an asset and are fungible.

  • Governance tokens - Provide the owner with the rights to govern a decentralised organisation.

  • Value tokens - Hold value in the form of a digital object, for example art or music. They take the form of non-fungible tokens (NFTs).

Tokenisation

Tokenisation is the process of converting (sensitive) data into tokens. This tokenisation of data allows users, organisations, and assets to protect sensitive data while preserving its business utility. Data tokenisation provides higher levels of data security than encryption alone would be able to do. When data is tokenised, all original sensitive personal, payment or identifiable data is removed. The sensitive data is kept in secure storage and only shared when really necessary. This method of data security is founded on “Zero Trust” principles whereby no user or device is trusted to access the stored data until their identity and authorisation are verified.

“Tokenization simultaneously preserves the utility of sensitive data while allowing it to remain secure and compliant with most regulations. Tokenization allows sensitive data to remain secure in transit, in use, and at rest, enabling a flexible range of data use cases that would otherwise be unsafe or inadvisable.” Buchfiel, A. (2022, March 31). What is data utility and how can tokenisation preserve it? Tokenex.

Implementations

The benefits of tokenising data is becoming more widely known, largely in part due to new regulations and standards. For example, in the United States, the Health Insurance Portability and Accountability Act (HIPAA), and in Europe, the General Data Protection Regulation (GDPR) require special handling, anonymisation and secure storage of personally identifiable information (PII). Due to these data governance requirements, companies from a variety of disciplines are utilising the tokenisation of business, employee and customer data as a means to store and exchange PII. Common implementations of data tokenisation include:

  • Payment services

  • User authentication

  • Asset management

  • Exercising user rights in an ecosystem or platform

Last updated