Technology
The inventors of the Internet were thinking about how to build a decentralized network for exchanging data that would continue to function even in the event of a nuclear attack. Since then the focus of security has primarily been on how to secure the channel between parties and not the data itself. Hence the challenge of verifiable data has not been addressed – we rely on process and trusted insiders to ensure the correctness of data within organizations and have limited mechanisms to verify data as it crosses organizational boundaries.
This is illustrated in the figure above for satellite communication. A satellite generates an image which is then passed across ERP systems and organizational boundaries until it is acted upon by an end user. This can be thought of as an information supply chain - data is created and then goes through many steps of verifying, annotating and modifying until eventually it is delivered to an actor who wishes to take action based on that data. Today we do not have the tools to make those supply chains cryptographically verifiable such that anyone in the supply chain can verify the provenance of the data without relying on trusted parties.
The typical answer for the last 40 years has been PKI (Public Key Infrastructure). PKI works perfectly for its original use case – sharing a secret across an insecure channel, however the complexity of key management make it very challenging to implement for verifying data at rest - and all data in motion start and finishes as data at rest. It requires an established trust infrastructure between parties that relies on the security of keys across every participant in the supply chain of information.
Blockchain and interoperability
The early iterations of blockchain involved architectures such as the one of the left below. Every party in the supply chain would run a consensus node and participations would establish governance committees to establish access to what is effectively a shared database.
This approach is failing universally – the idea that a hospital, an insurance
company, a financial institution, or a logistics company needs to know anything about blockchain, let
alone run consensus infrastructure, is fundamentally flawed.
Our technology takes a different approach. Public Blockchain (alphabill) is a critical component in all our enterprise solutions but only as the plumbing, deep underground as the infrastructure for generating and transferring tokens. There are no governance committees or shared databases - only software that solves a business problem.
The tokenization of data
Our view is that blockchain is a censorship resistant machine for the creation of tokens. Those tokens can represent digital currency, real world assets or simply data itself i.e., for any type of data it is possible to create a token that provides cryptographic proof of its properties (time, integrity, provenance, identity, uniqueness....) without the reliance on a centralized trusted authority.
As such, blockchain represents not a replacement for PKI, but as a way to make existing implementations practical for cross-boundary multi-party processes. The blockchain creates tokens for data, which can be verified and acted upon off-chain in the real world.
Digital currencies are just one example of tokenization. In this case proof of uniqueness is essential to prevent double spending. However, the vast amount of information in the world today does not require proof of uniqueness – you don’t need to transfer ownership of a configuration file in a 5G router or a record in a government database. Token designers can select the properties of the assets they wish to be verifiable – where did it come from, has it changed, who created it and when, etc. This allows us to build a new model for security on the Internet – no longer are we searching for vulnerabilities, instead we have real-time situational awareness about the state of every digital component in a system or network as each component has a verifiable supply chain.
Digital twins and digital threads
Most of the focus of tokenization to date has been on financial
assets. However the universe of tokenization is much larger - every financial, physical and digital asset can be tokenized. Indeed our vision is that every piece of data on the Internet, from an event in a syslog log daemon to high value content can be tokenized, and made verifiable as part of its lifecycle.
A token can travel across organizational and network boundaries, it can be verified and acted upon on at each step without any of the parties in the supply chain needing to change behaviour. This represents a new model for interoperability.