Indeed, how did you come to trust this fabric to make decisions autonomously and without your knowledge?

For many, these are the very reasons why industries are recalcitrant to move to cloud.

Now what if we could answer these questions – complete visibility and proof – a chain of custody - into all important data interactions regardless of the volume, variety, or scale?

My job at Guardtime and indeed my goal for the last ten years has been to ensure that data silos and the connections and applications that manage them can be trusted.  In the short time I have, I will explore these questions. 

My remarks today will focus on cyber security and an industry perspective focused on cyber integrity, what these terms mean, and why integrity it is important. Cyber integrity – the integrity of data, their storage repositories, and the networks that serve them is in my opinion the most fundamentally important cornerstone of cyber security and will completely transform the industrial internet, the IoT markets, and our API economies over the next decade.

First off, I’d like to start by differentiating the definitions of trust vs. truth. These definitions are essential to ascertaining whether or not data and networks have integrity. 

Trust is defined as, “firm belief in the reliability or ability of someone or something”. Trusting a network or the data stored in an enterprise or cloud service provider is nonsense without the basic instrumentation and metrics to develop a formal situational awareness into how reliable these assets really are and what they are doing with the data, services, and applications they are hosting.

Truth on the other hand can be measured – it means undeniable independent proof, which can be proven forensically in a court of law. Truth, not trust is essential for any network, enterprise, or data storage asset – its operation and interactions with the data being hosted should be able to be independently verified with forensic proof that holds up in a court of law. 

The data should be the witness and shown reliable, not the investigator.

Reliable integrity instrumentation and the evidence it affords should also be able to work at the storage scales required for all the data being generated on public, industrial and private networks – by all estimates a conservative 35 zettabytes by 2020. This amount of information is equivalent to every person on earth reading ~1258 newspapers every day.

Further setting this stage, I recently read that Accenture estimates the cost of ineffective cyber security to be three trillion USD by 2020. If you are to believe estimates – even Accenture’s analysis may be understated given Ericsson’s recent report highlighting that almost 50 billion new devices will be connected to the Internet by 2020 – also known as the Internet of Things (IoT).

Their perspective is unique – close to half the world’s traffic traverses their networks and equipment. My good friend Geoff Hollingsworth from Ericsson also recently highlighted from a  blog on ‘Truth not Trust – the Importance of Data Integrity in the Networked Society’ that…

“Every day another breach in online security is reported and on a macro level, companies are spending approximately 46 billion per year to secure their operations with 46 percent of them saying they will increase spending going forward. When breaches do happen, the average payout and cost of insurance averages about 1M USD and tops out at around USD 3.5 million when you factor in pending claims and self-insured retentions."

The industry has recoiled and reacted, IT security budgets have grown from 4-6 percent in two years and are on track to become 8-10 percent of total project budget and spend including NOC/SOC, and CSIRT organizations.

On a micro level, cyber breach costs are deeply personal – individuals are losing control of their personal information, companies are losing control of their core assets, IPR, and some employees are even losing their jobs – just look at the CEO of Target. The larger cost to the economy is that trust in the digital society has plummeted.

Businesses and the economy need a predictable and deterministic environment to grow, where risk can be quantified and managed alongside investment and return. The World Economic Forum believes the lack of functioning cyber security threatens as much as USD 3 trillion of non-realized potential growth during this decade. If we are investing more but performing worse, something is fundamentally wrong with the approach we are taking as a society to cyber security.

The way these new devices will communicate and to which systems and enterprises are not considered by Accenture’s estimate. From my perspective, there are a couple of impacts with this growth – These new devices further abstract humans from the machines that are using them – as they will now be increasingly communicating with each other using automated information rules. Information rules that are being written, monitored, and enforced by other machines.

In frame, machine generated data will exceed human generated data by 2016 and most of this data will be stored in the cloud. This is important to understand – that with this abstraction – the truth is, few analysis firms even understand how to weigh the security impact of such explosive growth or the integrity of this data, how to verify it’s authenticity and track where it has been stored and the transformations it undergoes, while still preserving privacy.

Consider that in 2014, more than 3 times as many devices as subscribers were added to carrier networks in the US market alone.

And yet – cyber security approaches are not working – massive breaches still occur as an almost monthly event.  It is my unique perspective and experience that the code these machines run will ALWAYS be subject to compromise. Exploiters, attackers, and hackers are a creative bunch. Applications and software always has problems – especially when delivery of those assets has increased velocity to provide new services daily. These vulnerabilities inevitably will result in exploitation.

The industry must accept this and always assume that at some point, even the most ‘secure’ infrastructures will be compromised by people (also known as the insider threat) or insecure code or poor implementations. So… for those within the sound of my voice – you should..  Assume compromise.

In fact, much of the interaction I have – whether it be within US, EU, and Asian governments, their defense establishments, or multinational corporations is how to actually bring transparency into how this data is being stored, manipulated, and how it can be trusted given information rules outlined in outsourced service provider contracts.

Much of my interaction professionally in the industrial, telecommunications, and banking sectors is with CIOs recalcitrant to embrace cloud migration and services due to cyber security, their perceived loss of control over their IP and data – as well as a lack of situational awareness into where their data is, how it is being used/consumed, and the information rules that govern these critical digital assets.

My response to them is that without data integrity, “your concerns are well founded”.

The industry has been seeking this holy-grail for cyber security for some time – undeniable truth (not trust) in the data, whose authenticity, identity and proof of creation time can be verified independent of the hosting or service provider.

One of the dominant paradigms in cyber security research is the idea of “provable security”. This is the notion that to have confidence in security protocols it is necessary to have a mathematically rigorous theorem that establishes a conditional guarantee of security given certain assumptions.

The challenge, and reason why provable security has remained in the ivory towers of academia is that it is completely useless if the assumptions require trusted humans or the security of cryptographic keys – secrets can be exposed and people can be compromised.

Simply put, provability doesn’t matter much in the real world if there are other attacks that can defeat the security much more effectively.

There is however one area of security where provability can be meaningfully applied, namely integrity. When Whit Diffie proposed public key cryptography he wasn’t thinking about integrity – he was thinking about key exchange and how two parties could communicate securely across an insecure channel – the foundation for today’s e-commerce and identity systems.

At the time it was perfectly natural to think about extending PKI for verifying the integrity of messages and the consequence of this that today when security professionals have an integrity problem they naturally turn to PKI, it has been the only tool in the toolshed.

The challenge however is that PKI requires secrets and trusted parties – that can’t be proven and will always remain the weakest link in security. Managing keys is hard and even the best security companies can’t do it successfully.

For integrity I would argue it is also completely unnecessary. By eliminating the need for keys and using widely witnessed consensus it is possible to have provable security and that is a really big deal for CISOs who want to secure their networks and data.

It is the difference between saying, “I know my network has integrity and I can mathematically prove it" and “our security is based on key management and trusting system administrators”.

As we have seen with very public examples - the NSA and now more recently with Target, Home Depot, and AT&T - the latter is not a winning strategy.

Truth under these definitions is now possible, definitive accountability can be realized and recovered from the service provider and enterprise, and with this truth – you have the ability to identify indemnification responsibility in the event of compromise. Being able to do this in real-time means tamper detection of the network and the data’s state from its creation baseline.  Real time tamper detection means enterprise integrity protection. 

A basic level of instrumentation – a digital fingerprint if you will - should provide proof of data creation, authenticity (is it the same), as well as identity and work at the scales required for cloud computing.  Also, its evidence should be portable across all regions and open, closed, and virtually closed boundaries.

As an example, at the end of 2013, the Cloud Security Alliance (CSA) published its annual report on ‘The Notorious Nine: Cloud Computing’s Top Threats in 2013” and the shift from ‘server to service-based thinking’.

Among the top threats outlined in the report include data breaches, data loss, account or service hijacking, insecure interfaces and APIs, denial of service, malicious insiders, abuse of cloud services, insufficient due diligence, and shared technology vulnerabilities.

Handing over competition sensitive, Personally Identifiable Information (PII), or related Intellectual Property information to a Cloud Service Provider (CSP) is indeed an exercise in extreme trust without the ability to independently (and in real-time) ensure cloud service providers are acting in accordance with purported security guarantees, controls, and the information rules provided by their contracts.

I would argue that in 2014, in light of the CSA assessment and through our own analysis of threats to Cloud Service Providers, as well as governments’ perceived nefarious interactions with the telecommunications and data storage, social media, and search industries; it has been widely witnessed that trust in these providers and the insiders that run them is dead.

For multinationals, outsourcing business trust to the largely unregulated cloud service provider industry today (regardless of the contract guarantee) ultimately belies the belief in the constraints of that providers trusted insiders (and indeed any government) interactions with your data, as well as the integrity of purported technical security controls, abeyance of best practices and associated policies and processes.

While CSA and world standards bodies have pioneered a number of policies and best practice tenants to manage cloud computing and data risks and security threats. These best practice frameworks for business, organizations, and governments is merely a risk management framework and does not address very fundamental integrity problems or technology solutions that should be associated with cloud models.

We can begin to change the dialog and emphasize..

CIO’s should make the assumption that any outsourced infrastructure will at some point be compromised (if not already).

You can’t outsource trust with the complexities offered today or with the people operating those resources on your behalf.

Also assume your own internal infrastructure is already compromised or soon will be in the (near) future.

The more important and valuable your intangible assets are (your intellectual property, customer and supplier base, etc), the more likely you are to be compromised and to become – no pun intended.. a TARGET (or a Home Depot).

If the siren song has become ‘we implement best practices!!’ to assuage concerns…

Our response should be, "so prove it in a way I can independently verify the integrity of your systems any time I want" – go beyond compliance. Prove it – where my data is, how it is being managed, geographically where it is maintained, and the chain of custody highlighting its transformation.

….Prove in an independently verifiable way that the integrity of my data, the information rules that govern it, and that my service contract is being enforced – and let me do it whenever I want without relying on YOU – let me do this in real-time.

With these assumptions, guaranteeing the integrity of these machines, the data being generated both by them and the user, and the rules used to enforce the information policies and contracts is paramount.

With integrity guarantees, true cyber security for an organization is possible.

How did I come to this conclusion? I made a trip to Estonia as an advisor to Senior US cyber policy makers to inspect how integrity was guaranteed for the rules defined for electronic services utilized by the citizens, the public, and private sectors there via an Estonian invention – Keyless Signature Infrastructure.  A massive scale data authentication technology that provides immutable evidence proving data provenance, integrity, and time.  

I specifically wanted to understand how a country could offer and mix such a rich set of citizen services in health, finance, energy, and e-governance across a common platform while still preserving data differentiation, privacy, accountability and transparency into all service interactions.

Instead of solely using secrets, and as you heard earlier from Jaan Priisalu - Estonia implemented Integrity and Attribution, while importantly preserving privacy. In Estonia, their citizen identity management systems are backstopped by data integrity generation and monitoring. Every critical security function, audit record, and critical data repository imparts integrity evidence with continuous monitoring to ensure tamper detection, and resilience without the reliance on stored secrets or trust anchors like system administrators. In fact, they have achieved mutual auditability between the state, the citizen, and participating private sector.

As background, Estonia is perhaps the worlds most advanced cyber-enabled nation state. Post 1992, they had the ability to create the world’s first society from the ground up post Internet. To do so, they wanted to allow all citizens the ability to interact with the government and private sector electronically. All banking transactions are done electronically in Estonia, so is voting, so is all legislation.

Unlike the rest of the world, the government, the private sector, and the citizen who interact on these platforms can mutually audit each other’s behavior. How do they do this – by imparting integrity instrumentation into literally every digital asset that is generated and event log files, which audit the interactions being generated by responsible systems and administrators.

The integrity components integrate Keyless Signature Infrastructure in such a way that the context of time becomes immutable and forensic evidence for all signed data assets. It is mathematically impossible to manipulate data that has been generated in the past without the integrity of the data changing. Change detection of these assets and the ability to mutually audit that interaction is at the foundation of the ability for the government, private sector, and citizens to actually trust their data in these systems.

What I discovered and tested is that Estonian scientists have built a near perfect detection technology through the use of Keyless Signature Infrastructure (KSI), which scales and can be used to allow the entire planet to verify EVERY event in cyberspace in such a way that the PRIVACY of each event is maintained but the integrity of the events cannot be denied.

These integrity technologies hold the promise of providing complete transparency – it becomes impossible for governments, corporations, or users to lie – everyone can verify the integrity of events independently from those presenting them.

I would further offer – that while we admire the efforts of the social networking, search, and cloud industries to increase transparency by issuing transparency reports – how can the insurance industry trust the message if you can’t trust the messenger?

What KSI implements at the digital level is TRUST BUT VERIFY – independent verification of everything that happens in cyberspace.

Requiring data integrity is the foundational instrumentation required to provide this detection mechanism and provides a new dawn for cyber security: visibility (and accountability) into operations; bringing truth to network and data interactions. Instead of increasing security perimeter complexity and trusting more secrets – this is a fundamental paradigm shift in security – instrumentation afforded from the inside out at the data-level, with real-time integrity reporting for critical organizational applications and assets.

This baseline instrumentation will allow the cyber security and cyber insurance industry and the organizations they back to better identify and visualize threats and changes to important intangible assets and data such as copy and transfer, deletion, as well as the manipulation of assets in real-time. Integrity instrumentation allows you fundamentally the ability to tag, track, and locate your assets in cyberspace. A GPS for data.

Cyber integrity proof affords the consumer, service provider, insurer or data broker to finally independently trust the provenance and integrity of any network interactions, as well as the digital assets they are managing and/or consuming.

Addressing integrity challenges at the scale required for cloud computing holds the greatest promise to actually address industry hesitance to move to cloud and insurers to appropriately back those assets; providing a better solution to identify malicious insider behaviors and threats that take advantage of code and application vulnerabilities not imagined by the software vendor.

There is friction in adopting integrity technologies. Why? Cloud Service Providers have been hesitant and stonewalling integrity verification and transparency technologies. The reason? Compromise of your data or exploitation may or may not indemnify them for losses and has direct effects on insurance and reinsurance of both the customer and their assets.

If they (or you) can’t prove what was lost or compromised on their watch and how it occurred – if the evidence doesn’t exist, they can claim they are not liable. “Prove it”.

Moreover, such a paradigm shift puts an entire industry of audit and compliance personnel at risk. Why would you need these auditors if you have real-time insights into the assets you are backing or trying to protect?

President Toomas Hendrik Ilves of Estonia recently emphasized in Berlin that it is important to protect people from malevolent alterations of databases.

 “Online security consists of three factors: privacy or confidentiality; data integrity and accessibility. Recent scandals have brought the issue of digital privacy into public debates; however, violating the integrity of data is even more dangerous, threatening lives and infrastructure essential to the functioning of society – we can for example imagine the damage that can be done by changing people’s blood groups or the cycles of traffic lights by simple changes of the data in databases,” said the President of Estonia.

According to President Ilves, “European Union member states must ensure a safe system for their citizens that can fend off those kinds of malevolent attacks and protects online identity and data integrity. “ I would like to use the Estonian X-road as a good example, not necessarily as a technical solution but rather as a way of thinking, a concept that data must be protected by a specific identification system. A crucial factor for all secure and functional e-services is secure online identity and integrity.”

Transparent truth, not trust, is the Estonian mantra. With this technology, significant friction is introduced to those attempting to lie or cover their tracks, or those who want change the evidence of an event in the past. This architecture and the integrity technologies that underpin them preserve the historical provenance of all interactions – and indeed serves to preserve Estonian history electronically for the long term.

So for this industry weighing response and investment against cyber risks…

Data integrity technologies should be a fundamental requirement for anyone that cares about their data, protecting it, and ultimately insuring those assets.

Requiring data integrity instrumentation brings accountability to the service provider by highlighting the complete chain-of-custody and digital provenance of service provider interactions, which in turn then identifies the responsibility for compromises, tamper, malicious insider activities, or even misconfiguration.

With this ability, truth becomes widely witnessed evidence without disclosing the content of the underlying data.  Estonia has done this with KSI to also ensure privacy.  KSI does not reveal the underlying data the signatures protect.  Moreover, the evidence is portable and independently verifiable across infrastructures, and travels with the data.

With this fundamental form of data instrumentation customers, auditors, data-brokers, and investigators can for the first time truly independently answer the critical question:

What changed, what was eliminated, when did it occur, and what/who was responsible

As my good friend Doctor Jason Hoffman of Ericsson once said, "You can’t be perfect at preventing crime, but you can be perfect at detecting crime".

Target still claims they are not responsible for the loss of over 100 million credit cards under their care despite firing their CEO. That’s quite a statement and belies an industry willing to ignore consequences in the race to provide and implement low-cost competitive services and automated transaction systems.

I’ll leave you with a quote by John Oltsik’s vision for next-generation Cybersecurity in Network World. In his article he focuses on 4 areas of data security. The implications of his point on data tagging and integrity should be carefully considered:

“Sensitive data identity tagging…. If some piece of data is classified as sensitive, it should be tagged as such so every host- and network-based security enforcement points knows its sensitive and can then enforce security policies accordingly. Are you listening to this requirement, Adobe and Microsoft? Cisco is making progress… but it would be incredibly helpful if the industry got together and agreed upon more universal standard data-tagging schemas and protocols. Think of the possible ramifications here as they are pretty powerful. With a standard tag, every DLP device could easily detect and block an email when a junior administrator uses Gmail to send a copy of the HR database to her home computer. Employees could use Box, Dropbox, or any other file-sharing services because the files themselves have their own “firewalls.”

This is where the cyber security industry and worlds standard’s organizations are going with our help – deploying Keyless Signature Infrastructure globally, making it available to all - and breathing new life and proof into digital assets - authenticity, time, and identity so that consumer, governments, and MNC can finally control and trust their assets.

I’ll conclude my remarks today highlighting these concepts and I am open to any questions at this time… Thank you.