To trust the message from these components to your SIEM and/or log and audit aggregation solution you must first trust the messenger.

At Guardtime, we talk a lot about machine-generated data, the complexity of machine interactions on the public and industrial Internet, and how to ascertain and guarantee the integrity of this information. The data growth rate for machine-generated data is exploding, crushing traditional IT security and event and management systems, log/audit and analysis tools. Critical to this issue are the reporting and validation systems, which ingest and analyze machine-generated data from widely disparate sources and formats. Scalable, effective, efficient, and trustworthy analysis is essential to effective governance. System Integrity and real-time tamper detection is essential for strong IT governance, especially as the complexity and abstraction of these systems increases between in-house and outsourced infrastructure and services providers.

While humans still generate the most amount of data on the Public Internet, machines are quickly catching up.

Machine data contains the records of all transaction histories of interacting devices, customer transactions, user activities, access, authentication and ‘handoff’ automation, virtualization and application launches, software deliveries, data from API interactions and their associated messages, as well as sensor data from remote devices – the Internet of Things (IoT) – the list goes on and on.

Global proliferation of devices means more machine generated data. A recent report by IDC estimates that while in 2014 machine generated data accounts for approximately 40 percent of all data created, this footprint will expand by 2020 with machine generated data accounting for the majority of all data being created and consumed by ‘things; like industrial machines, video delivery systems, medical, M2M infrastructure, and other devices such as sensors and vehicles.

The IDC’s analysis shows us that the world’s supply of what is commonly called “big data”—pools of analyzable and potentially useful digital information—is still relatively small and that most of the data that is being generated by consumers, like that episode of your favorite sitcom saved on your DVR, isn’t very useful for analysis and is eventually deleted.

More promising for big-data analysis are the readings from machines monitoring our world, from surveillance equipment and medical devices. To prepare for this explosive growth and literally the trillions of gigabytes of newly created data by these machines, new scalable tools need to be deployed to instrument, ingest, analyze, and report on this data within the enterprise or via the service provider; whether the data is being generated by data centers, applications, network devices, security and perimeter defense systems, and/or M2M infrastructure.

Moreover, to ensure the robustness of the network and operations availability to meet SLAs, reliability of these devices and validating their integrity is critical.

Abstraction of administration and control systems that manage these interactions is increasing. Automation, Software Defined Networking (SDN), and API extensibility has become the only way to effectively negotiate the myriad of disparate sources of these machine interactions. API and SDN security and integrity is therefore paramount for network coherence and service delivery assurance.