My name is Kevin Reed, I’m the CISO at Acronis, global leader in cyber protection and Singapore’s latest unicorn company. I have more than 20 years of experience in cyber security, and today would like to share with you on some of the industry’s most pressing topics.
One of the ways how we use artificial intelligence to help our clients is to protect them from ransomware. Ransomware acts very differently from a benign software or user systems. One of the difference is how the ransomware access the file; if you look at the benign software, it usually only open one or two files over a minute, maybe two minutes. While as a ransomware, as it tries to encrypt any file does that much faster across many more files as compared to the benign software does. This gives us a way to distinguish ransomware from your typical word application and lock it from propagating.
Talking about key process of safety - like data backup and recovery, threat detection etc. it requires some human traits besides accuracy and programmatic logical reasoning- maybe professional intuition or 6th sense that AI lacks. Do you think that we can rely on AI's capabilities to execute the tasks with the same diligence that we expect from human beings- considering the oversmartness of threat actors?
Research shows, AI can actually be more accurate than humans when performing defined tasks it has been trained for. For example, the best-of-breed anti-spam algorithms are all using ML in some form. And, as anyone using a major webmail platform could confirm, spam protection is mostly a solved problem. Rarely there are false positives or false negatives – in general it works pretty well.
While the variety of security attacks is much larger, the industry is not aiming to solve them all at once. Companies are working on detection and protection from certain threats.
For example, Acronis ML-based solution fights ransomware – other companies may try to prevent other kinds of attacks. Within defined field, detection rates are excellent and probably exceeding those that humans would show, if we were to do a blind test for ransomware detection.
We actually believe in ease of use being our top competitive advantage. We know that security only works when it’s also the easiest path for users to take. We design our products so that the user is secured by default – and in many cases no interaction is needed from them to be protected.
For our enterprise products we are truly integrating various aspects of protection into a single product with a common management console. Many inefficiencies arise from the fact that what is called a “product family” is in fact a number of separate products with minimal or no integration, with different management consoles and sometimes even conflicting settings.
This is not the case for Acronis, where both endpoint agent and cloud console are tightly integrated adressing all the 5 vectors of cyber protection – data safety, accessibility, privacy, authenticity and security.
Acronis True Image is a personal cyber protection solution – the first of its kind to combine reliable backup with the innovative AI-based anti-ransomware protection. It also protects users from other kinds of threats, like cryptojacking. In 2019 alone, it stopped almost 500,000 attacks. Basically, with Acronis True Image your data is safe, no matter what happens to your device.
For training purpose the AI needs a massive amount of data. Owning and managing On-site systems for the same could be highly expensive and moving such huge data in/out of clouds is equally hard- some would say almost impractical- in the present ecosystem. Backup is equally difficult and recovery process might be more complicated (or is it?). What are your thoughts on the same and how to overcome this challenge?
It is true that AI requires a lot of data for training. Yet I don’t see cost being the limiting factor. For example, processors designed specifically for ML are simpler, smaller and faster than their general-purpose counterparts. This allows to package together many more CPU cores, e.g., thousands instead of dozens – while keeping the cost affordable.
Speaking of data storage and data transfer, I also don’t think we will hit the limit anytime soon. Raw storage price per gigabyte continues to decline and innovative algorithms for data deduplication and compression, like those used in Acronis storage, allow users to store data more efficiently at a smaller cost.
For example, for a typical backup workload Acronis compression algorithms exceed the compression factor of 2. Yet, sometimes data transfer to large distances many become an issue. Although we have not reached the theoretical limits of data transferring bandwidth, we are definitely seeing impact of the physical limits of the speed of light on the latency-sensitive applications.
Also, data transfer to large distances may hit a legislation wall. Some legislations require storing certain types of data within their geographies. To address this requirement Acronis and others need to build datacenters across the world – in Europe, Asia, Oceania and Americas – in our case, we can also function on the infrastructure of all major cloud providers.
I also don’t think recovery process is complicated. First, many moving parts are hidden from the end-user, but more importantly, the maturity of the backup and restoration technology, availability of checksums and digital signatures to verify data consistency, make the restoration process seamless.
The currently existing cooling approaches are a little expensive like that of air conditioning. But as of now, we have only a few approaches with us.
Well, conventional data center cooling systems are designed to remove heat from inside the server space through an exhaust system and then introduce cooler air via the AC. However, there are some new innovative practices: like, geothermal cooling which relies on the temperature of the surrounding earth, creating a natural heat sink.
Several data centers already use this method, including the American College Testing data center in Iowa City.
I also heard of pumped two-phase cooling solutions, where the liquid flows through the isolated system across a cold plate which is itself directly attached to a heating unit, and the heat then vaporizes the liquid.
And you probably heard of immersion cooling solutions – when computer systems and servers are submerged in the liquid.
There are also smart monitoring and IoT solutions being applied these days – designed to automate cooling process inside areas with greater accuracy, like a smart thermostat.
The efficiency of AI training depends upon extremely fast processing and massive performance requiring petrabytes of capacity. The present day storage and datacenters heavily depend on virtualization but that approach is not believed to gel well with AI requirements according to some experts. Physical proximity of hardware is required for real-time benefits. What are your thoughts on this? Are we moving towards bridging this crucial gap?
It’s true that there are different kinds of infrastructure solutions for different purposes and applications. Some of them are not universally applicable. However, making an application-specific solution is not something we as industry have never done. We’ve build microcontrollers and processors for specific applications, like networking or video processing – so-called Application Specific Integrated Circuits (ASICs). We are building GPUs and ML-optimized processors now. If there will be a need to create a purpose-built datacenter for AI, I am sure the industry will be able to address that.
Speaking of storage, we believe currently data centers and storage vendors are heavily innovating to support modern requirements. I’ve mentioned modern approaches to cooling, but there are also innovations in power distribution, content delivery and retrieval, miniaturization and operational efficiency of data center equipment. It was proven already, that we can efficiently centralize data storage and processing, yet, edge computing promises another increase in efficiency. So, maybe this pendulum will swing again and we will see a new wave of decentralization for some applications.
A lot of research is being done there. One of the most exciting and revolutionary approaches is Secure Multi Party Computation (SMPC). With this approach, the provider would be entrusted with data, yet while data is not readable, the provider can perform certain operations over data on the request of data owners.
I see a lot of potential in this approach. Consider example – when you need to perform computation heavy analytics of sensitive data, say, medical records. You may not have the needed capacity, so you transfer specially prepared data to an SMPC provider, they perform the computation and return the processed data – all without revealing critical information to them.
Acronis too heavily invests in research and development in this area and we are already seeing working application of this technology.
For more information about Acronis, visit www.acronis.com.
Disclaimer: The opinion of Insercorp Water Cooler Bloggers are of their own and do not reflect the official position(s) of Insercorp LTD. Acronis is not affiliated with Insercorp LTD and no incentive was offered or received related to this story.