The Human Error in Cybersecurity
Human error has been the primary reason for 95% of cybersecurity breaches.
On a warm November morning in 2017, Uber had finally announced a breach it concealed for an entire year where personal information and license plates of 600,000 drivers and personal data of 57 million users had been compromised. Uber tried to negotiate and retrieve the data without notifying those affected and even paid a ransom of USD 100,000 (registered as a bug bounty). One of the top unicorns was being exposed by two external individuals who gained access to the information stored in the third-party cloud service. How did it happen? Should we blame the cloud provider? Definitely not.
Uber had been lacking the cloud access control that prevented unauthorized access. Just the concealment of the breach alone cost the company $148 mil and, as in many cases, the loss in revenue, market value, damage to reputation and legal costs are very difficult to calculate as they expand on a rolling basis. An astonishing statistic indicates that cybercrimes collectively have yielded at least $1.5 trillion in profits for the criminals annually.
Since 2010, we have seen a sharp rise in the coverage of cyberattacks in the media such as Mossack Fonseca, Equifax, Marriott, and National Health Service to name a few. Cybercrime has been a substantial threat to companies and individuals for a few decades now, and it is only in the past few years that the topic gained adequate coverage. For instance, in 2013 MI5 had presented astounding statistics regarding far Eastern espionage groups spying on the established western firms. One group had nearly 500 targets and had gained access to IT systems for an average of 365 days, with the longest period being 4 years.
Companies often miss the learning point of such incidents, frequently assuming there is a tool that will come to their rescue. In a recent study of 50 major data breaches, inadequate technology solutions contributed to 28% of the attacks, and the remaining 72% of successful hacks stemmed from the failures in people & processes; namely phishing emails, malicious insiders and IT configuration errors. Even more surprisingly, an IBM study revealed that human error was the primary reason for 95% of cybersecurity breaches. The main types of errors are either skill-based or decision-based. Therefore, cyber training and practice can make a crucial difference in the number of cybersecurity breaches.
Cyber training is fundamental to the cyber defense strategy. An estimated 45% percent of companies indicate that their personnel have a problematic shortage of cybersecurity skills. “Training employees on security will immediately bolster the cyber defenses at most companies,” says Lawrence Pingree, Research Director at Gartner, because most data breaches are based on “exploiting common user knowledge gaps to social engineer them to install malware or give away their credentials.”
As an example, susceptibility to phishing declined by more than 40 percent after cyber training at Wells Fargo. The training, as any learning exercise, needs to be repeated on a continuous basis as shown in the example of the City of San Diego case, where susceptibility declined after training, but picked up again by the end of the year as the training effect “wore off”.
Cybercrime is on the rise – according a global survey by Accenture, security breaches have increased by 67% over the last 5 years. Small companies are not safe either – a report by SCORE indicated that 43% of cybercrime targets small businesses. The personalized attacks are expected to prevail, and we see that industries like retail, oil & gas, utilities, media and legal are expected to be ranked in the top 10 attacks in the next 2 years. Some of these industries are perhaps least protected of all, even though companies hold an immense amount of sensitive information. Cybersecurity Ventures estimates the total cost of cyberattacks to rise to $6 trillion per annum by 2021, which is half of the cost of the financial crisis in 2008.
Building the Culture of Cybersecurity
Organizations of every size and in nearly every industry are starting to realize that when it comes to cybersecurity, having an unlimited budget and spending most of it on new tools is probably not the best strategy. Such an approach distracts from more effective organizational and cultural improvements. In order to make a consistent long-lasting change and build a culture of cybersecurity, security personnel and other executives need to collaborate closely with the rest of the lines of businesses to work on the challenges that need to be addressed with a holistic approach.
People are crucial to establishing the successful cybersecurity program of an organization and building the resilience needed to defend against a potential breach. They are at the forefront of designing, testing, implementing, and operating defenses. Conversely, their failures, whether due to malicious intent, negligence, or ignorance, will likely be the source of an organization’s next breach.
Attackers focus on finding the weak link in the defense of the firm – that one flaw that will allow them an undetected passage to the information – so why would one focus on penetrating the firewalls when there is a perfect opportunity to exploit the human nature?
Understanding the prominent mechanisms of a healthy cybersecurity culture will give managers and directors specific pathways to increased organizational resilience. As shown in the diagram below, external influences, values, attitudes and beliefs create the core of a culture of cybersecurity.
The key to establishing a strong culture of cybersecurity is ensuring that employees understand the importance of executing their daily tasks and activities while being cognizant of security. This may seem simple enough, but creating such a culture involves transformation from top to bottom – the way employees work, the way leaders lead, the way processes are executed, and the way issues are addressed.