Surge of cyber attacks: Security by design to stem the flaws
The security level of applications we use daily on our smartphones and computers is cause for alarm.
In a recent survey1, 77 % of applications were found to have at least one vulnerability on first analysis. Fewer than one-third of companies have an effective strategy in place to deal with large-scale cyber-attacks, while last year, the “WannaCry” malware paralysed 300,000 computers in over 150 countries. And when critical bugs are detected, only 1 in 3 of the issues is resolved within a month…
Governance at every stage
Even when software developed in-house is faultless, applications remain vulnerable to security failures, since many of them incorporate open-source libraries, which cannot be guaranteed safe. This type of component may form up to 75% of the code in a software package!
They now exploit infrastructures which are developing fast, with increasingly hybrid models. Today’s businesses are turning to Cloud-based platforms which offer them micro-service technologies and repositories giving access to applications easier and faster. But such components may carry faulty, even malicious code, and businesses which use it jeopardise not only the security of their own information systems, but also that of the members of their eco-system.
Studies also show that in a cloud storage environment, one in six databases is wide open, with no security measures at all. Cloud platforms do of course offer often very sophisticated protective measures, but if companies do not use them consistently, their code and data remain vulnerable. Thus, it is urgent for companies to review their governance rules on use of data centres and cloud platforms.
From GDPR to the corporate culture
Apart from governance rules, which are often notable for their absence, could the problem of app security be solved by regulations? Security is at the heart of the General Data Protection Regulations (GDPR) which came into force on 25 May. But while it provides a stricter framework for use of personal data, its article 20 could well open a Pandora’s box for their security.
Centred on data portability, this article was designed to make transfer of personal data between rival operators easier. So for instance if a customer wants to leave Apple Music and subscribe to Deezer, or vice versa, their playlists will be automatically transferred from one to the other. This is a real advantage for consumer freedom, but it comes with a major risk: such “portability platforms” could soon act as intermediaries among different players on the market. And these platforms will pass personal data around in a more or less secure way. The risk is not so much that such aggregators have access to personal data without the customer’s consent (although…) but that the APIs used to send the data on its way may themselves be deficient in security.
Given these comments, what can a company do? It needs a culture change! While many see security as being a hindrance in terms of reactivity, productivity and flexibility, this is because security is often considered too late in the day, which means code has to be rewritten and time-to-market is delayed. In the current context, such an approach is not sustainable: a developer must make security the priority from the very first line of code written. Like the “privacy by design” rule imposed by the GDPR, “security by design” must become the watchword for development teams. And the company boards and directors have to make security the cornerstone of their strategy for winning and retaining customers, because this is the foundation on which their trust is built. And without trust, there is no growth…