In 2020, cyberattacks are going to get personal


4 Dec 2019331 Views

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

Image: © Андрей Яланский/Stock.adobe.com

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

Forrester’s Jeff Pollard foresees the measures enterprises will have to take in 2020 to secure users’ personal data and protect themselves from increasingly sophisticated attacks.

In 2020 and beyond, security and risk professionals will discover that cybersecurity decisions have broader societal implications than ever before. Our lives increasingly depend on technology to work, learn and socialise, and that dependence also makes technology a target.

The growing reliance on data when making key decisions will give malicious actors greater incentive to restrict access to large pools of data through the use of ransomware. Weaponisation of data gathered on populations will give authoritative governments and shell organisations greater capacity to manipulate geopolitics and grow their influence outside their borders. And improvements made to artificial intelligence (AI) and machine learning (ML) in the past several years will result in use cases that improve cybersecurity but will also help attackers.

The intermingling of these trends creates the foundation for three of Forrester’s 2020 cybersecurity predictions.

1. Companies will collect and weaponise data through M&A activity

While the unravelling of the Cambridge Analytica scandal gave rise to mainstream anxiety over data collection, the ever-growing value of data remains too enticing a resource for companies – and governments – to ignore.

Laws aimed at limiting how organisations can share their large swathes of data will proliferate globally, but these measures will do little to stop the growing M&A market behind data consolidation.

The collecting of preference data, user location or medical information may be innocuous at first, but should the companies behind today’s leading apps be acquired by a government-owned entity, that data would be wielded by an adversary.

When Beijing-based engineers legally gained access to sensitive health information through the acquisition of Grindr, they illustrated how current legislation fails to mitigate the risks of data falling into the wrong hands, thus requiring companies to form their own consumer data governance strategies.

2. Costs associated with deepfake scams will exceed $250m

In what is possibly the first scam of its kind, social engineers were able to defraud $243,000 from a German energy company through the use of natural language generation technologies earlier this year. Now that a precedent exists showing economic gains from AI-backed deepfake technology, expect more to follow. Expect the development of more deepfake-based attacks fabricating convincing audio and video at a fraction of the cost.

To mitigate risk, IT departments need to further invest in training and awareness programmes. Without savvy employees who understand the similarities and differences between deepfake-based attacks and legacy phishing schemes, the costs associated with the former will continue to rise.

3. Data privacy concerns will lead one in five enterprise customers to safeguard their data from AI

Despite the growing value of AI and ML solutions, companies that rely on enterprise customer data to improve their B2B product offerings will struggle to find customers willing to opt in to data-sharing agreements. As legislation, such as GDPR and CCPA, and consumer backlash make privacy slip-ups and accidental disclosures catastrophic to the short-term bottom line and long–term image of the brand, companies will prohibit handing over their data to third parties.

This shortage of data will likely make AI and ML solutions less effective, which could in turn create a negative feedback cycle whereby companies that don’t feel the gains associated with AI offset the increased risk of privacy-related expenses, thus leading more enterprise customers to further prohibit the use of their data in the coming years.

By Jeff Pollard

Jeff Pollard is a VP and principal analyst at Forrester, serving security and risk professionals. He leads Forrester’s research on the role of the CISO, specialising in topics related to security strategy, budgets, metrics, business cases and presenting to the board.

Download Forrester’s Predictions 2020 guide for more information on the major dynamics that will impact firms next year.

A version of this article originally appeared on the Forrester blog.