Could robots pretend to be YOU? Cyber security experts warn that AI could mimic writing styles and habits of millions of users to launch devastating scams

Tuesday, February 27, 2018
By Paul Martin

Cyber security experts raise prospect of devastating scams using AI technology
AI could be used to impersonate individuals and spread malicious software
Potential for crime gangs to ‘scale up’ operations using cutting edge technology

27 February 2018

Robots could mimic writing styles and habits of millions of people to launch devastating scams, cyber security experts have warned.

Hackers could use AI programmes to impersonate individuals after malicious software harvests records and emails from their computers.

Such a scam could ‘explode’ as colleagues and contacts are tricked into opening files and infecting their own systems.

A House of Lords committee has also been told that criminal gangs could deploy AI to sift masses of material collected from hacked devices such as smart TVs at companies – and work out what can intelligence can be used to make money.

The potential for organised gangs to ‘scale up’ their activities using developments in artificial intelligence was spelled out in evidence to peers by respected Cambridge-based cyber security experts Darktrace.

The written submission to the AI Committee, which is investigation the potential threat from the fast-shifting technology, says there are huge opportunities for motivated groups to ‘pursue new models of criminality’.

‘Imagine a piece of malicious software on your laptop that can read your calendar, emails, messages etc,’ it said.

Now imagine that it has AI that can understand all of that material and can train itself on how you differently communicate with different people.

‘It could then contextually contact your co-workers and customers replicating your individual communication style with each of them to spread itself.

‘Maybe you have a diary appointment with someone and it sends them a map reminding them where to go, and hidden in that map is a copy of malicious software.

‘Perhaps you are editing a document back and forth with another colleague, the software can reply whilst making a tiny edit, and again include the malicious software.

‘Will your colleagues open those emails? Absolutely. Because they will sound like they are from you and be contextually relevant.

‘Whether you have a formal relationship, informal, discuss football or the Great British Bake Off, all of this can be learnt and replicated.

‘Such an attack is likely to explode across supply chains. Want to go after a hard target like an individual in a bank or a specific individual in public life? This may be the best way.’

The submission by Dave Palmer, Director of Technology at Darktrace, gives a second example of an attack on smart TVs and video conferencing systems at a firm, pointing out that such devices are typically more vulnerable than conventional computers.

Mr Palmer said the criminals could ‘activate the microphones and stream the audio of all meetings held, to an AI driven translation and transcription service (already available from Google and Amazon)’.

‘Given transcripts of all meetings an additional simple AI model could automatically alert the criminal to topics of interest (perhaps unannounced Mergers & Acquisitions, or the details of preparations for a particular trial) and suddenly the criminal has easily scalable approaches for ambient surveillance of a company without having to actually listen to any meetings themselves,’ Mr Palmer said.

The Rest…HERE

Leave a Reply

Join the revolution in 2018. Revolution Radio is 100% volunteer ran. Any contributions are greatly appreciated. God bless!

Follow us on Twitter