Our beloved tester Matt posted the following to the company chat system this morning:
Are people really silly enough to plug their laptops into an unknown USB drive embedded in a wall by goodness knows who? See http://deaddrops.com/
(A particular tickler for Alec I’d imagine)
Matt is exactly right – if I caught someone doing this with company hardware then I would want to see justification, mitigation and a business case for why they did this without contacting me first – and similarly there are risks like phone recharging stations which are not what they appear – and in fact anything with a data cable is suspect.
Further: it is hypothetically possible for a motivated hacker to backdoor your laptop by sending you a replacement battery because those have data connections to communicate battery status to the CPU, and risk buffer overflows as a consequence.
So the risk-aversion bit of me wants to point all this out; but then the security-philosopher bit of me wants also to say this:
— an alternative perspective —
Wouldn’t it be wonderful if we could go around using hard disks stuck in walls For treasure hunts, for photos, for casually sharing information with others? For anything?
Is it not an interesting opportunity?
And wouldn’t it be wonderful if we could charge our phones anywhere, and not live in fear of digital viruses in the same way that we no longer live in fear of human ones like smallpox?
In fact, better than that: better because we cannot re-engineer humans so that viruses and bacteria simply no longer work – not least because we require some bacteria like intestinal flora to survive – but we could hypothetically engineer software to simply have no malware attack surface, no means for malware to attack it. i.e.:
- Humans don’t fear smallpox because we’ve (hopefully) eradicated smallpox.
- Rocks don’t fear smallpox because rocks don’t get diseases.
So wouldn’t it be great if our software was in this sense like a rock – impervious even if swimming in a sea of malware?
I think so; but we won’t get to this situation without evolution and having a few risk-takers – people of a variety who will plug their computers into any connection and suffer the consequences, from delight through Goatse to having their identity stolen, data lost and their hardware bricked.
To achieve this evolution we have to reduce upon security-through-legislation – our tendency to simply declare ‘bad’ things to be illegal – and reduce our reliance upon anti-malware immune system models of protection.
To repeat: Rocks don’t need immune systems because they are not prone to disease.
The other problem with anti-malware is that like in some human diseases a modern computer virus will switch off the immune system, whilst other malwares will walk straight past the antimalware defences – something again reflected in medicine.
So our software should be more like rocks.
To get there we have to evolve our wants, needs, functional requirements, and thinking.
And we need to stop whinging about how bad the threats are and instead go swim in the filth a little.
My late father used to say you’ll eat a peck of dirt before you die – and so I got to play in the dirt, get really muddy on farms and in woodland, and I have a massively good immune system as a consequence.
Computer security must do the same until we evolve software to a rock-like state.
So that’s what I really think.
But yeah: if anyone does this with company hardware, yes, I’ll spank them.
And any geologists who want to argue the metaphor over onion-skin weathering as a threat model or tectonics or whatever, contact me offline. 🙂
Written by: Alec Muffett





