The Effect of Snakeoil Security

Saturday, September 04, 2010
Cross-Posted from Robert "RSnake" Hansen's Blog:
http://ha.ckers.org/blog/20100904/the-effect-of-snakeoil-security/

15 posts left…

I’ve talked about this a few times over the years during various presentations but I wanted to document it here as well. It’s a concept that I’ve been wrestling with for 7+ years and I don’t think I’ve made any headway in convincing anyone, beyond a few head nods. Bad security isn’t just bad because it allows you to be exploited. It’s also a long term cost center. But more interestingly, even the most worthless security tools can be proven to “work” if you look at the numbers. Here’s how.

Let’s say hypothetically that you have only two banks in the entire world: banka.com and bankb.com. Let’s say Snakoil salesman goes up to banka.com and convinces banka.com to try their product. Banka.com is thinking that they are seeing increased fraud (as is the whole industry), and they’re willing to try anything for a few months. Worst case they can always get rid of it if it doesn’t do anything. So they implement Snakeoil into their site. The bad guy takes one look at the Snakeoil and shrugs. Is it worth bothering to figure out how banka.com security works and potentially having to modify their code? Nah, why not just focus on bankb.com double up the fraud, and continue doing the exact same thing they were doing before?

Suddenly banka.com is free of fraud. Snakeoil works, they find! They happily let the Snakeoil salesman use them as a use case. So our Snakeoil salesman goes across the street to bankb.com. Bankb.com has seen a two fold increase in fraud over the last few months (all of banka.com’s fraud plus their own), strangely and they’re desperate to do something about it. Snakeoil salesman is happy to show them how much banka.com has decreased their fraud just by buying their shoddy product. Bankb.com is desperate so they say fine and hand over the cash.

Suddenly the bad guy is presented with a problem. He’s got to find a way around this whole Snakeoil software or he’ll be out of business. So he invests a few hours, finds an easy way around it and voila. Back in business. So the bad guy again diversifies his fraud across both banks again. Banka.com sees an increase in fraud back to the old days, which can’t be correlated to anything having to do with the Snakeoil product. Bankb.com sees their fraud drop immediately after having installed the Snakeoil therefore proving that it works twice if you just look at the numbers.

Meanwhile what has happened? Are the users safer? No, and in fact, in some cases it may even make the users less safe (incidentally, we did manage finally stop AcuTrust as the company is completely gone now). Has this stopped the attacker? Only long enough to work around it. What’s the net effect? The two banks are now spending money on a product that does nothing but they are now convinced that it is saving them from huge amounts of fraud. They have the numbers to back it up - although the numbers are only half the story. Now there’s less money to spend on real security measures. Of course, if you look at it from either bank’s perspective the product did save them and they’ll vehemently disagree that the product doesn’t work, but it also created the problem that it solved in the case of bankb.com (double the fraud).

This goes back to the bear in the woods analogy that I personally hate. The story goes that you don’t have to run faster than the bear, you just have to run faster than the guy next to you. While that’s a funny story, that only works if there are two people and you only encounter one bear. In a true ecosystem you have many many people in the same business, and you have many attackers. If you leave your competitor(s) out to dry that may seem good for you in the short term, but in reality you’re feeding your attacker(s). Ultimately you are allowing the attacker ecosystem to thrive by not reducing the total amount of fraud globally. Yes, this means if you really care about fixing your own problem you have to help your competitors. Think about the bear analogy again. If you feed the guy next to you to the bear, now the bear is satiated. That’s great for a while, and you’re safe. But when the bear is hungry again, guess who he’s going after? You’re much better off working together to kill or scare off the bear in that analogy.

Of course if you’re a short-timer CSO who just wants to have a quick win, guess which option you’ll be going for? Jeremiah had a good insight about why better security is rarely implemented and/or sweeping security changes are rare inside big companies. CSOs are typically only around for a few years. They want to go in, make a big win, and get out before anything big breaks or they get hacked into. After a few years they can no longer blame their predecessor either. They have no incentive to make things right, or go for huge wins. Those wins come with too much risk, and they don’t want their name attached to a fiasco. No, they’re better off doing little to nothing, with a few minor wins that they can put on their resume. It’s a little disheartening, but you can probably tell which CSOs are which by how long they’ve stayed put and by the scale of what they’ve accomplished.

10074
Vulnerabilities Webappsec->General
Post Rating I Like this!
681afc0b54fe6a855e3b0215d3081d52
Susan V. James While a great strategy in theory, a unified approach to security across an industry requires trust and cooperation among competitors. When you consider that levels of trust and cooperation will vary (not everyone will be 100% "in", even if they say they are or *believe* they are) there are still opportunities for exploitation, both by the hackers and by the members in the "ring of trust" itself. At least you know the hacker is the bad guy, right? But to assume that the enemy of your enemy is your friend, well.... maybe that's not entirely true.

And then there's the problem of scale to consider. Assuming you can get your industry to all cooperate and make things safe among your own community, the predator will go sustain himself off of the weaknesses in another industry. So then, do you annex the next community under attack and drive the hacker further away? If you do, realistically, you'll be caught up in this behavior until you've overextended your reach, and everyone in the ring of trust becomes vulnerable again because the ring has grown too large to manage the behavior of its members.
1284199594