There is news that women’s clothing website Unique Vintage has sent notifications to the customers that the site has been breached and the customer information was exposed. What is interesting is that the website is fully PCI compliant, i.e. it follows all rules for security set forth by the credit card industry. And still, it appears, the credit card numbers, among other information, were stolen. And this went on for more than a year and a half before being detected.
There is no substitute for proper design and security diligence. Following the rules set in a book will only get you so far. The attackers do not follow any book strictly, so you should not.
I do not often want to comment the news so today is a special day.
The first piece is an article on the popular subject of NSA Web Surveillance quoting some well-known people starts off on a good direction but gets derailed somehow into recommending obscurity for security. Strange as it is we really should consider anonymizing our access to the Internet. The problem is though that we cannot anonymize the most important part of our Internet access where we real need our real identity and that is the part that delivers most information about us. Sorry, it is not going to work.
I was wondering earlier what the situation of Canada is in relation to the NSA scandal and the article on Canada’s part in NSA plan revealed that we cannot count on Canada to be impartial in the matter. They are in on it and quite likely Blackberry is no better choice than other U.S. controlled mobile phones.
I cannot remember when was the first time I heard that “passwords are dead”, it must have been years and years ago but this same mantra is repeated over and over again every year. Now the passwords are dead at Google. Well, tell you what, long live passwords!
And suddenly Vint Cerf, one of the guys at the beginnings of the Internet, is preaching for the devil. He is working for Google, of course, so his opinion that we all should “give up a degree of privacy in order to be protected” is likely Google’s, not his own. On the other hand, if you ask me I would say he should watch what he says, people believe him more or less unconditionally and his moral obligation is to not peddle the loss of privacy for all of us.
Here you go. I seem to disagree with nearly all of the news today. Which is good news!
I see that HTC got finally whacked over the head for the lack of security in their Android smartphones. I will have to contain myself here and will leave aside the inherent issues surrounding Android, its security and model of operation that will hurt … Ok, ok, I stop now. So, HTC got dragged into a court in US for improper implementation of software that allows remote attackers to steal various data from your smartphone. Big news. Problem is they settled and are not likely to actually do something about it. Anyway, that’s not interesting.
The interesting thing is that the regulators complained that HTC did not provide security training to the staff and did not perform adequate security testing:
The regulator said in a statement that HTC America “failed to provide its engineering staff with adequate security training, failed to review or test the software on its mobile devices for potential security vulnerabilities (and) failed to follow well-known and commonly accepted secure coding practices.”
Most companies ignore security hoping that the problem never comes. This shortsighted view is so widespread I feel like Captain Obvious by repeatedly talking about it. But I suppose it bears repeating. The security risks are usually discarded because they are of low probability. However, their impact is usually undervalued and the resulting risk analysis is not quite what it should be. The security problems prevalent in software are usually of such magnitude that they can easily cost even a large business dearly.
Ignoring security is not a good idea. This is like ignoring a possibility of human death by being trapped in an elevator for an elevator company. An elevator company will do all it can to prevent even a remote chance of this happening because if something like that happens they can be easily out of business in no time. Quite the same approach should be taken for granted by software companies, and the sooner, the better. A security problem can put a company out of business. Be forewarned.
I am reading through the report of a Google vulnerability on The Reg and laughing. This is a wonderful demonstration of what the attackers do to your systems – they apply the unexpected. The story is always the same. When we build a system, we try to foresee the ways in which it will be used. Then the attackers come and use it in a still unexpected way. So the whole security field can be seen as an attempt to expand the expected and narrow down the unexpected. Strangely, the unexpected is always getting bigger though. And, conversely, the art of being an attacker is an art of unexpected, shrugging off the conventions and expectations and moving into the unknown.
Our task then is to do the impossible, to protect the systems from unexpected and, largely, unexpectable. We do it in a variety of ways but we have to remember that we are on the losing side of the battle, always. There is simply no way to turn all unexpected into expected. So this is the zen of security: protect the unprotectable, expect the unexpectable and keep doing it in spite of the odds.
US House of Representatives published an interesting report about their concerns with Huawei and ZTE, large Chinese telecom equipment providers. The report states openly that there are concerns that the equipment, parts and software may be manipulated by the Chinese government agencies, or on their behalf, in order to conduct military, state and business intelligence. The investigation that the report is the outcome of did not dispel those concerns but made them more founded, if anything. We have to keep in mind that this is a highly political issue, of course. But even then, citing such concerns underlines what we talked about for several years now: the supply chain is a really important part of your product’s security and blindly outsourcing things anywhere is a security risk.
NIST has announced the end of the Secure Hash Algorithm competition the day before yesterday, naming Keccak as the winner and making it the SHA-3 algorithm. The complete announcement from NIST is here.
One thing of note is that since the algorithm was developed by STMicroelectronics and NXP Semiconductors, the algorithm is heavily optimized for the use in smart cards. According to the announcements, it is both compact and fast when implemented in hardware. Which makes it once again very well suited to some applications and difficult to use for others (like password hashing).
“The world’s largest professional association for the advancement of technology” has been thoroughly embarrassed in an accident where they left their log files containing user names and passwords open for FTP access to all on the Net for more than a month, according to a DarkReading report. Or, at least, I think they should be embarrassed although they do not seem to be very.
The data for at least 100 000 members were exposed and IEEE took care to close the access. However, having access to the log files is not what I think they should be embarrassed about. As the things go, mistakes in configuration happen and files may become exposed. That’s just life.
However, what is really troublesome is that IEEE, the “world’s largest professional association for the advancement of technology” (according to themselves), has logged the usernames together with passwords in plaintext. I mean, we know that’s bad, and that’s been bad for at least a couple of decades. They are definitely at least a couple of decades behind on good security practices. I think that’s really embarrassing.
I stumbled across an article on car software viruses. I did not see anything unexpected really. The experts “hope” to get it all fixed before the word gets out and things start getting messy. Which tells us that things are in a pretty bad shape right now. The funny thing is though that the academic group that did the research into vehicle software security was disbanded after working for two years and publishing a couple of damning papers, demonstrating that “the virus can simultaneously shut off the car’s lights, lock its doors, kill the engine and release or slam on the brakes.” An interesting side note is that the car’s system is available to “remotely eavesdrop on conversations inside cars, a technique that could be of use to corporate and government spies.” This goes in stark contrast to what car manufactures are willing to disclose: “I won’t say it’s impossible to hack, but it’s pretty close,” said Toyota spokesman John Hanson. Basically, all you can hope for is that they are “working hard to develop specifications which will reduce that risk in the vehicle area.” I don’t know, mate, I think I better stay with the good old trustworthy mechanic stuff. I guess I know too much about software security for my own good. I kinda feel they will be inevitably hacked. Scared? If there is a manual override for everything – not so much but… The second-hand car market suddenly starts looking very appealing by comparison…
Google has been fined $22.5 million for breaching its privacy commitment and bypassing Apple’s Safari users security settings. As the article in Mercury News comments, citing Consumer Watchdog, “the commission has allowed Google to buy its way out of trouble for an amount that probably is less than the company spends on lunches for its employees and with no admission it did anything wrong.”
The days of when the motto of Google “do no harm” could be taken literally are long gone. Beware.
TechRepublic has an interesting article “Website and app security tips for software developers” that talks in a very short space about a whole bunch of things, from the “shelf life of software developers” to the advice on security for the website developer.
It provides in particular an interesting insight into why a person thoroughly familiar with security made security mistakes again and again.
I know why I made those mistakes — it was either the hubris of “I can roll my own better than off-the-shelf,” or the idea that slapping something together quickly would be fine “for now” and I would pay the technical debt off later. I was wrong on both counts, every single time.
How often do we get trapped like that?