Martin Pool

First State Super shoots the security messenger?

with 2 comments

Patrick Webster reported a security vulnerability in First State Super’s web site which allowed any member to read any other member’s personal details.

The flaw allowed any logged in member to access other member’s statements by changing a single digit in their browser’s URL bar.

Although First State initially thanked Webster for the information, they later filed a complaint which sent police to Webster’s home at night, locked his own super account, threatened to recover from him the costs for dealing with the breach, and demanded he give them access to his computer.

Having such a vulnerability doesn’t show a superlative level of software development competence or auditing thoroughness, but security is hard and things slip through even the best organizations.  My superannuation (retirement savings) is with First State, and while I wouldn’t leave them for having the vulnerability, I am seriously considering leaving because of their ham-fisted handling of the disclosure.  FyberOptic paraphrases it as:

“Oh thank you sir for finding my wallet! Now please let me search your house to make sure you didn’t take anything of mine.”

and Troy Hunt says in his open letter to First State:

This is a particularly irrational and unreasonable response to someone whose intent was clearly to ensure the safety of your customers and the integrity of your reputation…  the message [First State has] sent clearly says it is better to leave vulnerable software exposed and at risk of truly malicious activity than it is to privately and [responsibly] inform those who have failed in their duty to properly secure it in the first place.

I can appreciate that Pillar, FSS’s trustee, are in a difficult position when they know that sensitive private customer data has been exposed.  We should also remember that someone who does report a vulnerability can’t be immediately assumed to have a totally white hat.

Patrick Gray from risky.biz managed to get a good phone interview (mp3) with FSS’s CEO Michael Dwyer.

The key mistakes FSS seems to have made here are:

  1. Examining Webster’s computer to “prove the files have been deleted” is risible: if Webster wanted to retain the files he could trivially have copied them to another machine or to a USB stick, especially since he’s been given prior warning the search may take place.  This is either deep cluelessness, harassment, or a fishing expedition.  Webster stated in his initial email that the files had been deleted, and that’s all the proof they’re likely to get.
  2. By threatening or (mildly) harassing Webster, they don’t reduce the chances that other vulnerabilities will be discovered, only the chance that other vulnerabilities will be reported to them.  Account holders ought to bear in mind this makes FSS very slightly more risky than it would otherwise have been, and all Australian financial institutions more risky than they could be.
  3. Lawyers generally bias towards an aggressive claim-the-moon position: it’s the business’s responsibility to strike the right balance and saying this was “policy” or “legal advice” is a cop-out.  Possibly they would have done better contacting an experienced computer security lawyer, rather than one whose experience seems to be on the financial side.

risky.biz draws a comparison to Google’s system of rewards and recognition for people who report vulnerabilities. Webster did go beyond what is allowed in that program by downloading the records of many other members while demonstrating the flaw, but based on Google’s handling of security so far I think they would have taken a more balanced view.

It was my money and personal information at stake and what I would have liked to see is: thanks to Webster; whatever reporting is legally appropriate; and a serious audit for both previous unauthorized access, and currently open vulnerabilities.

I’m motivated to leave partly because I don’t like giving money to jerks, partly because I don’t want to support a policy that makes internet security worse, and also because I’m concerned that when future holes in First State are discovered, they’ll be exploited before they are reported.

First State, like other industry super funds, has good low fees and reliable returns, due to being not-for-profit and spreading costs across many members.  (The definition of “not for profit”  is not entirely clear when both the investments and administration are outsourced, but the fees are low anyhow.)  First State’s customer service is only mediocre: statements arrive three months after the close of the financial year (what are they doing?) and their phone operators have never seemed particularly well informed.  Generally speaking it’s hard to beat industry super funds for value-for-money, though they do also tend to also have slightly whiffy connections to the union old-boy network.  Sadly the best reason to stay would be pessimism anyone else is any better.

updates:

FSS have reached an agreement with Webster and will not be taking legal action against him, though they’re still apparently chasing the mirage of proof that data has been deleted.

The NSW privacy commissioner is investigating FSS:Any client where there was a potential for their data to be compromised should be advised,” said McAteer

Asher Moses at the SMH also writes:

First State Super CEO Michael Dwyer said yesterday that there was no evidence that anyone other than Webster had gained unauthorised access to customer accounts. But several computer security consultants who are paid by companies to test their networks, speaking on condition of anonymity, said they highly doubted First State kept logs or had the ability to definitively check either way.

FSS say the statements were in PDF format and were viewed by the person responsible but from other descriptions of the event this seems unlikely. On Webster’s and media accounts the statements were downloaded but (mostly?) not viewed. There’s a difference; FSS’s sloppiness here is again unimpressive.

Written by Martin Pool

October 17, 2011 at 1:25 pm

Posted in finance, software, tech

Tagged with , ,

Windows 7 visual noise

with 8 comments

I just started up Windows 7 from the dual-boot partition of my new Thinkpad X201 (which is mostly running Ubuntu, of course.)

The amount of visual noise in the default browser window is really pretty shocking, for a release that’s supposed to be about giving a clean experience:

  • Five different fonts.
  • Two Bing search fields, with different icons.
  • Two controls with different icons to email the page.
  • Jarring misalignment between the controls.
  • Four different button decoration styles: rounded borders, no borders, square borders, and jelly-style round buttons.
  • Some icons are nearly-monochrome and some are brightly coloured without this conveying any information.
  • Two different divider styles: sloping s-curves vs raised dots.
  • If you press Alt, everything in the window jumps around as the menu bar appears.

Not great.

Written by Martin Pool

March 17, 2011 at 12:56 pm

Posted in software

Tagged with ,

duplicity SSL performance to Amazon S3

leave a comment »

I just rediscovered (and may now try to fix) duplicity bug 433970, which is basically pointing out that using SSL to talk to Amazon S3 within duplicity gives you about 6x less sustained throughput than using plain http. It’s quite an interesting result, and potentially has some consequences for Launchpad, which serves just about everything over ssl and is not as fast as one might like.

Using SSL is a bit redundant because the duplicity backups are typically gpg-encrypted anyhow. I think, without this, the worst exposure would be that people on the network between you and S3 could see that you were using duplicity, the bucket name, and the timestamps of your previous backups.

I’m not sure what the actual cause is, and it may not be directly comparable to lp. I certainly see a lot of upstream traffic as, I suppose, the client does SSL-level synchronization. It may well be something about this particular implementation either on the client or Amazon side.

Written by Martin Pool

October 13, 2010 at 4:18 pm

Bazaar 2.1 retrospective

leave a comment »

bzr 2.1 retrospective on the list; it’s been a really good cycle and I feel much more in flow.

Written by Martin Pool

February 19, 2010 at 4:37 am

Posted in none

Tagged with

Subunit Tribunal

leave a comment »

I’ve been hacking recently on Tribunal and I’m very excited about what can be done here.

Tribunal, started ages ago by jml, was a graphical viewer for Python or Twisted test runs: you give it a class name, it runs the tests in it, and tells you which ones failed.

Some new and exciting things have been happening recently in pyunit-friends: a cluster of cooperating libraries and tools to improve testing in Python. In particular, subunit, is a generic protocol for serializing/externalizing test results (similar to and convertable to/from junitxml).

A test result serialization protocol enables several interesting things (transparent remote distributed test runs) and one of them is a GUI for test results that is able to

  • be well isolated from the process under test
  • save, load, and compare historic test runs — never worry again about unreproducible test failures
  • run and inspect tests in any language or on any platform, or tests run on one or several remote machine
  • click to edit or re-run a test
  • find interesting or similar failures

Anyhow, that’s the idea, but it’s early days. As of the now, the trunk branch can read subunit output from a pipe or a file and filter the results.

I’m planning to rip out much of the existing code, keeping the concepts, and keeping the ability to run python or twisted tests as a a mode.

Tribunal screenshot

Written by Martin Pool

February 12, 2010 at 9:40 pm

Posted in tribunal

Tagged with , , ,

higher velocity in losing your luggage

leave a comment »

I’m in Terrassa, Spain, for the Canonical allhands meeting before UDS Karmic.

I brought my motorcycle helmet with me as special-handling checked luggage, for a ride around here next weekend.  I think it missed the connection in London, but it showed up today apparently unharmed so all is well.

But as it happens I read a great Malcolm Gladwell essay which mentions this topic in passing — a real example of how we can take stupid inefficient processes for granted when they’ve existed for a long time:

Ranadivé [founder of TIBCO] views this move from batch to real time as a sort of holy mission. The shift, to his mind, is one of kind, not just of degree. “We’ve been working with some airlines,” he said. “You know, when you get on a plane and your bag doesn’t, they actually know right away that it’s not there. But no one tells you, and a big part of that is that they don’t have all their information in one place. There are passenger systems that know where the passenger is. There are aircraft and maintenance systems that track where the plane is and what kind of shape it’s in. Then, there are baggage systems and ticketing systems—and they’re all separate. So you land, you wait at the baggage terminal, and it doesn’t show up.” Everything bad that happens in that scenario, Ranadivé maintains, happens because of the lag between the event (the luggage doesn’t make it onto the plane) and the response (the airline tells you that your luggage didn’t make the plane). The lag is why you’re angry. The lag is why you had to wait, fruitlessly, at baggage claim. The lag is why you vow never to fly that airline again. Put all the databases together, and there’s no lag. “What we can do is send you a text message the moment we know your bag didn’t make it,” Ranadivé said, “telling you we’ll ship it to your house.”

It would be nice if the steward could come up during the flight, tell me my bag hadn’t made it, and then ask for my hotel details to deliver it. It would have saved most of an hour waiting at the airport. (And if you count all the passengers waiting in line with their travel companions, several person-days just for that one flight…)

Written by Martin Pool

May 17, 2009 at 4:24 pm

Posted in none

Ross Gittins on the Carbon Pollution Reduction Scheme conundrum

with 3 comments

Ross Gittins explains why the draft Carbon Pollution Reduction Scheme legislation seems stuck: Labor doesn’t have the votes in the Senate without either the Greens (who won’t compromise), or the Liberals (who don’t know what they want) or the Nationals (“agrarian populism”).

Rudd’s initial proposal was purpose-built to be irresistible to the Coalition. It adopted the lowest possible go-it-alone emissions reduction target – 5 per cent – and a pathetically low 15 per cent reduction in the event of an international agreement in Copenhagen in December.

It accommodated the demands of business lobby groups to an extent Rudd’s own expert, Professor Ross Garnaut, found repugnant. … Rudd offered the Coalition a scheme little different to the one it took to the last election (both schemes having been designed by the same bureaucrats). What was Malcolm Turnbull’s reaction? Nothing doing. He rejected it, contriving to claim it was simultaneously too weak and too tough.

Clive Hamilton in Crikey believes that Labor could force it through the Senate if they had the balls. I don’t know. Maybe there is some brinksmanship here in the hope the Greens will at the last minute see high but realistic targets as a lesser evil, or that the power struggle in the Liberals will resolve.

The climate-skeptical position of the Nationals, though apparently firmly set, is bizarre to me, because their rural consistency may suffer more than anyone else from climate change. The few farmers I know personally are firmly convinced, because they have to adapt to changing temperatures and rainfall by destocking land or growing new crops.

Written by Martin Pool

May 6, 2009 at 12:50 am

Follow

Get every new post delivered to your Inbox.