Meltdown and Spectre: ‘worst ever’ CPU bugs affect virtually all computers

Pre-Internet personal computers. It was a brief window in time.

There is still speculation about a bug that was found in iOS several years ago that, because a line of code was repeated, entirely skipped an authentication process. This is potentially something that someone working within Apple for a Three Word Agency (TWA) could have slipped past quality control.

2 Likes

I was thinking …

3 Likes

Hehe! ~1300BC to ~1983 or thereabouts. Very brief indeed :wink:

But is that a hack I see?

image

4 Likes

The researchers are much cleverer than the computer criminals but, once the exploit is public, the computer criminals suddenly don’t need to be as clever.

There are two aspects to that.

  1. Straight out release of proof-of-concept code by the researchers.
  2. (Requiring greater cleverness but still nowhere near as much as finding the exploit in the first place) Once the fix is released, the exploit can be reverse engineered from the fix.

Don’t forget that the computer criminals include a lot of governments, who are well-resourced and who employ lots of clever people.

That depends on what your metric for worst is.

If your metric is “most number of computers on the planet that are vulnerable” then for some Speculative Execution defects, I think the answer is yes, for two reasons …

a) vulnerability cuts across CPU implementations e.g. Intel and AMD and ARM (is that 99% of all general purpose computers on the planet?)

b) vulnerability at the CPU level means that all operating systems are vulnerable e.g. Windows and Linux and MacOS.

In the early days of this vulnerability even home computers that are not doing anything silly were still vulnerable.

If your metric is “cost” then maybe not but

a) it is very hard to ascertain all the cost of real world exploits because hackers don’t tend to register the event with the police or place ads in the newspaper, and

b) “cost” was prevented by focused and determined generation of patches and application of patches. (Generating patches in itself is a cost, and also an opportunity cost because while e.g. Linux kernel and compiler developers were tied up generating bandaids by the thousands, they weren’t doing something positive.)

Yes, new exploit variants are still being found so we can’t even call time on this one yet.

“worst ever” bug will ultimately be seen to be “puffery”, not intended to have an objective, agreed metric.

1 Like

Again, I would argue that very few individuals were affected by this. Unless you have unknown users accessing your computer, then it’s unlikely to be a problem. (One potential exception might be if you have provided sandboxes for your children, and they want to break free.)

While the Spectre/Meltdown flaws are CPU-based and can affect any OS, if a black hat is already in your system then this is the last tool they would turn to in order to gain administrator privileges.

If you’re an admin for an IT network, then you may have reason to worry. If you provide virtual machines to strangers, definitely worry. If you stay in your own world, then bad guys need to punch past your NAT firewall and your computer’s AV/firewall/other defences. Anyone who has the ability to do that doesn’t need these flaws.

There are plenty of flaws that hurt home users more than this class, but this class may be the ‘worst ever’ for certain kinds of business and individual. I would nominate Macromedia/Adobe Flash on websites as one of the ‘worst ever’ ideas. If you’re talking CPU flaws, then probably the FPU flaw in early Pentiums (Pentia?) would have to rate fairly highly.

No problem. That’s a different metric that someone could choose - and different metrics get different answers. When it comes to “worst” there is no right or wrong answer.

At least one early variant of these exploits could be carried out with Javascript i.e. you only needed to visit a web site that is either intentionally malicious (while seemingly harmless) or compromised, unknown to the operator of the web site (the web site having a completely legitimate actual purpose) - assuming that you have enabled Javascript. That was ‘fixed’ fairly early on.

1 Like

A slideshow with the second image explaining how the term “computer bugs” originated.

https://www.9news.com.au/national/today-in-history-new-pictures-gallery-famous-historical-images-crime-sport-celebrity-world-news-global-events/349cc469-40ad-49f0-b748-c2f482c4b0b5#2

There appears to be a “bug” in your link - due to the design choices of the web site to which you have linked.

The story of the first computer bug is probably inaccurate anyway. https://en.wikipedia.org/wiki/Software_bug#History

1 Like

The second image is now the sixteenth image.

1 Like

Let’s keep it simple. The image is:

(There’s no copyright issue here with Channel 9 because they have just ripped it off from Wikipedia anyway.)

2 Likes

Are you stating that the company responsible for (Australian) 60 Minutes and A Current Affair actually got its facts wrong? I am shocked!

1 Like

Normally I would share your cynicism, but the story of the first computer bug is repeated so often that it might be excusable for Channel 9 in this case.

My point was only that the term “bug” in the sense in which it is now used (frequently) was already in use before the “first computer bug”.

Whether it was actually the first computer bug or whether it was actually the first bug that was caused by an actual bug (insect) I can’t say.

If you watch ACA, that’s on you. :slight_smile:

1 Like