19-year-old security flaw in Windows finally fixed
MICROSOFT patched on Tuesday a critical bug affecting Windows that researchers say could potentially allow hackers to remotely control users' machines. But the bug wasn't some recent mistake.
The IBM researchers who found it said that it has been around for nearly two decades, highlighting the difficulty in spotting and fixing bugs even in code that has gone through extensive review.
"This complex vulnerability is a rare, 'unicorn-like' bug found in code that Internet Explorer relies on but doesn't necessarily belong to," wrote IBM X-Force research manager Robert Freeman in a blog post on the problem.
"Significant vulnerabilities can go undetected for some time. In this case, the buggy code is at least 19 years old and has been remotely exploitable for the past 18 years."
The bug was present as far back as the original release code for Windows 95, he said.
The IBM team said it has not found any evidence that the bug has been exploited. Still, there is a whole market for previously unknown computer software bugs where cyber criminals and even governments bid for ways to hack into computer systems.
IBM said that this newly discovered bug would have fetched six figures on this market, which occupies a legal grey area.
Windows powers about 90 per cent of computers worldwide.
This is not the first time major flaws have taken years to uncover. In 2010, a Google engineer uncovered a 17-year-old Windows bug affecting all 32-bit versions of the operating system and that could be used to hijack PCs.
In September, another problem called Shellshock was discovered in a free software package built into some 70 per cent of all devices connected to the Internet. It could have been introduced as long as 22 years ago, said Chet Ramey, the long-time maintainer of the code.
There are other examples, like the infamous Heartbleed bug that emerged in April and was undiscovered for two years.
So why does it take so long for seemingly important problems in critical systems to be discovered and fixed?
Part of it has to do with the process of software development and review. Writing code is not like a traditional engineering task such as building a bridge, where there are clear definitions for whether a project meets technical specifications. Code is a far messier medium and it can be hard to know how the individual pieces will work together when combined into a final product.
Developers also do their own assessments of products and, in many cases, hire testers to look for obvious flaws.
But the true test of the security of a piece of software often comes after it has been released. That is when code is exposed to outside security researchers and hackers who start to pick it apart, looking for weaknesses.
Many companies, including Microsoft, offer financial incentives through bug bounty programs to make the process go faster. There are people who make a living searching for bugs and collecting these bug bounties.
But despite all these efforts, no one knows just how many bugs are out there, waiting to be discovered. And, sometimes, it takes decades to find them.