Windows Update Problems

https://choice.community/t/forum-etiquette-site-feedback/14737/15

You might not be aware that was how it worked for large scientific systems at the time. OS releases came on tapes but updates were still distributed on microfiche and integrated into the source code that each site had. We checked updates (usually bug fixes) against our local modifications, decided what to add or not, often modified them to integrate, recompiled the modules affected and relinked the whole OS. The concept of a DLL was still new (although we had overlay loaders) and those as well as virtual memory were deemed counter to high performance computing. It was mostly assembler while c was creeping into OS work. The particular project was the Master Measurements Database that was used to recreate potential sources of problems encountered in an orbiting shuttle. It started with an effect and worked back to enumerate probable causes. The host machine, a modest CDC Cyber 74, was located in Johnson Space Center Building 30. We systems types spent lots of time there while the applications developers were off-site in an office complex with a dedicated link.

Security? This was in the days of ARPAnet; there was no internet. You were either on-site (or equivalent direct connect) or had a dial up number and if you were an ‘anointed one’ might get access via a 9600 baud synchronous modem. For most it was 1200 asynch from a TI Silent 700, teletype, or similar. Quite different from subsequent generations when (from my perspective) computer scientists essentially created problems through well meaning but misguided attempts to improve reliability, and that would be a topic on its own, not consumer related.

2 Likes

Our school actually connected to the university network, and I did work experience at the ANU computer centre in 1984 (this largely involved changing tapes). Later I worked on a team developing a replacement for the local ‘personnel’ system, and was somehow tasked with making repairs to the existing DBase III+ -based system’s code (badly - I was never a coder, although I did type in those programs computer magazines published).

Back in the 1980s and well into the '90s you had control over your computing environment. As soon as you connect to the Internet today you lose some of that control, and your operating system - along with all your other software - must be able to protect you against existing and new threats. Apple and Google do exactly the same as Microsoft, and sometimes their changes break certain software. Apple, for instance, is well known as forbidding Adobe Flash from going anywhere near iOS, because of its known (and unknown) flaws.

And yes, I have been known for my snark at times - I recognise this, and generally try to remain civil - except when it comes to topics such as vaccination and quackery.

1 Like

That is where it went wrong. Computer Science 101 taught that the OS number 1 job was to protect itself from all ‘users’, a seemingly lost art. The ‘how to’ once was no secret but perverse IBM ‘technology’ and then the microprocessors’ original limitations resulted in the introduction of shortcuts such as buffers that could be overrun and mailbox service requests to the kernel that evolved into chasms, not just holes, and here we are with magic in multi-layered libraries that use even more libraries and functions. Make an error in one, and voila, an exploit hook is deployed. Most contemporary programmers know how to call something like add (A,B,C) but have no idea what happens, nor do they usually care, as long as the return value is as expected.

I could continue, but I shall take leave from this enticing direction OT.

2 Likes

Thanks for the memories. :wink:
The screech and dings from the 300/75 baud modem.
The lightening fast down loads on 1200baud from APRAnet.
The second hand clicking away on the manual timer tracking the cost of the 1,000km STD call to the nearest dial up server.
The long lost mental ability to transform machine code between mnemonic, binary, and hex/octal, with some chance the outcome was predictable. :flushed:

In comparison, for us customers of Microsoft a simple Windows update should be child’s play?

That assumes I still have the mind of a child, and secondly that MS still considers me a customer. I fear I fail on both counts such is the rate of change and redundancy both of brain cells and hardware more than a few years old.

I appreciated the previous shared advice on the potential issues with recent Windows Updates. :smiley:

For everything else I use an iPad! Although I suspect Android on a tablet can be equally effective?

4 Likes

Not sure if this is just ‘Windows Updating’ or a problem heading some peoples’ way, but once again Microsoft is supporting the need for more ‘technology’ through ‘Windows creep’.

3 Likes

It turns out that the part of the patch that broke enterprise AV was to a Windows DLL that had an escalation of privileges vulnerability.

In other words, it was a security fix. Again, AV makers should not be hooking into key Windows components without using existing APIs.

1 Like

This might not be exactly relevant but is a related war story. Back in the mid-1980’s in the early days of PCs MS-DOS did not have universal video drivers and every application had to provide their own if they were going to have enough video performance to be acceptable to users. One of my staff (a very clever young man) made a shareware screen saver called Dazzle that was ‘big time’ in its day,. It produced unique graphics that would not have been possible using MS-DOS ‘approved API’. He was getting $2-3,000 per month in donations from release through to the time I lost contact with him in the early 90’s. Windows eventually relegated Dazzle to history.

To the topic, one of the evaluated criteria for AVs is overhead. it is quite possible the AV developers made conscious decisions to minimise overhead by going direct rather than being bogged down by what might be high overhead APIs. You, I, or anyone might think using the approved API is the only way to go, but depending on the outcome those who skirt the ‘approved API’ sometimes have reasonable reasons.

Now that having been written, I agree with you that once warned they should have had focus on adjusting as required.

1 Like

I played quite a few MS-DOS games back in the day, and the problem from my end wasn’t specifically the video drivers it was the 640kb memory limitation. Trying to write autoexec.bat and config.sys for the specific game so all the needed drivers would fit into low memory and the game still had enough space to run was a nightmare!

I think Windows 10 still has a config.sys, speaking of backwards compatibility. Actually, I just did a search and unless it’s very well hidden (yes, I display hidden files by default) it has finally been abandoned.

2 Likes

Whilst this is going OT the reason you did not have video problems is that the games of that era virtually all came with their own. Remember having to select one from the list during installation or configuration?

Bill Gates was sort of aboard that 640kb and this overview surrounding the attribution reflects the times, so there you go.

Memories! Not just games, business applications too!

3 Likes

Update? What update?

3 Likes