Memories: The Days of More Secure Computing :D

its this new fangled stuff that can be reprogrammed from the O/S that is terrifying - give me an ASR33 with paper tape, a system with front panel register switches, or a new tube of roms to replace by hand any day :wink:

2 Likes

Many times I remember a foot full of prongs from discarded ROM chips :slight_smile:

1 Like

I came after the dinosaur era.

My first work experience was in the ‘computer room’ of the local university. I would load and unload tapes based upon the student requests. We had a row of these IBM vaccuum-sealed tape decks into which you would load the tape, then press three or four buttons, and then step back carefully in the hope that it all worked.

In fact, it was a bank something like this (but without the woman). Fascinating work experience :roll_eyes:.

At the same time, I had an Apple ][+ at home, and was a member of the local ‘club’. Changing computer tapes had limited appeal for me.

4 Likes

Ah, tapes. My first full-time position was with a computer-output-to-microfilm lab.

Some customers sent us beautiful 6250 bpi tapes. You would load one of those, and a machine would sit happily for hours, producing an entire roll of microfilm.

For cost reasons, internal jobs came to us on 300 bpi tapes that had had long, hard lives. You would load/unload/reload these horrors over and over and over
 I recall approximately half of the unloads would be followed by the operator throwing the offending spool clear across the room
 ah, happy days. :joy:

3 Likes

One of my first jobs was at NASA doing SCADA OS work on a SEL810 used for climate research instrumentation (lidar,~1976). When they gave me the intro they pointed out a gray spot on the otherwise government creme cinder block wall. The 810 had a vacuum column tape drive that was old enough so there were zero parts available, the hub clips were badly worn, and they had no budget for a replacement. On high speed rewind the tape would spool off the takeup reel and out of the vacuum columns, the tape would then pop off the hub and go crashing into that spot on the wall, each and every time. Yet the researchers got their job done.

There was another 810 SCADA system controlling the 6x28 wind tunnel, also one I ‘cared for’. This is that machine, and some of my workmates. Amazing what you can find on the net!

6 Likes

I dug a few treasures out on the weekend now on display in the living room - a couple of memory cards I salvaged, one unknown size core memory card from an old ‘accounting machine’ I helped run before we replaced it with a Rainbow from memory, another 4 kB from an HP box - and an old hard drive from the same accounting machine - ceramic platter about 8mm thick and 3 fixed heads that weighs around 5 kG without motor/belt/enclosure 


I don’t remember ever getting hacked or having a virus back then, when ‘connectivity’ meant a boot-load of tapes and transmission speed meant how fast you drove :wink: Mind you, it’s still hard to beat the transmission speed of a briefcase full of drives and an airline ticket, especially cross country 


6 Likes

Glorious photos ! Thanks for sharing those !

2 Likes

I remember when I was still in primary school , that wasn’t yesterday , my uncle taking me into the GBD in Melbourne to see the first computer room that had been set up by a major company . T&G Insurance I think . /

It seemed like something out of science fiction to me . This big class panelled room with all these consuls with tapes on them . He told me this was the way of the future . I remember thinking " Oh yeh " Will never happen . How wrong I was .

4 Likes

I also remember using these in the late 1970s to mid 1980s
learning to program at school. I also remember being frustrated when accidentally/ unknowingly making a mistake and the cards refusal by the reader. It was impossible to transfer virus code using such systems.

3 Likes

Sometimes it was also impossible to transfer (install) code because of the fickle nature of the beasts! In the early 80’s the Control Data SCOPE2 operating system although well aged was still going strong on CDC 7600 systems, and the only way to get device drivers in was via punched binary cards! Nobody else used the reader except we systems types and sometimes spent an entire morning getting a driver read in properly. Oh the good old days!

4 Likes

When I started coding in COBOL in '82, my employer still insisted that programmers use coding sheets, that were then delivered unto the card punch pool. The card punch operators never bothered to replace the ribbons in the punch machines, so that meant the cards were never sequentially numbered by sight.
There was many a trip ( some deliberate ) twixt the punch room and the card reader. Other staff thought it such fun to have a moaning programmer on the floor with hundreds of unsorted cards.

We had high-speed printers that used punched paper tape loops to control printing on various types of continuous forms.
One dastardly trick was to pause a print run, remove the loop, cut, and introduce a half-twist before taping it back together, and replacing inside the printer.
The fiend would then retire to a safe distance and wait for the return of the trainee operator who would go ballistic when they discovered that the printer had been possessed by gremlins
 :tired_face:

4 Likes

The rightmost punched columns were usually sequence numbers for programming text and always for binary decks. Those of us in geekdom of the era could pick up a few hundred randomly shuffled binary or ‘sequenced blank text’ cards, no ink needed, and sort them as easily as a deck of playing cards. Sometimes we were the emergency response squad when someone had a serious ink-free or binary dropsie. :wink:

3 Likes


 back as far as version 4 !! and yes it is interesting some of the critical places ancient hardware and software is still in use 


1 Like

Once upon a time hacking was just figuring out how to use a computer. CDC NOS was pretty powerful in its days and the host for the vast majority of number crunching applications in the late 1970’s to the mid-1980’s when the world ran by it and Control Data quite quickly.

http://phrack.org/issues/18/5.html

1 Like

I started programming in 1965 on an English Electric Leo Marconi at the Shell Company in Melbourne. Graduated from there to the Olympic Tyre and Rubber Company who installed the first ICT 1900 in Melbourne. In fact, we graciously agreed that ICT (later ICL) could install our intended ICT 1900 in their own offices so that they could learn about it before we took delivery of ours


4 Likes

This is how to move real data!


an Exabyte-scale data transfer service


But wait, there’s more.


including dedicated security personnel, GPS tracking, alarm monitoring, 24/7 video surveillance, and an optional escort security vehicle while in transit.

Have you got your top-of-the-line NBN connection?

With Snowmobile, you can move 100 petabytes of data in as little as a few weeks, plus transport time. That same transfer could take more than 20 years to accomplish over a direct connect line with a 1Gbps connection.

4 Likes

Yes those were the days. Not sure how that makes Choice more effective?

The one point relevant to a little later on in time 1990’s was that the predecessor to the internet as we know it relied on the remote client acting only as a dumb terminal. The host did all the work. There was limited scope for anyone to take over your computer by remote control. It required a deliberate act by the user to install and run code. Or if you remember malware the user had collected by other means?

Microsoft et al created the monster through the multitasking and parallel processing abilities that came with Windows etc, and the functionality built into web browsers. This was all to give us a better consumer experience (including free cookies!). This new ecosystem allowed remote devices to use the computing power of the client, off loading work from the host and giving us all a richer experience. Also great for marketing and tracking and spying as it has turned out.

Whether we asked for alll of that does not matter. We accepted it. Ignorance is still no excuse. There might just be too many of us in that category.

1 Like

Historically it was a combination of losing sight of Operating Systems 101: Protect yourself from the user at all costs, and the limited power of the early microprocessors. If there was proper implementation of OS101 on those microprocessors there would not have been anything left to ‘entertain’ the PC user much less write a letter or play Adventure :wink:

So, instead of following the sacred norms of security, they opted for what I will rightly or wrongly call the IBM model of trusted global message passing and mailboxes in the name of ‘efficiency’, and in the case of a PC, just being able to get ‘it’ done. FWIW decades ago IBM gave (exposed?) their OS code to Japan Inc as IBM moved to their next, and probably set Japan back about 10 years - clever!

An example I always found curious includes double buffering considered the standard, but I do not understand why although it can be brutally efficient. Ever hear of circular buffering? It was impossible to do a buffer overrun to plant executable malicious code. Re overruns, do not get me going on the lack of security introduced by the lackadaisical use of getc and friends!

which is the rest of the story.

getc has not friends - it has co-conspirators !! :wink:

It wasn’t always ‘secure’ back in the old days - if you’ve never read Ken Thomson’s Turing Award Lecture “Reflections on Trusting Trust”, I thoroughly recommend it.

2 Likes

You cannot blame Microsoft for the security problems we have now. The problems that are coming home to roost at the moment are based on a range of problems, but there is one central issue: security.

Until relatively recently, security was an afterthought - and it remains an afterthought for many IOT creators whose devices are subsequently hijacked. When the Internet was being designed, it was by nerds for nerds - nobody expected it to be so successful, or for there to be so many computers! Security was an afterthought, even when developing standards like the hyper text transfer protocol (HTTP) - it was only as an add-on that you could go with HTTPS. And there was a penalty for being secure; computers had to perform these complex calculations, and were very slow at doing this - again until relatively recently.

The user wanted things to work, and to be fast - they got what they wanted. Even in the development of the replacement of the current HTTP standard, people are still arguing that it should be insecure by default!

On the desktop, Microsoft designed an operating system for stand-alone computers - and then added network functionality. If you look back at those early versions of Windows, everything was open by default. Nobody had firewalls until Shields-Up came along - and then a while later Microsoft realised that this was a function that should be performed in the OS. And of course this was in the days before NAT routers - so your entire network was visible.

Then there’s the idea of protecting the user from themselves, by having ‘user’ and ‘admin’ logins. Hands up if you access the Windows environment on your home machine with an account that does not have admin access. Congratulations if you fit in that group - you’re a small minority. I tried it for a day or two - but tend to change my system so often that it drove me crazy (as Choice Community members can see from my posts here).

Hopefully Windows will continue to improve its security, which has already improved in leaps and bounds but remains well behind Unix/Linux/BSD. That said, can we trust any closed-source operating system? Can we trust our Intel chips not to have back doors for the NSA? Apart from the highly publicised Active Management Technology, or AMT, of course. (By the way, if you want to limit the power of your motherboard to ‘phone home’ using the AMT, just use an external ethernet connection - AMT bypassed.)

In other words, technology was historically insecure. We are now in an age that has started to wake up to the dangers of insecurity and is gradually patching the holes as they are discovered. At some point in the future it may even be possible to move to a fully secure Internet, but that is currently a pipe-dream that is delayed further because of the need for ‘backward compatibility’.

Programmers are working towards ‘mathematically provable’ code - in which you can prove that your software is totally logically consistent and thus hopefully bug-free (don’t ask me how).

So the future is secure. The past was insecure, but it didn’t matter. The present is a mess. It’s the story of human history.

2 Likes