Software going on-line slows down computing

How does that apply in this case?

That can be true i.e. software is optimised for the current hardware spec or the most frequently occurring hardware spec … so if you haven’t updated your hardware then you are behind the curve, and the software is therefore optimised for something that you don’t have … and the software may actively underperform on the hardware that you do have.

I think that’s what you are saying.

That could be true if this is a feature implemented in the application (and a poorly written one at that). However the standard file caching on Linux automatically and dynamically tunes to the available memory. (Because it is implemented in the operating system, that caching is automatically and transparently available to all applications.) On a computer with 8 GB RAM, the operating system would never allocate 8 GB to file caching. Furthermore, on a computer with 8 GB RAM, with say 6 GB RAM doing nothing, it could allocate 6 GB to caching but if some bloated application is suddenly started and it needs 3 GB then the file cache would automatically be trimmed back to 3 GB.

So the example given may not be a good one for how an application optimised for a certain hardware spec will actively underperform on weaker hardware than that spec.

I assume that Windows works similarly.

To add to that … SSDs come in two flavours, SATA and NVMe. SATA SSDs are several times faster than spinning rust. NVMe SSDs are even an order of magnitude faster than SATA SSDs. All subject to the capability of the computer itself of course.

Even on a modest current CPU, the maximum memory bandwidth is still about an order of magnitude faster than a top-of-the-line NVMe SSD.

Would that also be very specific to a particular type of program or application?
It’s obviously a factor in being able to play many of the latest in gaming.
It may be a factor if one is regularly creating high quality video productions.
It will be a factor in high end CAD and 3D related tasks.
For 90% or more of home users we may be hard pressed to find a program that is in common use that needs more than the most basic of specifications.

My more recent experiences include running MS Office - spreadsheets and presentation on a very basic early Atom class lightweight PC. An early slow SSD allowed plenty of time to make a brew when booting from cold. Windows memory management with full fat Win 7 installed met every need, albeit a challenge to edit other than basic graphics. That may answer any concern re Windows ability to manage resources effectively. At least within the MS biosphere.

1 Like

Not really. Just clicking on a folder of photos that you have not previously clicked on where you have your “files” application set to “thumbnail view” could easily pull a GB into the file cache. (Linux even gives you the option of overriding (suppressing) the thumbnail view when the folder is remote so that you not only avoid pulling a GB into the file cache but you avoid hauling a GB across the network.)

But my point was that this is not a good example because the operating system won’t pull a GB into the file cache if memory is not sufficiently abundant to devote a GB to caching files.

That is mostly true.

The situation as originally raised by the OP (“slows down”) needs more detailed investigation. Is it hardware spec? Is it network? Is it software design?

I’m hoping that is tea or coffee, not beer. :joy:

1 Like

My example is a good one, and for the reasons you have said about file caching.
Applications don’t know about real memory. They only know about virtual memory.

If an application can use many Gb of data, let’s say it is a multimedia app, then the question of where that data resides is important for performance. Much faster to have that data in RAM, rather than have it on disk.

So, my application could allocate data buffers of sufficient size to hold, say, a DVD’s worth of data, which on a single sided, single layer would be 4.7 Gb.

On a computer with 16 Gb of RAM or more the OS paging system could have all of that in memory. The application runs fast. Or I don’t allocate data buffers, and just assume it is on disk. The OS can use file caching on behalf of all programs and keep my app’s data in memory. The result is the same.

Now run the same application on a computer with only 2 Gb of RAM.
Only a small part of the data can be in memory, most out on slow disk. The OS will spend a lot of time paging data to and from real memory in RAM and disk. Probably ‘thrashing’.
That is if the app has allocated it’s own data buffers in virtual memory.
In the case of file caching, then the OS would simply consider that since there simply isn’t enough real memory for caching anything but small amounts of data, all the data the app uses just has to stay on disk. Slow.

They key point is that, for this problem to occur, caching must be implemented by the application itself and without sufficient regard to the available (real) memory.

If the application has no way of determining the available real memory then the next best option is to allow user configuration of the amount of virtual memory used for caching by the application (i.e. allow the user to improve the situation for the next time the application is used).

But the best option in general is for the application not to do it all and just let the operating system do it. It will be fast when your computer has 16 GB and it will be no slower than it needs to be on a computer that has 2 GB. That is not to say that it will ever be fast on the latter computer.

Is it possible for it to be better to cache in the application? Yes, there will be occasional situations in which the application can do more optimal caching because it has knowledge about its future pattern of access to the underlying file(s) but it doesn’t share that knowledge with the operating system.

And my key point is that over time, the typical resource configuration of computers change.
10 years ago most home computers would have maybe 2Gb of RAM.
So the OS and the applications would be tuned to that and have default settings on memory usage to reflect that.
Today the typical home computer would have at least 8Gb on the low end and more typically 16 Gb.
The application and OS upgrades that come along would adjust the default settings to reflect the increase in RAM to speed things up. Also, new features come along that use more data and memory.

You would probably unaware of these changes, and if staying on an old computer with only 2 Gb of RAM the OS and applications are not tuned to your computer. Yes, it will still work. But the OS will certainly be working overtime to service virtual to real memory translation and paging for one thing.

1 Like

Occam’s razor check: Microsoft has enough trouble fixing security holes without making anything else more difficult for users.

While it makes for a nice conspiracy theory, the reality is likely to be more mundane - more features = more bloat and greater demand for faster hardware. So we keep getting more features that many of us will never use, and…

1 Like

I wouldn’t agree with that.
Features many users want in software demand improvements in the hardware to make it doable. And improvements in hardware allow features to be implemented in applications that otherwise would not be doable.
They follow each other.

Very few want for more features at all. Take the example of the almost ubiquitous MS Word. It is capable of handling a complex report of thousands of pages by multiple authors, formatted with styles and magically updated fields, it can automate repeated operations including import data automatically and be controlled by other MS apps and control them, yada, yada.

Nine out of ten users don’t know what all that means much less ever use it. What most want is; easy, simple, light, fast, stable, reliable, consistent, cheap, etc. Oh and a format that is capable of being transferred to an all-singing all-dancing monster should they ever need it.

What we get is aimed at the power user and all those features obscure what the average Joe wants to do and makes their learning harder.

If that is all they require they can use the built in Wordpad rather than the MS Office App and it’s free to use. Built on a much earlier version of Word it makes simple document prep easy enough for most home users.

It also saves in a variety of formats such as docx and odt

Most users are looking for more formatting and inclusions, the rest that comes is rarely used. They could of course opt for free and competent products such as OpenOffice or LibreOffice and with changing settings to choose to provide a similar desktop feel. Both of those offer the saving of a file in the MS format versions of documents, spreadsheet, and powerpoints.

1 Like

Perhaps. But sometimes there are features that many users really want. And sometimes the hardware needs to catch up to make it possible.
And sometimes, the software needs to catch up to what the hardware can do to give many users what they want.

Two examples.

In 1982 the IBM PC came out. It used a CPU from Intel limited to 1 Mb of memory due to 20 bit addressing. It had an operating system from MS limited to 640K to put all programs and data into.
In 1984 the Apple Mac came out. It used a Motorola CPU that could address 24 bits of physical memory, and 32 bits architecturally. That is a big jump to 16 Mb of memory and an OS that could use that memory and enabled Apple to have a graphic user interface and mouse driven interface that users loved. Really loved that mouse input and GUI.
It took nearly 8 years before the IBM PC type computers caught up with a combination of 32 bit CPUs, and operating systems like MS Windows or IBM OS/2 to provide a GUI to compete with the MAC.

A second example.
Gamers wanted more detail, more colours, more scenes on the display, and sounds. This all involved lots of data.
Maybe hundreds of Megabytes or more. A 3.5 inch floppy with a capacity of 1.44 Mb just wouldn’t do. I can remember doing installs with a box of 30 or more floppies and it was a pain. It just wasn’t practical so game features were limited to what the technology could provide.
So the hardware developers came out to address what most users wanted. One disk. Everything on it. The data CD. 800 Mb capacity. The game features took off big time.

All good practice, and excellent advice.

For about a decade, I’ve typically run a system drive on SSD, and a data drive on a mechanical hard drive that’s much bigger than I really need I partition the hard drive to ensure that I read and write to a relatively small portion of the drive, meaning the heads never have to seek edge to edge. On a slow disk (5400 rpm), rotational latency is 8.3333 milliseconds per access (1/2 a rotation). Track to track seek is a few msec. Network latency is typically measured in seconds, even if the line “speed” is Mbits. The lines all run between queues, at every data junction…where the data packets wait. Routers, switches, MUXes, etc It’s efficient, but they all add up.

Unless the application is networked, like I assume multi-player gaming (I’ve never indulged), local is better.

If your browser has myriad web-pages open, you can assume there’s high memory usage, even if the browser is not the active application. Antivirus is essential, but uses both memory and processor time, routinely. I assume it sets up traps to trigger interrupt pre-emption of active programs when they attempt to access either a network of file resource - preemption is when the OS stores the current program state and loads a new active program (e.g. the AV trap handler), then restores the program as it was afterwards.

The same is true if you have a lot of office files open.

For what its worth, going back to the original question, which was about cloud based PC applications, these were fundamentally designed to benefit software vendors, not users.

I run Libre-Office on Windows, although I rarely use Windows these days; and Libre-Office on Ubuntu Linux otherwise. Libre-Office always runs locally. You control what it does, and where your data is stored.

Ubuntu doesn’t build anything like the bloatware OS that Windows does, yet it has all the features I could want, and few of the really annoying things Microsoft foists on you. I don’t use any Apple products, because they are designed to lock users into the Apple eco-system. Apple runs on BSD Unix under the covers anyway, so if it is going to be Unix, I prefer Ubuntu.

I find Ubuntu vastly more efficient than Windows, and much more user friendly. Libre-Office has more than enough for my needs, and I’ve only found minor incompatibilities with MS-Office files produced by Libre-Office, and then only using advanced features - specifically legal format numbering.

A new computer will come with the OS loaded. I still use an old laptop, which I leave at my daughter’s. It was very high end when new. Ultralight, and had XP. I upgraded it to Win-7, then made it dual boot with Ubuntu…then I replaced the hard drive with a SSD, and only installed Ubuntu. I’ve done the same with every computer since. Once they’re out of warranty, or otherwise a few years old - if still useful - I convert them to Ubuntu, or ditch them if they have no use. Don’t forget to blitz the hard drive with a decent overwrite data scrambler, unless its encrypted anyway.

2 Likes

All very informative.
From the OP

Not a lot to go on.
Something the user support staff in most businesses might resolve with a clean install/rebuild. Assuming some basic hardware checks have first failed to resolve.

Yes! My internet is much slower- since lockdown- I don’t know why.
For last five years I only have an iPad and my iPhone for internet.
I don’t use “the cloud” and wish I could delete it. With annoying frequency I’m informed my iCloud is full.
If anyone can explain to me in KIS what I can do I’d be very grateful.

The reason for the “iCloud is full” message:

iPads and iPhones use the iCloud when you choose as part of setting up the phone or pad to backup your data. You get a free amount of storage, this usually is enough to store your keychain data, contacts and similar. This free storage amounts to 5GB no matter how many devices you have. To turn off most of this backup you need on your devices to go into the settings → your account name (at the top) → iCloud and turn off all the items you do not want backed up. WARNING if you turn off keychain backup and you lose the devices you will not be able to restore your passwords that you have only saved on your keychain. Typically photos are also backed up as most Apple ware does not have expansion slots for extra memory modules (microSD type) and so they try to lessen the load by using some of that iCloud to become the photo storage (it is pretty seamless in usage but often the photos are only thumbnails stored on your device and they are downloaded when you need them for sharing etc). You can make it so the photos are stored on your devices but in many it rapidly eats up the storage available.

So while you may think you don’t use the iCloud, I think from the messages you are receiving you would find that you have and do use it. You can access your iCloud and delete data that resides there like photos using your Safari or really any browser and your Apple ID + password to access it. Phone and Pad backups are a bit more secure and are not usually accessible.

For a relatively small fee of $1.49 per month your can increase your storage in the iCloud to 50GB (which for many should be very adequate), this may be a better option than wiping most of your stored photos and documents stored in iCloud. $4.49 a month gets you 200GB of data and at $14.99 per month you get 2TB of storage. I am not suggesting you use any of these plans but they exist because many can take advantage of the extra capacity to store information they may want to keep.

3 Likes

Thanks. I do not want to store in iCloud so will backup photos to a USB.

1 Like

As @grahroll suggests users can choose what to store through iCloud. Photos are optional.

It is worth carefully considering backing up both devices including contacts, messaging etc. The benefit comes if something goes wrong with the device or it’s misplaced. My iPhone is set to sync data and back up often. My first iPhone was synced to my laptop which required a physical cable. Using the internet is so much more reliable and convenient.

P.S
I pay the small monthly amount for 50GB of storage. I don’t expect to need more.

1 Like

Best to back them up to at least 2 devices. The cardinal rule is to backup and then backup the backup as a minimum (eg 2 copies of a backup). When an electronic device (ie a USB stick) fails it tends to fail and anything on it is lost. When a rotating device (ie a HDD) fails it is possible (sometimes easy and sometimes at high cost) to recover some, most, or all of the data.

While some avoid cloud services the storage systems they employ are significantly more reliable than anything consumers would invest in. The down side is while that data is ‘secure’ it might not be ‘secured’ nor even hacker-proof at the end of the day. Then there are issues with access (the accounts) that can become frustrating quickly if it goes bad. Some avoid cloud services because they like hands-on control and responsibility in preference to what is interpreted as the equivalent of magic.

1 Like

Apple unfortunately takes the approach that it knows what is best for its users, even is users don’t agree. You mst likely had agreed in the setup of the device to use iCloud and you might not realise the device is using iCloud to do backups and store your data on Apple servers.

To turn off use of the iCloud as much as possible, see…

There are many reports online that the storage full warnings are common and many users purchase additional iCloud space to get rid of the warnings. There is also additional commentary indicating that this is a deliberate attempt of Apple to make more money from its users, by buying iCloud storage subscriptions.