A different approach to Note Taking

I take a lot of notes. I use them for reference. I use them for speculation about topics of interest. I use them to note down ideas so that I don’t loose them.

Before the digital age I had many paper notebooks and boxes full of index cards.

In those days I used to carry a HP 200-LX computer in my pocket and I thought it was an ideal note taking solution, oh if only I could buy one again, oh the nostalgia …

Whilst I was doing my degree I used to take notes at lectures on paper despite having several digital solutions available. I found that writing on paper helped me recall the material that I was writing much better than if I typed it onto a computer.

The physical act of writing is more visceral, it connects with the consciousness at a more basic level than typing. When typing one can go into autopilot and concentrate on the sequence of letters rather than the meaning of the words, the material gets typed accurately but it leaves little lasting impression in the memory.

But computer solutions are better organised and more compact. If one relies on paper then one accumulates many scraps of paper and old notebooks which are difficult to keep organised or refer to.

What I am looking for it the best of both worlds.  A paper notebook with unlimited pages which can transfer it’s content onto a computer, without many scraps of paper to keep track of, and hopefully without the paper.

Trees generate oxygen for our planet, we should not chop them down to be made into newspapers or chipboard furniture or paper notebooks.

For a long time I have relied on an application called ConnectedText which has served me well for a long time. It is a wiki with many powerful features, but recently I have found it to be less satisfactory than it used to be.

This is because I bought a new laptop and monitor with very high resolution screens.

The advent of high DPI screens and Windows 10 screen scaling has meant that the icons on ConnectedText are now microscopic and the titles of topics are only partially displayed.

The development of ConnectedText has now ceased and so it will probably never be updated and will continue to fall further behind as operating systems change until finally one day some update will break it.

This is particularly annoying for me because some while ago I paid quite a lot of money for perpetual licenses, the developer sold me licenses which would be for life, if there were any new versions of the software I would get an update to my license so I would get the new version for free.

He probably already knew that version 6 would be the last one and I already had a license to version 6.

If the developer has abandoned development it would be better if he were to release the source code as an open source project but I suspect he is keeping it going just to get a little more money from the current version.

I cannot now recommend ConnectedText for anyone wanting a new notetaking solution.

Perhaps it is time for some lateral thinking.

It would be nice if one could have digital paper, a screen on which one could write and draw but which could send these images to a computer and/or recognise the handwriting. Like a paper notebook with unlimited pages, no more stray scraps of paper to keep track of.

There are several possibilities.

One could use an Android tablet.

In my experience handwriting on an Android (a Sony Xperia mobile phone) is awful, the line drops out at random and the screen is slick, there is no friction and this tends to make my handwriting more messy. Also the note taking apps I have tried are cumbersome and awkward to use.

An Android device can also act as an e-reader for PDF and text files.

Despite this Android is not a good solution.

Dr Andus recommends a Boogie board.

I have tried a Boogie board and writing on the screen is much better and it is more responsive than an Android phone. However the Boogie board is not a very good solution for other reasons.

It is a write only solution, this is not what I want.

Once you have written a page or drawn a diagram, once you move onto a new page you can never go back to the previous page. The device stores them but it cannot display them. You can upload the stored pages onto a computer and this is the only way to see one of your previous pages.

The Boogie board is cheap but it is not a solution to the problem.

There is another device by Sharp, the WG-N20 which seems more capable than the Boogie board. It is an electronic notepad. You can look at and edit any stored page, sounds good, but there are problems.

The first big shock you get when buying one is the hidden costs. This is a Japanese import and so the price you see on the Amazon website is not the price you end up paying.

On the Amazon website it boasts free shipping to the UK but the UK Customs and Excise will open your package and impose an import duty on it. The shipping company will then demand this import duty plus an ‘administration charge’ before it will release your parcel for delivery.

The price you end up paying is about one and a half times the advertised price.

The manuals are in Japanese. So is all the text displayed by the machine, on the on screen buttons and in the dialog boxes.

The screen is slick and has little friction but despite that it has a better writing experience than an Android phone or tablet. The screen contrast is not very good, you are writing on a grey screen with slightly darker grey. This tablet needs good lighting to be able to use it adequately and there is no backlight.

The screen is a conventional LCD screen not e-paper. It is not an e-reader, it cannot import or display text files or PDFs.

It is not a good solution.

I have even been into the local Apple store to try out an Apple iPad.

I didn’t get along with it very well, the iPad suffers from a frictionless slick screen and the note taking application seemed to have some fundamental flaws. The iPads are expensive for what you get.

The staff in the Apple store are so full of artificial enthusiasm, everything about their products is wonderful and the fact that the annotations can be in any colour you like more than compensates for the fact that if you insert text the annotations don’t move with the text and are now in the wrong place.

I didn’t agree with the sales person!

I would rather have something in black and white that works properly than something multi coloured that doesn’t.

A random search (a clutching at straws exercise) pointed me at a potentially good solution for note taking which is the reMarkable tablet, but it is not available yet and it is expensive. If the advertising on the website is to be believed then they are trying to produce something which seems to fit almost exactly with what I want.

It is an e-reader, it can display PDF files (and e-pub files but I have no e-pub files), it cannot display plain text files which I think is a bad decision on the part of the designers.

There are an awful lot of legacy text files out there. But to be fair the text files could be printed to PDF files but this will increase their size.

Which brings us to the question of storage. The reMarkable tablet has 8 GB for storing documents and notes and drawings. This may seem like a lot but it’s only 100,000 pages. I can envisage filling that, maybe not very quickly but it is possible that I might be able to fill it up. There is no expansion, no SD card slot and the USB socket seems to be only for charging.

Once the storage is full you will have to either delete something or transfer something to a computer to make room for new items.

It is also big, just a little less than A4 size, 18 cm by 26 cm (7 inches by 10 inches). This is good for reading but definitely not pocket sized. What is needed is a small version which I could put in my pocket, 5 inches by 7 inches would be ideal, I wouldn’t use this as an e-reader just as a notepad.

Although it is not ideal it is far better than any other solution I have yet discovered so I ordered one. At the moment (in June 2017) there is a 33% discount on pre orders but I will have to wait five months, current delivery schedule is October but that keeps going up because demand is greater than their production rate.

If they had a pocket sized version then I would probably be ordering both the big and small versions, especially if they could transfer notes and documents between them.

I will write a review of it when I get it.





Ribbons, screens and links

Why ribbons?

A few of years ago Microsoft started putting ribbons on most of their applications and trying to promote them as a good idea, “this is the future” they said and many people believed them. On a lot of applications the ribbon is optional, you can choose to have the traditional menus and toolbars but on Microsoft applications the ribbon is mandatory whether you like it or not. But on a small screen a ribbon is a really bad idea, it takes up far too much room. If you use the keyboard shortcuts a lot then this is just wasted space.

The reason Microsoft are so enthusiastic about ribbons is that they see the future of computing in small mobile devices with touch screens, like the Microsoft Surface. With a touch screen you prod the screen with your finger. With a finger you have much less precision than if you are using a mouse or even a stylus, so the icons have to be bigger and have to be spaced further apart.

So the ribbon should have been optional on mobile devices with touch screens but instead Microsoft chose to impose it on everyone. It is puzzling why they have caught on as much as they have, I think this is partially due to the novelty value and partly because Microsoft are such a big company with a disproportionately large influence over the computing community that anything they do becomes a standard so they do not have to pay any attention to common sense or ease of use.

How to tame the ribbon on Microsoft Office

You can make the ribbon less obnoxious on Microsoft Office programs. At the top far right of the screen just below the window controls is a blue circle with a white question mark in it. This is next to a white up arrow. If you click on this up arrow the ribbon goes away until you click on one of the menu tabs at the top of the screen, then the ribbon you have selected appears until you have used it and then it goes away again. There is also something called the ‘quick access toolbar’ which isn’t used very much by most people.  It is usually at the very top of the screen but in the options there is a ‘quick access toolbar’ tab with a tick box to put it below the ribbon, from this screen you can also select which commands go on to the quick access toolbar.

I have put many commands on there, if I find that I am having to use the ribbons a lot then I put the commands I need onto the quick access toolbar and so it has grown until now it is almost all the way across the screen and it only takes up a small amount of vertical space. Microsoft are very good at designing user interfaces so I suspect this is deliberate and how the interface is supposed to be used but it is not obvious and a lot of people just don’t use the quick access toolbar at all.

High DPI Screens

I recently had to buy a new laptop because Microsoft destroyed my old laptop. When Microsoft destroyed my old laptop in the upgrade to Windows 10 (an upgrade which I did not instigate or desire) I needed to buy a new laptop. The one I chose has a very high resolution screen, the resolution is 3200 by 1800. I thought that having a high DPI screen would be a good idea, now that I have been using it for a while I think that perhaps it wasn’t such a good idea. The picture on the screen of the laptop itself is very clear and incredibly sharp but at a scaling factor of 100% the text is un-readably small, currently I have it set to 200% and this is still a bit small.

The problem is the scaling of text in applications. If the application doesn’t scale the text properly then you get microscopic text or on some programs the text does scale properly but the toolbar icons are microscopic. And some programs have not got the idea that a computer can have two different resolution screens, so windows and dialog boxes are scaled correctly on the screen that they were drawn on but if you drag them to the other screen some programs re-scale the dialog box or window properly, some programs don’t scale the dialog box so it becomes very small, some programs make the window or dialog box disappear whilst other programs just crash.

The problem is the new ‘Windows Presentation Foundation’ which is an API for rendering text and images on a computer screen. Somewhere between Windows 7 and Windows 10 it has been updated to include new features to handle the scaling of text and GUI elements, so programs which use the new features in the API need to be re-written, or at least the GUI needs to be re-written.  The change is not trivial, it isn’t just like compiling to a different library, the changes cannot be done automatically so the code needs to be edited manually to include the new features.

Of course all the Microsoft applications handle this correctly, as you might expect, but other programs sometimes don’t handle it quite as well. This has meant that some of my favourite programs either don’t work properly or are completely unusable on my new laptop.

I tried out a few of the programs I have been using and which I have used in the past using my laptop with it’s high DPI screen and a 1600 by 1200 monitor plugged into the HDMI port of the laptop.


Compendium ignores any scaling factors you have set on your screen and draws its user interface at the native resolution of the screen. The text and icons are microscopic and the program is unusable without a magnifying glass.  On the external monitor things are scaled to the same size but the pixels are bigger so that even with a magnifying glass it is unreadable.


WhizFolders scales everything correctly and works as expected.


VUE ignores any scaling factors you have set on your screen and draws its user interface at the native resolution of the screen. The text and icons are microscopic and the program is unusable without a magnifying glass.  On the external monitor things are scaled to the same size but the pixels are bigger so that even with a magnifying glass it is unreadable.  This has left me looking for a new mapping program, I relied on VUE quite heavily.

CMAP Tools

Because I can’t use VUE on my laptop anymore I revisited CMAP Tools, a program I tried a while ago, but alas CMAP Tools ignores any scaling factors you have set on your screen and draws its user interface at the native resolution of the screen. The text and icons are microscopic and the program is unusable without a magnifying glass.  On the external monitor things are scaled to the same size but the pixels are bigger so that even with a magnifying glass it is unreadable.


Scrivener draws most of its user interface correctly but the icons in the toolbar are now small and the text in the binder panel looks cramped, it has been drawn at the correct scale but too close together. This can be solved by switching fonts to a font which has a larger line spacing, Calibri worked on my system.  The toolbar icons in Scrivener were too large, having them much smaller is a little tiresome but not as bad as it would have been if the icons had started out at normal size, this problem is trivial.  Scrivener works well on a high DPI screen.


TheBrain scales its user interface correctly but cannot handle having two screens with different scaling factors.  If any of the panels are put into a floating window and dragged to the other screen then the program crashes if the scale factor is different on the two screens.  If the scale factor is the same on both screens then everything works as expected.


MyInfo scales everything correctly and works as expected.  Embedded OLE objects are rendered at the correct scale.

Ultra Recall

Ultra Recall scales its user interface correctly and works as expected apart from one problem.  Embedded OLE objects are rendered at a ridiculously large scale.  The developer said that he is using Internet Explorer to render the objects within Ultra Recall and so cannot do anything about the scale factor at which they appear.  However developers of some other programs seem to have been able to do this correctly.


Unfortunately ConnectedText has some problems with high DPI screens, the icons on the toolbar become microscopic and the titles of topics show only the top half of the text.  Apart from those problems it works correctly.  I still use ConnectedText despite the problems.

Essential PIM Pro

This is a curious one.  I was using Essential PIM Pro 6 which had all sorts of problems with scaling when I was forced onto Windows 10, so I wrote to the developer telling him what the problems were and he wrote back saying that ‘Unfortunately there is no way to overcome this problem’ which I assumed to mean that he wasn’t going to do anything about it and started looking for a new e-mail program but then just a couple of weeks later Essential PIM Pro 7 came out which solved almost all the problems.  He could have told me that the new version was coming out and to wait a little while but for some reason he didn’t.  There is still a problem with some of the text in some of the panels and dialog boxes looking too cramped, this could be solved by switching fonts but you cannot change the interface font in Essential PIM Pro like you can in Scrivener.

So, which laptop should I have bought?  Well I think there is an optimum screen resolution for each screen size, you want it high enough that the individual pixels are not visible but not so high as to cause the scaling issues detailed above, and for the external screen you want it to have enough pixels so that you can set the scaling factors to be the same for the two screens.  So the external monitor should be high resolution. But I am stuck with the monitor that I have (1600 by 1200) unless I want to purchase another one.

For a screen which is 13 inches between diagonally opposite corners I think the optimum resolution would be 1920 by 1080.  If the screen were bigger then the resolution could be higher to keep the DPI (dots per inch) the same.

Universal Links

I sometimes get e-mails about the blog and sometimes people put comments on my posts.  One thing that has been asked more than once is :-

“What is a universal link anyway?”

A universal link is a link to specific content within the file of an application.  For instance Essential PIM Pro allows you to copy a link which will point to a specific e-mail in a specific database created in Essential PIM Pro.  This can be activated from another application and will not only start up Essential PIM but open the specific e-mail to which the link points.

There is a protocol which the application needs to register with the operating system when it is installed, once registered if the operating system receives a link of the correct format it will pass the link to the specified application.

As an example of what they look like a link to one of the e-mails in Essential PIM looks like :-


the bit up to the :// is the string which is registered with the operating system, the rest is application specific.

As another example a link to a topic in my ConnectedText notes looks like :-


again the bit before the :// specifies the application to which the link points but the rest of it is almost human readable once you realise that ‘%20’ is the space character.

So a universal link is like a URL but it points to specific content within a specific application on the local machine.

Windows 10 ate my Laptop

A while ago Microsoft announced the upgrade to Windows 10, at that time I had no strong feelings about it one way or the other, as far as I was concerned Windows 7 worked very well and Windows 8 was an unmitigated disaster. When I found out more about Windows 10 my impression was that technically very good but as far as user surveillance and tracking was concerned it was much more intrusive and there were a lot of privacy issues that I was not happy with.

Now that I have used it for a while it has confirmed my initial feelings, before it seemed like a Windows 7 computer was your computer, whereas a Windows 10 computer is Microsoft’s computer which they are kindly letting you use.

Despite my misgivings I eventually upgraded my desktop computer to Windows 10 and all went well, I chose not to get a Microsoft account and learned how to switch off as much of the tracking as I know about and have access to, I also do not use Cortana which just seems like it refers every question to the Bing search engine. Well if I wanted to use a search engine I would use one without having to go through Cortana.

In a way my desktop machine was the least important of my computers, in years gone by it was the most important machine but Laptops have become ever more powerful until now I do most of my work on my Laptop and the desktop machine is just used for games.

The most important machine I have is my Laptop and that is where most of my programs are installed and where most of my documents, spreadsheets and databases reside. In theory my old Laptop was much more powerful than in needed to be to run Windows 10.

Because I have a lot of important documents and data on my Laptop I backed up my hard drive before trying to update to Windows 10, I took the hard disk out of my Laptop and made a clone of it on another slightly smaller hard disk.

It was fortunate that I did because the upgrade turned my Laptop into a brick. When I switched it on it would apparently boot up with the screen with the blue window and the dots chasing one another round a circle but then the screen would go black and the computer would become completely unresponsive.

When it became apparent that the upgrade didn’t work I swapped the backup disk back into the Laptop and investigated the problem. It seems that many people with Alienware MX series Laptop computers were having exactly the same problem.

Alienware is really Dell, when they brought out the MX series computers the Laptops of that series had Nvidia graphics chips but I don’t think they are the official Nvidia graphics chips. Certainly they were renamed and when Windows 7 tried to update the graphics drivers they were misidentified so several times I have had to use the safe boot option to boot with vanilla drivers and roll back the graphics drivers to the previous version. I learned not to update anything to do with the graphics drivers via Windows Update.

That safe boot option no longer exists in Windows 10.

So I decided not to upgrade my Laptop to Windows 10, I was perfectly happy with the way Windows 7 worked and decided to leave it.

However I reckoned without Microsoft’s dirty tricks.

I should have cloned the disk again because the clone now contained the corrupted version with Windows 10 installed, unfortunately hindsight is 20:20 vision and I had decided not to install Windows 10 so I thought I was safe. It was one of the things I intended to do at some point but never got round to it.

One day whilst using the Laptop a dialog appeared saying that the upgrade to Windows 10 had been scheduled for a date which I didn’t take note of, I just cancelled the dialog box assuming that this would cancel the purpose of the dialog. However unknown to me Microsoft had scheduled the update and cancelling the dialog did not cancel the scheduled update.

A few days later I had left my Laptop on doing a virus scan and had occasion to go out in the car to pick up my daughter from her boyfriends house. When I came back the Laptop was in the middle of installing the update to Windows 10, apparently it must have downloaded it and started installing it without my consent or instigation. I thought about switching it off in the middle of the installation but decided against it.

Right after the update Windows 10 appeared to work, for a short time, but then it updated its video drivers and the screen went to 800 by 600 in 256 colours. I thought maybe a reboot would clear the problem but instead it booted to the black screen after showing the screen with the blue window and the dots chasing one another round a circle. After booting up it was completely unresponsive.

So safe boot doesn’t work with Windows 10, the option to go back to Windows 7 only works if you have a working computer which can respond to the command and I have not got a DVD from which I can install a fresh copy of Windows 7, my laptop was provided without media, there was an emergency partition on the hard disk from which I might be able to install Windows 7 if the computer still worked and if it has not been erased by Windows 10.

As far as I have been able to find out the latest video drivers from Nvidia have been deliberately tweaked to fail on non Nvidia hardware, they say that their drivers work on all genuine Nvidia hardware. Alienware (aka Dell) say that they are not going to update the video drivers because the MX series Laptops are obsolete.

Thanks a bunch Nvidia!   Thanks a bunch Alienware!

In the future it is likely that I will have to purchase new computing equipment eventually either for myself or recommend which equipment is to be purchased at work, you can be sure that none of this equipment will be from Dell or Alienware and none of it will have Nvidia graphics cards. This is a pity because Dell do make good monitors.

And of course Microsoft are a big enough company that they feel that they can take a dump on individual customers with impunity, they couldn’t care less.  And they are right of course, there is little I can do to make my complaints heard, apart from expressing them in this blog.

Microsoft don’t seem to have any sort of complaints department, telephone line or e-mail address to handle unhappy customers. I think they know that if they opened anything like that it would get swamped immediately.

I have installed Ubuntu Linux on my old Laptop but of course it won’t run most of the Windows programs I know and love. So I ended up having to buy a new Laptop. And some of the software which I had on the old Laptop was licensed to that computer and won’t activate on the new computer. There are only a few companies who have licensing as pernicious as Microsoft and so only one other non-microsoft program was shackled to the old laptop. But I have had to buy a new copy of Microsoft Office (2013, not the 365 rental version).

So this supposedly ‘Free’ update to Windows 10 has cost me £550 for a new Laptop, £100 for a new copy of Microsoft Office and £60 for replacement of other software.  Over £700 !

Needless to say I am unimpressed with the way Microsoft have rolled out (some might say steamrollered out) Windows 10.


Where is the Computer industry headed?

If you’re a computer technology enthusiast who keeps an eye on developments in the field, especially someone who has been an enthusiast for many years then you’re probably not very happy with the way things are heading.

Everything that computers once stood for, everything that once made them great and exciting as a hobby has been hijacked by big business who are intent on controlling what goes on in your computer and turning you into a ‘user’, i.e. someone who doesn’t understand or even care what is happening inside their computer.

It’s not difficult to pick out the one phenomenon that people like to complain about, the one thing that people love to hate and accuse as being responsible for all the computer world’s biggest problems.  I speak, of course, of Microsoft, and more specifically, the Windows Operating System.

I had a first encounter with Windows 8 over Christmas, I was not impressed.  What were Microsoft thinking?  Don’t they have a quality control department?  Don’t they test the software before releasing it?  (perhaps not, look at Windows Vista!)

Can you imagine the conversation that took place?

Marketing guy : “We need something novel and innovative to differentiate this operating system from the previous one!”

Programmer (making a joke) : “We could give it a mobile phone interface, they are really popular these days”

Marketing guy (being serious) : “Great, that’s a really good idea!  We’ll do it!”

Programmer (in panic) :  “Hang on a minute, it wasn’t a serious suggestion!”

Marketing guy :  “Nonsense, I think its a great idea …..  ”


Windows just keeps getting worse and worse.  Far from benefiting yourself by upgrading, you are taking a big risk every time you upgrade to a newer version.  You will find it takes up more disk space and RAM, the applications you use might not work (or they may work fine but you need to buy a new license because your hardware has changed) and the most absurd thing of all is that nothing will be different.

Windows 98 did not have any significant improvements over Windows 95, nor did Windows 2000 improve significantly on 98.  XP was a little bit more stable but Vista got back to the usual standard, nice shiny graphical interface behind which the software was riddled with bugs, apart from all the bugs it took up far more computer resources and memory to do what is essentially the same job at the same speed.

Vista was just the Alpha version of Windows 7 and so the public could pay for the privilege of testing it and finding the bugs for Microsoft.  Windows 7 was better but now they need something new to try and make people want to upgrade their PC.

What few people seem to realise is that this is just what the computer industry wants.  The hardware industry produces faster machines with more memory and more disk space whilst the software companies produce bigger, slower more bloated software to neutralise all these advances.  The users end up having to upgrade all the time just to continue doing the same things as they were doing with their old hardware and software.

You already know all this.  It has been repeated time and time again by many people in the industry, and so it would be rather fruitless to dwell upon it yet again.  So how did things get this way?

Windows has been a messy, bloated operating system from its very first version.  The very first versions of Windows (versions 1.0 and 2.0) were awful, they worked intermittently if at all.  But by the time it got to version 3.0 Windows was relatively stable and usable.

And back then it was understandable.

Understandable in a technical sense, that is.  In the early 1990s, in the age of Windows 3.0 and 3.1, Windows could be mostly understood.  A power user could identify every single file that Windows shipped with and what that file’s function was (and Windows came with a lot of files).  Windows 3.x was an operating system that a normal human being could comprehend. Furthermore, it did not do much ‘behind the scenes’ work, at least not nearly as much as Windows 95 and beyond.

When Windows 3.x did something, you probably already knew about it, because you would have ordered the computer to do it yourself.  At that time it was the user in control of the computer, not the other way round.

Windows 95 changed all that.  Windows 95 did a lot of things ‘behind the scenes’ in a way that was simply annoying.  There were little things constantly going on inside your computer which you didn’t know about and hadn’t asked for.  Sudden, brief periods of hard disk activity, even when nobody is using the computer, was a sure sign of this.

I still remember the day in August of 1995 when Microsoft released Windows 95, the advertising promised it would change the face of computing forever, and indeed, it did just that.

On that day, reading about the new features of this revolutionary OS, I felt an impending sense of doom for my hobby.  It seemed that computers, as a whole, were becoming ever more automated.  User friendliness is all well and good but it seemed to me that control was being taken away from the users, more and more of the inner workings of the machine was being hidden and ‘protected’ from the owner of the machine.

As time went on, the trend of increasing user-friendliness began to take on new and sinister facets.  Foremost among these was the trend towards corporate domination.  As the Internet continued to grow in mass popularity and operating systems became increasingly elephantine and incomprehensible beasts, users seemed to be losing control over their computers, and the computers (or more specifically, the companies who wrote the operating systems running those computers) seemed to be reversing the role, controlling the user rather than the other way around, by spying on the user’s browsing habits and preventing them from having direct control over many aspects of their computers.

Although this was largely a software trend propagated by inflexible operating systems, it also had an effect on the way hardware was designed, by hardware companies manufacturing non-standard, under-documented hardware that was deliberately difficult to reverse-engineer to the user’s own preferences.

Before Windows 95 it was possible to put together a homebrew interface to some weird device you built yourself and control it from the computer with a little program you hacked together yourself.

Most hardware which you bought to connect to your computer had understandable well documented interfaces, so that if you wanted to do something unusual with them you would write a program to control it yourself.

After Windows 95 all hardware had to be controlled through a ‘device driver’ and this was not the domain of the home constructor.  If you tried to access your hardware directly the processor generated an exception and halted your program.  Around this time manufacturers increasingly started to hide the details of their interfaces.  Computers were moving away from being a hobbyist device and increasingly enthusiasts were forced out.

There was a general dumbing down of people’s knowledge.  One example of this is Microsoft’s ‘Internet Explorer’ program, which is just a web browser but in the minds of many people the World Wide Web became synonymous with the Internet. There is a lot more to the Internet than the World Wide Web but nowadays many people don’t even know anything other than the web exists.

Microsoft, and people who side with Microsoft will say that this is done to make the computer easier to use.  The end user does not want to know about all those silly little technical details.

The user is just that: Someone who uses the computer, and wants to use it with a minimum of complications doing only the things that the software authors allow them to do and nothing else.  Yet the truth is that most people who used computers back in the early nineties had enough expertise to get into Windows 3.1 and use it and its applications without many complications.

Now there are many more people using computers and increasingly users are deliberately excluded from the technical aspects of computing.  The main people who try to persuade users that learning about the technical aspects of your computer is ‘too hard’ are the people with a hidden agenda.  The software company that wants to convince you to buy their new software because it is more user-friendly.

The problem is further inflamed by the software vendors interests in concealing the inner workings of their software, afraid that other companies and technically savvy users might copy their ‘intellectual property’. They don’t want people to understand and so documentation has become trivial, lacking any depth, it is mostly on how to use the software, rather than anything that an enthusiast would want to know.

Apart from playing games a 5-year old computer running a basic set of applications could do everything that a ‘normal’ person wants to do with their computer.  The typical human’s needs have already been surpassed long ago, and that leaves the computer industry with a big problem.  They have to try to stimulate artificial demand using hype and advertising.  Novelty is king.  And so we end up with an operating system with a stupid mobile phone interface which does a very good job of excluding the user from any of the technical aspects of the machine whilst enforcing corporate DRM and controls on anything you do with your machine.

The role of advertising in the computer industry is exactly the same as it is for the rest of the retail industry, it is there to make you dissatisfied with what you already have so that you will go out and buy something new even though it might not be necessary.

Linux is starting to look more and more inviting with every ‘innovation’ perpetrated by Microsoft!