What made the Molecular 18 and LOS special
At a time when most computer systems had a "use by" date of a year to two, many Molly owners held on to their system for as long as possible - why? Perhaps because its capabilities were aimed precisely at their level and were wholly relevant to their requirements. People living in their world, not in some remote corporate or academic tower had designed their system. When their needs grew, the Molly came up with a trick or two to break through what had previously been considered its limits.
In this environment assemblers, compilers, "high-level" languages and all such technicalities were not an issue. What mattered was performance. Better reliability would have helped (but this was before the Japanese showed us what could be done, so nobody expected perfection). It wasn’t until the 1990s that Unix boxes began to match the Molly for performance, and thereafter reliability became the issue.
LOS was designed on the principle of co-operation, both with the Molly’s hardware and with the programmer, to produce what was wanted. In return it was not an option to divert resources to protect programmers from themselves (or from their colleagues). This didn’t just work; it often opened up possibilities well beyond what had originally been proposed. Of course, some programming developments were justified as they improved productivity. Such as the ability to load or edit a program while seated at a keyboard, rather than by flicking it in bit by bit standing at the control panel switches.
Elsewhere in the computer industry, manufacturers were rapidly evolving hardware to meet the perceived needs of the software developers. These people would always want more, no matter how much you gave them, so the industry was on to a sure thing, which continues to this day. At BCL we wrote our programs in pencil on foolscap coding sheets. Each of these sheets held 64 steps (instructions), and 32 sheets were needed to fill a whole Task Partition (2K words). If you laid these out on the desk, it looked a formidable amount of coding space (especially when you recall how much rubbing out you did before it was finished) - who could possibly need more?
At one extreme we might cram several little programs into one "module" and at the other a complex program (invoicing was usually considered "complex") could span several modules by fetching in the relevant code as required. Who could possibly need virtual memory when managing real memory was so easy? Also, the Task Partition had a specific layout, and we had to allow for workspace for our variables, so that not all of the partition could be filled with code. Incidentally, pre-defining the layout greatly reduced the amount of parameter passing between subroutines. Several operating system services took no parameters, e.g. the single instruction "JSBR IZ 1651" was sufficient to "spool and post" a simple job to a print queue. Knowing where our variables were in the partition meant that we could find them, watch them and even change them by running the programming program from another keyboard. We could modify the very code too, even whilst it was being executed! Others (encumbered by text editors, compilers and linkers) could only dream of such capabilities, but we had it all!
The processor’s instruction set barely changed and neither did its speed. Memory reference instructions took 2.4 microseconds in the 1970’s and they still took 2.4 microseconds in the 1990’s. Actually, 2.4 microseconds were pretty good for the 1970’s. Although we didn’t realise it we were already using RISC technology (albeit because the Molly never "progressed" to micro-code). The 1K word page-addressing scheme never changed either. It is this aspect which stymied the development of an effective Assembler (let alone high level language). Try it and you’ll soon get into big problems! Whilst other manufacturers implemented more flexible addressing schemes in hardware, BCL meandered occasionally down this software dead-end.
The "Metacode" character set was an interesting case of making full use of the seventeen-bit word format. The cube root of 217 is just over 50. This means you can pack three characters into one word if you restrict the character set to 50, which is enough to be useful (in the 1970s respectable computers disdained lower case lettering anyway). To pack three into a sixteen-bit word you must restrict the set to 40 characters.
The hardware implementation of the "Greater Than" flag deserves comment, as it chose to ignore the sign bit thereby introducing inefficiency in software when working with signed numbers!
There are a couple of switches on the control panel that deserve "special mention". Interrupt if MA=SW allowed us to set a breakpoint whilst debugging a program. Alternatively, it could just as easily catch any read or write of a program variable. I have yet to meet any modern debugger offering this sometimes very useful service. The apparently useless (to software) Continuous Interrupt switch found its niche as a mains-off emulator. The real mains-off interrupt was supposed to trigger when there was enough juice left for maybe 300 instructions (I forget the exact timing). When it actually triggered depended on the setting of a variable resistor somewhere. Occasionally this resistor would be wrongly set and the juice ran short, so the Continuous Interrupt switch offered a stopgap means to switch off until it was fixed (a job very definitely left to an engineer). In the 1970s we always advised switching your Molly off at night: someone did leave one running on a first floor and next morning the staff found it in the basement with their smouldering building around it.
Even with the mains-off interrupt triggering correctly, LOS had to keep all its interrupt handlers short in case a mains-off became pending (it was not practical to interrupt an interrupt handler). Watching the system pick up after a power cut was quite a thrill, so how did it work?