What about the software?
Computer equipment includes the computing machine (hardware) and the programs that run on the machine (software).
The computer itself without software is not designed to serve any specific use case or task. Which makes software a very important if not the most important part of the computer, turning it into the most multifunctional device ever imagined by the human mind.
Software is a set of instructions for automatic processing of data in the form of linear logically connected orders, independently executed by the computer to solve a certain problem. With the ue of software programs the user is freed from having to manage the machine hardware directly. Software is also an array of actions the machine performs on its own in a certain sequence. These are for example the functioning of a robot, the working of the traffic lights, processing and recording of various data.
Many programs run on each computer’s hardware. Alongside user programs, which are meant to perform tangible tasks and with the help of which the computer can do certain work for us (i.e. writing, playing a video or drawing), we must not overlook programs, which have not visible effect for the user, but are vitally important. These are so-called system programs, which make the computer operational when turned on – take care of basic functions of the computer system – or run in parallel to user programs (i.e. saving data, managing the drivers for all the plugged in components and communication with the user).
For the curious:
It’s interesting that the word software does not contain the meaning for what it describes. The French mathematician Paul Niquette coined the word software in 1953 as the opposition to hardware (in 16th century English hardware was used for manual metal tools and cutlery).
The British mathematician Ada Lovelace is the one we consider »the mother of software coding«. She published the first software program (more precisely an algorithm for calculating the Bernoulli numbers), which should have been executed on Charles Babbage’s mechanical analytical machine in 1842 but Ada never saw it happen in her lifetime. The idea of a general computer and related system for theoretical development of a computer program was first Proposed by the British mathematician Alan Turing in 1935. The first person to execute a computer program on a contemporary, electronic computer was the German inventor and computer scientist Konrad Zuse in 1941. Because software was inseparable from hardware at the time, Zeus is named the father of contemporary computers.
An Algorithm is a set of precisely defined instructions for solving a certain problem, that takes input data and transforms it into the desired result, much like a lasagna recipe. If the algorithm is executed by a computer, we talk about a computer algorithm.
What makes a good algorithm
- Input and expected output is precisely defined.
- Each step of the algorithm is clear and unambiguous.
- The most efficient out of many possible ways to solve the problem needs to be chosen for the algorithm.
- The algorithm should not include program code. It needs to be written so that it can be used in many programming languages.
Turing’s machine is a simple construct of a computing device (a mathematical mental model), envisioned by the British mathematician Alan Turing in 1936. He proved that with it one would be able to obtain the results of any solvable calculation.
The Turing machine can perform any algorithm. It is built froman endless memory tape, divided into individual cells. A read-write head moves across the tape and can read from or write in any of the cells one of the possible preset symbolica values. The machine stores the internal state (a predefined set of responses to the contents of a cell) and can perform a limited array of operations over the cells on the tape: reading, writing, deleting and moving one cell left or right.
For the curious:
The word algorithm comes from the name of a 9th century Arab mathematician Al – Khwarizmi, whose name vas latinized into Algoritmi. He authored the algorithms for basic mathematical functions and his most important book Kitab al-Jabar val-Mukabala (The Rules of Reintegration and reduction) served as the basis for standardizing arab numbers in European mathematics. Šart of the book’s title Al-Jabar awas later interpreted as algebra.
For better understanding of algorithms try programming without a computer (CS Unplugged).
The simplest version of the Turing machine uses only two possible symbols for the inscription on the memory tape, represented by the numbers 0 and 1 and two states A and B. Integers 3 and 4 can be written on the tape in such a way, that we represent each of them with a corresponding number of ones, and separate them with a zero:
| 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
With the use of basic commands we can then write down the algorithm for addition:
- If the machine is in state A and reads 1 -> it moves to the right and stays in state A
- If the machine is in state A and reads 0 -> writes 1, moves right, changes state to B
- If the machine is in state B and reads 1 -> moves right, stays in state B
- If the machine is in state B and reads 0 -> moves left, erases 1, stops
We can also write down the algorithm in a table represenation:
State / Symbol
RIGHT / STATE A
RIGHT / STATE B
RIGHT / STATE B
LEFT / ERASE / STOP
Try on your own! Perform the algorithm step by step to get the result.
Begin executing the algorithm on the left, the machine is in state A and reads 1
| 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
According to instructions it moves to the right and stays in state A
| 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
…Repeats until it reads 0
| 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
When the machine is in state A ad reads 0, it writes down 1, changes state to B and moves right
| 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
When the machine is in state B and reads 1, it moves right and stays in state B
| 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
…Repeats until it reads 0
| 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
When the machine is in state B and reads 0, it moves left, erases 1 and stops.
| 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Now we can read out the result by counting the number of ones.
The machine calculated the correct result. What is important though is that the machine did not understand the concept of addition, it merely followed the instructions diligently.
With our working model of a traffic light you will quickly learn the programming basics as you write the code and execute your first program. Try adapting the code to make the yellow light blink continuously.
The development system is based on an educational single board computer Raspberry Pi 4 (2019) running the Raspberry Pi OS (2022) Linux distribution. The 3 LED lights in the traffic light housing are switched on/off using 3 relays controlled by the Arduino Nano microcontroller platform (2008) to which your program is sent via USB. The Arduino has its own development environment where code is written in a dialect of the popular C++ programming language. Use the “digitalWrite” command to turn individual lights on (HIGH) or off (LOW) and the “delay” command to make the program wait before the next command – the delay time unit is in milliseconds (1000 = 1 second). To run the program, press the round button with the arrow icon. If you want to reset the code to the original, close the Arduino IDE window first, then doubleclick on the desktop icon labeled “reset-code.sh”.
Machine language was created with the first computers and is closest to how the computer likes to understand things. It is the most basic written program in the form of zeros and ones, which directly represent the building blocks of a control unit and the computer can perform the written instruction directly. The instructions in machine software reflect the inner structure (architecture) of a computer, so each computer has a different machine language. Programming in machine language is time consuming and far from human-readable and thus only rarely used in practical applications.
Symbolic machine language – Assembly language
In Assembly the series and ones are replaced with mnemonics (names of instructions, which are easy to remember), memory locations (instructions for finding information inside the computer) and symbols. Such linguistic accessibility enables the programmer to have a better overview over the basic structure of the program, but programming in Assembly is still relatively demanding. The first assembler was invented by mathematician and computer programmer Kathleen Booth in 1947 for the computer ARC2. today Assembly is still in use where the speed of execution is of top importance, for instance in the most basic operations within an operating system. Assembly also represents a tie between machine language and higher programming languages.
We call machine code and assemblers first and second generation programming languages, because they share the characteristic of being made in the measure of the machine, while higher languages of later generations strive to be made with humans in mind first.
For the curious:
The set of machine commands in assembler are called Instruction Set Architecture (ISA) and are highly dependent on the actual architecture of the central processor itself. ISA consists of either many specialized commands (Complex Instruction Set Computing or CISC), or a few simple commands which are executed much faster (Reduced Instruction Set Computing or RISC). The most common architectures today are Intel X86/X64 in personal computers built on CISC principles, and ARM in mobile devices, which follows RISC principles.
The swift increase in hardware capacity triggered a growing complexity of software and programming in machine code was quickly becoming practically unmanageable. BEsides software written for one computer wasn’t really transferable to other computers. So early on an idea arised of an abstract programming language, similar to human language but translatable into machine code by the computer and applicable to many different computers.
The first attempt at such an invention was Plankalkül, conceptualized by Konrad Zuse in 1941, but never actually constructed. The first practically useful programming languages came about in the 1950ies, such as Short Code (1949), where each instruction individually was translated into a string of machine commands (the principle of »interpretation«), in Autocode (1952), where the whole program was translated at the same time and then was executed at the same time the principle of »translation« was used in such code compilers. In 1954 IBM developed Fortran to serve the needs of scientific computing (the name comes from Formula Translation). This was the first widely used higher programming language. Then in 1959 the CODASYL consortium created Cobol for the needs of business computing. Both of these languages are still in use extensively today.
At this point the development of software became independent from the development of hardware and at the same time a shift was beginning to take place in the relationship between the computer and the user. The working of the machine began to adapt more and more to the human users and not the other way around as it was common before.
Many languages, created in this early period, are still being used today, partially because of historic reasons, partly because of a tried and tested reliability of their execution. Besides Fortran and COBOL, Algol, LISP, C and BASIC stem from the 1950ies and 1960ies and practically all of them are still influencing the development of computer languages today.
For the curious:
Software written in higher computer languages need to be translated into machine language before running. We can do this in two ways.
- The compiler translates source code written in a higher programming language into machine code. This creates a working executable program, we can run as many times as we wish. The first compiler was written by mathematician and United States Navy rear admiral Grace Hopper in 1951. She also coined the term »compiler«.
- The interpreter continuously translates and executes command after command from higher level into machine language. This process is slower than using a compiler, but has the benefit of not needing to wait for the whole program to be translated (compiled) at once before the start of execution. Interpreting takes place each time we run a program anew.
BASIC (Beginner’s All-purpose Symbolic Instruction Code) is a programming language for beginners, which was developed in 1963 as a prop for teaching nonscientific students how to program. Because its interpreter took up very little space, it later became a very popular built-in language in the home computers of the 1980s because it could be “baked” into the read only memory (ROM) and thus available immediately at the startup of a computer..
Try yourselves in the basics of computer programming in the original ZX Spectrum BASIC, run by the ZX-UNO emulator. For help with command entries ask those born before 1980, because commands are not entered letter by leter but with unique combinations of keys on the keyboard. The simplest example is the Hello world!:
10 Print »Hello world!«
20 Goto 10
Slightly more complex examples can be found in the book Mirko tipka na radirko (Slovene Mirko typing on rubber), published by Moj Mikro in 1985.
Websites are not text and pictures, as it seems at first glance, but complex software programs, written in many programming languages. The web browser (Chrome, Safari, Firefox, Edge …) runs on our local machine and shows the result on our screens. And all software can be reconfigured, altered, for good and bad.
Did you know web browsers contain options with which you can alter the currently shown content of a web page? Right click to get a menu, where you select »View source«, to see the source code in HTML language. Even more powerful tools hide under the mysterious »Inspect«. Web developers use this tool to aid their development work, but nothing can stop us from for instance editing the news on a technology forum like Slo-Tech to include the latest amazing achievement of the local astronauts and showcase their landing on the Moon.
Dare to explore?!
Right click on the title of the first piece of news on the page and then select »Inspect« from the menu. A complex tool box for viewing and editing various components of the website including the HTML code opens on your right. Find the code segment that holds the content of a news title and doubleclick on the it to enter a new version. Then hit »Enter«. It’s that simple!
Of course the change is only active for the current local display of the website on the computer in front of you.But if somebody takes a screenshot and shares it over social media, this can trigger a fake news tsunami pretty easily, so it’s always wise to check the cited source of news directly.
Now try altering the body text of the article. After right clicking »Inspect« click on the down arrow to expand the HTML element
, hiding the content of the article. If you want to delete a certain element of the page it suffices to press the »Delete« button.
For consideration: the ease of meddling with the integrity of a webpage means many things both for the user as for the web developer. If in the last few years you’ve been phoned by someone who in heir bad english requests access to your computer there is a danger your were a victim of fraud, where the fraudster used the »Inspect« function to fake your bank statements showing you received certain funds by mistake and urged you to make up for the difference in cash or coupons. Only if after that you used the »Refresh« function in your browser, to load the web content from the server anew, you would see you’ve been fooled.
More info: varninainternetu.si
We usually equate coding with endless lines of typed code glaring greenish on a black background screen over screen. The biggest programs are built out of hundreds of millions of such lines of code in various text based coding languages. But we also know programing languages, where coding is done with visual sequencing of building blocks representing individual operations. Advantages of such programming are better visibility, decreased chance of mistakes, real time results and simple learning process. All this is present in Scratch, a visual coding language with which we can create digital narratives, games and animations. It is intended for young audiences, from 8 to 16 years old, but its enthusiastic users include quite a few adults.
Let’s dive in. We start Scratch on a computer or we open https://scratch.mit.edu/create in a browser. We are greeted by an interface, divided into a bigger space containing the code editing area, graphic elements and sounds and two smaller spaces. One of them hosts a stage, showing us the current result of code, the other is meant for editing characters and backgrounds. A cat says hello from the stage and awaits our instructions.
To begin, find a block labeled Videzi (appearances) in the code editor and in that fold, search for the building block »Reci živijo za 2 sekundi« (say Hi for 2 seconds). Use the mouse to drag ir into the empty space for the program building. Click on it and observe what happens on the stage. We can change the block text by clicking on it and editing it. Similarly we can change the greeting duration. We can add movement to the cat by selecting »Pojdi 10 korakov« (go 10 steps)in the fold Ginanje (movement) and drag it near the »Reci živijo« (say hello) block, so a gray shadow appears below it. Then we drop the block and make sure it sticks. Click on the connected blocks and observe the cat on the stage.
Let’s add a new character. Click on the cat head in the bottom right corner or the partition with characters. Each character has its own program, which we can access by selecting the desired character from a list. If we want the programs of several characters to execute simultaneously, we have to add a start block »Ko kliknemo na zeleno zastavico« (when we click on the green flag) from the fold Dogodki (events) to all of the programs’ beginnings. The programs can now be started by clicking on the green flag above the stage.
Now that you have some orientation around the basics, feel free to explore the rainbow of content available in the menu Vadnice (exercises).
Nowadays it’s clear Linux has won. It runs on the best supercomputers, on billions of smartphones and a sizable chunk of web servers. BUt two decades ago that was not so obvious. It took a lot of effort from the LUGOS society and the Pingo Linux team in collaboration with the publishing house Pasadena to make the slovene, free and open source Linux a staple in our schools and homes.
Pingo Linux 1.0 (2000) with KDE desktop in slovene was based on the Red Hat 6.2 distribution. Since 2002 Pingo Linux has been installed as the dual boot option on all new computers the Ministry of Education co-finances for the educational institutions. In the years to follow versions 2.0, 3.0, 4.0 and 4.1 materialized. Today all the most popular distributions of Linux, i.e. Ubuntu, are available in Slovene from the get go.
System specifications: AMD K6-2 @ 400 MHz, 32 MB RAM, 6 GB hard disk
Use an actual dial-up modem and surf the web archives from 1997 using Netscape Navigator 4.0, poke Clippy in Microsoft Word 97, browse articles in Encarta 97 or play the Slovenian platformers Willybald (1998) and SKB Papi (1998).
You will be able to hear the beeps and noises old dial-up modems made when connecting to the internet of yesteryear. Don’t worry, our nostalgia experience plan provides unlimited minutes. Wait around 60 seconds and you can visit archived websites from 1997 from The Internet Archive provided by the theoldnet.com proxy.
We installed Slovenian Windows 95 together with Slovenian Offfice 97 (the first version featuring Clippy) and put Encarta Encyclopedia 97 (the way schoolchildren and curious minds explored different topics before Wikipedia) in the CD-ROM drive.
For the little ones, there’s SKB Papi, one of the first Slovenian games made to advertise a product and the platformer Willybald, both designed by Dušan Kastelic.
System specifications: Pentium 133 MHz, 16 MB RAM, 1,19 GB hard drive, S3 graphics, Creative Sound Blaster 16 sound
Discover the unique EVA word processor, play the original Wordle – Lingo and beat the opponent in Tarok for DOS. For the little ones there’s fun math challenges and you can also look through the Slovenian crossword dictionary or estimate the cost of placing a call from Ljubljana to Koper.
Primož Jakopin is a Slovenian computer scientist and linguist, known for his work on word processor software for multiple platforms: INES (ZX Spectrum), EVA (DOS) and STEVE (Atari ST). We have included a sample file alenga.eva which you can open in this program.
Many years before the popularity of Wordle there was a TV quiz show called LINGO, which was produced in many countries under license, including Slovenia and was so popular that multiple homegrown video game implementations exist. The one presented was made by Jože Starič. We were also able to dig up a forgotten DOS version of Tarok the card game for DOS made in 1987 by an unknown author.
System specifications: IBM PS/1 model 2133-154, CPU 486SX @ 25 MHz, 8 MB RAM, 170 MB hard drive
Used to be copied from floppy to floppy, now presented together – Tarok (local trick-taking card game, 1995), Ena (card game similar to UNO, 1993, Robert Turnšek) and Vislice (Hangman word game, 1991, Miha Mazzini). We also installed the first Microsoft product in Slovenian, Words 6.0a (1995) and the demo version of Amebis’ “bookshelf” reference library (1997).
The installed Windows 3.1 operating system is the “Central and Eastern Europe” edition with support for the Slovenian locale together with our special characters ČŽŠ. You can find Word 6.0a in the “Microsoft Office” folder – it took 6 floppies to install it. Amebis’ reference library “ASP” used to be a popular way to access dictionaries and other resource libraries. In the “Games” folder you will find two locally popular card games – Tarok and Ena.
System specifications: IBM PS/1 model 2121-682, CPU 386SX @ 20 MHz, 6 MB RAM, 163 MB hard drive
Primož Jakopin’s word processors are legendary and we’re happy to display not only the MS-DOS version called EVE but also the Atari ST version – STeve (1987). It was used in many educational and research settings, including its home base, the Faculty of Arts, Ljubljana.
The user interface of this program might seem simple at first, but features many useful functions for efficient text editing. One can compress the text file size up to 30% which might not sound useful today but at the time, editing 100s of pages on a computer was not trivial due to the limited RAM and storage space. It also supports the Slovenian characters ČŽŠ and many other local character sets, including support for custom ones if you want to write in some obscure scripts from the past. Using simple text markup one can also build a searchable database and the features go on… because of that, a 250-page user manual was included with the original distribution.
System specifications: Model 1040STF, 1 MB RAM, CPU Motorola 68000 @ 8 MHz, 3,5’’ 720K Floppy disk drive
Did you know digital cameras are actually fully fledged computers that can even play video games? We got the classic first person shooter Doom (1993) running on one. A work landline phone also hides a computer inside and can play Doom. Why read books on your e-reader device when you can – play Doom? Scientific calculators can also be used more productively to play Doom. Still using your iPod mini to listen to music? Why not play Doom on it instead?
None of the devices on display have had their hardware modified in any way, but are being put to new uses by a dedicated community of hackers and modders. The original software is put to new use either by developing new programs for it or if it’s locked down, security vulnerabilities are exploited to execute arbitrary new code.The classic DOS game Doom beion open source makes it a good choice for enthusiasts to prove their programming skills by running it on previously unsupported hardware – for them it’s considered “Hello World 2.0”.
The type of people who get intrigued by programming are the ones who enjoy building new worlds, tinkering, playing with logic…But the ones who stay and thrive are the ones who can survive something much more mundane and grinding: looking for errors.
Even though they’re called “programmers,” when they’re sitting at the keyboard, they’re quite rarely writing new lines of code. Most of the time, they are finding bugs.
For the Curious:
One early use of the term was in 1876, when Thomas Edison complained about malfunctioning telegraph equipment he was developing. (“Awful lot of bugs still,” as he wrote in his notebook later while working on glitchy incandescent lights.) The phrase entered the lore of programming on the 9th of September 1947, when the programmer Grace Hopper traced an error on the huge Mark II machine to a dead moth seeking the warmth in its internal components, that got pinned inside a relay and prevented the electromagnetic switch from closing. The insect was carefully removed and taped to the logbook, and added a note saying “first actual bug being found”.
A bug is an error in your code, something mistyped or miscreated, that throws a wrench into the flow of a program. They’re often incredibly tiny, picky details.
The joy and magic of the machine is that it does precisely what you tell it to. When a coder’s instructions are in error, the machine will obediently commit the error. And when you’re coding, there are a lot of ways to mess up the commands. Perhaps you made a simple typo. Perhaps you didn’t think through the instructions in your algorithm very clearly. Maybe you referred to the variable numberOfCars as NumberOfCars—you messed up a single capital letter. Or maybe you were writing your code by taking a “library”—a piece of code written by someone else— and incorporating it into your own software, and that code contained some hidden flaw. Or maybe your software has a problem of timing the code needs Thing A to take place before Thing B, but for some reason Thing B goes first, and all hell breaks loose. There are literally uncountable ways for errors to occur, particularly as code grows longer and longer and has chunks written by scores of different people, with remote parts of the software communicating with each other in unpredictable ways.
The World's Most Costly Software Mistakes
Software spec didn’t include ‘-‘ after ‘r’ (for average radius) in some equation, causing the rocket to steer off course and had to be destroyed 212 seconds into flight. Science-fiction author Arthur C. Clarke described the error as “the most expensive hyphen in history”.
$169M in today’s dollars / 160M EUR
Bug in a virus
A Cornell University student created a worm as part of an experiment, which ended up spreading like wildfire and crashing tens of thousands of computers due to a coding error. Morris’s coding mistake, in instructing the worm to replicate itself regardless of a computer’s reported infection status, transformed the worm from a potentially harmless intellectual and computing exercise into a viral denial of service attack. The Internet was partitioned for several days, as regional networks disconnected from the NSFNet backbone and from each other to prevent recontamination whilst cleaning their own networks.
$10 million estimated cost / 9,5M EUR
Design for original Pentium didn’t download to the etching machine correctly, causing a 1:360 billion chance for miscalculation.
Only certain combinations of numerator and denominator trigger the bug. One commonly-reported example is dividing 4,195,835 by 3,145,727. Performing this calculation in any software that used the floating-point coprocessor, such as Windows Calculator, would allow users to discover whether their Pentium chip was affected.
$475M in replacement chips / 450M EUR
Metric vs. imperial units
US vendor for the thrust firing impulse measurement used imperial units, NASA used metric system for everything else. So the orbiter entered Mars atmosphere at the wrong angle and crashed.
$320 million / 300M EUR
Famous investment bank developed a new feature to replace old functionality, but kept the old code in the programe under a different setting. Then someone deployed the new program on 7 servers and accidentally the old code to the 8th server. In high frequency trading environment.
$440 million and a bankruptcy / 415M EUR
Poor code management
Developers left some guidance code from Ariane 4 running, which wasn’t needed but it eventually crashed in the new environment and blocked the whole system.
$370 million for rocket and satellites / 350M eur
Early programers had to save space wherever they could, so it was customary to only use last two digits when storing and processing years. Nobody knew how much of that “optimized” software is still around exactly when the millenia turned, which would make all those programmes behave like they went back in time 100 years. This was a major topic in the media for years, but fortunately almost no serious problems occured at new years 2000.
The USA spent vast quantities to address the issue, with some estimates putting the cost at $100 billion.
Software deliberately created to cause disruptions on a computer (server, client or network), harvest private data, gain or enable unauthorized access to information or systems, disabling the user access to information or obstructing safety systems is commonly referred to as malware. But in everyday conversation, we most often talk about computer viruses.
A computer virus is a program, which behaves similarly to a biological virus: it needs a host program, it requires some activity from the user to spread from one system to another (so it often camouflages as a different type of program or data), it attaches bits and pieces of it’s own code to other files or replaces files with copies of itself.
We know also worms, which in contrast to viruses do not require users to do anything before they activate. Worms are entirely self-sufficient malware, which copies itself and propagates from the moment it penetrates a system. A trojan horse with bac doors is one of the simplest, yet most dangerous trojan programs. It can load other malicious software into a system or at least ensure the system becomes susceptible for external attacks. Backdoor entry is often used by botnets which turn a computer into part of a zombie network used for further attacks, all without the user’s knowledge.
Malware has been part of our reality for over 60 years, but what once was just cyber vandalism quickly grew into cybercrime that can affect just about anyone.
The first mention of a computer virus came at the end 1940ies in lectures by the mathematician John von Neumann and wasn’t published until 1966 in an experimental article titled Theory of Self-Reproducing Automata. He speculated about the ability of a mechanical organism – such as a part of computer code – to reproduce and infect hosts just like biological viruses.
The first computer virus Creeper was coded by Bob Thomas from BBN in 1971. Creeper was designed as a safety test to see if self-replication of software is a feasible idea. With each new infected hard disc Creeper removed itself from the previous host. Creeper had no malicious intents, it only flashed a message saying “I am Creeper. Catch me if you can!”.
The first Trojan horse ANIMAL came about in 1975 and was authored by John Walker. In those days games in which you guessed an animal through 20 questions were very popular. Walker’s version of the game was gaining a following and to ease his distribution work which meant a lot of copying and playing magnetic tapes, hee wrote a companion program called PREVADE. PREVADE installed itself alongside the game without users’ knowledge and copied Animal into a free directory where the game was not yet saved. Intents were unproblematic, but the mechanism is known today as the dreaded Trojan horse.
SAFETY FIRST: We can protect ourselves from computer viruses by behaving carefully and responsibly. This means not opening suspicious files or answering unexpected emails, especially if the sender is asking for information from us. We should check files with antivirus programs and keep our antivirus software upgraded to the newest version because viruses and new versions of them pop up daily.
WHO CAN INFECT US: Malware is made for fast non-selective propagation, so chances are small that someone was targeting us exactly. Rather than spending time on trying to figure out where we got infected, we should make sure we don’t spread the infection onwards. In case we suspect intentional unlawful behavior however, we should inform the police.
Apple, Microsoft, Amazon, Alphabet, Meta, Alibaba, Tencent …The biggest companies in the world makeonay with software.
In the 1960ies the number of computers in the world rose from 4.400 to 63.000 and with that rose the demand for various types of software.
In 1969 IBM, at that time by far the largest producer of large computers, unbundled software from hardware sales. That tectonic shift in computer business opened the floodgates for a whole new industry of software programs.
Software programs were no longer being written for each customer and use case individually. Instead they started developing them in a way that facilitated easier installation and use in different environments, and equipped users with instruction manuals. At that time Moore’s law (1965) already kicked in as well – making tangible the claim that every two years, the processors double in speed, while computers become smaller and more financially accessible. By the early 1980ies research institutions and big companies already owned at least one microcomputer, and an array of home computers and games consoles was being born, triggering new giant markets for home use software. This brought on an era of mass distribution methods for software such as the penetration of boxed software into stores and mail-in catalog sales. The decades to follow saw an era of Free and Open source software, an era of the World Wide Web, a Microsoft decade and the era of today’s Mobile App stores.
In Slovenia we trace the beginning of an accelerated and systematic development of computing after 1971, when the 5th IFIP congress was organized in Ljubljana and higher education programs for computer sciences were beginning to be set in place. In early 1980s this meant a double revolution: (1) the local producer Iskra Delta began selling their own program equipment for large and minicomputers (2) many households could afford a house computer. The games industry’s success was fueled by the latter but also witnessed a steep downturn, rebounding only after 1990. That is the time to which we also trace the first local creative studios.
Microcomputer software was being created largely by enthusiasts. There was a vibrant exchange of software going on through computer magazines and even at flea fairs. The biggest software distributors were well organized pirates.
Because software for home computers was stored on regular cassettes in the form of an audio track, it was possible to distribute software on vinyl records and even over air, most known example of this in slovenia is the game Kontrabant and the radio station Radio Študent.
After Slovenia gained independence from Yougoslavi ain 1991 we again witnessed two revolutions simultaneously: (3) Iskra Delta disintegrated and the liberalization of the market enabled a big number of small companies, offering amongst other services also software development and soon after (4) the World Wide Web appeared.
We can state that after the year 2000 Slovenia was en par with global trends:
- Around the year 2000 the dot.com bust caused a market crash in the USA largely because of overpriced web service companies. One month after Hermes Softlab, our biggest software company acquired Zaslon, a small company known mostly for the first Slovene banking software system NLB Klik.
- In 2008 Apple published the App Store, one year after that Outfit7 was incorporated in Ljubljana. Their mobile game was so popular, some international sources assign a certain segment of mobile phones sales to them as well.
- In 2011 Bitstamp was established in Kranj, as one of the first and leading exchanges for Bitcoin cryptocurrency.
For the curious:
The first computers were delivered to the users, mostly scientists, without any computer software, only with machine language. For each analysis then they had to be programmed separately, much like we enter an equation in a calculator. These programs were designed by hand, and that’s how the saying that »you only need a pen and notebook to create a software program« came about. Later they took the notes and typed them into punch cards. For storing a larger program one would need to use over 100.000 punch cards.
Most of today’s time measuring software counts seconds from January 1st 1970 (unix time) while the counter can support at most 32 bits. All systems, which fail to upgrade to 64bit by January 19th 2038 will thus run into the same problem as with Y2K – they will virtually go back in time to December 13th 1901.
We do not know how many such programs will still be in use by then. We do not know what damage this can cause. What we do know is that the ubiquitous presence of software will be much much more felt than it was in the year 2000.. .
Nintendo video game consoles are not just for playing games and solving every level – check out the Mario Paint cartridge which is interacted with using a mouse. There’s some for everyone: draw the next masterpiece, create an animated short or instantly become a music composer.
The Super Nintendo Entertainment System console (1991) supports an official mouse controller besides the usual gamepads. It was bundled with the Mario Paint (1992) cartridge which unlocks new possibilities of creativity in each of us. It is presented on real hardware from our collection! But because playing together is more fun, we added a second sitting using an SNES emulator running on the Raspberry Pi 4. To tell you how to get to all the features would not be as fun as to let you explore on your own!
Check out our DOS gaming hits picks!
Travel back in time to an era when games were not ridden with in-app purchases for virtual items, full of ads and being promoted by social media influencers, to a time when a fun and challenging gameplay was king.
We’re confident that you will find your favorite game on our list. Relive the Prince of Persia (1989), save the Earth from an alien invasion in Duke Nukem 3D (1996) and build the best logistics network in Transport Tycoon Deluxe (1994)!
Using a menu system similar in visuals to Norton Commander you can quickly select from one of our offerings and experience the nostalgia. For games that are tricky to quit to DOS you will find instructions printed next to the machine (or would you guess Alt+Q was the right key combination?) Some games might not be suitable for children and are password protected – the password is IBM.
System specifications: Pentium 133 MHz, 16 MB RAM, MS-DOS (from Windows 95), SD-to-IDE adapter + 8 GB memory card, S3 graphics.
The IBM PC model 5150 was the first Personal Computer, launched in 1981. The effects of it dominating the personal computer market (due to using open interfaces and standards available to the competing companies that wanted to build compatible machines) are still felt today as I type this text on my Intel x86 (with modern 64-bit extensions) machine.
This specific machine was donated by the International Atomic Energy Agency (IAEA) to the “Jožef Stefan” Institute and was at the time the most powerful computer at the Reactor center Podgorica in Dol pri Ljubljani. It was used to run the DMR043 program which measured reactor core reactivity levels at the only Slovenian nuclear power plant Krško. What you see is playback of recorded data in a loop. Reactivity is the rate at which neutrons multiply in the reactor. It is positive in value when the power of the reactor increases and negative in value when the power of the reactor is decreasing, at zero or stable.
This early PC runs at 4,77 MHz and doesn’t contain a hard drive – we first boot PC-DOS 2.10 from one 5,25’’ floppy and then replace it with another containing the homegrown DMR043 program – the version demonstrated was developed by professor Andrej Trkov.
These days, the stereotype of a coder is a young man in a hoodie (the stereotypical geek), with vaguely anti establishment points of view or motivated to make a quick million.
What type of a person becomes a coder has changed over the history of computing alongside with the development of the technology itself and its role in society.
The first generation of programmers were the ones who didn’t yet necessarily think of coding as a career and who were part of big serious teams working on big serious projects.
The first generation of computers was barely more than a giant collection of binary switches, each representing a single bit with positions 0 and 1 for on or off. With a specified number of bits in a line, you could write numbers (1101 for instance represents the number 13).
Programing took place in assembly language and demanded not only precision but also frugality. The memory of most of the computers of the time was limited to not more than 4000 coding “words”. For ease of programming computer languages Fortran and COBOL were developed in the late 1950s, enabling programmers to use English language words to write code.
Employing programers meant taking on candidates with no previous experience, because only a select few ever had the chance to code before. Stanford University only formed a computer science department in 1965, the University in Ljubljana in 1973.
The most important characteristic of a computer programmer was a highly selective, precise mind and it was commonplace to consider women as natural programmers. Quite a few women worked as cipher coders during World War 2 encoding and breaking code in Bletchley Park, UK and USA, where the first ENIAC programmers computed ballistic trajectories. However in the 1960s the attractive, highly paid jobs were in hardware engineering, contributing technology development that fueled faster reading of data from computer memory. Programmers were thus often viewed as little more than “typists”.
The second generation of programmers is embodied by the rebellious hackers of the 1960s and early 70s. They understood their mission as liberating computers from shackles of grim, strict research organizations. The second generation of computers (like PDP-1) was “conversational”, equipped with displays and keyboards. During the day those computers were computing for the sake of scientific work at the institute, but at night (as soon discovered at the MIT Laboratory for Artificial Intelligence), it was often free and available to various interested individuals, who had the freedom of no one telling them what to code and what not to code. They were the first generation of programmers liberated to code something purely for fun or entertainment such as playing music or computing chess moves. Programming for the first time became a direct interaction with the computer, which was responding to human commands in milliseconds.
Programing skills were learned through radically open knowledge sharing. Good code was admired, regardless of who programmed it. But the most important characteristic of the generation was the non-commercial “hacker ethic”, which later gave birth to the Free and Open Source Software movement. They firmly believed computer programs are a form of artistic expression and that direct contact with the computers must be available to everyone.
But because this mostly male group tended to congregate late at night in the dim light of cathode displays, they were also the first generation of programmers who, with their unkempt, rebellious, fraternity house subculture managed to systematically push away women.
The third generation of programmers sprouted from a period in which the manufacturers decided to bring computers closer to the crowds. In those days you could literally stumble upon computing.
Computers became more and more accessible for the budgets of middle class families. You could connect them to a home TV screen and instantly start programming just like the hackers at MIT. The allure was even greater for the young generation because the fairly good music and computer graphics capabilities made these computers ideal for simple games and entertainment programs.
Programming took place in BASIC (Beginner’s All-purpose Symbolic Instruction Code), which was even simpler to use that the predecessors Fortran and COBOL. A beginners instruction manual to help you learn programming came with most of the computers at purchase.
The most important characteristic of this generation was an endless amount of uncontrolled time for random experiments from the home living room and an environment in which the number of programmers around the world increased dramatically. This triggered frequent software exchanges and a new distribution medium of computer magazines with faithful and responsive audiences.
Because the video-games scene mostly belonged to boys,this grassroots computer culture also tended to be geared more towards boys than girls. The popular pre internet BBS (Bulletin Board Systems) message exchange with strangers on the other side of the planet was another thing that was tolerated for boys more than girls by their parents. But BBs wasan inevitable step in technology development and taught this generation the value of open exchange of information and global connectedness.
The fourth generation of coders, which still rules the world, grew up in an era of mobile phones and web technologies, which more than any technology before it enabled unlimited entrance into the world of programming for complete beginners.
Software attracted and still attracts large sums of investments, so programming is seen as a viable parth for more and more socially indifferent young people, mostly men, who in the past decades would seek their “fast millions” on Wall street instead.
This new generation of programmers will have a groundbreaking role for the future of humanity, and at the same time the role of programmers themselves will be fundamentally transformed with the use of AI, artificial intelligence technologies.
The programmers input will be redirected into code prompt engineering – creating the basic parameters and narrative for AI to actually code the software. This means a key skill will suddenly become the correct formulation of the initial sentence of code for AI to finish it properly. Teaching AI includes feeding it large samples of program code and much of Ai’s behavior will be determined by the specific intricacies of the sample AI uses for training. The programmer will thus need to pay extra attention to understand the genesis of the technology used for completing the job at hand.
AI will replace some of the programming tasks, an experienced programmer would now assign to junior assistants and will in all practical sense become a member of the programming team. Within the workflow itself the programmers will be much closer to their AI technology toolbox, but on the other hand potentially limited by its directional positioning. Software creation will become faster, but understanding of the complete software stack will become even more elusive. Mistakes and assumptions will be transferred from program to program without deliberate human intervention or our knowledge and ability to identify it during the creative process. Software quality assurance and ethics monitoring will become important parts of software development. The world of programmers will be split into those who create AI and those who work with AI. There will be significantly more of the latter and with each of their coding contributions they will become teachers of a technology, which feeds ina feedback loop on all created software. Therefore it is important for this generation to take on the mission of conscious shaping of a societal future not only on the level of technological progress and business excellence, but also in developing socially responsible solutions and willful addressing of existing and future pressing issues. The next generation will only be able to do this if we as a society familiarize ourselves with technology and take it for our social responsibility to engage and contribute to the future development of software.
Artificial intelligence is the name for a technical system (computer or a group of computers and sensors), which can independently adjust behavior based on the analysis of the effect of its previous actions. In other words: it can learn.
An artificial intelligence system takes into consideration available information, insecurities, possible advantages and weaknesses and responds logically with the best possible response. The latter is determined based on a large set of data available for the learning process. When choosing a response the artificial intelligence system uses decisive decision making with known unknowns (unknowns in the environment, the system understands logically) as well as unknown unknowns (new, not yet analyzed aspects). The result of such a process can bear similarity with natural animal intelligence and enables the computer to use algorithmic judgment and a pile of statistical calculations to depict human intellectual work.
Artificial intelligence as an academic research field began in 1956, in Dartmouth (USA) where a small group of researchers from Dartmouth College, Harvard University, IBM Corporation and Bell Telephone Laboratories gathered at a Summer Research Program on Artificial Intelligence.
Researchers tried and abandoned many different approaches to artificial intelligence from simulating the human brain, to modeling human problem solving, formal logic, big knowledge databases and imitating animal behaviour. The initial theoretical optimism was followed by disappointment in the mid 1970ies, also known as the AI Winter. From the mid 1990ies there was steady progress enabled by increased computing power. In the last years however we have witnessed a before unimaginable leap forward mostly made possible by vast amounts of data, gathered through mass use of amongst others free web services. In near future before unused or even not yet discovered uses of artificial intelligence will cause big changes in society. But in fact AI is already very present in our everyday lives.
Even more, what we once considered artificial intelligence, lost its AI title (a phenomenon known as the AI effect), because we realized it is not »real intelligence« just the next programming tool.
For the curious:
Turing’s test is a test used to determine the ability of a computer for independent thought and decision making, which was described in the 1950 by the British mathematician Alan Turing. The user converses simultaneously with a human on one and a machine on the other computer through keyboard interaction. If the user cannot determine which of the two conversation partners is a machine, we say the machine passed the turing test. In the article introducing this test, Turing foresaw that by year 2000 machines will be able to convince 30% of judges to believe in their humanity. At a competition in 2014 the chatbot Eugen Goostman, depicted as a Ukrainian thirteen year old managed to convince 33% of judges to believe it is human. An example of a (reverse) turing test, which we meet in everyday computer use is CAPTCHA »Completely Automated Public Turing test,used to distinguish Computers and Humans Apart «. With the help of a simple test, which is hard to solve by a machine, we determine if a user is a computer or a person and thus prevent malware from filling out web forms for instance.
A segment of a conversation with Artificial Intelligence:
Is AI a computer program?
No, AI is not a computer program. AI is a field of computer science that deals with the creation of intelligent agents.
Is AI a computer programmer?
No, AI is not a computer programmer. AI is a field of computer science that deals with the creation of intelligent agents.
How does AI relate to computer programming?
AI is a field of computer science that deals with the creation of intelligent agents, which are systems that can reason, learn, and act autonomously. Computer programming is a process of writing instructions for a computer to execute.
Not everything that appears like artificial intelligence is actually intelligent. The first program designed as a conversation robot was Eliza in 1964, a lot of people still remember Clippy from Microsoft’s Office in the 1990ies and even Vida the friendly Slovene tax authority virtual assistant.
All these programs were based on some form of a predefined »if / then« scenario. Lately we talk about real artificial intelligence when it comes to programs which have a capacity to learn by themselves and even learn more than the programmer could foresee in advance. This field is called machine learning.
At first, we spent decades teaching machines only with logical rules, resulting in limited scope of knowledge which brought upon the so-called AI Winter research freeze. With increased speed of computing the 1980ies brought a new wave of AI popularity with »expert systems«, algorithms for generating very large sets of logical rules, but even here researchers hit another limit to the development and a new Winter unfolded. After 1990 we began seeing first serious success stories: In 1997 Deep Blue beat Kasparov in Chess, in 2011 Siri came along, in 2016 a computer first beat a human in the game of Go.
The biggest breakthroughs of the past years happened in the field of neural networks and deep learning. Similar to the human brain, a neural network program is built out of a myriad of interconnected identical subprograms (threshold functions), which self-modify through the process of learning in a way that allows each one of them to transform the input information and pass it onto its neighbors. If we simplify this, we can imagine each neuron casting a vote for one small part of a picture, while the decision on whether this is an image of a Chihuahua or a cookie is made by group consensus.
For the curious:
Machine learning requires vast amounts of data. For example 100 millions of photos with descriptions or all the translations of Wikipedia entries. In this data batch the program looks for patterns. There are three basic machine learning models we use:
- supervised learning, where some input and output data is labels, so the machine knows exactly when a correct conclusion is reached
- Unsupervised learning, where we let the machine try to discover groups of related data on its own and mark them
- Reinforced learning, where an algorithm is being taught by reward and punishment
Today’s most extensive models include the Microsoft Turing with 17 billion parameters and the OpenAI initiative’s DALL-E 2 and GPT-3, the first with 12 and the second with 175 billion parameters.
Dream with Dall-eDall-e is a version of GPT-3 AI, artificial intelligence, which uses 12 billion neurons trained specifically for image creation. For comparison: the number of neurons in a human brain is estimated at 86 billion. When scientists tried Dall-e (named after the surrealist artist Salvador Dalí), they discovered it can create images of many things from anthropomorphic animals to combining of the uncombinable objects and drawing text as well as imitating painting techniques. None of this was thought directly to the underlying AI system, on the contrary the neural network trained itself through analysis of large collections of images and their descriptions. Try some of Dall-e’s capabilities by choosing prompts for the program:
- Drawing multiple objects: it is able to draw for instance a hedgehog with a red hat, yellow gloves, blue t-shirt and green pants. It turns out Dall-e can reliably draw up to three items, but tends to perform less well when complexity increases.
- Drawing perspective and space: Dall-e can for instance draw a head from every angle, so that combined images smoothly connect into animated movement.
- Drawing internal and external structure: Dall-e can draw an X-ray or a cross section.
- Extrapolating related features: if we prompt it to draw a »capybara staring at a sunset«, it will correctly add the long shadow, specific to evening hours, when the sun is low on the horizon.
- Merging unjoinable concepts: it can design products like an »avocado chair«.
- Knowledge of geographical specifics and famous buildings
- Knowledge of historical periods and styles.
- Creating text from a prompt
- Reshaping text (translating, summarizing)
- Responding with facts
- Clearly state what you expect, feel free to use examples.
- Input quality data.
- Check the temperature and variable “top_p” settings, to define how precise an answer you would like. If you want only one correct answer, set them to a lower value.
OpenAI’s mission is to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity. We will attempt to directly build safe and beneficial AGI, but will also consider our mission fulfilled if our work aids others to achieve this outcome. To that end, we commit to the following principles:
Broadly Distributed Benefits
- We commit to use any influence we obtain over AGI’s deployment to ensure it is used for the benefit of all, and to avoid enabling uses of AI or AGI that harm humanity or unduly concentrate power.
- Our primary fiduciary duty is to humanity. We anticipate needing to marshal substantial resources to fulfill our mission, but will always diligently act to minimize conflicts of interest among our employees and stakeholders that could compromise broad benefit.
- We are committed to doing the research required to make AGI safe, and to driving the broad adoption of such research across the AI community.
- We are concerned about late-stage AGI development becoming a competitive race without time for adequate safety precautions. Therefore, if a value-aligned, safety-conscious project comes close to building AGI before we do, we commit to stop competing with and start assisting this project. We will work out specifics in case-by-case agreements, but a typical triggering condition might be “a better-than-even chance of success in the next two years.”
- To be effective at addressing AGI’s impact on society, OpenAI must be on the cutting edge of AI capabilities—policy and safety advocacy alone would be insufficient.
- We believe that AI will have broad societal impact before AGI, and we’ll strive to lead in those areas that are directly aligned with our mission and expertise.
- We will actively cooperate with other research and policy institutions; we seek to create a global community working together to address AGI’s global challenges.
- We are committed to providing public goods that help society navigate the path to AGI. Today this includes publishing most of our AI research, but we expect that safety and security concerns will reduce our traditional publishing in the future, while increasing the importance of sharing safety, policy, and standards research.
- In bus stations
- In camera lenses
- In stickers on the shelves at the pharmacy
- In place of Queen Elizabeth II
- In bread maker machines
- In your pocket (smartphone)
- In electricity plugs
- In phone cables
- In doorbells
- In closets (server)
- In bank cards
- In smoke detectors
Software ate the world!
Online shopping and advertising
Artificial intelligence is widely used to provide personalized recommendations to people, based for example on their previous searches and purchases or other online behavior. AI is hugely important in commerce: optimising products, planning inventory, logistics etc.
Digital personal assistants
Smartphones use AI to provide services that are as relevant and personalized as possible. Virtual assistants answering questions, providing recommendations and helping organize daily routines have become ubiquitous.
Language translation software, either based on written or spoken text, relies on artificial intelligence to provide and improve translations. This also applies to functions such as automated subtitling.
Smart homes, cities and infrastructure
Smart thermostats learn from our behavior to save energy, while developers of smart cities hope to regulate traffic to improve connectivity and reduce traffic jams.
While self-driving vehicles are not yet standard, cars already use AI-powered safety functions, such as automated sensors that detect possible dangerous situations and accidents. Navigation is largely AI-powered.
AI systems can help recognize and fight cyberattacks and other cyber threats based on the continuous input of data, recognizing patterns and backtracking the attacks.
Certain AI applications can detect fake news and disinformation by mining social media information, looking for words that are sensational or alarming and identifying which online sources are deemed authoritative.
Researchers are studying how to use AI to analyze large quantities of health data and discover patterns that could lead to new discoveries in medicine and ways to improve individual diagnostics.
For example, researchers developed an AI program for answering emergency calls that promises to recognise a cardiac arrest during the call faster and more frequently than medical dispatchers.
Food and farming
Many farms across the EU already use AI to monitor the movement, temperature and feed consumption of their animals.
Nothing Beats the Humble Command Prompt
Many power users still prefer text-based user interfaces today as they enable efficient, quick and accurate control of local and remote computers. You can use our well preserved example of a Gorenje Delta PAKA 3000 terminal (1985) to browse our museum inventory system. But just because we can, we invite you to take a photo with the included webcam which will be transformed to text-based ASCII art which you can then send via email if you wish.
The industry standard protocol for serial communication – RS-232 – goes back decades and is still able to be interfaced to using USB adapters. The terminal talks through it with the small form factor computer Raspberry Pi 4 running Linux, where it takes just a few lines of code to enable communication between the two. The command prompt presented to the user is a modern Node.js-based program. It even supports the special Slovenian characters ČŽŠ. Photos taken using the webcam are converted to letters numbers and other characters on screen using a special algorithm – take a few steps back and the picture becomes even more recognizable!
This rare BETA version was last seen 27 years ago. It was distributed to select beta testers and the press but they were prohibited from keeping a copy of it. Experience the classic desktop games: Solitaire, Minesweeper, Freecell and Hearts. Even though it is not the final Windows 95 build released to manufacturing, it’s close enough that you can experience the wonderfully illustrated interactive picture book “Gal in the Gallery” (1996, Svetlana Makarovič, National Gallery of Slovenia) and the CD-ROM-based point & click adventure Neverhood (1996, Steven Spielberg).
We were lucky enough to obtain the installation files for this Slovenian BETA version from an anonymous donor from abroad. We knew of its existence due to a review article in the September 95 issue of the Slovenian computer magazine Win.ini but the CD was thought to be lost forever – until now! It is visually and functionally very similar to the final release build but one can spot a typo or grammatical error if one looks hard enough – but that’s understandable.
System specifications: Pentium 133 MHz, 32 MB RAM, 809 MB hard drive, S3 graphics, Sound Blaster Vibra 16 sound
The IBM OS/2 Warp 4 (1996) was one of the few non-Microsoft operating systems to be localized to Slovenian in the 90s. This task was entrusted to ZOP-CR d.o.o., a company which specializes in localization and the result was OS/2 Warp 4 together with the Slovenian translation of Windows 3.1. This is an extremely rare version of OS/2 by IBM which tried to compete with Windows 3.1/95 but lost out in the end. We obtained and restored a copy during our local software archival efforts.
The original OS/2 Warp 4 install media is not compatible with hard drives bigger than a certain size and newer chipsets, hence the 486 PC in front of you was a perfect fit as it is from the era and is 100% supported including the graphics drivers.
Because OS/2 offered great compatibility with DOS and 16-bit Windows, the act of everything being localized means this exhibit also contains Windows 3.1 in the Slovenian language – something the standalone distribution never got translated into.
System specifications: 486DX2, 16 MB RAM, 400 MB hard drive, Cirrus Logic graphics @ 800×600 256 colors and a blazing fast quad speed (4x) CD-ROM drive!
Microsoft seems to have nailed it with the Windows XP operating system (2001). Even after a decade from its release date, users were reluctant to upgrade to the next version. Windows XP has not been supported or provided with security updates for many years now but many critical government and infrastructure systems still rely on it worldwide. This has been a cause for some concern as such systems can become compromised by just the fact they are connected to the internet.
Try the Professional edition with Service Pack 3 (2008) with the Slovenian language pack installed. This is the first version of Windows to require online activation. Even though the machine in front of you has a valid XP license, we were unable to activate it online as the servers are long offline and needed to call the Microsoft call center and obtain a long string of numbers to manually activate it.
You will find some classic software from the period installed: the music player Winamp, the bundled pinball video game Space Cadet and we put the Slovenian localization of Amazing Animals (1998, DZS Multimedija), a beautiful piece of edutainment, in the CD-ROM drive.
System specifications: HP Pentium Dual Core E2200 @ 2,2 GHz, 2 GB RAM, 74,5 GB disk, integrirana grafična kartica Intel, integrirana zvočna kartica
We installed Slovenian Windows 98 Second Edition, turned on one of its over-the-top desktop themes together with cool screensaver and sound effects and added some black magic in the form of 3dfx Voodoo2 (1998) 3D accelerator card. Now one can experience Half-Life (1998), Quake II (1997), Thief (1998) and Gex (1998) the way they were meant to be played. If you want to try something different, may we suggest you look up your old phone number in the Slovenian electronic phone dictionary from 1997.
System specifications: IBM Personal Computer 300GL, Celeron 500 MHz, 256 MB RAM, 1,19 GB hard drive, integrated sound card, S3 Tru3D + Voodoo2 graphics
The Iskra Delta Partner computer from 1983 is a homegrown desktop development and business machine based on the Z80 processor also used in the popular ZX Spectrum home computer. We have 4 such exhibits in our collection and there is a decent amount of private collectors still enjoying them too. One such group put its newly “found” free time during the pandemic to develop new software for the platform – from toolkits to video games that push the boundaries of this venerable old computer system.
The PARTNER (Part Time Nerds) group wants to make it easier for everyone to develop for the system and have created a new Software Development Kit (SDK) for it. They also repair and restore non-working Partner computers using modern tools and techniques (among them are floppy emulators and more reliable ATX power supplies). They are building an archive of old data storage mediums (floppy disks, hard drives and ROM chips), collecting and scanning user and service manuals and developing advanced video games using its high resolution graphical mode. You can help the project by supplying new information and resources to aid their cause. All Z80 programmers welcome, beginners alike!
FOSS vs. Proprietary Software
Free and Open Source Software:
YOU CAN complain if the program wants you to agree with a GPL license
YOU CAN use the source code of each version of the program
YOU CAN use the software for free
YOU CAN explore and see how the software is constructed
YOU CAN change the software as you please
YOU CAN help a friend install a free copy
YOU CAN sell adjusted versions of the software
YOU CAN contribute with your fixes and further development of the software
Proprietary Commercial Software:
YOU MUSTN’T open the box if you don’t agree with the contract that’s in it
YOU MUSTN’T use the program before reading 20 pages of small print all the way to the end and tick of “agree with the terms”
YOU MUSTN’T if you don’t enter the 25 character activation key correctly
YOU MUSTN’T go offline, because then we can’t check you authentication anymore
YOU MUSTN’T look into how the program is built
YOU MUSTN’T lend the software to a friend
YOU MUSTN’T fix the bugs you find in the software
YOU MUSTN’T use any good trick from this software in your own work
YOU MUSTN’T complain if your use data is sold to a third party
YOU MUSTN’T complain if we open a secret door to the state
YOU MUSTN’T even if we’re no longer around and no longer support this software