We live in a digital world. Everything now is digital. From photography to communication, medical field to our banks. Everyone is aware of the current IT trends but it took a long time for us to get here. In here, I’ll talk about the electronic age of IT. Wondering what else can IT consist of if not electronic, have a look at this article.
The modern IT industry completely comprise of electronics. It all began when we learnt about electrical energy and how to use it. After that, the IT industry changed dramatically. The electronic era of Information technology can be said to have started in 1940. That is when the earliest vacuum tubes we built. It took some time for us to utilize them, but no too long. In 1945, the ENIAC was built. It is a calculator that did not use any mechanical part. It weighed 30 tons. This computer (or calculator to be precise) was developed for the US army. One thing that this machine could not do was to store data. So, the next big step was to develop a machine that could store data. Then the Manchester mark 1 was born. It could save its data and the first run of the machine was made in June 1949 which was successful. However, these were machines only able to perform in a lab. But then Leo I developed by J Lyons and Co. which was commercially applicable and was used for commercial purposes, running its first business application in 1951. The next big step was the UNIVAC I mainframe computer. It was widely used for calculation and prediction purposes. Notable, the presidential election results for Eisenhower.
The following generation of computers were to become smaller and compact. Vacuum tubes were replaced by transistors. Magnetic drums were used for internal storage. Along with hardware, software also was getting an overhaul. Compilers were used as machine and assembly language. This was the mark of the second generation computers who were getting smaller and smarter. More resourceful languages like COBOL and FORTRAN were also developed during this time which gave more work to the computers to do. The next phase of IT development was the invention of semiconductors. Semiconductors were used to make integrated circuits which were used for memory. They were compact, faster and more reliable. In the software field, Operating System was developed. In 1964, BASIC was developed by John Kemeny and Thomas Kurtz at Dartmouth College in New Hampshire. It can be said that this OS changed the computing world. The next stage is from where we get where we are in today’s world. Large Scale and Very Large Scale integrated circuits were being developed. This led to the development of microprocessor. It has memory, logic unit and control unit together in one single chip.
Though “Programma 101”, the Soviet MIR series, IBM’s SCAMP computers were available in market, they were only used for commercial purposes, they were not for the consumer. However, this changed with the release of Apple 1. Though it was just a motherboard, it set the idea of a “personal computer”. The first successful personal computer which was mass produced and sold was Commodore PET developed by Commodore International. Steve Jobs and Steve Wozniak introduced Apple II (generally referred to as Apple). It revolutionized the personal computing. It developed keeping the general people in mind. It was cheap, reliable, convenient to use and most important of all, it was very attractive. It turned out to be extremely successful. Another big hit was the Window operating system from Microsoft. It has a GUI (Graphic User Interface). It also was a smashing hit. Henceforth, in the IT world, thing kept getting smaller, more complicated, and illustrious.
Well, this is where we get our modern IT infrastructure from. The process did not take very long, compared to what it took to get till here, but it definitely is fascinating and had to have a lot of work getting done.
So, when you fancy your latest gadget, don’t forget to admire the history behind it.