The History of Information Technology

Past and present IT

An Introduction to IT

In a modern context, the term ‘IT’ is commonly used to describe computers and networks within a business environment. It refers to their applications in: generating, manipulating, storing, regaining, transmitting, handling, exchanging, studying and securing all data or information in an electronic format. IT is also used as an umbrella term to cover: television, telecommunication equipment, software, e-commerce and the internet.

When thinking about IT you need to consider IT support within both your personal and private life. Especially when it comes to the increasingly sophisticated level of cyber crime we see every day. This is so that when you are surfing the web on your computer or receiving an email, your personal and business data is kept safe. IT support also covers technical problems you may come across, ensuing you are using the most up to date software and finding the best tools possible to effectively complete tasks.

Humanity has been manipulating, storing, and communicating information since the early Sumerians pioneered the written word in ancient Mesopotamia, circa 3000 BC. The term IT did not appear until the mid-20th century however when an influx of early office technology appeared. The term was first published in the 1958 Harvard Business Review when authors Harold J. Leavitt and Thomas C. Whisler said “the new technology does not yet have a single established name. We shall call it Information Technology.”

Timeline of important IT milestones

Although this section could go as far back as 2400 BC with the production of the first known calculator (abacus) in Babylonia, it will focus on the information technology boom in recent centuries.

The first mechanical computer device was conceptualised and invented by English mechanical engineer and polymath Charles Babbage in the early 19th century. Called the ‘Difference Engine,’ it was originally created to aid in navigational calculations. Often referred to as the ‘Father of the Computer’, Babbage came up with the more general ‘Analytical Engine’ in 1833 which could be used in fields other than navigation. Funding constraints meant that Babbage died without seeing his machine completed, however his son Henry completed a much simpler version of the machine in 1888, which was successfully demonstrated to the public in 1906.

Early computers were not developed until the mid 1900s, when a more compact analogue electromechanical computer, that used trigonometry, was installed on a submarine to solve a problem with firing torpedoes at moving targets.

The Z2, the first electromechanical digital computer, invented by Engineer Konrad Zuse in 1939, used electric switches to drive, and relays to perform calculations. Devices like the Z2 had very low operating speeds and were eventually succeeded by faster all electric machines, such as the first fully automatic 1941 Z3, also created by Zuse.

Colossus, a set of computers created between 1943 – 1945, are widely recognised as the world’s first programmable electronic digital computers. Popularised by its use during World War II Colossus were used in intercepting and deciphering encrypted German communications from the Enigma machine. English computer scientist, mathematician, and theoretical biologist Alan Turing conceptualised modern computers in his 1936 seminal paper ‘On Computable Numbers’, whereby programmable instructions are stored in the memory of a machine.

Another early programmable computer was the Manchester Mark 1 developed by the Victoria University of Manchester. Frederic C. Williams, Tom Kilburn, and Geoff Tootill began working on the machine in August of 1948, but the first operational version of the computer was not available for use until 1949. The Manchester Mark 1 caused controversy when British media outlets referred to it as an electronic brain, which provoked a long-running debate with the department of Neurosurgery at Manchester University. They asked whether an electronic computer could ever be truly creative.

It was not until 1951 when electrical engineering company Ferranti International plc created the Ferranti Mark 1; that the world’s first general-purpose computer was commercially available. Also called the Manchester Electronic Computer, the Ferranti Mark 1 was first utilised by the Victoria University of Manchester.

The first computer used in processing commercial business applications was developed by the Lyons Tea Corporation to increase business output in 1951 – Leo I.

A brief timeline of some other important events is listed below:

1835 – Morse Code invented by Samuel Morse

1838 – Electric Telegraph invented by Charles Wheatstone and Samuel Morse

1843 – Typewriter invented by Charles Thurber

1877 – Microphone invented by Emile Berliner

1888 – Hertz produces radio waves

1893 – Wireless communication invented by Nikola Tesla

1895 – Radio signals invented by Guglielmo Marconi

1898 – Remote control invented by Nikola Tesla

1907 – Radio amplifier invented by Lee DeForest

1919 – James Smathers develops the first electric typewriter

1923 – Electronic Television invented by Philo Farnsworth

1933 – FM radio is patented by inventor Edwin H. Armstrong

1937 – Alan Turing conceptualises the computing machine

1948 – One of the first programmable computers, the Manchester Mark 1 designed by Frederic C. Williams, Tom Kilburn, and Geoff Tootill

1951 – MIT’s Whirlwind becomes the first computer in the world to allow users to input commands with a keyboard

1956 – Optical fibre invented by Basil Hirschowitz, C. Wilbur Peters, and Lawrence E. Curtis

– The hard disk drive invented by IBM

1958 – Silicon Chip: the first integrated circuit is produced by Jack Kilby and Robert Noyce

1959 – The first photocopier, the Xerox Machine enters the consumer market

1961 – Optical disc invented by David Paul Gregg

1963 – Computer mouse invented by Douglas Engelbart

– Cloud computing invented by Joseph Carl Robnett Licklider

1967 – Hypertext software invented by Andries Van Dam and Ted Nelson

1971 – E-mail invented by Ray Tomlinson

– Liquid Crystal Display (LCD) invented by James Fergason

– Floppy Disk invented by David Noble

– First commercially available microprocessor, the Intel 4004 is invented

1972 – The first video game console designed for use on TV’s is invented – the Magnavox Odyssey

1973 – Ethernet invented by Bob Metcalfe and David Boggs

– Personal computer invented by Xerox

1976 – The inkjet digital printer is invented by Hewlett-Packard

1982 – WHOIS (pronounced who is) is released as one of the earliest domain search engines

1984 – The first laptop computer enters the commercial market

1989 – World Wide Web (the internet) invented by Sir Tim-Berners Lee

1990 – A student at McGill University in Montreal develops the first search engine named Archie

1992 – Complete I.T. Founded

1993 – Benny Landau unveils the E-Print 1000 as the world’s first digital colour printing press

– Xerox 914 is released as the first successful commercial plain paper copier

1996 – The Nokia 9000 Communicator is released in Finland as the first internet enabled mobile device

1998 – Google established

– PayPal is launched, enabling large scale payment via the internet

2000 – Microsoft develop the first tablet computer

2001 – Digital Satellite Radio

– Apple releases the iPod

2003 – WordPress, an open source website content management system is launched by Mike Little and Matt Mullenweg

– LinkedIn is established

2004 – Emergence of Web 2.0 – Humans move away from consumers of internet material to active participation

– Facebook established by Mark Zuckerberg

2005 – USB Flashdrives replace floppy disks

– Google Analytics established

– YouTube is launched as a video platform

2006 – Twitter is launched to the public

2007 – Apple Inc. debuts the iPhone

– Amazon releases the Kindle, marking a new era in reading and book technology

2009 – Bitcoin is developed by unknown programmers under the name of Satoshi Nakamoto

2010 – Apple debuts the iPad

– The beginning of responsive website design

2011 – 22 nanometre computer chips enter mass production

2012 – Quad-core smartphones and tablets are releases, offering faster processing   power

2014 – 14 nanometre computer chips are released

– The market for smart watches reaches 5 million

2015 – Apple releases the Apple Watch

2016 – Supercomputers reach 100 petaflops

– Mobile devices overtake wired devices as a means of using the internet

2017 – 10 nanometre chips enter service

2018 – AI first publicly emerged alongside 5G technology

2019 – Google released Quantum Supremacy, a machine running on quantum mechanics that can answer questions that would confuse even the world’s top supercomputer

Sharp acquires Complete I.T.

2020 – Chatbot-technology and text-producing AI GPT-3 was released.

– The COVID-19 pandemic accelerates digital transformation, leading to remote work and online education

2021 – GitHub Copilot, a programmer assistant AI, was released

– Continued development of electric vehicles (EVs) with advancements in battery technology.

2022 – Chatbot and text-generating AI, ChatGPT  is released, Expansion of Metaverse concepts

2023 – Microsoft released ChatGPT-powered Bing

– Sharp launches first-to-market Virtual Showroom – an immersive and interactive experience

The Virtual Showroom is an award-winning 3D virtual environment that allows users to view Sharp’s products and services in a ‘real-life’ setting with interactive elements such as video and the ability to engage with different hotspots and touchpoints to view and learn more about the product/solution.

Implications of IT in the workplace

No matter the size of a company, IT systems have had tangible and intangible applications and implications across all areas of a business’ operations. Company communications, efficiency, mobility, culture have all been affected by the introduction of information technology.

Businesses will either have an internal support team or outsource their IT to a Managed  Support Provider to manage their IT, ensuring they are working effectively and fixing any problems that may arise.


Employees are no longer limited to inter-office written mail or phone calls. Electronic mail (e-mail) allows for the instant communication of information without interrupting the recipient. Modern digital communication tools such as e-mail also allow for quick and clear communication with customers and clients, a particularly useful commodity in a world where people now want things instantly. Tools such as websites also allow customers to interact with a business out of usual office opening hours, offering a place to provide feedback, testimonials, or to order products and services. Additionally, live chat services allow instant messages to be sent between co-workers, businesses and consumers.


Information technology increases workflow efficiency. By utilising technology such as e-mail and faster hardware such as laptops and tablets, businesses save time and increase productivity. Digital filing conserves office space and reduces annual paper and print costs, with any alterations able to be made immediately, at the click of a button. By connecting all of these systems together, working life is made easier, quicker, and more efficient, saving time and money.

Microsoft 365 is an integrated solution, bringing together the best-in-class productivity of Microsoft 365 Business Standard (formerly Office 365) with advanced security and device management capabilities. Using Microsoft Teams within 365 as an example, it allows your team to communicate and collaborate in real-time, from anywhere at any time and on any device.

IT History Efficiency
IT History Mobility


The original computers of decades past required whole teams of people to operate them simultaneously, making them costly and completely immobile. Information technology today can be transported at will due to a drastic reduction in the size of devices. They can also be operated by one person. This increases productivity by allowing staff to work from any location, away from office distractions. The ability to work anywhere can attract employees to a company by reducing travel costs.

Aside from the mobility benefits to employees, a business can directly profit from the movement brought about by IT. An organisation can establish a global presence very easily and at a fraction of the cost by ascertaining small offices in several countries and keeping them connected by storing data in the Cloud. They can also utilise modern functions such as video conferencing to enable staff to communicate as if they were in the same room. IT enables for growth and expansion, quickly, and brings even more benefits.


IT can vastly improve company culture. If file sharing technology is utilised then employees can work collaboratively, get to know each other and improve the general feel of a business, from anywhere around the world. Additionally, employees do not have to wait for other members of staff to send them work, as work can be shared instantly using modern IT systems, removing inter-staff hostilities, reducing inefficiencies and frustrations.

What did people think future IT would look like when computers were first introduced

Since the introduction of digital tools, the pace of work and life has irreversibly changed. When these first computers were introduced, many futurists believed that the human workforce would drop to 2%. They believed that humanity would return to the Ancient-Greek Hellenic concept of leisure, whereby slaves did the hard work and the Greeks focussed on challenging their minds, in the modern version of the concept machines would do the work.

Other scientists thought that the coming of the millennium in 2000 would change everybody’s lives forever. The belief was that the computers everyone had grown dependant on would malfunction, taking humanity back to a time without electricity. These predictions added to a growing ‘computerphobia’ that formed in the 1980’s when personal computers became mainstay household items. People believed they could and would be replaced by machines, or damage a computer’s internal mechanisms by touching it.

IT History IT introduced

What are modern computer systems like now?

Computer systems today are vastly different to their predecessors. Devices of the past used to fill entire rooms and required whole teams to operate them because of their size. The first personal computers were not so different, usually taking up entire desks and not leaving much room for anything else. Machines today are much more compact, and lighter. Devices like tablets and smartphones are small enough to hold in your hand, making them much easier to transport. The storage capabilities of these devices is massively increasing whilst the devices themselves are decreasing in size, making them a lot more useful in both business and personal environments.

The first personal computers were made of heavy metals and dense plastics such as Acrylonitrile Butadiene Styrene (ABS) which were susceptible to damage and could not be moved easily. Because of increasing calls for recyclable materials to combat climate change today, computing devices are being created with materials that are lightweight and sturdier, making them easier to transport.

When computers first appeared, their functions were quite limited, only really being able to perform basic computational tasks such as typing text and saving data. When personal computers arrived in the late 20th century, they were more capable than their predecessors, but lacked the conveniences of devices today. Modern computers also have a wide array of uses, and the inter-connectivity of devices brings a modern convenience that computer users prior did not have.

The functions of a computer are no longer seen as limited and instead, they are high tech and effective productivity tools, such as using Microsoft Office 365, an integrated solution, bringing together the best-in-class productivity of Microsoft 365 Business Standard (formerly Office 365) with advanced security and device management capabilities.

IT History modern systems

How do modern systems compare to their predecessors?

The pace of technology growth in recent years is staggering. Businessman and co-founder of the Intel Corporation, Gordon Moore theorised the rate at which internal computer mechanisms would grow in capability. Named ‘Moore’s Law’, he predicted that the number of transistors in any given computer circuit would double every two years, meaning that tomorrow’s computers could outdate current technology in just two years!

Modern devices are better than their predecessors in every way, including size, weight, mobility, input and output, capacity, and functionality.

Size and Weight

As previously mentioned, the first computers, such as the Electronic Delay Storage Automatic Calculator, which was used to solve differential biology equations, filled an entire room. Modern computers are made with lightweight polycarbonates and synthetic plastics, and it is rare to find computers today that are made with mostly metal. Internal components are also getting smaller (in accordance with Moore’s Law), which therefore makes the devices smaller and more lightweight, some of which can fit into a pocket or onto a wrist.
Furthermore, whilst devices have gotten smaller, their storage capacity has gotten much bigger. In the past 20 years, hard drives have been continuously innovated, transforming them from disks that measures 52 feet that held 5mb (5,000,000 bytes) of data, to 3.5 inches in size and capable of storing one terabyte (1,000,000,000,000) of information

Other scientists thought that the coming of the millennium in 2000 would change everybody’s lives forever. The belief was that the computers everyone had grown dependant on would malfunction, taking humanity back to a time without electricity. These predictions added to a growing ‘computerphobia’ that formed in the 1980’s when personal computers became mainstay household items. People believed they could and would be replaced by machines, or damage a computer’s internal mechanisms by touching it.

IT History Size and weight
IT History Mobility


Because devices today are shrinking in size and the technology that drives their components is becoming more advanced, they are growing more mobile. In 2016 mobile browsing overtook desktop browsing for the first time ever in history. Personal machines in the past had very little charge time, so required hefty battery packs to be carried around with them, just to keep them running. Today’s computers are capable of being charged for an hour by a small and easily transportable charger, and can sustain function for an entire day. This increase in mobility has allowed such devices to be used anywhere in the world, whereas previous devices tied the user to a single location.


Early personal computers had very little to no input or output ports, they were only really compatible with the floppy disk (an early form of storage media), keyboard and printer. Machines today can work in conjunction with many different pieces of equipment, including:

• Graphics Tablets
• Cameras
• Video Capture Hardware
• Trackballs
• Barcode Readers
• Joysticks
• Microphone
• MIDI keyboards
• Mouse
• Webcams
• Touchpads
• Headphones
• Universal Serial Bus (USB)


The performance and speed of computation can be measured in Million Instructions per Second (MIPS). The first personal computer produced in the US in 1951, the UNIVAC I, could perform tasks at 0.002 MIPS, whilst a prototype processor today, the Intel Polaris, is able to work at 1,800,000 MIPS. Additionally, the first computers invented were only capable of doing tasks that were input by human operators, today’s devices are completely programmable and store vast amounts of instructions on the hard drive, freeing the computer to perform vastly different functions.

The Evolution of Cyber Security

Cyber security has evolved significantly over the years in line with ever-growing threats in our digitally interconnected world. As our reliance on digital technologies expands across industries and everyday life, the importance of cyber security cannot be overlooked. It serves as a safeguard against malicious actors who are actively looking to exploit vulnerabilities in systems, networks, and data. With the increase in cyber threats, including malware, phishing attacks, ransomware, and data breaches, the evolution of cyber security has been crucial for both organisations and individuals alike. IT companies offer IT Support services to help bolster an organisation’s cyber security defences.

Cyber security has transitioned from mere anti-virus software to sophisticated, multi-layered defence systems incorporating artificial intelligence (AI), machine learning, and behavioural analytics. As our digital footprint expands with the arrival of Internet of Things (IoT), cloud computing, and mobile devices, cyber security has needed to adapt to protect a broader range of technological touchpoints.

The sudden rise of remote work over the years, triggered by 2020’s global pandemic, and the increase of interconnected supply chains due to globalisation also emphasises the need for robust cyber security measures. Cyber security’s evolution highlights its essential role in preserving privacy, maintaining trust in digital transactions, and safeguarding critical information technology infrastructure in the face of relentless cyber threats.

Cyber security: Starting as you mean to go on

The Rise of AI

Artificial Intelligence (AI) is a subset of of information technology that has undergone a remarkable evolution, transforming from a concept of science fiction into a prevalent force in our daily lives. Initially, AI was primarily focused on rule-based systems and symbolic reasoning, which limited its capabilities to narrow domains. However, with advancements in machine learning and deep learning algorithms, AI has made significant strides in understanding and processing complex data patterns. One of the latest breakthroughs in AI involves the development of GPT-3 (Generative Pre-trained Transformer 3), a language model capable of generating human-like text and performing a wide range of natural language processing tasks. GPT-3 represents a shift in AI capabilities, demonstrating the potential of large-scale pre-trained models to understand and generate human-like text at an unprecedented level.

AI applications have now expanded into diverse areas such as healthcare, finance, transportation, and entertainment, revolutionising industries and enhancing efficiency and productivity. As AI continues to evolve, researchers are exploring new possibilities such as reinforcement learning, neuro-symbolic AI, and AI ethics, paving the way for even more remarkable developments in the future.

Microsoft Copilot

Microsoft Copilot, released in June 2021, is an AI-powered code completion tool designed to assist developers in writing code more efficiently. Developed in collaboration with OpenAI, Copilot utilises machine learning algorithms to analyse context and suggest relevant code snippets, function definitions, and even entire blocks of code in real-time.

Integrated seamlessly with popular code editors like Visual Studio Code, Copilot aims to streamline the coding process by providing intelligent suggestions based on the developer’s input and the project’s requirements. Its release marks a significant advancement in the field of developer tools, promising to enhance productivity and reduce the cognitive load associated with writing software.

In November 2023, Microsoft’s Copilot for Microsoft 365 was introduced, initially targeting enterprise customers who committed to a minimum of 300 Copilot licenses. A significant portion of the Fortune 100 joined Microsoft’s Early Access Program, demonstrating strong interest across various industries. Thousands of professionals embraced this AI-powered assistant for their work processes.

From January 15th, 2024, Microsoft’s AI-powered Copilot became widely available to organisations of all sizes without any minimum license requirements. The decision was driven by the substantial demand and enthusiasm from small and medium-sized businesses seeking to leverage the benefits of this AI-driven tool. Additionally, individual users not associated with business plans can now access Copilot through the Copilot Pro (Copilot Professional) package.