Computer Basics: What is a Computer?

Two Desktop Computer Monitors on desk

A computer is a programmable digital electronic machine that executes a series of commands to process data input, conveniently obtaining information that is subsequently sent to the output units.

A computer is physically made up of numerous integrated circuits and various support, extension, and accessory components, which together can perform various tasks extremely quickly and under the control of a program (software).

It is made up of two essential parts, the hardware, which is its physical structure (electronic circuits, cables, cabinet, keyboard, etc.), and the software, which is its intangible part (programs, data, information, documentation, etc.)

From a functional point of view, it is a machine that has at least one central processing unit (CPU), a memory unit, and an input/output unit (peripheral).

The input peripherals allow the input of data, the CPU is in charge of its processing (arithmetic-logical operations) and the output devices communicate them to the external media.

Thus, the computer receives data, processes it, and emits the resulting information, which can then be interpreted, stored, transmitted to another machine or device, or simply printed; all at the discretion of an operator or user and under the control of a computer program.

The fact that it is programmable allows it to perform a wide variety of tasks based on input data since it can perform operations and solve problems in various areas of human activity (administration, science, design, engineering, medicine, communications, music, etc.).

Basically, the capacity of a computer depends on its hardware components, while the diversity of tasks lies mainly in the software that it supports running and contains installed.

Although this machine can be of two types, analog computer or digital system, the first type is used for few and very specific purposes; the most widespread, used and known is the digital computer (general purpose); so that in general (even popular) terms, when you speak of “the computer” you are referring to a digital computer.

There is mixed architecture, called hybrid computers, and these are also special-purpose.

In the Second World War mechanical analog computers were used, oriented to military applications, and during the same time the first digital computer was developed, which was called ENIAC; it took up a huge space and consumed large amounts of energy, equivalent to the consumption of hundreds of current computers (PCs).

Modern computers are based on integrated circuits, billions of times faster than the first machines, and they take up a small fraction of your space.

Simple computers are small enough to reside on mobile devices. The laptops, such as tablets, netbooks, notebooks, ultrabooks, can be powered by small batteries.

Personal computers in their various forms are icons of the so-called information age and are what most people think of as a “computer.”

However, embedded systems also constitute computers, and are found in many current devices, such as MP4 players, cell phones, fighter jets, toys, industrial robots, etc.

Definition of Computer by GetOnlineSolution

“A computer is an electronic machine that primarily calculates. The word computer is derived from the Latin language “Compute”. Which means to calculate. For this reason, a computer is also called a calculating machine.”

In other words,

“A computer is a programmable machine, designed to solve or complete numerical mathematical or logical operations automatically by means of computer programming.”

Another definition of Computer

“A computer is a machine or device that can perform calculations and operations based on the instructions given by the hardware and software program.”

Personal computer, view of typical hardware.
  1. Monitor
  2. Motherboard
  3. Microprocessor or CPU
  4. ATA ports
  5. RAM
  6. Expansion boards
  7. Power supply
  8. Optical disk drive
  9. Hard disk drive , Solid state drive
  10. Keyboard
  11. Mouse

Computer working process or functioning

Computer work is mainly completed in three steps.

  1. Input data (taking data as input.)
  2. Processing Data (Processing or calculating data.)
  3. Store Data (Store data in storage device)
  4. Output Data (To give the result data as output after the process.)

A computer has the ability to accept input data, process it, and then output it.

The computer can store the data in its storage device (memory) for later use and retrieve it when needed.

What is Input

Instructing a computer or sending data is called input. The hardware devices used to give input to a computer are called input devices. The example of which is the following.

  1. Keyboard
  2. Mouse
  3. Microphone

What are process and processing

The sequence of events in which a program is run on a computer is called a process. Any type of program that is running and working on your computer. The process is a name (noun) and processing is its verb.

What is Output

The result which is given by the computer after completing a task is called output. The data given as the result is called output data. This output data can be in many forms such as Text, Video, Audio, Images, etc.

Component of Computer (Architecture)

The basic model of the von Neumann architecture, on which all modern computers are based.

The technologies used in digital computers have evolved greatly since the appearance of the first models in the 1940s, although most still use the von Neumann Architecture, published by John von Neumann at the beginning of that decade, which other authors attribute to John Presper Eckert and John William Mauchly.

The von Neumann architecture describes a computer with four (4) main sections:

  • Arithmetic logic unit
  • Control unit
  • Primary or main memory
  • Input and output (I/O) unit

These parts are interconnected by conductive channels called buses.

Central processing unit

The Central Processing Unit (CPU) consists of the following three elements in a basic way:

  • The arithmetic logic unit (ALU) is the device designed and constructed to carry out elementary operations as operations arithmetic (addition, subtraction), operations logic (AND, OR, XOR, inversions, displacements and rotations).
  • The control unit (CU) follows the address of the positions in memory that contain the instruction that the computer is going to carry out at that moment; it recovers the information by putting it in the ALU for the operation to be carried out. It then transfers the result to corresponding locations in memory. Once the above occurs, the control unit goes to the next instruction, which may be the next physically (from the Program Counter) or another (through a jump instruction).
  • The registers : not accessible (instruction, data bus and address bus) and accessible, specific use (program counter, stack pointer, accumulator, flags, etc.) or for general use.

Computer Memory

The main memory, known as random access memory ( RAM), is a set of storage cells organized in such a way that can numerically access the memory address.

Each cell corresponds to a bit or minimum unit of information. It is accessed by 8-bit sequences.

An instruction is a certain operational action, a sequence that tells the ALU the operation to be carried out (addition, subtraction, logical operations, etc).

The bytes of the main memory store both the data and the operation codes that are needed to carry out the instructions.

The memory capacity is given by the number of cells it contains, measured in bytes or multiples.

The technologies used to manufacture the memories have changed a lot; from the electromechanical relays of the first computers, tubes with mercury in which acoustic pulses were formed, arrays of permanent magnets, individual transistors to today’s integrated circuits with millions of cells on a single chip.

They are subdivided into static memories (SRAM – Static random access memory) with six integrated transistors per bit and the much more widely used dynamic memory (DRAM – Dynamic random access memory), of a transistor and a capacitor integrated per bit.

RAM can be rewritten several million times; unlike ROM, which can only be recorded once.

Computer main memory comes in two principal varieties:

  • Random-access memory or RAM
  • Read-only memory or ROM

Input, output or input/output peripherals

The input devices allow the entry of data and information while the output devices are in charge of externalizing the information processed by the computer. There are peripherals that are both input and output.

As an example, a typical input device is a keyboard, an output device is a monitor, and an input/output device is a hard drive. There is a very wide range of I / O devices, such as keyboard, monitor, printer, mouse, floppy drive, webcam, etc.


The three basic units in a computer, the CPU, the memory, and the I / O elements, are communicated with each other by bus or communication channels:

  • Address bus : allows you to select the address of the data or peripheral you want to access,
  • Control bus : controls the external and internal operation of the CPU.
  • Data bus : contains the information (data) that circulates through the system.

The main parts of a computer that make up a computer.

There are two main parts of a computer that make a computer and are capable of functioning. Both are useless without each other.

Those two main parts of the computer are listed below.

  1. Hardware
  2. Software

What is Hardware

Hardware is the physical part of a computer that interconnects and makes a computer.

Hardware means all the physical parts that we can see, touch, and feel with our hands.

Computers consist of many types of hardware. We can divide this hardware into different categories based on its usage and ability to work. Which makes it easier for us to remember and identify them.

Categories of Computer Hardware

There are five main categories of hardware based on work and capacity.

  1. Input device
  2. Processing device
  3. Output Device
  4. Memory or Storage Device
  5. Other Internal Device
What is Input device

The devices used to instruct and operate or control the computer are called input devices. The following is an example of an input device.

List of input devices
  1. Keyboard
  2. Mouse
  3. Microphone
  4. Joystick
  5. Scanner
  6. Light Pen
  7. Barcode Reader
  8. Touch screen
  9. Digital camera
  10. Digital video
  11. Graphics tablet
  12. Image scanner
What is processing device

The device that processes the given data as input, and gives the output, is called a processing device.

The main example of a processing device is the CPU (Central Processing Unit).

What is Output Device

The device which provides the output of the data processed by the processing unit in the form of text, audio, video, graphics, or images is called an output device.

An example of an output device is a monitor, which shows the output as text, video, graphics, or images.

List of the Output Device
  1. Monitor
  2. Speaker
  3. Projector
  4. Headphone
  5. Printers
  6. Sound card
  7. Video card
Storage Device (Memory)
  1. RAM (Random Access Memory)
  2. ROM (Read Only Memory)
  3. HDD (Hard Disk Drive)
  4. SSD (Solid State Drive)
Other Computer Hardware Device
  1. Power Supply
  2. Motherboard
  3. Cooling Fan
  4. Hit Sink

What is Software

Software is a set or collection of data, instructions, or programs that tell a computer what to do and how to do it.

The software mainly makes computer hardware workable.

The main example of software is the operating system. Which controls the system and establishes communication between the user and the computer. With the help of the operating system, the user is able to talk to the computer.

There are mainly two types of software.

  1. System Software
  2. Application Software
List Of System Software
  1. Microsoft Windows
  2. macOS
  3. Linux
List of Application Software
  1. MS Word
  2. WordPad
  3. Notepad
  4. Real Player
  5. Media Player
  6. Apple Numbers
  7. Microsoft Excel
  8. Oracle
  9. MS Access etc
  10. Microsoft PowerPoint
  11. Keynotes

How many types of computers

A computer can be divided in many ways. Computers can be divided based on their usage, size, and processing power.

There are three main types of computers and all three have another subcategory.

List of computer

  1. Analog Computer
  2. Digital computer
    1. Mainframe Computer
    2. Mini Computer
    3. Microcomputer
      1. Desktop Computer or Personal Computer(PC)
      2. Notebook Computers or Laptop
      3. Netbook
      4. Tablet
      5. Handheld Computer or Personal Digital Assistant(PDA)
      6. Smart Phones
    4. Workstation
    5. Supercomputer
  3. Hybrid Computer

Types of Computer

Computers can be classified in a number of different ways, including:

By architecture

  1. Analog computer
  2. Digital computer
  3. Hybrid computer
  4. Harvard architecture
  5. Von Neumann architecture
  6. Complex instruction set computer
  7. Reduced instruction set computer

By size, form-factor and purpose

  1. Supercomputer
  2. Mainframe computer
  3. Minicomputer
  4. Microcomputer
  5. Server
  6. Personal computer
    • Desktop Computer
    • Laptop Computer
      • Chromebook
      • Subnotebook
      • Netbook
    • Mobile Computer
      • Tablet computer
      • Smartphone
      • Ultra-mobile PC
      • Pocket PC
      • Palmtop PC
      • Handheld PC
    • Wearable computer
      • Smartwatch
    • Smartglasses

History of Computer

The computer is the most advanced version of a series of calculation tools invented since ancient times, including the abacus, the Antikythera machine, Napier’s sticks.

The most famous examples of calculating machines in the modern age are perhaps Pascal’s machine ( 1645 ) and Leibniz’s machine ( 1672 ), but we should also remember Wilhelm Schickard’s calculating machine, dated 1623, of which only the designs remain.

The transition from a calculating machine to a real computer (in the sense of a programmable device) is due to Charles Babbage: his Analytical Engine, designed in 1833, although never built, is considered the first example of a computer in history.

It was a colossal machine with gears, steam-powered and equipped with input, output, memory unit, decimal calculation unit with data accumulation register, and a connection system between the various parts and contrary to what is you might think, it was entirely digital.

Far from being an invention of one person, in particular, the computer is the evolutionary result of ideas of many people related to areas such as electronics, mechanics, semiconductor materials, logic, algebra, and programming.

It was created for the greater good because making mathematical tables was a tedious and error-prone process. Konrad Zuse was the first to build a working computer.

This computer could store 64 words. With the passage of time, it got better than now. Save 500 million words with a capacity of 30GB.

Chronology of Computer

The main milestones in the history of computing, from the first-hand tools for calculating to modern pocket computers.

  • 500 BC C.: the abacus is used in ancient civilizations such as Chinese or Sumerian, the first tool to perform addition and subtraction.
  • Around 830: the Persian mathematician and engineer Musa al-Khuarismi developed the theory of algorithm, that is, the methodical solution of problems in algebra and numerical calculus by means of a well-defined, ordered and finite list of operations.
  • 1614: the Scotsman John Napier invents the Neperian logarithm, which managed to simplify the calculation of multiplication and division by reducing it to a calculation with addition and subtraction.
  • 1620: the Englishman Edmund Gunter invents the slide rule, a manual instrument used since then until the appearance of the electronic calculator to perform arithmetic operations.
  • 1623: the German Wilhelm Schickard invents the first calculating machine, the prototype of which disappeared shortly after.
  • 1642: the French scientist and philosopher Blaise Pascal invents an adding machine (the pascaline ), which used toothed wheels, and of which some original copies are still preserved.
  • 1671: the German philosopher and mathematician Gottfried Wilhelm Leibniz invents a machine capable of multiplying and dividing.
  • 1801: Frenchman Joseph Jacquard invents a perforated card for his brocade weaving machine that controls the pattern of operation of the machine, an idea that would be used later by the first computers.
  • 1833: British mathematician and inventor Charles Babbage designs and attempts to build the first mechanically operated computer, which he called the “analytical engine.” However, the technology of his time was not advanced enough to make his idea a reality.
  • 1841: the mathematician Ada Lovelace begins to work with Babbage in what would be the first algorithm destined to be processed by a machine, which is why she is considered the first computer programmer.
  • 1890: the American Herman Hollerith invents a tabulating machine taking advantage of some of Babbage’s ideas, which was used to prepare the United States census. Hollerith later founded the company that would later become IBM.
  • 1893: Swiss scientist Otto Steiger develops the first automatic calculator to be manufactured and used on an industrial scale, known as the Millionaire.
  • 1936: the English mathematician and computer scientist Alan Turing formalizes the concepts of algorithm and Turing machine, which would be key in the development of modern computing.
  • 1938: the German engineer Konrad Zuse completes the Z1, the first computer that can be considered as such. Electromechanical in operation and using relays, it was programmable (via perforated tape) and used binary system and Boolean logic. It would be followed by the improved Z2, Z3 and Z4 models.
  • 1944: In the United States, the IBM company builds the Harvard Mark I electromechanical computer, designed by a team headed by Howard H. Aiken. It was the first computer created in the United States.
  • 1944: Colossus computers ( Colossus Mark I and Colossus Mark 2 ) are built in England, with the aim of deciphering the communications of the Germans during World War II.
  • 1946: at the University of Pennsylvania the ENIAC ( Electronic Numerical Integrator And Calculator ) is put into operation, which worked with valves and was the first general-purpose electronic computer.
  • 1947: at Bell Labs, John Bardeen, Walter Houser Brattain, and William Shockley invent the transistor.
  • 1950: Kathleen Booth, creates the Assembly Language to do operations on the computer without having to change the connection cables, but through punch cards (program or operation saved for use when necessary) which were prone to damage for this reason, at the end of this year the development of the programming language begins.
  • 1951: EDVAC begins to operate, conceived by John von Neumann, which unlike ENIAC was not decimal, but binary, and had the first program designed to be stored.
  • 1953: IBM manufactures its first industrial-scale computer, the IBM 650. The use of assembly language for computer programming is expanded. Computers replace the transistors of valves, marking the beginning of the second generation of computers.
  • 1957: Jack S. Kilby builds the first integrated circuit.
  • 1964: The appearance of the IBM 360 marks the beginning of the third generation of computers, in which printed circuit boards with multiple elementary components are replaced with integrated circuit boards.
  • 1965: Olivetti launches Program 101, the first desktop computer.
  • 1971: Nicolet Instruments Corp. launches the Nicolet 1080, a scientific-use computer based on 20-bit registers.
  • 1971: Intel introduces the first commercial microprocessor, the first chip : the Intel 4004 microprocessor.
  • 1975: Bill Gates and Paul Allen found Microsoft.
  • 1976: Steve Jobs, Steve Wozniak, Mike Markkula founded Apple.
  • 1977: Apple introduces the first personal computer to be sold on a large scale, the Apple II, developed by Steve Jobs and Steve Wozniak.
  • 1981: The IBM PC is launched on the market, which would become a commercial success, would mark a revolution in the field of personal computing and would define new standards.
  • 1982: Microsoft presents its MS-DOS operating system, commissioned by IBM.
  • 1983: ARPANET is separated from the military network that originated it, passing to civilian use and thus becoming the origin of the Internet.
  • 1983: Richard Stallman publicly announces the GNU project.
  • 1985: Microsoft introduces the Windows 1.0 operating system.
  • 1990: Tim Berners-Lee devises hypertext to create the World Wide Web (www), a new way of interacting with the Internet.
  • 1991: Linus Torvalds began developing Linux, a Unix-compatible operating system.
  • 2000: pocket computers appear at the beginning of the 21st century, the first PDAs.
  • 2007: presentation of the first iPhone, by the Apple company, a smartphone or smartphone.
  • 2016ː emergence of virtual reality aimed at the general public with devices from manufacturers such as Oculus, HTC and Sony.

Other Computer data and concepts

In modern computers, a user has the impression that computers can run several programs “at the same time”, this is known as multitasking.

In reality, the CPU executes instructions in one program and then after a short time, switches execution to a second program and executes some of its instructions.

Since this process is very fast, it creates the illusion that several programs are running simultaneously; you are actually dividing the CPU time among the programs, one at a time.

The operating system is the one that controls the distribution of time. Really simultaneous processing is done on computers that have more than one CPU, giving rise to multiprocessing.

The operating system is the program that manages and administers all computer resources, controls, for example, which programs are executed and when manages memory and access to I / O devices, provides interfaces between devices, even between the computer and user.

Currently, some widely used programs are usually included in the distributions of the operating system; such as Internet browsers, word processors, email programs, network interfaces, movie players, and other programs that previously had to be obtained and installed separately.

The first large and expensive digital computers were used primarily for scientific calculations. ENIAC was created for the purpose of solving the ballistics problems of the United States Army.

The CSIRAC, the first Australian computer, made it possible to assess rainfall patterns for a large hydroelectric generation project.

With the commercial manufacture of computers, governments and companies systematized many of their data collection and processing tasks, which were previously carried out manually.

In the academic world, scientists from all fields began to use computers to do their analyzes and calculations.

The continuous decline in the prices of computers allowed their use by smaller and smaller companies. Businesses, organizations, and governments began to use large numbers of small computers to perform tasks that were previously done by large and expensive mainframes.

With the invention of the microprocessor in 1970, it became possible to make ever-cheaper computers.

The microcomputer was born and then the PC appeared, the latter became popular for carrying out routine tasks such as writing and printing documents, calculating probabilities, performing analysis and calculation with spreadsheets, communicating via email and the Internet.

The wide availability of computers and their easy adaptation to the needs of each person have made them use them for a variety of tasks, which include the most diverse fields of application.

At the same time, small fixed-programming computers (embedded systems ) began to make their way into applications for the home, automobiles, airplanes, and industrial machinery.

These integrated processors controlled the behavior of the devices more easily, allowing the development of more complex control functions, such as anti-lock braking systems ( ABS ).

At the beginning of the 21st century, most electrical appliances, almost all types of electrical transportation, and most factory production lines were powered by a computer.

Towards the end of the 20th century and the beginning of the 21st, personal computers are used for both research and entertainment (video games), but large computers are used for complex mathematical calculations, technology, modeling, astronomy, medicine, etc.

Perhaps the most interesting “descendant” of the cross between the concept of the PC or personal computer and the so-called supercomputers is the Workstation or workstation.

This term, originally used for equipment and machines for recording, recording, and digital sound treatment, now refers to workstations, which are high-capacity computing systems, normally dedicated to scientific calculation tasks or real-time processes.

A Workstation is, in essence, a personal work computer with high computation, performance, and storage capacity, superior to conventional personal computers.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *