Friday, 20 January 2012
Computer of Classification
Until recently computers were classifieds as microcomputers, super minicomputers, mainframes, and supercomputers. Technology, however, has changed and this classification is no more relevant. Today all computers used microprocessors as their CPU. Thus classification is possible only through their mode of use. Based on mode of use we can classify computers as Palms, Laptop PCs, Desktop PCs and Workstations. Based on interconnected computers we can classify computers we can classify them as distributed computers and parallel computers.
Laptop PCs (also known as notebook computers) are portable computers weighing around 2 kgs. They have a keyboard, flat screen liquid crystal display, and a Pentium or Power PC processor. Colour displays are available. They normally run using WINDOWS OS. Laptops come with hard disk (around 40 GB), CDROM and floppy disk. They should run with batteries and are thus designed to conserve energy by using power efficient chips. Many Laptops can be connected to a network. There is a trend towards providing wireless connectivity to Laptops so that they can read files from large stationary computers. The most common use of Laptop computers is used for word processing, and spreadsheet computing. As Laptops use miniature components which have to consume low power and have to be packaged in small volumes.
While manufacturers such as IBM, SUN and Silicon Graphics have been manufacturing high performance workstations the speed of Intel Pentium Processors has been going up. In 2004, Pentium with clock speed 3 GHz are available. They can support several GB main memories. Thus the difference between high end PCs and Workstations is vanishing. Today companies such as SUN make Intel based workstations.While Workstations are characterized by high performance processors with large screens for interactive programming, servers are used for specific purpose such as high performance numerical computing (called compute server), web page hosting, database store, printing etc. interactive large screens are not necessary. Compute servers have performance processors with large main memory, database servers have big on-line disk storage (100s of GB) and print servers support several high speed printers.
Palm PCs or Simputer
With miniaturization and high-density packing of transistor on a chip, computers with capabilities nearly that of PCs which can be held in a palm have emerged. Palm accept handwritten inputs using an electronic pen which can be used to write on a Palm’s screen (besides a tiny keyboard), have small disk storage and can be connected to a wireless network. One has to train the system on the user’s handwriting before it can be used as a mobile phone, Fax, and e-mail machine. A version of Microsoft operating system called Windows-CE is available for Palm.An Indian initiative to meet the needs of rural population of developing countries is called Simputer. Simputer is a mobile handheld computer with input through icons on a touch sensitive overly on the LCD display panel. A unique feature of Simputer is the use of free open source OS called GNU/Linux. The cost of ownership is thus low as there is no software cost for OS. Another unique feature of Simputer not found in Palm, is a smart card reader/writer, which increases the functionality of the Simputer including possibility of personalization of a single Simputer for several users.Laptop PCs:
Laptop PCs (also known as notebook computers) are portable computers weighing around 2 kgs. They have a keyboard, flat screen liquid crystal display, and a Pentium or Power PC processor. Colour displays are available. They normally run using WINDOWS OS. Laptops come with hard disk (around 40 GB), CDROM and floppy disk. They should run with batteries and are thus designed to conserve energy by using power efficient chips. Many Laptops can be connected to a network. There is a trend towards providing wireless connectivity to Laptops so that they can read files from large stationary computers. The most common use of Laptop computers is used for word processing, and spreadsheet computing. As Laptops use miniature components which have to consume low power and have to be packaged in small volumes.
Personal Computers (PCs)
The most popular PCs are desktop machines. Early PCs had Intel 8088 microprocessors as their CPU. Currently (2004), Intel Dual Core is the most popular processor. The machines made by IBM are called IBM PCs. Other manufacturers use IBM’s specifications and design their own PCs. They are known as IBM compatible PCs. IBM PCs mostly use MS-Windows, WINDOWS –XP or GNU/Linux as Operating System. IBM PCs, nowadays (2004) have 64 to 256 MB main memory, 40 to 80 GB of Hard Disk and a floppy disk or flash ROM. Besides these a 650 MB CDROM is also provided in PCs intended for multimedia use. Another company called Apple also makes pCs. Apple PCs are known as Apple Macintosh. They use Apple’s proprietary OS, which is designed for simplicity of use. Apple Macintosh machines used Motorola 68030 microprocessors but now use Power PC 603 processor. IBM PCs are today the most popular computers with millions of them in use throughout the world.Workstations:
Workstations are also desktop machines. They are, however, more powerful providing processorspeeds about 10 times that of PCs. Most workstations have a large colour video display unit (19 inch monitors). Normally they have main memory of around 256 MB to 4 GB and Hard Disk of 80 to 320 GB. Workstations normally use RISC processors such as MIPS (SIG), RIOS (IBM), SPARC (SUN), or PA-RISC (HP). Some manufacturers of Workstations are Silicon Graphics (SIG), IBM, SUN Microsystems and Hewlett Packard (HP). The standard Operating System of Workstations is UNIX and its derivatives such as AIX (IBM), Solaris (SUN), and HP-UX (HP). Very good graphics facilities and large video screens are provided by most Workstations. A system called X WINDOWS is provided by Workstations to display the status of multiple processes during their execution. Most Workstations have built-in hardware to connect to a Local Area Network (LAN). Workstations are used for executing numeric and graphic intensive applications such as those, which arise in Computer Aided Design, simulation of complex systems and visualizing the results of simulation.Servers
While manufacturers such as IBM, SUN and Silicon Graphics have been manufacturing high performance workstations the speed of Intel Pentium Processors has been going up. In 2004, Pentium with clock speed 3 GHz are available. They can support several GB main memories. Thus the difference between high end PCs and Workstations is vanishing. Today companies such as SUN make Intel based workstations.While Workstations are characterized by high performance processors with large screens for interactive programming, servers are used for specific purpose such as high performance numerical computing (called compute server), web page hosting, database store, printing etc. interactive large screens are not necessary. Compute servers have performance processors with large main memory, database servers have big on-line disk storage (100s of GB) and print servers support several high speed printers.
Mainframes Computers
There are organizations such as banks and insurance companies process large number of transactions on-line. They require computers with very large disks to store several Terabytes of data and transfer data from disk to main memory at several hundred Megabytes/sec. The processing power needed from such computers is hundred million transactions per second. These computers are much bigger and faster than workstations and several hundred times more expensive. They normally use proprietary operating systems, which usually provide high expensive services such as user accounting, file security and control. They are normally much more reliable when compared to Operating System on PCs. These types of computers are called mainframes. These are a few manufacturers of mainframes (e.g., IBM and Hitachi). The number of mainframe users has reduced as many organizations are rewriting their systems to use networks of powerful workstations.Supercomputers
Supercomputers are the fastest computers available at any given time and are normally used to solve problems, which require intensive numerical computations. Examples of such problems are numerical weather prediction, designing supersonic aircrafts, design of drugs and modeling complex molecules. All of these problems require around 10^16calculations to be performed. These problems will be solved in about 3 hours by a computer, which can carry out a trillion floating point calculations per second. Such a computer is classifieds as supercomputer today (2004). By about the year 2006 computers which can carry out 10^15 floating point operations per second on 64 bit floating point numbers would be available and would be the ones which be called supercomputers. Interconnecting several high speed computers and programming them to work cooperatively to solve problems build supercomputers. Recently applications of supercomputers have expanded beyond scientific computing, they are now used to analyze large commercial database, produced animated movies and play games such as chess.Besides arithmetic speed, a computer to be classified as a supercomputer should have a large main memory of around 16 GB and a secondary memory of 1000 GB. The speed of transfer of data from secondary memory to the main memory should be at least a tenth of the memory to CPU data transfer speed. All supercomputers use parallelism to achieve their speed. In Sec. 12.9 we discuss the organization of parallel computersComputer history and Generations
The computers that you see and use today hasn't come off by any inventor at one go. Rather it took centuries of rigorous research work to reach the present stage. And scientists are still working hard to make it better and better. But that is a different story.
First, let us see when the very idea of computing with a machine or device, as against the conventional manual calculation, was given a shape.
Though experiments were going on even earlier, it dates back to the 17th century when the first such successful device came into being. Edmund Gunter, an English mathematician, is credited with its development in 1620. Yet it was too primitive to be recognized even as the forefather of computers. The first mechanical digital calculating machine was built in 1642 by the French scientist-philosopher Blaise Pascal. And since then the ideas and inventions of many mathematicians, scientists, and engineers paved the way for the development of the modern computer in following years.
But the world has had to wait for yet another couple of centuries to reach the next milestone in developing a computer. Then it was the English mathematician and inventor Charles Babbage who did the wonder with his works during 1830s. In fact, he was the first to work on a machine that can use and store values of large mathematical tables. The most important thing of this machine is its use in recording electric impulses, coded in the very simple binary system, with the help of only two kinds of symbols.
This is quite a big leap closer to the basics on which computers today work. However, there was yet a long way to go. And, compared to present day computers, Babbage's machine could be regarded as more of high-speed counting devices. For, they could only work on numbers alone!
The Boolean algebra developed in the 19th century removed the numbers-alone limitation for these counting devices. This technique of mathematics, invented by Boole, helped correlate the binary digits with our language. For instance, the values of 0s are related with false statements and 1s with the true ones. British mathematician Alan Turing made further progress with the help of his theory of a computing model. Meanwhile the technological advancements of the 1930s helped much in furthering the advancement of computing devices.
But the direct forefathers of present-day computer systems evolved in about 1940s. The Harvard Mark 1 Computer designed by Howard Aiken is the world's first digital computer which made use of electro-mechanical devices. It was developed jointly by the International Business Machines (IBM) and the Harvard University in 1944.
But the real breakthrough was the concept of the stored-program computer. This was when the Hungarian-American mathematician John von Neumann introduced the Electronic Discrete Variable Automatic Computer (EDVAC). The idea--that instructions as well as data should be stored in the computer's memory for better results--made this device totally different from its counting device type of forerunners. And since then computers have increasingly become faster and more powerful.
First, let us see when the very idea of computing with a machine or device, as against the conventional manual calculation, was given a shape.
Though experiments were going on even earlier, it dates back to the 17th century when the first such successful device came into being. Edmund Gunter, an English mathematician, is credited with its development in 1620. Yet it was too primitive to be recognized even as the forefather of computers. The first mechanical digital calculating machine was built in 1642 by the French scientist-philosopher Blaise Pascal. And since then the ideas and inventions of many mathematicians, scientists, and engineers paved the way for the development of the modern computer in following years.
But the world has had to wait for yet another couple of centuries to reach the next milestone in developing a computer. Then it was the English mathematician and inventor Charles Babbage who did the wonder with his works during 1830s. In fact, he was the first to work on a machine that can use and store values of large mathematical tables. The most important thing of this machine is its use in recording electric impulses, coded in the very simple binary system, with the help of only two kinds of symbols.
This is quite a big leap closer to the basics on which computers today work. However, there was yet a long way to go. And, compared to present day computers, Babbage's machine could be regarded as more of high-speed counting devices. For, they could only work on numbers alone!
The Boolean algebra developed in the 19th century removed the numbers-alone limitation for these counting devices. This technique of mathematics, invented by Boole, helped correlate the binary digits with our language. For instance, the values of 0s are related with false statements and 1s with the true ones. British mathematician Alan Turing made further progress with the help of his theory of a computing model. Meanwhile the technological advancements of the 1930s helped much in furthering the advancement of computing devices.
But the direct forefathers of present-day computer systems evolved in about 1940s. The Harvard Mark 1 Computer designed by Howard Aiken is the world's first digital computer which made use of electro-mechanical devices. It was developed jointly by the International Business Machines (IBM) and the Harvard University in 1944.
But the real breakthrough was the concept of the stored-program computer. This was when the Hungarian-American mathematician John von Neumann introduced the Electronic Discrete Variable Automatic Computer (EDVAC). The idea--that instructions as well as data should be stored in the computer's memory for better results--made this device totally different from its counting device type of forerunners. And since then computers have increasingly become faster and more powerful.
Still, as against the present day's personal computers, they had the simplest form of designs. It was based on a single CPU performing various operations, like, addition, multiplication and so on. And these operations would be performed following an order of instructions, called program, to produce the desired result.
This form of design, was followed, with a little change even in the advanced versions of computers developed later. This changed version saw a division of the CPU into memory and arithmetic logical unit (ALU) parts and a separate input and output sections.
In fact, the first four generations of computers followed this as their basic form of design. It was basically the type of hardware used that caused the difference over the generation. For instance, the first generation variety was based on vacuum tube technology. This was upgraded with the coming up of the transistors, and printed circuit board technology in the 2nd generations. It was further upgraded by the coming up of integrated circuit chip technology where the little chips replaced a large number of components. Thus the size of computer was greatly reduced in the 3rd generation, while it become more powerful. But the real marvel came during the 1970s. It was with the introduction of the very large scale integrated technology (VLSI) in the 4th generation. Aided by this technology a tiny microprocessor can store millions of pieces of data.
And based on this technology the IBM introduced its famous Personal Computers. Since then IBM itself, and other makers including Apple, Sinclair, and so forth, kept on developing more and more advanced versions of personal computers along with bigger and more powerful ones like Mainframe and Supercomputers for more complicated works.
Meanwhile the tinier versions like laptops and even palmtops came up with more advanced technologies over the past couple of decades. But only advancement of technology cannot take the full credit for the amazing advancement of computers over the past few decades. Software, or the inbuilt logic to run the computer the way you like, kept on being developed at an equal pace. The coming of famous software manufacturers like Microsoft, Oracle, Sun have helped pacing up the development. The result of all these painstaking research is to add to our ease in solving complex problems at a lightning speed with a device that is easy to use and operate, called computer.
Each generation of computer is characterized by a major technological development that fundamentally changed the way computers operate, resulting in increasingly smaller, cheaper, more powerful and more efficient and reliable devices.
The history of computer development is often referred to in reference to the different generations of computing devices. Each generation of computer is characterized by a major technological development that fundamentally changed the way computers operate, resulting in increasingly smaller, cheaper, more powerful and more efficient and reliable devices. Read about each generation and the developments that led to the current devices that we use today.
First Generation (1940-1956) Vacuum Tubes
Sponsored
10 Data Center Server Hardware Must-Haves.: Find out which 10 hardware additions will help you maintain excellent service and outstanding security for you and your customers.
The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.
First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.
Second Generation (1956-1963) Transistors
Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.
The first computers of this generation were developed for the atomic energy industry.
Third Generation (1964-1971) Integrated Circuits
The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.
Fourth Generation (1971-Present) Microprocessors
The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.
Fifth Generation (Present and Beyond) Artificial Intelligence
Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.
Subscribe to:
Posts (Atom)