March 30, 2007

National Science Day – Lecture on 5th Generation System & Artificial Intelligence

Fifth Generation System & Artificial Intelligence

Invited talk On the occasion of “National Science Day – February 28, 2007”, The Institute of Engineers (India), Madhya Pradesh State Center, Bhopal, by R C Chakraborty, Visiting Prof. JIET, Guna & Former Dir. DTRL & ISSA (DRDO).,

Artificial intelligence (AI) :  Researchers are creating systems which can mimic human thought, understand speech, beat the best human chess player, and other feats never before possible. AI has a unique place in science, sharing borders with mathematics, computer science, philosophy, psychology, biology, cognitive science and others. While there is no clear definition of Artificial Intelligence or even Intelligence, it an attempt to build machines that like humans can think, act, learn and use knowledge to solve problems on their own.

Roots of AI : About 5,000 years ago, ABACUS, a machines with memory emerged in far east for accounting crops, land, and population. About 3000 years later, Pascal in 1652 invented a calculating machines, followed by a programmable mechanical calculating machines by Charles Babbage in 1822. Two decades later George Boole in 1854 published ‘An Investigation of the laws of Thought’ on which calculus of logic (Boolean algebra) and probabilities are based upon. A century later in 1936, Alan Turing invented ‘Turing machine’, a hypothetical device representing a computing machine. Finally, the University of Pennsylvania unveiled the invention of ENIAC (Electronic Numerical Integrator and Calculator), in 1946, the first completely electronic general-purpose computer by Eckert and Mauchly. It contained 17,468 vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints. It weighed 30 tons, size 8×3 x100 feet took 1800 sq. feet, and consumed 150 kW power. Since then, the electronic computer technology evolved fast. John von Neumann in 1945 at Princeton conceived the architecture for the digital computer. The magnetic memory, magnetic tape and the “drum” disks were realized in the early 1950s. The first commercial computer UNIVAC-I (UNIVersal Automatic Computer I) hit the market in 1951. This is first American commercial computer UNIVAC I, designed for business use. UNIVAC-I used 5,200 vacuum tubes, consumed 125 kw, clock 2.25 mhz., memory 1000 words of 12 characters, operations 1,905 per second. The central complex (processor and memory unit) was 4.3 m × 2.4 m × 2.6 m. The complete system occupied more than 350 Sq. ft floor space and weight 13 tons. Just a year before, in 1950 Alan Turing published ‘Computing Machinery and Intelligence’, a concept aimed to capture what the human mind can do when carrying out a procedure. His work is regarded as the foundation of computer science and of the artificial intelligence program.

The notion of computers ‘generations’  :   The time line of computer development is often referred to the different generations, each characterized by a major technological development that fundamentally changed the way computers operate.

  • First Generation (1940 – 1956) Computers : Used Vacuum tubes for circuitry, and magnetic drums for memory, machine language to perform operations, input on punched cards and paper tape, output on printer, large input power, heat, size, cost. Examples : UNIVAC and ENIAC.
  • Second Generation (1956 – 1963) Computers : Used Transistors invented in 1947, replaced vacuum tubes, became smaller, faster, cheaper, more energy-efficient and more reliable, moved from a magnetic drum to magnetic core technology, continued input on punched cards and paper tape, and output on printer, moved from machine language to assembly languages (specify instructions in words), stored instructions in memory. High-level programming languages COBOL and FORTRAN being developed, which moved from a magnetic drum to magnetic core technology. Example : IBM7030, IBM 7094. The first two computers of this generation were developed for the atomic energy industry.
  • Third Generation (1964 – 1971) Computers : Used Integrated circuit (IC) developed in 1959, users interacted through keyboards and monitors replaced punched cards and printers, the interfaces with an operating system which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers became smaller and cheaper, and accessible to many people. Example : the IBM 370, CDC 7600.
  • Fourth Generation (1971 – 1984 to Present day) Computers : Used Microprocessor where thousands of integrated circuits were built onto a single silicon chip. The Intel released the first microprocessor 4004 chip developed in 1971, located all the components of the computer, the central processing unit (CPU), arithmetic logic unit (ALU), memory, I/O controls on a single chip. It was possible to process four bits of data at a time. PASCAL & C, as structured programming language were developed. The computers became small, more powerful, could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices. Examples : IBM PC/AT with MS-DOS as the operating system, the other personal Computers, desktop, laptop using UNIX OS and the high performance parallel processors that contains thousands of CPUs.
  • Fifth Generation (Present & Beyond) ) Computers : The Used Artificial intelligence and are still in development. The conventional computers built during all four generations follow an operational design known as the Von Neumann process. It can do only what it is instructed to do in a detailed program. It cannot assimilate new facts that were not included in the program.

Artificial intelligent computers operate in a fundamentally different fashion are primarily symbolic processors. Artificial Intelligence (AI) is the science that automates intelligent behaviors.  It is a system that thinks and acts like humans, and rationally.  It is the study of mental faculties through the use of computational methods, the use of computers to do symbolic reasoning, pattern recognition, learning, and some forms of inference.   The AI program sorts through its stored memory to determine its own sequence of steps. AI systems uses “heuristics,” instead of merely preset algorithms.   Heuristics enable machines to recognize promising approaches to solving problems.  We expect AI computers Computers to do things which at the moment people do better.

For complete lecture slides move on to Website URL :


By R C Chakraborty, February 28, 2007,,

Create a free website or blog at