36 Lecture

CS101

Midterm & Final Term Short Notes

Roots of Computing

Computing, as we know it today, is the culmination of centuries of progress and innovation in the field of mathematics and science. The roots of computing can be traced back to ancient civilizations like the Babylonians, who developed an intrica


Important Mcq's
Midterm & Finalterm Prepration
Past papers included

Download PDF
  1. Who is considered the father of modern computing? a. Charles Babbage b. Alan Turing c. Tim Berners-Lee d. John Mauchly Answer: b. Alan Turing


  2. What was the first computer program? a. Algorithm for calculating Bernoulli numbers b. Algorithm for playing chess c. Algorithm for searching the internet d. Algorithm for solving quadratic equations Answer: a. Algorithm for calculating Bernoulli numbers


  3. What was the name of the first electronic computer? a. UNIVAC b. ENIAC c. IBM 701 d. EDVAC Answer: b. ENIAC


  4. Who invented the first mechanical calculator? a. Ada Lovelace b. Charles Babbage c. Blaise Pascal d. John Napier Answer: c. Blaise Pascal


  5. What is the significance of the invention of the transistor in computing history? a. It paved the way for the development of smaller and more efficient electronic devices b. It led to the invention of the first computer c. It enabled the development of the World Wide Web d. It allowed computers to connect to the internet Answer: a. It paved the way for the development of smaller and more efficient electronic devices


  6. What is the difference between a computer and a calculator? a. Computers are more powerful than calculators b. Calculators are more specialized than computers c. Computers use vacuum tubes instead of transistors d. Calculators are easier to use than computers Answer: b. Calculators are more specialized than computers


  7. Who invented the World Wide Web? a. Tim Berners-Lee b. Bill Gates c. Steve Jobs d. Mark Zuckerberg Answer: a. Tim Berners-Lee


  8. What is the significance of Moore's Law in computing history? a. It predicts that the number of transistors on a microchip doubles every two years b. It predicts that computers will become obsolete within five years of their manufacture c. It predicts that computers will never become smaller than a certain size d. It predicts that the internet will continue to grow exponentially Answer: a. It predicts that the number of transistors on a microchip doubles every two years


  9. What is the difference between software and hardware? a. Software refers to the physical components of a computer, while hardware refers to the programs that run on it b. Hardware refers to the physical components of a computer, while software refers to the programs and instructions that run on it c. Software refers to the programs and instructions that run on a computer, while hardware refers to the data stored on it d. Hardware refers to the data stored on a computer, while software refers to the physical components of it Answer: b. Hardware refers to the physical components of a computer, while software refers to the programs and instructions that run on it


  10. What is the significance of the invention of the GUI in computing history? a. It made computers more powerful than ever before b. It made it easier and more intuitive to use computers c. It made it possible to connect computers to the internet d. It made it possible to send emails from computers Answer: b. It made it easier and more intuitive to use computers



Subjective Short Notes
Midterm & Finalterm Prepration
Past papers included

Download PDF
  1. What is the significance of the Analytical Engine in the roots of computing?

Answer: The Analytical Engine, designed by Charles Babbage in the mid-19th century, was one of the earliest mechanical general-purpose computers, and it laid the foundation for modern computing.

  1. Who is considered the father of modern computing?

Answer: Alan Turing is widely considered the father of modern computing for his contributions to the development of theoretical computer science and the cracking of the Nazi Enigma code during World War II.

  1. What was the first computer program?

Answer: The first computer program was written by Ada Lovelace in the 19th century for Charles Babbage's Analytical Engine. It was an algorithm for calculating Bernoulli numbers.

  1. Who invented the first electronic computer?

Answer: The first electronic computer was the ENIAC (Electronic Numerical Integrator And Computer), which was invented by John Mauchly and J. Prosper Eckert in 1945.

  1. What is the significance of the invention of the transistor in computing history?

Answer: The invention of the transistor in 1947 by John Bardeen, Walter Brattain, and William Shockley paved the way for the development of smaller, faster, and more efficient electronic devices, including computers.

  1. What is the difference between a computer and a calculator?

Answer: A calculator is a small, specialized device designed to perform mathematical calculations, while a computer is a more general-purpose device that can perform a wide variety of tasks, including mathematical calculations.

  1. Who invented the World Wide Web?

Answer: The World Wide Web was invented by British computer scientist Tim Berners-Lee in 1989.

  1. What is the significance of Moore's Law in computing history?

Answer: Moore's Law is a prediction made by Intel co-founder Gordon Moore in 1965 that the number of transistors on a microchip would double every two years, leading to exponential growth in computing power. It has proven to be remarkably accurate and has driven much of the rapid progress in computing over the past few decades.

  1. What is the difference between software and hardware?

Answer: Hardware refers to the physical components of a computer, such as a processor, memory, and storage devices, while software refers to the programs and instructions that run on the hardware to perform specific tasks.

  1. What is the significance of the invention of the GUI in computing history?

Answer: The invention of the graphical user interface (GUI) by Xerox PARC in the 1970s revolutionized the way people interact with computers by making it easier and more intuitive to use. It led to the widespread adoption of personal computers and has influenced the design of many other electronic devices, such as smartphones and tablets.

Roots of Computing

Computing, as we know it today, is the culmination of centuries of progress and innovation in the field of mathematics and science. The roots of computing can be traced back to ancient civilizations like the Babylonians, who developed an intricate system of calculations using base 60, and the Greeks, who made significant contributions to the field of geometry. However, it was not until the 19th century that the foundations of modern computing were laid. The invention of the first programmable computer by Charles Babbage in 1837 marked a significant milestone in the history of computing. Babbage's Analytical Engine was designed to perform complex mathematical calculations using punched cards and was considered a precursor to the modern computer. Another important figure in the history of computing is Ada Lovelace, a mathematician, and writer who is widely considered to be the world's first computer programmer. Lovelace worked closely with Babbage and wrote the first algorithm for his Analytical Engine, recognizing that the machine could be used to perform a wide range of tasks beyond mathematical calculations. The early 20th century saw the development of electronic computers, which replaced the mechanical and electromechanical devices that had been used previously. The first electronic computer, known as the Atanasoff-Berry computer, was developed by John Atanasoff and Clifford Berry in the late 1930s. This was followed by the development of the first programmable electronic computer, the Colossus, which was used by the British during World War II to decrypt coded messages. The invention of the transistor in 1947 by William Shockley, John Bardeen, and Walter Brattain was a major breakthrough in the field of computing. Transistors were smaller, faster, and more reliable than the vacuum tubes that had been used in electronic computers, and they paved the way for the development of smaller and more powerful computers. The 1960s saw the development of integrated circuits, which allowed multiple transistors to be combined on a single chip. This led to the development of the first minicomputers, which were smaller and more affordable than earlier computers. The 1970s saw the development of the first microprocessors, which allowed computers to be built on a single chip. This paved the way for the development of personal computers, which became increasingly popular in the 1980s. The 1990s saw the development of the World Wide Web, which revolutionized the way we access and share information. The development of the internet and the rise of mobile computing have transformed the way we communicate, work, and live our lives. Today, we rely on computers for everything from sending emails and browsing the web to conducting business and managing our finances. In conclusion, the roots of computing can be traced back to ancient civilizations, but it was not until the 19th century that the foundations of modern computing were laid. The development of electronic computers, the invention of the transistor, and the development of integrated circuits and microprocessors have led to the development of smaller, faster, and more powerful computers. The internet and mobile computing have transformed the way we communicate and access information, and computers have become an integral part of our daily lives. The history of computing is a fascinating subject that continues to evolve and shape the world we live in today.