Thursday, January 30, 2014

Third Generation (1964-1971) Integrated Circuits The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers. Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors. Fourth Generation (1971-Present) Microprocessors The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip. In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors. As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.
Third Generation (1964-1971) Integrated Circuits The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers. Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors. Fourth Generation (1971-Present) Microprocessors The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip. In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors. As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices. Fifth Generation (Present and Beyond) Artificial Intelligence Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.
Second Generation (1956-1963) Transistors Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output. Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology. The first computers of this generation were developed for the atomic energy industry.
Second Generation (1956-1963) Transistors Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output. Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology. The first computers of this generation were developed for the atomic energy industry.
First Generation (1940-1956) Vacuum Tubes The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions. First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts. The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.
The history of computer science began long before the modern discipline of computer science that emerged in the 20th century, and hinted at in the centuries prior.[dubious – discuss][citation needed] The progression, from mechanical inventions and mathematical theories towards the modern concepts and machines, formed a major academic field and the basis of a massive worldwide industry.[1] Contents [hide] 1 Early history 1.1 Binary logic 1.2 Birth of computer 2 Emergence of a discipline 2.1 Charles Babbage and Ada Lovelace 2.2 Alan Turing and the Turing Machine 2.3 Shannon and information theory 2.4 Wiener and Cybernetics 2.5 John von Neumann and the von Neumann Architecture 3 See also 4 Notes 5 Sources 6 Further reading 7 External links Early history[edit]The earliest known as tool for use in computation was the abacus, developed in period 2700–2300 BCE in Sumer[citation needed]. The Sumerians' abacus consisted of a table of successive columns which delimited the successive orders of magnitude of their sexagesimal number system.[2] Its original style of usage was by lines drawn in sand with pebbles[citation needed]. Abaci of a more modern design are still used as calculation tools today.[3] The Antikythera mechanism is believed to be the earliest known mechanical analog computer.[4] It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to c. 100 BCE. Technological artifacts of similar complexity did not reappear until the 14th century, when mechanical astronomical clocks appeared in Europe.[5] Mechanical analog computing devices appeared a thousand years later in the medieval Islamic world. Examples of devices from this period include the equatorium by Arzachel,[6] the mechanical geared astrolabe by Abū Rayhān al-Bīrūnī,[7] and the torquetum by Jabir ibn Aflah.[8] Muslim engineers built a number of automata, including some musical automata that could be 'programmed' to play different musical patterns. These devices were developed by the Banū Mūsā brothers[9] and Al-Jazari[10] Muslim mathematicians also made important advances in cryptography, such as the development of cryptanalysis and frequency analysis by Alkindus.[11] When John Napier discovered logarithms for computational purposes in the early 17th century,[citation needed] there followed a period of considerable progress by inventors and scientists in making calculating tools. In 1623 Wilhelm Schickard designed a calculating machine, but abandoned the project, when the prototype he had started building was destroyed by a fire in 1624[citation needed]. Around 1640, Blaise Pascal, a leading French mathematician, constructed the first mechanical adding device[12] based on a design described by Greek mathematician Hero of Alexandria.[13] Then in 1672 Gottfried Wilhelm Leibnitz invented the Stepped Reckoner which he completed in 1694.[14] In 1837 Charles Babbage first described his Analytical Engine which is accepted as the first design for a modern computer. The analytical engine had expandable memory, an arithmetic unit, and logic processing capabilities able to interpret a programming language with loops and conditional branching. Although never built, the design has been studied extensively and is understood to be Turing complete. The analytical engine would have had a memory capacity of less than 1 kilobyte of memory and a clock speed of less than 10 Hertz[citation needed]. Considerable advancement in mathematics and electronics theory was required before the first modern computers could be designed[citation needed]. Binary logic[edit]In 1703, Gottfried Wilhelm Leibnitz developed logic in a formal, mathematical sense with his writings on the binary numeral system. In his system, the ones and zeros also represent true and false values or on and off states. But it took more than a century before George Boole published his Boolean algebra in 1854 with a complete system that allowed computational processes to be mathematically modeled[citation needed]. By this time, the first mechanical devices driven by a binary pattern had been invented. The industrial revolution had driven forward the mechanization of many tasks, and this included weaving. Punched cards controlled Joseph Marie Jacquard's loom in 1801, where a hole punched in the card indicated a binary one and an unpunched spot indicated a binary zero. Jacquard's loom was far from being a computer, but it did illustrate that machines could be driven by binary systems[citation needed]. Birth of computer[edit]Before the 1920s, computers (sometimes computors) were human clerks that performed computations. They were usually under the lead of a physicist. Many thousands of computers were employed in commerce, government, and research establishments. Most of these computers were women, and they were known to have a degree in calculus. Some performed astronomical calculations for calendars Full form of computer Commonly Operated Machine Particularly Used for Technical and Educational Research[citation needed]. After the 1920s, the expression computing machine referred to any machine that performed the work of a human computer, especially those in accordance with effective methods of the Church-Turing thesis. The thesis states that a mathematical method is effective if it could be set out as a list of instructions able to be followed by a human clerk with paper and pencil, for as long as necessary, and without ingenuity or insight[citation needed]. Machines that computed with continuous values became known as the analog kind. They used machinery that represented continuous numeric quantities, like the angle of a shaft rotation or difference in electrical potential[citation needed]. Digital machinery, in contrast to analog, were able to render a state of a numeric value and store each individual digit. Digital machinery used difference engines or relays before the invention of faster memory devices[citation needed]. The phrase computing machine gradually gave away, after the late 1940s, to just computer as the onset of electronic digital machinery became common. These computers were able to perform the calculations that were performed by the previous human clerks[citation needed]. Since the values stored by digital machines were not bound to physical properties like analog devices, a logical computer, based on digital equipment, was able to do anything that could be described "purely mechanical." The theoretical Turing Machine, created by Alan Turing, is a hypothetical device theorized in order to study the properties of such hardware[citation needed]. See also: Philosophy of physics, Philosophy of biology, Philosophy of mathematics, Philosophy of language, and Philosophy of mind Emergence of a discipline[edit]Charles Babbage and Ada Lovelace[edit]Charles Babbage is often regarded as one of the first pioneers of computing. Beginning in the 1810s, Babbage had a vision of mechanically computing numbers and tables. Putting this into reality, Babbage designed a calculator to compute numbers up to 8 decimal points long. Continuing with the success of this idea, Babbage worked to develop a machine that could compute numbers with up to 20 decimal places. By the 1830s, Babbage had devised a plan to develop a machine that could use punched cards to perform arithmetical operations. The machine would store numbers in memory units, and there would be a form of sequential control. This means that one operation would be carried out before another in such a way that the machine would produce an answer and not fail. This machine was to be known as the “Analytical Engine”, which was the first true representation of what is the modern computer.[15] Ada Lovelace (Augusta Ada Byron) is credited as the pioneer of computer programming and is regarded as a mathematical genius, a result of the mathematically heavy tutoring regimen her mother assigned to her as a young girl. Lovelace began working with Charles Babbage as an assistant while Babbage was working on his “Analytical Engine”, the first mechanical computer. During her work with Babbage, Ada Lovelace became the designer of the first computer algorithm, which had the ability to compute Bernoulli’s numbers. Moreover, Lovelace’s work with Babbage resulted in her prediction of future computers to not only perform mathematical calculations, but also manipulate symbols, mathematical or not. While she was never able to see the results of her work, as the “Analytical Engine” was not created in her lifetime, her efforts in later years, beginning in the 1940s, did not go unnoticed.[16] Alan Turing and the Turing Machine[edit]The mathematical foundations of modern computer science began to be laid by Kurt Gödel with his incompleteness theorem (1931). In this theorem, he showed that there were limits to what could be proved and disproved within a formal system. This led to work by Gödel and others to define and describe these formal systems, including concepts such as mu-recursive functions and lambda-definable functions[citation needed]. 1936 was a key year for computer science. Alan Turing and Alonzo Church independently, and also together, introduced the formalization of an algorithm, with limits on what can be computed, and a "purely mechanical" model for computing[citation needed]. These topics are covered by what is now called the Church–Turing thesis, a hypothesis about the nature of mechanical calculation devices, such as electronic computers. The thesis claims that any calculation that is possible can be performed by an algorithm running on a computer, provided that sufficient time and storage space are available[citation needed]. In 1937, Alan Turing introduced his idea of what are now referred to as Turing Machines, and, anticipating the modern stored program computer, described what became known as the Universal Turing machine. These Turing machines were designed to formally determine, mathematically, what can be computed, taking into account limitations on computing ability. If a Turing machine can complete the task, it is considered Turing computable.[17] Turing machines are not physical objects, but mathematical ones. They show if and how any given algorithm can be computed. Turing machines are state machines, where a state represents a position in a graph. State machines use various states, or graph positions, to determine the outcome of the algorithm. To accomplish this, a theoretical one-dimensional tape is said to be divided into an infinite number of cells. Each cell contains a binary digit, 1 or 0. As the read/write head of the machine scans in the subsequent value in the cell, it uses this value to determine what state to transition to next. To accomplish this, the machine follows an input of rules, usually in the form of tables, that contain logic similar to: if the machine is in state A and a 0 is read in, the machine is going to go to the next state, say, state B. The rules that the machines must follow are considered the program. These Turing machines helped define the logic behind modern computer science. Memory in modern computers is represented by the infinite tape, and the bus of the machine is represented by the read/write head.[17] Turing focused heavily on designing a machine that could determine what can be computed. Turing concluded that as long as a Turing machine exists that could compute a precise approximation of the number, that value was computable. This does include constants such as pi. Furthermore, functions can be computable when determining TRUE or FALSE for any given parameters. One example of this would be a function “IsEven”. If this function were passed a number, the computation would produce TRUE if the number were even and FALSE if the number were odd. Using these specifications, Turing machines can determine if a function is computable and terminate if said function is computable. Furthermore, Turing machines can interpret logic operators, such as "AND, OR, XOR, NOT, and IF-THEN-ELSE"[17] to determine if a function is computable. Turing is so important to computer science that his name is also featured on the Turing Award and the Turing test. He contributed greatly to British code-breaking successes in the Second World War, and continued to design computers and software through the 1940s until his untimely death in 1954[citation needed]. At a symposium on large-scale digital machinery in Cambridge, Turing said, "We are trying to build a machine to do all kinds of different things simply by programming rather than by the addition of extra apparatus"[citation needed]. In 1941, Konrad Zuse developed the world's first functional program-controlled Turing-complete computer, the Z3. Zuse was also noted for the S2 computing machine, considered the first process-controlled computer. He founded one of the earliest computer businesses in 1941, producing the Z4, which became the world's first commercial computer. In 1946, he designed the first high-level programming language, Plankalkül.[18] In 1969, Zuse suggested the concept of a computation-based universe in his book Rechnender Raum (Calculating Space)[citation needed]. In 1948, the first practical computer that could run stored programs, based on the Turing machine model, had been built - the Manchester Baby[citation needed]. In 1950, Britain's National Physical Laboratory completed Pilot ACE, a small scale programmable computer, based on Turing's philosophy[citation needed]. Shannon and information theory[edit]Up to and during the 1930s, electrical engineers were able to build electronic circuits to solve mathematical and logic problems, but most did so in an ad hoc manner, lacking any theoretical rigor. This changed with Claude Elwood Shannon's publication of his 1937 master's thesis, A Symbolic Analysis of Relay and Switching Circuits. While taking an undergraduate philosophy class, Shannon had been exposed to Boole's work, and recognized that it could be used to arrange electromechanical relays (then used in telephone routing switches) to solve logic problems. This concept, of utilizing the properties of electrical switches to do logic, is the basic concept that underlies all electronic digital computers, and his thesis became the foundation of practical digital circuit design when it became widely known among the electrical engineering community during and after World War II[citation needed]. Shannon went on to found the field of information theory with his 1948 paper titled A Mathematical Theory of Communication, which applied probability theory to the problem of how to best encode the information a sender wants to transmit. This work is one of the theoretical foundations for many areas of study, including data compression and cryptography[citation needed]. Wiener and Cybernetics[edit]From experiments with anti-aircraft systems that interpreted radar images to detect enemy planes, Norbert Wiener coined the term cybernetics from the Greek word for "steersman." He published "Cybernetics" in 1948, which influenced artificial intelligence. Wiener also compared computation, computing machinery, memory devices, and other cognitive similarities with his analysis of brain waves[citation needed]. The first actual computer bug was a moth. It was stuck in between the relays on the Harvard Mark II.[1] While the invention of the term 'bug' is often but erroneously attributed to Grace Hopper, a future rear admiral in the U.S. Navy, who supposedly logged the "bug" on September 9, 1945, most other accounts conflict at least with these details. According to these accounts, the actual date was September 9, 1947 when operators filed this 'incident' — along with the insect and the notation "First actual case of bug being found" (see software bug for details)[citation needed]. John von Neumann and the von Neumann Architecture[edit]In 1946, a model for computer architecture was introduced and became known as von Neumann Architecture. Since 1950, the von Neumann model provided uniformity in subsequent computer designs. The von Neumann Architecture was considered innovative as it introduced an idea of allowing machine instructions and data to share memory space. The von Neumann model is composed of 3 major parts, the arithmetic logic unit (ALU), the memory, and the instruction processing unit (IPU). In von Neumann machine design, the IPU passes addresses to memory, and memory, in turn, is routed either back to the IPU if an instruction is being fetched or to the ALU if data is being fetched.[19] Von Neumann’s machine design uses a RISC (Reduced instruction set computing) architecture, which means the instruction set uses a total of 21 instructions to perform all tasks. (This is in contrast to CISC, complex instruction set computing, instruction sets which have more instructions from which to choose.) Addresses, operations and data types comprise this instruction set. With von Neumann architecture, main memory along with the accumulator (the register that holds the result of logical operations)[20] are the two memories that are addressed. Operations can be carried out as simple arithmetic (these are performed by the ALU and include addition, subtraction, multiplication and division), conditional branches (these are more commonly seen now as “if” statements or “while” loops. The branches serve as ‘go to’ statements), and logical moves between the different components of the machine, i.e., a move from the accumulator to memory or vice versa. Von Neumann architecture accepts fractions and instructions as data types. Finally, as the von Neumann architecture is a simple one, its register management is also simple. The architecture uses a set of seven registers to manipulate and interpret fetched data and instructions. These registers include the "IR (instruction register), IBR (instruction buffer register), MQ (multiplier quotient register), MAR (memory address register), and MDR (memory data register)."[19] The architecture also uses a program counter (PC) to keep track of where in the program the machine is.[19] See also[edit]2006 in information technology Computer Museum History of computing History of computing hardware History of software List of computer term etymologies, the origins of computer science words List of prominent pioneers in computer science Timeline of algorithms Notes[edit]1.Jump up ^ History of Computer Science 2.Jump up ^ Ifrah 2001:11 3.Jump up ^ Bellos, Alex. "Abacus adds up to number joy in Japan". Retrieved 2013-06-25. 4.Jump up ^ The Antikythera Mechanism Research Project, The Antikythera Mechanism Research Project. Retrieved 2007-07-01 5.Jump up ^ In search of lost time, Jo Marchant, Nature 444, #7119 (November 30, 2006), pp. 534–538, doi:10.1038/444534a PMID 17136067. 6.Jump up ^ Hassan, Ahmad Y.. "Transfer Of Islamic Technology To The West, Part II: Transmission Of Islamic Engineering". Retrieved 2008-01-22 7.Jump up ^ "Islam, Knowledge, and Science". University of Southern California. Retrieved 2008-01-22. 8.Jump up ^ Lorch, R. P. (1976). "The Astronomical Instruments of Jabir ibn Aflah and the Torquetum". Centaurus 20 (1): 11–34. Bibcode:1976Cent...20...11L. doi:10.1111/j.1600-0498.1976.tb00214.x. 9.Jump up ^ Koetsier, Teun (2001). "On the prehistory of programmable machines: musical automata, looms, calculators". Mechanism and Machine Theory (Elsevier) 36 (5): 589–603. doi:10.1016/S0094-114X(01)00005-2. 10.Jump up ^ A 13th Century Programmable Robot, University of Sheffield 11.Jump up ^ Simon Singh, The Code Book, pp. 14-20 12.Jump up ^ Short history of the computer 13.Jump up ^ History of Computing Science: The First Mechanical Calculator 14.Jump up ^ Kidwell, Peggy Aldritch; Williams, Michael R. (1992). The Calculating Machines: Their history and development. USA: Massachusetts Institute of Technology and Tomash Publishers. Cite uses deprecated parameters (help), p.38-42, translated and edited from Martin, Ernst (1925). Die Rechenmaschinen und ihre Entwicklungsgeschichte. Germany: Pappenheim. 15.Jump up ^ "Charles Babbage". Encyclopedia Britannica Online Academic Edition. Encyclopedia Britannica In. Retrieved 2013-02-20. 16.Jump up ^ Isaacson, Betsy. "Ada Lovelace, World's First Computer Programmer, Celebrated With Google Doodle". The Huffington Post. http://www.huffingtonpost.com/2012/12/10/google-doodle-ada-lovelace_n_2270668.html. Retrieved 2013-02-20. 17.^ Jump up to: a b c Barker-Plummer, David. [. "Turing Machines"]. The Stanford Encyclopedia of Philosophy. Retrieved 2013-02-20. 18.Jump up ^ Talk given by Horst Zuse to the Computer Conservation Society at the Science Museum (London) on 18 November 2010 19.^ Jump up to: a b c Cragon, Harvey G. (2000). Computer Architecture and Implementation. Cambridge: Cambridge University Press. pp. 1–13. ISBN 0521651689. 20.Jump up ^ "Accumlator" Def. 3. Oxford Dictionaries. Sources[edit]Ifrah, Georges (2001), The Universal History of Computing: From the Abacus to the Quantum Computer, New York: John Wiley & Sons, ISBN 0-471-39671-0 Further reading[edit]Alan Turing A Very Brief History of Computer Science Computer History Museum Computers: From the Past to the Present The First "Computer Bug" at the Online Library of the Naval Historical Center, retrieved February 28, 2006 Bitsavers, an effort to capture, salvage, and archive historical computer software and manuals from minicomputers and mainframes of the 1950s, 1960s, 1970s, and 1980s Gordana Dodig-Crnkovic. "History of Computer Science". Mälardalen University. Matti Tedre (2006). The Development of Computer Science: A Sociocultural Perspective. Doctoral thesis for University of Joensuu. External links[edit]Oral history interview with Albert H. Bowker at Charles Babbage Institute, University of Minnesota. Bowker discusses his role in the formation of the Stanford University computer science department, and his vision, as early as 1956, of computer science as an academic discipline. Oral history interview with Joseph F. Traub at Charles Babbage Institute, University of Minnesota. Traub discusses why computer science has developed as a discipline at institutions including Stanford, Berkeley, University of Pennsylvania, MIT, and Carnegie-Mellon. Oral history interview with Gene H. Golub at Charles Babbage Institute, University of Minnesota. Golub discusses his career in computer science at Stanford University. Oral history interview with John Herriot at Charles Babbage Institute, University of Minnesota. Herriot describes the early years of computing at Stanford University, including formation of the computer science department, centering on the role of George Forsythe. Oral history interview with William F. Miller at Charles Babbage Institute, University of Minnesota. Miller contrasts the emergence of computer science at Stanford with developments at Harvard and the University of Pennsylvania. Oral history interview with Alexandra Forsythe at Charles Babbage Institute, University of Minnesota. Forsythe discusses the career of her husband, George Forsythe, who established Stanford University's program in computer science. Oral history interview with Allen Newell at Charles Babbage Institute, University of Minnesota. Newell discusses his entry into computer science, funding for computer science departments and research, the development of the Computer Science Department at Carnegie Mellon University, including the work of Alan J. Perlis and Raj Reddy, and the growth of the computer science and artificial intelligence research communities. Compares computer science programs at Stanford, MIT, and Carnegie Mellon. Oral history interview with Louis Fein at Charles Babbage Institute, University of Minnesota. Fein discusses establishing computer science as an academic discipline at Stanford Research Institute (SRI) as well as contacts with the University of California—Berkeley, the University of North Carolina, Purdue, International Federation for Information Processing and other institutions. Oral history interview with W. Richards Adrion at Charles Babbage Institute, University of Minnesota. Adrion gives a brief history of theoretical computer science in the United States and NSF's role in funding that area during the 1970s and 1980s. Oral history interview with Bernard A. Galler at Charles Babbage Institute, University of Minnesota. Galler describes the development of computer science at the University of Michigan from the 1950s through the 1980s and discusses his own work in computer science. Michael S. Mahoney Papers at Charles Babbage Institute, University of Minnesota—Mahoney was the preeminent historian of computer science as a distinct academic discipline. Papers contain 38 boxes of books, serials, notes, and manuscripts related to the history of computing, mathematics, and related fields
WiFi - Wireless Fidelity

Wireless Fidelity is a brand, originally licensed by the Wi-Fi Alliance to describe the underlying technology of wireless local area networks (WLAN). A person with a WiFi device can connect to the internet or Local area network using an access point, provided that the device should be in the range of that Wireless access point. Wireless access point (WAP) connects a group of wireless devices to an adjacent wired LAN.

CDMA - Code Division Multiple Access

Code Division Multiple Access, a digital cellular technology that uses spread-spectrum techniques and a special coding scheme (where each transmitter is assigned a code) to allow multiple access. With the help of Multiple access several signals can be multiplexed over the same physical channel.
Each user in a CDMA system uses a different pseudo-random digital sequence to modulate their signal. As it uses spread spectrum technique, so the modulated signal has a much higher data bandwidth than the data being communicated.
Categories Of CDMA :-
  • Synchronous (orthogonal codes) and
  • Asynchronous (pseudorandom codes)

Friday, January 24, 2014

roll. number-2014

SR NO.ROLL NO.STUDENT NAMED.O.BFATHER NAME
11327AJAY KUMAR YADAV10/1/1985JAGDEESH PRASAD
21328ASHUTOSH UMRAO13/11/1995ARVIND UMRAO
31329ABHAY RAWAT28/08/1991LAXMI SHANKAR
41330ANKIT KUMAR THAUR14/01/1995MATHUBAR THAKUR
51331SONAM KUMARI19/10/1995TAKECHANDRA
61332RAJESH KUMAR SHARMA1/6/1988KAMESWAR SHARMA
71333NEHA KUMARI7/10/1995MAHENDRA KUMAR
81334ROHIT AHUJA4/23/1997MAHESH KUMAR AHUJA
91335KM. RAGINI SHARMA3/5/1995MR. KAMESHWAR SHARMA
101336ANANYA SHUKLA1/6/1996ALOK SHUKLA
111337RAJNISH KUMAR DIXIT11/11/195KARUNA SHANKAR DIXIT
121338RASHMI SHARMA PRADEEP SHARMA
131339KM ANJALI  SHARMA8/7/1994KAMESWAR SHARMA
141340RAHUL THAKUR12/4/1997RAMANAND THAKUR
151341POONAM CHAUHAN20/08/1997KISHAN CHAUHAN
161342SHIVA TIWARI18/11/1997MOHAN TIWARI
171343ABHIRAJ KUMAR14/09/1997AWADHESH KUMAR
181344ANUJ KUMAR MATHUR15/10/1994MORDHWAJ MATHUR
191345RAHUL MAURYA21/03/1994SANTOSH MAURYA
201346SHIKHA BAJPAI27/12/1990RAMBABU BAJPAI
211347SHIVBARAN SINGH30/05/1980RAMSAJEEVAN
221348SANGEETA DEVI10/10/1995PANNALAL SINGH
231349GAURAV SHUKLA1/8/1996ANIL KUMAR SHUKLA
241350SHAKTI PRAKASH5/5/1993SUBHASH CHANDRA SHARMA
251351KAPIL KUMAR SHUKLA2/5/1987PALESH KUMAR
261352RAJ DWIVEDI5/7/1998AKHILESH KUMAR DWIVEDI
271353KHUSHBU PAL1/7/1998RAMKHILAVAN PAL
281354SHRUTI SACHAN20/04/1996DILIP KUMAR SACHAN
291355RENUKA GUPTA17/06/1992DEVENDRA GUPTA
301356PRATIBHA SINGH16/11/1996DINESH SIGH
311357APARNA AGNIHOTRI15/06/1996UMAKANT AGNIHOTRI
321358SARIKA PRAJAPATI22/06/1994MATHURA PRASAD PRAJAPATI
331359ARCHANA SHUKLA15/01/1993LATE ASHOK KUMAR  SHUKLA
341360KOMAL NIGAM17/07/1994DAYA SHANKAR NIGAM
351361YOGITA CHOURASIA8/3/1986LATE RAMPAL CHAURASIA
361362SUMAN SINGH17/07/1996BEERENDRA SINGH
371363AMRITA SIGH1/1/1993KEDAR SINGH
381364VIDYA DHARA20/06/1996SHIV KUMAR DHARA
391365SHALINI MISHRA15/02/19ANIL MISHRA
401366ANKITA MISHRA3/3/1995MR.CHATUR PRASAD MISHRA
411367SHAILESH SINGH15/4/1997GAJRAJ SINGH
421368SURABHI SINGH25/10/1996TEKBHADUR SINGH
431369VISWAS KUMAR VERMA18/02/1996UMESH PRASAD VERMA
441370ADITYA PANDEY26/04/1993RAM CHAND PANDEY
451371SHITAL GUPTA1/1/1993VIJAY SHANKER GUPTA
461372PINKY YADAV18/10/1997SHIV NARAYAN YADAV
471373MANISHA JHA18/11/1996DILEEP JHA
481374SHUBHI SINGH16/06/1995GYAN SINGH
491375NILAM  DEVI30/12/1981RAJJAN LAL
501376DIKSHA MISHRA15/05/1994PRATAP BHAN MISHRA
511377PRIYA PANDEY6/7/1993RAMLAKHAN PANDEY
521378ARPITA TRIPATHI13/07/1992ANAND KUMAR TRIPATHI
531379SIVANI BAJPAY16/05/1997SUSHIL KUMAR  BAJPAY
541380SOUMYA SINHA19/09/1996MAHESH SINHA
551381KIRAN GUPTA7/3/1994VINOD GUPTA
561382RAJ SHARMA15/07/1997CHANDRA SHEHAR SHARMA
571383AKASH PANDEY5/10/1996RAJENDRA PANDEY
581384ANSHU KUMAR26/10/1996RAMESH CHANDRA
591385KANCHAN SINGH28/08/1993RAJKUMAR
601386NIHARIKA6/9/1994DEEPAK CHANDRA
611387SUMIT SHARMA5/9/1995UMAKANT SHARMA
621388VERSHA PATEL1/7/1994SRI RAJKUMAR  PATEL
631389ROHIT KUMAR SAVITA22/05/1996RAJ NARAYAN SAVITA
641390ANKIT UTTAM19/05/1995VIRENDRA KUMAR UTTAM
651391VIJAY CHAK3/5/1988VEER SINGH CHAK
661392AKANSHA PANDEY22/09/1996SATYENDRA PANDEY
671393GAURAV KUMAR8/12/1995RAM PRAKASH
681394SAUMYA TIWARI22/05/1992SANTU TIWARI
691395SHOBHA VERMA10/10/1996PRADEEPVERMA
701396KM RINKI GAUTAM7/9/1991ASHOK KUMAR
711397KM.SUJATA SANKHAWAR14/04/1990AWADESH UMAR
721398KAJAL DIXIT24/03/1999ANIL DIXIT
731399JYOTI TIWARI5/7/1990SURENDRA KUMAR TIWARI
741400PRANSHI6/1/1997MANOJ KUMAR TRIPATHI
751401RUPALI KUSHWAHA4/8/1996KRISHNA KUSHWAHA
761402RAJNIKANT YADAV7/11/1997LATE RAMESHVAR YADAV
771403SHRUTI SINHA18/12/1994MAHESH CHANDRA SINHA
781404PRAGYA SAXENA4/12/1995AMIT KUMAR SAXENA
791405SHUBHI SINGH23/09/1993BRIJESH SRIVASTAV
801406JAYA VISWAKARMA15/07/1997DURGA PRASAD
811407ASTHA PRAJAPATI26/05/1998JAI BAHADUR PRAJAPATI
821408SANGEETA DEVI6/3/1975DUKHAN JHA
831409AKANSHA MISHRA16/04/1994JAGDEESHWAR PRASAD MISHRA
841410ABHISHEK TIWARI25/08/1993SRI SANTOSH TIWARI
851411ANAND KUMAR16/04/1997ROSHAN LAL
861412KM. SHWETA YADAV7/6/1993VISHNU KUMAR YADAV
871413KM. SHIKHA YADAV12/1/1994VISHNU KUMAR YADAV
881414KM. SHIVANI PRAJAPATI26/07/1998VIRENDRA PRATAP
891415ARCHANA GAUTAM15/04/1993RAJESHSWARI GAUTAM
901416KIRTI TIWARI14/04/1993KAMLESHWAR NATH TIWARI
911417ALKESH PANDEY1/7/1998PRADEEP KUMAR PANDEY
911418POOJA JHA25/08/1993TRILOK NATH JHA
921418PRATIBHA AWASTHI8/3/1992SURESH AWASTHI
931419AKASH SINGH2/1/1997PREM SHANKAR
941420SAGAR SINGH14/05/1996PREM SHANKAR
951421MUKUL SHARMA5/11/1998ANIL KUMAR SHARMA
961422SIVAM SHARMA18/10/1995ANIL KUMAR SHARMA
971427RAM PRATAP SINGH    30/07/1990 RAJU SINGH
981424SONAM JAIN10/12/1994NEERAJ JAIN
991426ASWANI YADAV22/06/1989RAJA RAM YADAV