September 2022

American computer Pioneer, David Andrew Patterson, was a professor of computer science at the University of California. He is best known for his operating systems, central processing units, and programming languages.

Personal Life and Education

David Patterson was born on 16th November 1947 at Evergreen Park, Illinois, in the United States. From South High School in Torrance, he did graduation. He was the first person in his family who did graduation. Degrees awarded to him were Bachelor of Arts in Mathematics in 1969, Master in Computer Science (1970), and Ph.D. in 1976 from California University at Los Angeles. His thesis was the Verification of Microprograms.

David married his childhood sweetheart, Linda, his high schoolmate. She is the founder of East Bay IMpRoV. They had two sons.

David likes soccer, mountain biking, body surfing, and weight lifting in his spare time. He used to talk about his hobbies, his work, and his team in his lectures. He always convinced the students to teamwork.

David was the top fundraiser from 2006 to 2012. Turing Award laureate David Patterson retired in 2016. He served for 40 years. At Google, he was a remarkable Software Engineer.

Research and Investigation

David’s main areas of investigation are as under:

  • Parallel computing
  • Computer architecture
  • Distributed computing
  • Workload
  • Embedded system

Contributions to the World

  • David coined the term RISC (R reduced instruction set computer) by leading the project.
  • He is famous for research on redundant arrays of inexpensive disks (RAID) storage.
  • Books written by David on computer architecture are extensively used in computer science education.
  • The Greatest Jewish Stories.
  • Article – the case for Reduced Instruction Set Computer.
  • Article – the case for Redundant Arrays of Inexpensive Disks.
  • Article – the design of XPRS
  • Article – the case for Networks of Workstations

Contributions in the Field of the Internet

  • Computer security
  • Systems design
  • Server
  • Real-time computing
  • Software deployment
  • Elasticity and information technology
  • Storage area network
  • Workstation

David’s work dealt with software engineering. Themes like benchmarking, Data Science, and Big data intersect with software where he had focussed.

In his computer hardware research, David mostly incorporated elements of Array data structure, Multi-core processor, and Cache.

Awards and Honors

David Andrew Patterson received 35 awards for his research, teachings, and services. He also received Fellowships. Some of them are listed below:

  • In 2006 and 2007, David received Fellowship from AAAS- American Association for the Advancement of Science and the American Academy of Arts and Sciences.
  • In 2006, he became a member of the National Academy of Sciences and a member of the National Academy of Technology.

Final Words

The outstanding work of David Patterson is a boon to the modern world. 99% of all new chips use RISC architecture since 2018. His systematic and quantitative approach to the evaluation and design of computer architectures had an everlasting impact on the microprocessor industry. Computer engineers and computer scientists of today, who are adopting and developing their ideas, read his books.

Bill Gates, the founder of Microsoft Corporation, has truly said while announcing the Turing Award that the contribution of Patterson had proved to be a fundamental foundation that flourished the entire industry.

The post David Patterson Biography appeared first on The Crazy Programmer.



from The Crazy Programmer https://ift.tt/Rkv4NMg

Sentiment analysis is considered one of the most popular strategies businesses use to identify clients’ sentiments about their products or service. But what is sentiment analysis?

For starters, sentiment analysis, otherwise known as opinion mining, is the technique of scanning words spoken or written by a person to analyze what emotions or sentiments they’re trying to express. The data gathered from the analysis can help businesses have a better overview and understanding of their customers’ opinions, whether they’re positive, negative, or neutral.

You may use sentiment analysis to scan and analyze direct communications from emails, phone calls, chatbots, verbal conversations, and other communication channels. You can also use this to analyze written comments made by your customers on your blog posts, news articles, social media, online forums, and other online review sites.

Businesses in the customer-facing industry (e.g., telecom, retail, finance) are the ones who heavily use sentiment analysis. With a sentiment analysis application, one can quickly analyze the general feedback of the product and see if the customers are satisfied or not.

How does Sentiment Analysis Work?

To perform sentiment analysis, you must use artificial intelligence or machine learning, such as Python, to run natural language processing algorithms, analyze the text, and evaluate the emotional content of the said textual data. Python is a general-purpose computer programming language typically used for conducting data analysis, such as sentiment analysis. Python is also gaining popularity as it utilizes coding segments for analysis, which many people consider fast and easy to learn.

Because, nowadays, many businesses extract their customers’ reviews from social media or online review sites, most of the textual data they’ll get is unstructured. So, to gain insight from the data’s sentiments, you’ll need to use a natural language toolkit (NLTK) in Python to process and hopefully make sense of the textual information you’ve gathered.

How to Perform Sentiment Analysis in Python  

This blog post will show you a quick rundown on performing sentiment analysis with Python through a short step-by-step guide. 

Sentiment Analysis In Python

Install NLTK and Download Sample Data 

First, install and download the NLTK package in Python, along with the sample data you’ll use to test and train your model. Then, import the module and the sample data from the NLTK package. You can also use your own dataset from any online data for sentiment analysis training. After you’ve installed the NLTK package and the sample data, you can begin analyzing the data.

Tokenize The Data 

As the sample text, in its original form, cannot be processed by the machine, you need to tokenize the data first to make it easier for the machine to analyze and understand. For starters, tokenizing data (tokenization) means breaking the strings (or the large bodies of text) into smaller parts, lines, hashtags, words, or individualized characters. The small parts are called tokens.

To begin tokenizing the data in NLTK, use the nlp_test.py to import your sample data. Then, create separate variables for each token. After tokenizing the data, NLTK will provide a default tokenizer using the .tokenized() method.

Normalize The Data

Words can be written in various forms. For example, the word ‘sleep’ can be written as sleeping, sleeps, or slept. Before analyzing the textual data, you must normalize the text first and convert it to its original form. In this case, if the word is sleeping, sleeps, or slept, you must convert it first into the word ‘sleep.’ Without normalization, the unconverted words might be treated as different words, eventually causing misinterpretation during sentiment analysis.

Eliminate The Noise From The Data

Some of you may wonder about what is considered noise in textual data. This refers to words or any part of the text that doesn’t add any meaning to the whole text. For instance, some words considered as noise are ‘is’, ‘a’, and ‘the.’ They’re considered irrelevant when analyzing the data.

You can use the regular expressions in Python to find and remove noise:

  • Hyperlinks 
  • Usernames 
  • Punctuation marks 
  • Special characters 

You can add the code remove_noise() function to your nlp_test.py to eliminate the noise from the data. Overall, removing noise from your data is crucial to make sentiment analysis more effective and accurate.

Determine The Word Density

To determine the word density, you’ll need to analyze how the words are frequently used. To do this, add the function get_all_words to your nlp_test.py file. 

This code will compile all the words from your sample text. Next, to determine which words are commonly used, you can use the FreqDist class of NLTK with the code .most_common(). This will extract a date with a list of words commonly used in the text. You’ll then prepare and use this data for the sentiment analysis.

Use Data For Sentiment Analysis

Now that your data is tokenized, normalized, and free from noise, you can use it for sentiment analysis. First, convert the tokens into a dictionary form. Then, split your data into two sets. The first set will be used for building the model, and the second one will test the model’s performance. By default, the data that will appear after splitting it will contain all the listed positive and negative data in sequence. To prevent bias, add the code .shuffle() to arrange the data randomly.

Build and Test Your Sentiment Analysis Model

Lastly, use the NaiveBayesClassifier class to create your analysis model. Use the code .train() for the training and the .accuracy() for testing the data. At this point, you’ll retrieve informative data listing down the words along with their sentiment. For example, words like ‘glad,’ ‘thanks,’ or ‘welcome’ will be associated with positive sentiments, while words like ‘sad’ and ‘bad’ are analyzed as negative sentiments.

The Bottom Line

The point of this quick guide is to only introduce you to the basic steps of performing sentiment analysis in Python. So, use this brief tutorial to help you analyze textual data from your business’ online reviews or comments through sentiment analysis.

The post Sentiment Analysis in Python – A Quick Guide appeared first on The Crazy Programmer.



from The Crazy Programmer https://ift.tt/oNc08xj

John Carmack is an American programmer and a video game developer. He is the co-founder of id Software, and the main programmer of popular games Wolfenstein, Commander Keen, 3D, Quake and Doom. His company developed various shareware and Internet Distribution Channels that brought a revolution in the field of computer games and how they were sold. John Carmack is credited for making 51 games.

Presently on August 19, 2022, Carmack announced that he has raised $20M for his new AGI company, Keen Technologies.

John Carmack Biography

Personal Life & Education

John was born on August 20, 1970, at Shawnee Mission, Kansas, United States. His father is Stan Carmack and his mother is Inga Carmack.

He received education at the University of Missouri-Kansas City Volker Campus, Raytown South High school and Shawnee Mission East High school. He has studied judo and jujitsu too.

John married Catherine Anna Kang in 2000. The couple ended their relationship and tweeted their divorce on May 27, 2022.

Journey of the Game Era

Carmack attended a few semesters of computer science classes at the University of Missouri. But he dropped out college to pursue contract programming jobs. He accepted job at Softdisk, a software publishing firm in Louisiana. There he joined with John Romero, Adrian Carmack and Tom Hall. They all made the first Commander Keen game which was released as shareware in 1990. Being successful with this game, in 1991 they left Softdisk and founded id software.

The group released Wolfenstein 3D in 1992 and then Doom in 1993. This was a turning point in the history of computer gaming. In 1996, they released Quake with further advancements. Quake offered multiplayer gaming on the internet. This feature gave immense popularity to online gaming.

Carmack continued making sequels to Doom and Quake including Doom II: Hell on Earth in 1994, Final Doom in 1996, Quake II in 1997, Quake III: Arena in 1999, Doom 3 in 2004, and Quake 4 in 2005. Carmack left id Software in 2013, and become Chief Technology Officer at Oculus, a virtual reality company.

Carmack used QuakeC, a compiled language developed in 1996 by id Software to program parts of the video game Quake. This programming language was developed by John himself.

Awards and Recognitions

Date  Awards and Recognitions
1996 The most influential people in computer gaming of the year.
1997 The most influential people of all time
1999 The 50 most influential people in technology
2001 Award for Quake 3 engine for community contribution
2001 Induction at Interactive Arts and Sciences Academy
2002 The MIT Technology Review- TR100
2003 Masters of Doom- One subject of book
2007 2 Emmy Awards
2008 Honored at the 59th Annual Technology & Engineering Emmy Awards
2008 Level One X-Prize Lunar Lander Challenge, won the $350,000
2010 The Game Developers Conference Lifetime Achievement award
2016 BAFTA Fellowship Award
2017 Honorary Doctorate

Karmak developed a unique speech impediment. In addition to his work as a game designer, John is also the founder and lead engineer of Armadillo Aerospace which builds crewed suborbital spacecraft.

Online gaming has encouraged the growth of the 3D rendering sector of the computer hardware market. The Quake engine has been licensed for use in numerous other games. An example is Half-Life, which is very successful.

The post John Carmack Biography appeared first on The Crazy Programmer.



from The Crazy Programmer https://ift.tt/HOI1PRY

Donald brought a transition in the field of computers. He was the first to create the WEB and CWEB computer programming systems. He is rightly known as the Father of Analysis of Algorithms.

An American mathematician, Donald Ervin Knuth is a Computer Scientist and a Professor at Stanford University. He is known for his book The Art of Computer Programming, TeX, METAFONT, Computer Modern, MMIX, LR Parser, Literate Programming, and many more. Donald was an author and a scholar, too.

Donald’s motto in life is to organize and summarize what is known about computer methods and to give it a mathematical and historical foundation. To show that the connection between computers and mathematics is far deeper and more intimate than these traditional relationships would imply.

Knuth also has a Chinese name, Gao Dena, given in 1977 by Frances Yao just before his first visit to China. Knuth was very happy to get this name. He said that he had felt close to all the Chinese people though he cannot speak their language.

Donald Knuth Biography

Birth and Education

Donald Knuth was born on January 10th, 1938 in Milwaukee, Wisconsin in the United States. As a child, Donald was not interested in mathematics. He showed his enthusiasm for a career in music. Donald completed his graduation at the Case Institute of Technology in the United States in 1960. In 1963, at the California Institute of Technology, he completed his Ph.D. in mathematics.

Married Life

When he was a graduate student, Donald married Nancy Jill Carter on June 24, 1961. She published a book on liturgy titled ‘Banner without Words’ in 1986. Jennifer Sierra Knuth and John Martin Knuth are their two children.

Books

Knuth is famous for his Multivolume Series of Books, The Art of Computer Programming. Still, no other book in computer science can be compiled with Donald’s encyclopedia the art of computer programming (TAoCP).

His thesis was Finite Semifields and Projective Planes in 1963. Donald wrote numerous books and articles, in the field of computers. 

Contributions

  • At the age of 22, Donald wrote Algol compilers in 1960.
  • He invented the TeX typesetting language in 1977. It is used to produce high-quality technical books and papers. It formats complex mathematical expressions.
  • He designed the SOL simulation language.
  • At the age of 30, Donald published the first volume of his book, The Art of Computer Programming.
  • Published the important paper- An Empirical Study of FORTRAN Programs in 1971.
  • In 1974, Donald published his article Structured Programming with Go-To statements.
  • Made the crucial contribution to the spelling “correctness proof” mythology.
  • He introduced and refined the LR parsing algorithm.

Retirement & Health Issues

Donald Knuth is retired from Stanford University. But, still, he is very active and working on the next volume of his book. He has kept a closed eye on the evolution of computer science.

Donald was diagnosed with prostate cancer and had to undergo surgery in 2006. He reported this in his autobiography and stated, “a little bit of radiation therapy….. as a precaution but the prognosis looks pretty good”.

Influenced by

Donald Knuth drew inspiration from a 300-year-old algorithm by the Swiss mathematician Leonhard Euler. He wanted to map a route through the Prussian city of Konigsberg that would cross each of its seven bridges only once. In his first volume of treatise, Knuth addressed Euler’s typical problem. Once Knuth applied Euler’s method in a sewing machine controlled by a computer. Donald also obtained 1271 digits of Euler’s constant by using Euler-Maclaurin summation in the year 1962.

Awards and Recognitions

  • In 1971, Grace Murray Hopper Award.
  • Professor Donald Ervin Knuth is the prestigious Turing Awardee for his outstanding contributions to the analysis of algorithms and the design of programming languages in 1974.
  • National Medal of Science in 1979.
  • SIGCSE Outstanding Contribution in 1986.
  • In 1995, Donald received John Von Neumann Medal and Harvey Prize.
  • Kyoto prize in 1996.
  • Faraday medal in 2011.
  • In 2010, he was honored by BBVA Foundation Frontiers of Knowledge Award.
  • Turing Lecture in 2011.
  • In the WEB system, Knuth encapsulated the idea of literate programming.
  • Flajolet Lecture in 2014.

Donald Knuth is one of the most distinguished computer scientists of the 20th century. Throughout his career, Knuth has played a major role in advancing computer software and programming languages. He designed a number of computing interfaces and early typefaces. He worked primarily in computer science and software development. Donald Knuth’s life is a good example of what a single talented individual can accomplish.

The post Donald Knuth Biography appeared first on The Crazy Programmer.



from The Crazy Programmer https://ift.tt/FXOeMdL

Alan brought a revolution in the field of computers. He was one of the first scientists to recognize that a computer has the capability to represent objects as pictures. He is well-known as the Father of Personal Computers.

Alan Kay was a Computer Scientist from America. He was a Fellow of the National Academy of Engineering, the American Academy of Arts and Sciences, and the Royal Society of Arts. He is popularly known for his outstanding work on object-oriented programming and windowing graphical user-interface design. Alan worked for Xerox PARC and invented the concept of the laptop computer which he named Dynabook. Also, he is famous as the first researcher of mobile learning.

Alan Kay Biography

Birth and Childhood

He was Born on May 17, 1940, in Springfield, Massachusetts, USA. His father Hector W.Kay designed arm and leg prostheses and his mother Katherine Kay was a musician. Alan was brought up in an environment of arts and science. His childhood was such a blessed one. He read 150 books till he reached the first grade. He used to read very fluently.

Education

Alan completed his schooling at Brooklyn Technical School in New York. He graduated in Mathematics and Biology from Bethany College in West Virginia in 1966, a master’s in 1968 in Electrical Engineering, Doctor of Philosophy in CS in 1969. His thesis explained the invention of FLEX, a computer language.

Music Love

Alan Kay is a former professional jazz guitarist, composer, and theatrical designer. He taught guitar in Denver, Colorado for one year. He is also a novice of pipe organist. His musical career ended after becoming busy with his research work.

Married Life

Alan Kay married Bonnie MacBird in 1983. She is a writer, actress, playwright, and producer of screen and stage. They have four children.

Career

  • In 1968, Alan learned Logo, a programming language, and worked on its enhancement and modification for educational purposes.
  • In 1969, he joined Stanford Artificial Intelligence Laboratory as a visiting researcher.
  • In 1970, he joined Xerox PARC and worked on the development of Smalltalk with the research staff. He formulated the idea of laptops, tablets, and e-books.
  • From the year 1981 to 1984, at Atari Corporation, a Console game company, Kay was the Chief Scientist.
  • In 1984, he joined Apple and due to his extraordinary contribution, he became an Apple Fellow, but that group was closed in 1997.
  • In 1997, his friend recruited him as the head of research and development at Walt Disney Imagineering and became a Disney Fellow.
  • Alan joined Ferren, his friend, who started Applied Minds Incorporation with Danny Hills. This led to the halting of the Fellows program.
  • In 2001, Alan founded a non-profit organization, Viewpoints Research Institute, in California, dedicated to learning, children, and advanced software development to improve “powerful ideas education”.
  • Kay and his Viewpoints group worked on various projects at Applied Minds in Glendale, California.
  • At Hewlett-Packard, Alan Kay was a senior fellow till July 2005.

“The Computer Revolution has not happened yet”…….!!!!!!!!

Reinventing Programming with his lectures, Alan Kay says that the revolution of computers is very new. And all the good ideas have not been implemented universally. This was informed by his experiences with Sketchpad, Simula, Smalltalk, and his code of commercial software.

Awards and Recognitions

Awards and honors flooded in the name of Alan Kay. Some of them are as under:

  • In 2001, UdK 01-Award in Berlin, Germany for pioneering the Graphical User Interface.
  • In 2002, Telluride Tech Festival Award of Technology, Colorado.
  • In 2003, ACM Turing Award.
  • In 2004, the Kyoto prize.
  • In 2012, UPE Abacus Award.

Really Alan’s contribution to the small computing system has fulfilled the dream of One Laptop One Child. It is a boon to children and modern education. He is rightly called the architect of The Modern Overlapping Windowing Graphical User Interface.

The post Alan Kay Biography appeared first on The Crazy Programmer.



from The Crazy Programmer https://ift.tt/QYZTVyv

John Backus was a mathematician known for his invention of FORTRAN & for BNF notation for describing syntax for the programming language. John developed Formula Translator or Fortran, which was the first high-level programming language. It was a widely used programming language that made computers accessible and practical machines for developers and scientists without any need to have in-depth knowledge of this machinery.

Restless as a young man, John Backus found his place in mathematics, he earned B.S. in 1949 and M.A. in 1950 from the Columbia University of NY City. He later joined International Business Machines or IBM in 1950.

Drained of laborious coding by hand, he was permitted to assemble the team at IBM working on improving efficiency. His team at IBM developed the programming language FORTRAN for its numerical analysis. FORTRAN made programs that were quite good as ones written by professional developers.

John Backus Biography

Personal Life

John Backus was born on 3rd December 1924 in Philadelphia and stayed in Delaware for a long time. Backus’s parents were Elizabeth Warner Edsall and Cecil Franklin Backus. Backus was married two times, first to Jamison Marjorie, whom he got divorced in 1966, and later to Barbara Una in 1968. John had two children, Paula and Karen. In 2004 Barbara died, whereupon John shifted to Ashland, Oregon, to stay near Paula. John Backus died on 17th March 2007 in Ashland.

Education

His family sent him to Hill School located in Pennsylvania, though he wasn’t very studious, and got poor grades. Nevertheless, in 1942 he graduated and attended Virginia University. During John’s freshman year, he got expelled because of poor attendance & joined the US Army.

When in the US Army, Backus did really well on his medical skills and later went to Haverford College for studying medicine. In the meantime, he got treated for a brain tumor and he continued at Flower & Fifth Avenue Medical School located in NY, but after nine months, he dropped out. Not being very successful, he was quite unsure about his career and he rented an apartment in NY city and built hi-fi set. He was then enrolled at the radio school where he discovered his mathematics skills.

Soon, Backus got admission to Columbia University and completed his mathematics. In 1949, he got his degree, and in 1950 IBM hired John to work with the team on the (SSEC) Selective Sequence Electronic Calculator, which was the first electronic computer made by the company. For almost 3 years, he worked on this project. Before, part of his work was attending to the machine & fixing it when stopped running. Machine programming was tough since there was not any organized system to do. Backus invented the program he named Speedcoding for streamlining this process. This program included a “scaling factor” that allowed numbers of various sizes to be stored and manipulated easily.

Working at IBM

After his graduation, Backus joined IBM, staking territory in their upcoming field of computer science. Backus did not know much about computers (just some people did) soon he found himself on its cutting edge. He led the group of researchers in 1952 who produced the Speedcoding system for IBM 701 computer, and after one he wrote what will prove to be the historic memo. In this, he outlined, Cuthbert Hurd, a need for general-purpose and high-level programming language. It was the origin of FORTRAN.

IBM approved John’s outline for the software language for 704 in 1953. With a team of professional developers, mathematicians and programmers, he together designed the language and translator.

John felt a translator will be important in making the machine faster and easier to work. It does not involve time-consuming and tedious hand-coding, which was quite a typical programming. Fundamentally, the device could translate data given through language made for users to find in binary language that the computer could understand.

Career

FORTRAN was a prototype for the modern compilers—programs that could translate high-level language into a form that the computer hardware will be able to read. Before the foundation of FORTRAN, programmers were forced to endure logging in the rows of ones and zeroes, the binary language of the computers.

FORTRAN helped programmers with greater creativity and freedom and allowed wide-ranging developments—not least because, before its development, around three-quarters of the cost of running a computer came from programming and debugging. IBM 1954 published FORTRAN I, and in the following year Backus, with me. Ziller and R A. Nelson started working out bugs in their first version of the language.

In 1954, John published a paper with his teammate named “Preliminary Report, Specifications that were used for Mathematical FORmula TRANslating, FORTRAN.” Though to complete the language’s compiler provided, it took them 2 long years with every IBM 704 installation. After 50 years, Fortran now is widely used. With time and feedback from the user, this system became quite efficient and even early “bugs” are removed.

After working with the Fortran system, John continued with Backus-Naur Form, it was a kind of standard notation describing grammatical errors of high-level languages. It’s used in various programming languages at present. He continued working on language simplification and worked primarily for IBM’s Almaden Research Center and San Jose Research Laboratory. In 1963 he was named the first IBM Fellows.

Founding of FORTRAN

When John and his small team of IBM colleagues started their hunt for a programming system in 1954 that enabled computers to produce their own language programs, they were not always very sure what they will come up with. Backus remembered in 1967: when he started to solve a problem, it got split up into others we had not foreseen. In 1955, they had less than one year. But, they finally did in 1957.

What John and his workers had made was FORTRAN, known as the daddy of the programming systems. Many people, “think FORTRAN’s key contribution was enabling the programmer to write down programs in the algebraic formulas rather than machine language. However, it isn’t. FORTRAN can actually mechanize the organization of loops. The loop, highly used in scientific work and computing payrolls, is the series of instructions that were repeated many times until the specific result arrived.

FORTRAN did increase programmer creativity and productivity. What had earlier taken over 1,000 machine instructions can be written now in just 47 statements. As intended, many engineers and scientists learned to do their own programming. However, language was very slow, initially, in catching on. But, users just found this tough to believe that machines can write the most efficient program.

By 1958, over half of these machine instructions of 704 were getting generated by FORTRAN. This was soon used on various other machines too. “In the way, FORTRAN was a big boon to competitors, “because, with the programs tied up over machine language, IBM users were not about to reprogram another computer. However, if a competitor can come up with the program that will translate FORTRAN program in the language of his own machine, he had the selling point.”

Awards

The achievement won John Backus W. W. McDowell Award in 1967 for his outstanding contributions to the computer field from the Institute of Electrical & Electronics Engineers.

He received the National Medal of Science in 1975 for pioneering contributions to computer programming languages.

In 1976, John was awarded the National Medal of Science and the prestigious Turing Award. He was the 1993 recipient of the Charles Stark Draper Award. In 1991, he retired to Ashland, Oregon, where he died on 17th March 2007.

The post John Backus Biography appeared first on The Crazy Programmer.



from The Crazy Programmer https://ift.tt/TM0Pnw5

The degree of relationship means the number of entities associated with a relationship in a relational model. Based on the number of entities associated we have the following degree of relationships.

Unary Relationship

In this only a single entity is associated in a relation. It is mostly used to represent the recursive relationship, which means a single entity can be further divided into multiple categories. Like in a company multiple employees work. To manages these all employees a manager is appointed who is also counted as an employee. Here manager is an employee first then a manager. So, we can represent this relationship like this:

Unary Relationship in DBMS

Binary Relationship

In this, two entities are associated in a relationship. This is most commonly used. This is easy to describe and maintains simplicity.

Binary Relationship in DBMS

Here in the above relation, two entities are associated in a single relationship, so the degree of relationship is 2 hence it is a Binary relationship.

Ternary Relationship

In this, three entities are associated with a relationship.

Ternary Relationship in DBMS

Here in the above relation, an employee works in an office, but the employee should be working in a specific department. So, the employee works in a department in an office.

N-ary Relationship

In this, more than three entities are associated with a relationship.

N-ary Relationship in DBMS

Here in the above relation, a college has multiple sections like staff, departments, students, etc. N-ary relationship may have N numbers of entities associated in a single relationship.

  • Here Unary relationship is used very rarely.
  • A binary relationship is used frequently and most commonly used.
  • Ternary and Nary relationships are counted in higher degree relationships and used least. Most of the time the higher degree relationship decomposed into a binary relationship to make the relationship easier and simple.

The post Degree of Relationship in DBMS appeared first on The Crazy Programmer.



from The Crazy Programmer https://ift.tt/cqY2ahd

A Database is a very complex mechanism to handle and control a large amount of data. As we all know the overall design of the database is known as database schema which is further divided into three categories as given below:

  1. Physical schema
  2. Logical schema
  3. View schema

The database or relational schema looks quite simpler and effective from the top but as we go to the bottom, we find a massive collection of data. A database has to face so many problems which we may call anomalies. An anomaly is like an unwanted situation, which may impact the integrity or consistency of a database.

Anomalies a database may face are listed as given below:

  1. Redundancy
  2. Update Anomalies
  3. Insertion Anomalies
  4. Deletion Anomalies
SID Name (Not Null) Subject (Not Null) Mobile
1 Raj English 65468154
2 Jyoti Home science 87668545
3 Vikash Maths 26865948
1 Raj Maths Null
3 Vikash Science Null

Redundancy

Duplicate data storage in a database is called redundancy. As we can see in the above table student’s name and subject’s name are repeated. This is called redundancy. Due to this lots of memory space will be wasted. Also, Redundancy creates other three anomalies.

Update Anomalies

Anomalies generated during update of a database. If a record has multiple copies, and if we make updates in a few copies and leave the remaining copies with old values, then the search result for that record may be misled the information. This may create inconsistency. Like in above given table student “Raj” is repeated twice so if we update the new mobile number for Raj at the second time then Raj has two mobile numbers. This may create confusion that which one is actually the correct mobile number. 

Insertion Anomalies

Anomalies generated during insert data into a database. In the given above table If a new student enrolled in a college but not selects any subject yet. Then we can’t insert that student’s record in the college database as we can’t leave the subject column blank for that student if NULL is not allowed. It means the student is enrolled but no data will be found for the same.

Deletion Anomalies

Anomalies generated during delete data into a database. As in above given table if we delete the subject “Home Science” since it’s no longer available then the record of student “Jyoti” is also deleted whereas Jyoti is still an active student of the college.

The post Anomalies in DBMS with Example appeared first on The Crazy Programmer.



from The Crazy Programmer https://ift.tt/wBcDnLh

This is a high-level programming language, which is used by database users to access the database.

This is also called non-procedural language because just like other programming languages it does not follow any fixed procedure or sequence for execution. Instead, it allows users to just pass on the commands in simple English text that follow simple syntax which can be easily understood by any user like this: create table, select data, insert data, etc.

SQL, Informix 4GL, and Oracle are examples of 4gl.

Advantages of 4GL

  1. Now days databases are used everywhere to manage data so 4GL makes it very easy to create, manage and operate the databases.
  2. A single line command can perform the task. On the other hand, in other languages, we need to write a series of commands (sometimes a huge segment) for the same task, in which syntax and keywords are not easily understandable.
  3. This type of language just focuses on “what is required”.
  4. Users need not worry and define “how it needs to be performed”.
  5. It is very easy and simple to use even for beginners or end users.
  6. It reduces overall cost, time, and effort.

Disadvantages of 4GL

  • This language is only database oriented, which means we can use it for databases only.
  • This is easy for users but in backend each query executes a sequence of commands which makes it time consuming. (Not create that much delay and effect)

Main Components of 4GL

  1. Database Language and Queries
  2. Report Generators
  3. Analysis and reporting
  4. GUI creators
  5. Mathematical optimization
  6. Spreadsheets
  7. To create an interface application for end users

The post Fourth Generation Language (4GL) – Advantages & Disadvantages appeared first on The Crazy Programmer.



from The Crazy Programmer https://ift.tt/FzaofrL

Grace Hopper, brought a revolution in the field of computers. She was the first to explore computer programming. She is rightly known as the Queen of Computer Code. Her motto in life was “Dare and Do”.

Grace Brewster Murray Hopper was a Computer Scientist, mathematician, and Admiral in the U.S. Navy. 

Grace Hopper Biography

Birth and Education

Born on December 9, 1906, in New York City, USA, Grace was brought up by her parents very nicely. Father, Walter Fletcher Murray, and mother, Mary Campbell Van Home, provided all sorts of books to develop her curiosity. They always motivated her to be self-reliant and hard-working to fulfill her dreams. Grace was their eldest child.

Grace completed her schooling in New York and New Jersey. She did her graduation and masters in Mathematics from Vassar College. From Yale University, Grace completed her Ph.D. in Mathematics in 1930.

Married Life

Grace Murray was married to Vincent Hopper in 1930. He was a professor at New York University. The couple got separated in 1945. They had no children.

Career

Grace Hopper’s success journey began when she joined as a Mathematics Assistant at Vassar College in New York. 

Second World War

Grace tried to join the military during the Second World War in 1941. But she was refused due to her height. She was a good mathematician. So, she was selected.

Later in 1943, Grace joined the Naval Reserve for training and became a Battalion Commander. In 1944, Lieutenant Grace Hopper joined a team, who were working on the computer, Harvard Mark I, which was approximately 51 feet long and built by IBM. She used to sleep beside the machine to repair if anything went wrong. Hopper learned a lot in this team.

After the War

In 1949, Grace became a Senior Mathematician in a company that built ENIAC, an electronic computer. She started working for a computer that could do the programming by itself.

Invention

In those days, the computer was using machine-coded language which was not used by the general public. So, Grace started working in the direction that computers will use human-friendly languages and then it will translate them into machine code. And her team was successful in inventing a TRANSLATOR, which is now known as COMPILER. The credit for inventing Flow-Matic and A-OI goes to Grace.

For many years, many young computer scientists joined her team and learned a lot from her.

Books

Murray published two books- A History of Programming Languages and The Education of a Computer (1984) and Computers and People: A Refection (1991).

Retirement

At the age of 79, ‘Amazing Grace’ retired from the Navy. She was never fully retired. She continued working as an industry consultant.

Death

On January 1, 1992, at the age of 85, in Berkeley, California, Grace Hopper left this world. Her body is interred at Arlington National Cemetery.

Influenced by

In her life, Grace Hopper was influenced by Howard Hathaway Aiken and John Mauchly, American Physicists, and Richard Courant, a German-American Mathematician.

Awards and Recognitions

During her lifetime and also after her, awards and recognitions flooded in the name of Grace. She received 40 honorary degrees, 9 military awards, and 26 other awards and recognitions. Also, many colleges, parks, streets, buildings, awards, bridges, meeting halls, a supercomputer, missiles, etc are named in her honor.

Even a minor planet discovered by Eleanor Helin is named ‘5773 HOPPER’  in her honor. 

In 2020, Google also did not stay back in honoring Grace. It named an undersea network cable ‘GRACE HOPPER.’

Famous Quotes

Grace Murray Hopper’s famous quotes which still inspire the world are as under:

  1. “A ship in port is safe, but that’s not what ships are built for.”
  1. “It is often easier to ask for forgiveness than to ask for permission.”
  1. “You don’t manage people, you handle things. You lead people.”
  1. “One accurate measurement is worth a thousand expert opinions.”
  1. “I have always been more interested in the future than in the past.”
  1. “Humans are allergic to change.”

Her most disliked phrase was -“Why we’ve always done it that way.”

Grace Hopper’s invention gained international attention. Once she told her biographer that teaching was her greatest joy.

The life of Grace Hopper is one of the most inspiring ones. She worked like a hero, a noble scientist, a magician, and a revolutionary. Her life showers a positive and daring impact on every woman. Really, Grace was a boon to the universe.

The post Grace Hopper Biography appeared first on The Crazy Programmer.



from The Crazy Programmer https://ift.tt/2MoWG7F

Although these fields initially appear comparable, their variability quickly becomes apparent when several studies are performed. The most important critical question is: which career is better in the long run? Both software engineers and computer scientists are concerned with computer programs and software improvement and various related fields. Investigators in a profession focused on software improvement and software engineering are regularly aware of one of the recognized specialties: computer science or software engineering, often called software improvement, but they are not synonymous.

What is Software Engineering?

Software is more than just program code. It is executable code that serves several computational purposes. Software is understood as a series of executable programming codes, related libraries, and documentation. Software engineering is an engineering department related to improving software products using well-described clinical ideas, strategies, and procedures. The final result of software engineering is an effective and reliable software program.

Head spaces of study inside Software engineering incorporate artificial consciousness, PC frameworks, and organizations, human PC cooperation, security, data set frameworks, vision and illustrations, computer programming, mathematical investigation, programming dialects, bioinformatics, and hypothesis of registering.

Key Skills to Be a Software Engineer

Developing a career as a software engineer requires a minimum primary education, which usually includes a bachelor’s degree. The most popular ones are in software engineering or computer science, and math. Upon entering the world of advanced software engineering, you have several career paths to choose from, the most popular of which are:

  • Blockchain Engineer
  • Security Engineer
  • Embedded Systems Engineer
  • Data Engineer
  • Backend Engineer

What is Computer Science?

Informatics is a look at computer systems and computational structures. Unlike electrical engineers and computer scientists, computer scientists mainly deal with programs and the structures of computer programs; this comprises their concept, layout, enhancements, and software. The main areas to look at in computer science include synthetic intelligence, computer and network structures, security, database structures, human-computer interactions, imagination and prediction, imagery, numerical evaluation, programming languages, software engineering, bioinformatics, and the concept of computer science. The maximum essential factor of computer science is problem-solving, a critical life skill.

Computer science is a designing branch that improves programming items utilizing apparent logical standards, strategies, and techniques. The result of computer programming is an effective and solid programming item.

Difference between Software Engineering and Computer Science

Difference between Software Engineering and Computer Science

Sub-division of Arithmetic

Both software engineers and computer science are concerned with computer programs, computer program improvement, and various related fields. The number one distinction is that computer science initially became a subdivision of arithmetic. Software engineering offers the basic shape of a computer and is beyond theory. Therefore, it is so much more plastic in specialization phrases with an emphasis on math and technology. Software engineering is the discipline of software engineering strategies for the emergence, maintenance, and layout of software for many one-of-a-kind purposes. A software engineer designs custom-designed programs according to the needs of the organization.

Technology, Design, and Development

Initially, computer science becomes the subdivision of arithmetic that manages the foundations of the computer in a theoretical manner. In specialization phrases, computer science is very plastic, with an emphasis on technology and mathematics. Software engineering, however, is a discipline that deals with software engineering strategies for the design, maintenance, and emergence of programs for many one-of-a-kind purposes. Software engineers arrange custom programs according to their organizations’ needs.

Software engineering can also deal with the interaction of software packages with hardware. For example, an IT professional may decide on an approach to creating software packages that are well suited to the computer’s hardware. However, a software engineer offers the most useful software packages, mainly software development, maintenance, validation, and generation.

In addition, a software engineer can obtain precise application requirements when constructing a software layout. At the same time, a computer scientist works with computer languages and mathematical calculations to make approximate choices of how the application needs to be designed.

Programming – Theory vs Practical

Another difference between computer science and software engineering is software for programming and software improvement. Software engineering will know how to calculate and calculate exact approaches to application programs, in addition to locating the calculations that allow engineers and designers to construct software packages to meet product needs. Software engineers essentially use IT professionals, assessments, and descriptions as useful resources to fully enhance and create the latest software frameworks and packages.

Computer science is also one of a kind compared to software engineering. It focuses strictly on the clinical theories behind computer operations, computational structures, registers, and design software. However, software engineering can apply these theories to useful resources within layout and strategies for constructing structures, hardware and software packages, and programs. Thus, while computer science studies and develops theories related to computer operations, software engineering applies these theories to construct real international computer programs. In addition, software engineering can generally recognize the complexity and algorithms of software packages in addition to a variety of analyses, as can computational programming technology, visualization and imagery systems, and consumer interaction. The way an engineer takes control of a product consists of deploying software, evaluating, and checking automation and software product warranty evaluations.

Software engineering can also add any knowledge of computer coding and languages. However, maybe additionally take a closer look at coding to apply it when developing and designing programs. A computer scientist may also know coding in addition, as it relates to computer languages, and will additionally use differentiated computer coding to calculate the compatibility between hardware and software programs. The app specializes in backside image creation technology for computer systems while applying clinical and mathematical ideas to building, designing, and implementing software and hardware packages.

Software Engineer Job Responsibilities & Education

Software engineering deals with systems and information as the most realistic technique for improving and using a computer. Although computer engineers are aware of the program, the computer engineer also needs to know the hardware. In addition, computer engineering combines electrical engineering and technological know-how, focusing on program hardware interactions.

Some of the most unusual slot packages in computer engineering include CPU interface, virtual logical design, thermodynamics, endurance management, solid-state physics, and magnetic fields. MCA degrees are entirely customizable as packages often hide a wide variety of interests. Ultimately, you are on top of your own diplomas when you know your interests; this customizable diploma will guide your career toward your favored profession.

Computer Science Job Responsibilities & Education

Computer science knowledge offers the basic shape of a computer and is beyond theory. As a result, they are very pliable in specialization phrases, emphasizing math and technological expertise. When computer systems were introduced, there were levels of computer know-how in the math department. Since computer systems have become an essential part of society, a branching degree has shown that it has grown, but the consciousness remains the same. Check out this PGP in software development from Great Learning to kick-start your career in software engineering and land your dream job.

 

The post Difference between Software Engineering and Computer Science appeared first on The Crazy Programmer.



from The Crazy Programmer https://ift.tt/ijaWv9w

How wonderful to have the whole world of knowledge and information at our fingertips.

Well, this is due to the wonderful applications of Computer Science. This course has become the most wanted course both at graduate and post graduate level in India. Computer Science Engineering (CSE) encapsulates a variety of topics like programming languages, software, hardware, robotics, computation, analysis of algorithms, etc.

In India, there are more than 3800 Computer Science Engineering colleges. Out of these nearly 2005 colleges are private, 323 are government and about 36 are semi-government colleges. They offer BE/B.Tech degree through full-time, part-time, and Distance Learning Programmes.

The colleges are ranked by the National Institutional Ranking Framework (NIRF), Nature Index, India Today, Atal Ranking of Institutions on Innovation Achievements (ARIIA), Times Higher Education (THE), etc.

Parameters for Ranking

The ranking organizations shortlist more than 5000 colleges every year and divide them into different groups according to their cumulative scores out of which some are given below:

  • Students’ most preferred choices during choice-locking after the entrance exam
  • Qualities of Faculties
  • Application-to-seat selection ratio
  • Quality accreditation given by NAAC, NBA, NIRF, India Today, etc.
  • Opinions given by experts
  • Feedback by alumni 
Best Colleges in India for Computer Science

The data is then divided into nine clusters and then the colleges are awarded ratings as per the following slab:

Rating Score Rating Description
AAAAA 95-99 percentile Exceptional
AAAA+ 90-94 percentile Outstanding
AAAA 80-89 percentile Very Good
AAA+ 70-79 percentile Good
AAA 60-69 percentile Above Average
AA+ 50-59 percentile Fair
AA 40-49 percentile Average
A+ 35-39 percentile Satisfactory
A < 35 percentile Pass

Also Read: 10 Best Computer Courses After 12th in India 2022

Best Private Colleges

College Entrance Exam College Fees for 4 years (Approx.) Average Salary Highest Package Rated by Rating
VIT, Vellore VITEEE INR 7.8 Lakhs (Total fee) 8.19- LPA INR 1.2 crore PA   1.No.4/218 –India Today 2. No.1 – ARIIA 3. No.1 –NIRF 2022, 2021,2020 AAAA+
Birla Institute of Technology and Sciences, Pilani BITSAT INR 18.8 Lakhs (Total fee) INR 9.2 -13 LPA INR 1.3crore PA 1.No.1/238 –India Today 2. No.3 –NIRF 2022 AAAA+
Thapar Institute of Engineering and Technology, Patiala JEE Main INR 11- 13 Lakhs (Total fee) INR 9.2 – 13 LPA INR 43 LPA   1.No.6/238 –India Today 2. No.6 –NIRF 2022 AAAA+
SRM Engineering College, Chennai JEE Main INR 10-12 Lakhs (Total fee) INR 9.2 -13 LPA ₹29.5 LPA   1.No.8/238 –India Today 2. No.1 – Nature Index 3. No.6 –NIRF 2022 AAAA+
Manipal Institute of Technology (MIT), Bangalore MET, TNEA  INR 14.4 Lakhs (Total fee) INR 9.2 -13 LPA   ₹22.38 LPA   1.No.4/238 –India Today AAAA+

Best Government Colleges

College Entrance Exam Total College Fees for 4 years (Approx.) Average Salary Highest Package Rated by Rating
IIT, Madras JEE Main, JEE Advanced INR 8.08 L – 8.19 L INR 21.48 LPA INR 1.98 crore PA N0.1- NIRF 2020 AAAA
IIT, Mumbai JEE Main, JEE Advanced INR 8.33 L – 10.39 L INR 22.7 LPA INR 1.8 crore PA N0.3- NIRF 2020, AAAA
IIT, Delhi JEE Main, JEE Advanced INR 8.47 L – 8.66 L INR 18-19 LPA INR 1.25 crore PA N0.2- NIRF 2020 AAAA
IIT, Roorkee JEE Main, JEE Advanced INR 8.58 L – 10.71 L INR 12-18 LPA INR 50 LPA N0.6- NIRF 2020 AAAA+
NIT, Trichy JEE Main, JEE Advanced INR 5.63 L INR 17- 22 LPA INR 42 LPA N0.8- NIRF 2020 AAAA+

FAQs

Q1. Why do we consider private and government colleges separately?

Government colleges receive aid in terms of good infrastructure facilities, special grants, and full financial support in terms of loans, scholarships, interest subsidies, technical assistance, insurance, etc. while private colleges have to capitalize on themselves by charging high fees from the students, donations from some private organizations and individuals. They have to struggle to hire expert faculties by paying them handsome salaries.

Q2. What eligibility is required generally for CSE colleges?

Eligibility required for these colleges – 10+2 with PCM and English from a recognized board.

Q3. Which are the top 10 entrance exams conducted to qualify for B.Tech CSE Colleges in India?

The various entrance exams accepted by B.Tech CSE colleges in India are :- JEE Main, TNEA, MHT CET, AP EAMCET, UPSEE, KCET, TS EAMCET, KEAM, GUJCET, COMEDK UGET, etc.

Q4. Are JEE Main and JEE Advance exams compulsory for Computer Science course?

No, it is not compulsory to give JEE for pursuing B. Tech in Computer Engineering. But many colleges and universities require JEE scores for admission. Many universities usually conduct their own entrance exam.

Q5. Is there any Special Offer for admission in such top colleges?

Yes, many colleges offer DIRECT ADMISSION for Class XII Board toppers.

Q6. What skills do a CSE course pursuing candidates must have?

Computer Science Engineering course pursuing candidates must have skills such as:

  • Analytical skills
  • Mathematical and statistical skills
  • Basic web development knowledge
  • Knowledge of algorithms
  • Knowledge of software tools
  • Problem-solving skills
  • Reasoning skills
  • Understanding of software design

Q7. What are the career opportunities after CSE?

Ans: CSE graduates can start their career as a Hardware Engineer, Software Engineer, System Hardware Engineer, Computer Scientist, Data Warehouse Analyst, Database Administrators, Data Mining Engineer, Computer Operator, Computer Programmer, Software Developer, Systems Analyst, Computer and Information Research Scientist, Computer Network Architect, Roboticist, Cloud Engineer, UNIX System Administrator, Engineering Support Specialist, Front End Developer, Back End Operator, etc.

Q8. Which companies are the top recruiters of CSE candidates?

Ans: Top recruiters are – Amazon, Tata Consultancy, Big Bazaar, Bosch, Taj Hotels, Toyota, CISCO, Accenture, Cognizant, JSW, Indiamart, Adobe, Cognizant, Infosys Technologies, Microsoft, Wipro, IBM Services, Deloitte, Google, Indian Oil, Reliance, Citibank, Samsung, Yahoo, Jaguar, TATA, Dell, Standard Chartered, Apple, Capgemini, Intel, Uber, Myntra, ONGC, etc. 

Q9. What parameters are used for World University Rankings Methodology?

Ans:  The parameters used for World University Rankings Methodology are- Teaching (30%), Research (30%), Citations (30%), International Outlook (7.5%), and Industry Income (2.5%).

Conclusion

All the CSE colleges in India play an outstanding role in bringing up their students. They leave no stone unturned for the overall growth of their students.

And thus, India’s CSE students receive high regard from US and European countries. We can boast ourselves of having enormous technical power in the world. These students have successfully launched many satellites and have made India self-reliant in many technologies. We are ready to face the super powers of the world.

Apart from all these achievements, still, the Indian government is needed to take crucial steps to tackle the problem of ‘Brain Drain’ which requires immediate attention. The government should develop its infrastructure, salary norms, and publications in this field which is the need of the hour.

The post 10 Best Colleges in India for Computer Science [Private & Government] appeared first on The Crazy Programmer.



from The Crazy Programmer https://ift.tt/WneQ8lb

John McCarthy is a pioneer of AI or Artificial Intelligence and a popular computer scientist. He made huge contributions in mathematics and the computer science and made significant inventions in the field of artificial intelligence & interactive computing systems.

Early Life

McCarthy was born in 1927, in Boston, Massachusetts. His father, Patrick McCarthy, was a Catholic who became a labor organizer and Business Manager of the Daily Worker, a national newspaper that was owned by the Communist Party in the USA. John’s mother, Glatt, was a Lithuanian Jewish immigrant and worked for the wire service as Daily Worker later and as a social worker.

His family and John lived through the great depression during the 1930s. The recession caused them to shift and relocate to LA. His excellence was becoming quite apparent though his health hindered his schooling. John was self-taught during his academic life. When health permitted, John entered public school and skipped several grades because of his self-study.

Being a sickly child, John turned to read books for solace, eventually, his family shifted in the hope his health will improve. Though it did, and he also proved to be a prodigious scholar, skipping 3 grades in 1944 he entered the California Institute of Technology. Despite the fact he took a little time off from his school for several reasons, which includes his stint in the army as a clerk, John graduated 4 years later with a degree in mathematics.

John-Mccarthy-Biography

Education

John 1948 got a bachelor’s degree in mathematics from California Institute of Technology & later in 1951 he became a doctorate in mathematics from Princeton University. From 1955-58, he held professorships at Dartmouth College; Massachusetts Institute of Technology in 1958–62, where he worked on the earliest time-sharing systems; later between 1953 and 1962 at Stanford University he founded SAIL or Stanford Artificial Intelligence Lab, which was the primary centres for the research in this field. Though he started in mathematics, he started with the newly made computer science department in 1963 at Stanford and stayed there till his retirement.

Personal Life

Dr John McCarthy was married 3 times. Vera Watson was his second wife and also a member of the American Women’s Himalayan Expedition, she died in a climbing accident that happened in 1978 on Annapurna.

Besides Sarah and Calif., he’s survived by his wife, Carolyn Talcott, another daughter, Susan, of San Francisco & son, Timothy.

He stayed to be an independent thinker in his entire life. Some years before, his daughters gave him the license plate bearing his favourite adages: “Do the arithmetic or be doomed to talk nonsense.”

Career

Dr John’s career followed an arc of current computing. Trained as a mathematician, John was responsible for the seminal advances in this field and often was called the father of computer time-sharing, it was the major development of the 1960s that allowed many people and companies to draw from one single computer, like the mainframe, without having one.

Reducing costs, allowed many people to make use of computers and laid the groundwork for interactive computing.

Although he didn’t foresee the rise of the personal computer, John was prophetic to describe the implications of various other technological advances years before they got currency. In the 1970s, he presented one paper in France about buying & selling by computer, now it is termed electronic commerce.

And while studying artificial intelligence, there is nobody more influential than McCarthy. When teaching mathematics in 1956 at Dartmouth, Dr John was also the organizer of his first Dartmouth Artificial Intelligence Conference.

His idea of simulating intelligence was discussed for years, but the term “artificial intelligence” —used originally to help to raise funds for supporting conferences — stuck.

Dr John moved to the Massachusetts Institute of Technology in 1958, and there with Marvin Minsky, John founded Artificial Intelligence Laboratory. This was at MIT he started working on List Processing Language (Lisp), a programming language that became a standard tool for AI intelligence research & design.

Evolution of AI

During the summer of 1956, John started working on a program that will help the computer to play chess. To limit its possible moves and speed up this game, John developed the method termed an alpha-beta heuristic that made it actually possible for the computer to eliminate moves that will benefit the opponent. It was the evolution of artificial intelligence (AI), a term John coined that year while he organized the first conference on modelling intelligence in computers.

John became an associate professor in mathematics at the Massachusetts Institute of Technology in 1958 and founded his first Artificial Intelligence laboratory there. He started creating a computer language that will be called List Processing Language or LISP. It remains a commonly used language, especially in AI research. When at MIT, John started developing different ways of time-sharing on computers that will make the networks possible just by allowing several people to share their data on the same large computer. Besides, he initiated work on the concept of giving computers “common sense,” it was an idea that will perplex programmers for years.

AI Accomplishments 

Programming languages, the web, the Internet, and robots are some of the technological innovations that McCarthy paved his way for. He coined “Artificial Intelligence,” as the first computer language for symbolic computation, LISP (that is used even now as the most preferred language in AI), and established time-sharing. The human-level AI & commonsense reasoning were his major contributions in this field. Furthermore, Dr John has written many papers on the theories of various calculations that are the basis of software science today.

In the field of engineering, he proposed some basic concepts of the Time-Sharing System and was also involved in its development. The work opened the way for the development of large-scale computers today.

Besides his academic contributions, Dr John established his first AI research project at the Massachusetts Institute of Technology and was also the founder of the Artificial Intelligence Lab. When he moved to Stanford University, he established AI Laboratory there.

Awards

His efforts in the artificial intelligence field are quite immaculate throughout his wonderful career. John’s contributions were recognized worldwide and he also received several awards. He won several prestigious awards, which include:

  • He received Turing Award from Association for Computing Machinery in 1971.
  • Was awarded Kyoto Prize in 1988.
  • He was awarded the National Medal of Science in the Statistical, Computational Sciences & Mathematics by the USA in 1990.
  • In 2003, he was awarded the Benjamin Franklin Medal in Cognitive Science & Computers by Franklin Institute.

The post John Mccarthy Biography – Father of AI appeared first on The Crazy Programmer.



from The Crazy Programmer https://ift.tt/Y0X2Jn6

MKRdezign

Contact Form

Name

Email *

Message *

Powered by Blogger.
Javascript DisablePlease Enable Javascript To See All Widget