When Was Artificial Intelligence Born And How It Evolved?

Definition Of Artificial Intelligence

Let’s start immediately by removing all hope from those looking for a univocal and shared definition of the term “artificial intelligence” because it is a concept that includes a very large number of topics that refer to different disciplines, from neurology to computer science, from neurobiology to neurophysiology (and in genre all disciplines that study the human brain) to mathematics and so on.

Therefore, the more one tries to give it an all-encompassing scientific definition, the more one is forced to simplify it (in order not to neglect fundamental aspects depending on the point of view taken into consideration), and a banal definition remains the discipline that studies design, development and implementation of systems capable of simulating human skills, reasoning and behaviour.

Then, to try to understand the fields of action, it is necessary to refer to the history of what only in more recent years is considered a real scientific discipline, when it first appeared and where it comes from.

When Was Artificial Intelligence Born And How It Evolved

In this article we explain you the history of AI, we summarise in a schematic way the main evolutionary junctions and, since we are talking about systems that simulate the behaviour of the human being, we do so using human life as a metaphor for the story.

The Birth Of Artificial Intelligence

  • The first sparks that will lead to the birth of this discipline are those lit in the century that revolutionised science and started modern science, the 1600s (although conventionally the year of the beginning of the scientific revolution is considered 1543 with the publication. It was in the 17th century that the first machines capable of performing automatic calculations were built (Blaise Pascal, Gottfried Wilhelm von Leibniz).
  • Charles Babbage with his “analytical engine” which anticipated the characteristics of modern computers in the first half of the nineteenth century and of course the work of Alan Turing , considered the father of computer science, starting from the second half of the 1930s represent the amniotic fluid in which the gestation of artificial intelligence continues.
  • However, it is in 1943, with the work of the neurophysiologist Warren Sturgis McCulloch and the mathematician Walter Harry Pitts, that this gestation draws to a close: first of all the two scientists, based on neurophysiological observation, theorise that the signals between two cells are characterized from an exclusive type of behaviour where the transmission of the neuro impulse can only be complete or null (on / off); thus assimilating the neuron to a binary system, McCulloch and Pitts show, with a mathematical model, how simple neurons can be combined to calculate the three elementary logical operations NOT, AND, OR. It is from these assumptions that artificial neural networks will be born and that science comes to give birth to artificial intelligence.
  • The term “artificial intelligence” has a precise date of birth: it was used for the first time by mathematicians and computer scientists John McCarthy, Marvin Minsky, Nathaniel Rochester and Claude Shannon in a 1955 informal document to propose the Dartmouth conference to be held there. ‘next year and will be considered the real “delivery room” of artificial intelligence: in fact, the first program explicitly designed to imitate the problem-solving skills of human beings is presented here.

The Evolution Of Artificial Intelligence

The period from 1950 to 1965 is one of the great expectations and, like watching a child start talking or walking, the first activities around the newborn discipline are exciting.

  • In 1950 Turing proposed his famous “ imitation game ” in the equally renowned article Computing machinery and intelligence, in an attempt to answer the fundamental question: can machines think?
  • In 1958 the psychologist Franck Rosenblatt proposed the first neural network scheme, called Perceptron (perceptron) , designed for the recognition and classification of shapes and which consists of an entity with an input, an output and a learning rule based on the error minimisation.
  • In 1958 McCarthy developed the Lisp language to study the computability of recursive functions on symbolic expressions which for a long time was the reference language in artificial intelligence projects (it was the first language to adopt the concepts of the virtual machine and virtual memory management).
  • McCarthy also describes an ideal program, Advice Taker, designed to find solutions to problems of a not strictly mathematical type.
TechSmashers
Tech Smashers is a global platform thatprovides the latest reviews & newsupdates on Technology, Business Ideas, Gadgets, Digital Marketing, Mobiles,Updates On Social Media and manymore up coming Trends.

RECENT POSTS

Freebie: What Are They, And Why Are They Essential For Brands And Entrepreneurs?

Use a freebie to bring your audience closer and increase online conversions to benefit your business.You are always looking for some strategy to broaden...

Learn How G Suite Helps Your Company’s Information Security

Contemporary organizations operating under uncertainty and risk are forced to modify their approaches toward information security. Faced with any threat, companies need to consider...

Digital Marketing: What Will Be The Main Trends In 2022

According to insiders, from automation to omnichannel, passing through the centrality of content - and good corporate storytelling - and online purchases: the most...

Four Concerns With Managing A Restaurant And Solutions

The restaurant business has always been both rewarding and challenging. With over $72B in sales in December 2021 alone, the restaurant industry has tremendously...

Creative WordPress Blog Themes For Your Blog

In just a few seconds, visitors decide whether to stay on your website or not. Appearance and design play a crucial role. In this...

Launching A SaaS Business: Do’s And Don’ts Every SaaS Entrepreneur Should Know

The last two years saw about 90% of the web creation. The massive influx of information will continue to tremble in today's economic world....

How To Improve Your Learning Experience Through XR Design And Development

XR stands for the extended reality that is the present and holds a powerful future in our digital world. It is a combination of...

Why Augmented Reality Is Convenient For Companies In The Manufacturing Sector

The industry has been fighting for skilled workers on the job market for years. A new AR technology should ensure that companies can involve...