How Artificial Intelligence Supporting Technologies In The Modern World

The artificial intelligence current market, from what we read in the data presented at the beginning of 2020 by the researchers of the Artificial Intelligence 2020 Observatory, is still taking its first steps.

Going into a little more detail, we see that Intelligent Data Processing initiatives constitute the main type of AI projects, collecting 33% of the total expenditure. Natural Language Processing and Chatbot / Virtual Assistant development projects follow which, taken together, represent 28% of the market. As regards the sectors, four have emerged which, more than others,

What we are experiencing, since the 1980s, is a period of great ferment on the topics of artificial intelligence and we can compare it to that of a human being in the prime of his youth, full of energy and desire to experiment, but also with the adequate ability to analyse what has been learned, evaluate its potential and possible developments. And here our metaphor ends because the next step that of maturity is still on the horizon. But here, the key junctions of the last 40 years divided by strands.

Neural Networks

We will analyse this topic in detail later when we talk about deep learning, but we give here some basic historical notions.

We saw in the previous paragraph that the first theorisation of artificial neural networks (the Rosenblatt perceptron) had been questioned, but in 1986, Jay McClelland and David Rumelhart published Parallel distributed processing.

Explorations in the microstructure of cognition laying the foundations for connectionism. And giving new vigour to studies in this field: a neural network is a direct non-linear graph, in which each processing element (each node of the system) receives signals from other nodes and in turn emits a signal to other nodes.

Practically, while in a computer the knowledge is concentrated in a precise place, the memory, in the neural network knowledge is not localisable, it is distributed in the connections of the network itself allowing the system to learn from its own experiences.

The multilayer neural network model (MLP-Multilayer Perceptron) is developed where each layer of nodes is completely connected with the next one and uses a supervised learning technique called error back-propagation (which we will explain later in the deep learning chapter) for the training of the net that starts from an already known result for a specific champion.

Nanotechnologies

The evolution of nanotechnologies, i.e. the ever-increasing miniaturization of microprocessors, has led to the development of a new generation of components (from General Purpose GPUs to ResistiveRAMs to anthropomorphic chips) that have given new impetus to artificial intelligence thanks to the enormous power of calculation made available. We will see them in detail later, in the chapter The technologies that enable and support artificial intelligence.

Innovative Algorithms

If the previous strands represent (with a final parallelism with our metaphor) the physicality of artificial intelligence, the algorithms should represent what comes closest to thought: that “magical” action that makes man a very special being compared to to other animals.

It is precisely from the evolution of innovative methodologies and algorithms that artificial intelligence draws its lifeblood and it is on these that we will focus in the parts of this article dedicated to machine learning , deep learning , natural language processing and augmented reality and virtual reality.

Cognitive Computing

Using self-learning algorithms, data mining and big data analytics, pattern recognition, natural language processing, signal processing (a signal is a temporal variation of the physical state of a system or of a physical quantity that is used to represent and transmit messages or information at a distance, so the analysis of signals is a component that supports cognitive computing) and by implementing the most advanced hardware technologies, technological platforms are created that try to imitate the human brain, starting from simpler activities to arrive at increasingly complex processing.

From the historic IBM Watson, the first commercial supercomputer of this type, to Google Deepmind and Baidu Minwa there are now several examples available today.

TechSmashers
Tech Smashers is a global platform thatprovides the latest reviews & newsupdates on Technology, Business Ideas, Gadgets, Digital Marketing, Mobiles,Updates On Social Media and manymore up coming Trends.

RECENT POSTS

What Is The Difference Between Substitutive Conservation And Electronic Archiving?

We talked about the importance and potential of using software for digital document management within the company and how electronic invoicing has paved the...

From Electronic Invoicing To The Digitization Of Tax Documents

Electronic invoicing has forced all companies to rethink administrative and accounting processes, making them lean and even more automated.Many businesses have taken advantage of...

One-third Of Employees Telework 3.6 Days On Average Per Week

Almost a year after the start of the Covid-19 crisis, the number of teleworkers reached 41%. The latest edition of MalakoffHumanis.com annual barometer observes...

How To Defend Yourself From Online Scams

Below are the precautions indicated from time to time by various bodies to avoid, or at least cushion these online scams:Don't put apparent...

Which Is The Fastest Way To Validate The Address Database?

Managing and standardizing the postal addresses manually can be a daunting task. Computer software and Application Programmatic Interfaces (APIs) can be helpful in these...

Online Scams: What They Are And How To Avoid Them

Let's see how the crime of fraud is declined in the online world and the differences with money laundering, to understand how to counter...

The Role Artificial Intelligence Plays In Cybersecurity

Artificial Intelligence for Security Before examining the change in information security thanks to artificial intelligence systems, it is good to know some data. We are...

Technology Trends: Will 2021 Be Exciting?

Futurologists take a look into the next decades in impressive visions - but we don't need to look that far ahead. In the coming...