If artificial intelligence were the history of aeronautics, it could be said that we are in the 30s: we already passed through the birth of the first airship and the motorized flight of the Wright brothers; we founded the first airline and circumnavigated the globe. We will soon break the speed of sound barrier and have the first passenger jet.
Learn more: How Artificial Intelligence is Transforming Telecommunication Sector
In other words, for the moment we have created machines with specific objectives that begin to assimilate information from raw data, in the same way, a human child learns about the world around him. Moreover, you can check these sites like backpage for advertising purpose.
“From the point of view of artificial intelligence, or AI, we are at the dawn of our future postbiological,” says Rodney Brooks, professor emeritus of the Laboratory of Computation and Artificial Intelligence of the Massachusetts Institute of Technology (MIT) and president of the company IRobot.
The expert refers to the future in which machines, the systems we consider ‘non-living’, will behave as biological and conscious entities, as predicted by the English mathematician Alan Turing, the father of modern computing.
After decades of setbacks and breakthroughs, we finally begin to understand the challenges of this field, where computation, mechanical engineering, psychology, anatomy, biomimetic (technology inspired by nature) and neurology converge. Especially the latter.
Read more: 4 Ways Artificial Intelligence is Benefiting Small Businesses
In fact, in the last five years, more progress has been made in artificial intelligence than in the previous 50s. So many, those technology gurus are beginning to wonder what the hierarchy between humans and humanoids will be in the future. Moreover, some of them, like Bill Gates and Elon Musk, have even expressed concern that our creations become a threat.
Hollywood has also played its part to fuel the vision of a dangerous future, with artificial entities that see us as we approach the dinosaur fossils: a species that dominated the world but whose lack of adaptation condemned it to extinction. Such as ‘Terminator’, ‘ExMachina’ ‘Her’ and ‘The Matrix’, to name just a handful of films.
Read more: Here’s how big data analytics is transforming the entertainment industry
Along the same lines, the English physicist Stephen Hawking, one of the greatest scientific disseminators of our time, refers to AI as “the greatest event in human history”, but warns that “at the same time” could be our worst error”.
Meanwhile, the hottest area of artificial intelligence remains the so-called deep learning. It is nothing new, as researchers have been leading this issue since the 1970s. Now, with the development of GPUs, which optimize computer performance, learning deep is reborn. Especially for a couple of years, when Google bought the research group Deep Mind.
Deep learning, also called learning machines or creating artificial neural networks, is to develop algorithms capable of deciphering a natural language.
Read more; What Are The Predictions of People About IoT
The basic idea is to take a computational model and feed it with information. For example, put everything that exists on Wikipedia, or the CNN news of recent months. Then you will be given multiple-choice exams or complete the information, like the ones presented by the students. It is in that process of being evaluated where the system ‘learns’ from its experience, like a child when it begins to communicate.
Mustafa Suleyman, co-founder of Deep Mind, says one of these systems was tested using an Atari video game, from the 70’s, which consisted of a racket and a ball (the famous ‘telebolito’). The system was not instructed on how to play. “Only raw pixels are given, and he must go through the frustrating experience of being ‘killed’ several times, without any guidance or element of comparison,” explains Suleyman. Eventually, the system hits a ball by accident and learns that that action brings you a reward. “It is very similar to what a laboratory mouse does when it realizes that pulling a lever represents access to food.
Last year, Deep Mind released the results of its latest research in the journal Nature, according to which an algorithm, after 500 games, won the game. “This new artificial agent surpassed the performance of the previous algorithms and achieved a level comparable to that of a professional human player, able to excel in a diverse range of difficult tasks,” the scientists concluded.
Read more: How Blockchain Technology Can Solve IoT’s Security Problem
However, getting there is not an easy task. During this long training process, Google algorithms have made the company go through shameful times, like when they classified a black woman as a gorilla in Google Photos.
Another side of the coin is IBM’s Watson cognitive computing platform. The programmers taught him to read medical literature and now the supercomputer is able to evaluate millions of research on any disease and identify within a few days treatments that would take months to the best doctors.
Strong and weak IA
Although the work of all these algorithms is remarkable, the question arises of how similar a human brain is to an artificial neural network. Work complements the effort of the US Government through The Brain Initiative to map the activity of each neuron in the human brain.
“Understanding how the brain creates intelligence is the biggest problem in science and technology,” adds Tomaso. In order to have a computational understanding of what intelligence means, our centre emphasizes four areas of interdisciplinary research: the integration of intelligence, including vision, language and motor skills; the circuits for the intelligence, which will need investigations in neurobiology and electrical engineering; the development of intelligence in children and, finally, the study of social intelligence. ”
Read more: Is Internet Ever Going Obsolete?
For Patrick Winston of MIT, what makes human intelligence stand out over the artificial and the animal is our ability to tell and understand stories. That is why, for years, he has been working on the Genesis program, which attempts to copy that quality.
Given a brief recount on a conflict between two countries, the program tries to conclude why things happened and their meaning. Genesis is able to detect concepts such as revenge and evaluate the character of a character. “What we are looking for is to create an artificial intelligence that, for example, enters a restaurant where people are talking and eating and can describe, in words, what is happening in detail. In addition, you can sit at the table without overturning the chairs or breaking the glasses. That is difficult, “Winston admits.
Hence the term ‘soft robotics’ to group robots capable of interacting with humans, says Hanson Robotics expert David Hanson in Hong Kong. Social robots like Sophie, popular in Japan and Korea, have a mission to inspire people to relate to them and help them learn.
Read more: Z-Wave Technology | The Advantage Of Z-Wave
That is to say, one thing is artificial intelligence ‘weak’ and another, the ‘strong’. Weak AI is a machine capable of simulating the behaviour of human cognition, but incapable of experiencing that state of mind. On the contrary, strong artificial intelligence involves developing machines capable of having cognitive mental states.
Its promoters want to make machines aware of themselves, with emotions and a true consciousness: an artificial super intelligence. “When we get to that uniqueness, the landscape is going to be so unimaginable that today it’s hard to say things about it that make sense,” says Rodney Brooks.
Some will insist that the problem is not the enslavement of our conscious machines but the possibility that they will kill us, but most robotic specialists believe that this will not be the case. Rather, in advance, we will merge with them. Let us graft human consciousness into extraordinarily durable and efficient machines. In other words, ‘homo sapiens’ will vanish as a biological species, replacing itself with ‘robo sapiens’.
Contrary to many studies and opinion polls, the Mogia artificial intelligence system predicted Republican President Donald Trump’s narrow triumph in the US presidential election, as he did with Barack Obama in the previous elections.
The algorithm, created 12 years ago by the Indian company Genic.ai, had already predicted that the real estate mogul and former Secretary of State Hillary Clinton would be the nominees of their respective parties.
Mogia, named after the character Mowgli, from ‘The Book of the Jungle’, written by Rudyard Kipling, learns as he explores his environment, as does the child of history. And according to its creator, Sanjiv Rai, he has been improving himself in every election.
The calculations are made from the volume of quotes of the name of a candidate that people make in social networks. To generate the prediction of this latest contest, Mogia analysed more than 20 million data on Trump and Clinton collected on Facebook, Google, YouTube and Twitter, in order to generate an idea of the public’s sentiment.
In the end, the candidate who achieves the highest degree of involvement with the audience wins the election.
But even more interesting is that disruptive technologies like this have the potential to influence interactions in social networks without our realizing it. For example, both the Democratic and Republican Campaigns used ‘chatbots’, or digital robotic systems, to create and upload more than a million tweets in short periods, creating the illusion that certain opinions were viral.
Read more: Internet Entrepreneur Ideas
For Rai and other industry leaders, artificial intelligence begins to change the architecture of electoral campaigns, at least those in the United States, in part because more and more people are making their opinions public on the networks.
And although the quality of the predictions depends on choosing the right platforms (Snapchat would not be the best, probably), this medium is the perfect breeding ground to feed the insatiable curiosity of a system that learns automatically and by itself.
Leave a comment