The use of artificial intelligence (AI) in our daily lives has grown over the last few years. For the most part we do not notice or consider that AI is behind it. Whether people think of it as smart algorithms or progress, it is here to stay.
AI makes life easier by doing many simple tasks for us. One example is voice recognition. How many of us ask Amazon’s Alexa a question or get it to turn the lights on, or play music? However, AI is not limited to simple tasks and we are likely to experience exponential growth in its use both at home and in the workplace. It is already used manufacturing, medicine and banking, marketing, vehicles and this is only the beginning.
The beginnings of AI
The idea of artificial intelligence and machine learning (ML) go back over sixty years to the 1950s and the days of Alan Turin and others. Its foundation as an academic discipline followed a 1956 conference at Dartmouth College in New Hampshire.
Their vision was to create intelligent machines that simulated or exceeded human intelligence. As with many things, the thinking was ahead of its time and it is only since the 1990s that the real-world computational power has made it possible.
The term artificial intelligence is widely used and has become generic. In computer science AI is machine intelligence that seeks to replicate and exceed human intelligence by awareness of its environment. Unsurprisingly there is no single definition for it, but Wikipedia uses “any device that perceives its environment and takes actions that maximises its chance of successfully achieving its goals.”
Artificial Intelligence is already here
One example of AI that many people will be familiar with is voice recognition using natural language processing (NLP). It allows machines to communicate with humans using written and spoken language. It allows us to control machines using voice commands at home and work. For example, we can dictate to our computers rather than use a keyboard but there are scores of applications in regular use..
There a re many high visibility applications of AI such as in autonomous cars, or smart home controllers. Other less obvious applications include email spam filters; facial recognition system; banking fraud applications and manipulating most social-media websites.
One interesting phenomenon is the AI effect that says the more mainstream AI becomes the more onlookers discount the behaviour as AI. Paradoxically they argue that it is not real intelligence.
From algorithms to AI in manufacturing
This is also true in engineering. For many years, control engineers have been using PID controllers and fuzzy logic to control processes. Although the ways in which they work is not the same as AI they can be thought of as artificial intelligence as they do demonstrate a form of intelligence by using algorithms in response to multiple inputs.
An algorithm is a sequence of unambiguous instructions that a computer uses to complete a task or solve a problem. AI however uses lots of algorithms working together to change the outcome dependent on a wider number of variables, inputs and data. In the case of an adaptive algorithm it can change itself each time it runs. This is sometimes referred to as machine learning (ML).
The basis for machine learning is that we can build machines to process data and learn on their own, like humans. They learn to exceed their programming without our constant interaction with humans. In this case the machine is a data processor, and it can operate in any industry or business. Only machines can handle huge volumes of data in real-time and identify patterns in this way. For example, it can have a huge impact in areas like diagnostics, whether for health, banking or predictive maintenance in a plant.
AI achieves machine learning by acquiring, reasoning, perception and problem solving. Robots makers are a leading field for the use of AI as it enhances their productivity. The technology can also redesign and optimize control algorithms automatically in real time.
Conventional robot controls are designed to handle objects where conditions change in predictable and predetermined ways. New technology using multiple types of sensors and real-time controls handles items of different shapes or where conditions are changing. It recognises changes by applying a smart learning AI and makes repeated estimations based on deep learning techniques. Tests have shown that the technology can reduce learning times and adapt to changes in conditions in just 3-5 milliseconds.
IOT also plays an important part in the growth of AI in manufacturing. The internet, IOT, cloud computing and Big Data have developed to a level that supports machine learning and the growing feasibility of AI and will fuel further research. Growth will be dynamic
And what of the future
According to the Stanford 100-year study there is no imminent threat to humankind from self-perpetuating machines. It will however affect the life and work of everyone on the planet.
The new digital economy provides both the data resources and the demand for manipulating it. Seeking to emulate the human brain has stimulated new research into the use of artificial neural networks. These are statistical models that process data in a non-linear way. It is already a big part of our lives and most people interface with artificial intelligence daily. And this is only the beginning. AI will replace some jobs, but will create new jobs.