Artificial Intelligence (AI) trends
Information and Communications Technology & Media | 17 Dec 2019
Artificial Intelligence (AI) trends
The future of AI seems bright, not just in the industrial and business, but in our everyday lives. Tech industry giants like Apple, Amazon, Google, Facebook, IBM, Microsoft are investing billions into the research and development of AI, bringing consumers and AI closer. This is despite some of the myths we hold about AI.
We are seeing a significant increase of AI around the world, impacting healthcare, education and other industries. From self-driving cars, to using intelligent robots to sow and spray insecticide on crops, the list goes on. According to PWC’s analysis, global GDP will be up to 14% higher in 2030 as a result of the accelerating development and take-up of AI – the equivalent of an additional $15.7 trillion. By 2020, AI is projected to create 2.3 million new jobs worldwide while eliminating 1.8 million traditional jobs, according to research firm Gartner (link).
We discuss 5 trends to look out for.
1. AUTOMATED MACHINE LEARNING (AUTOML)
AutoML will fundamentally change the face of ML-based solutions and revolutionise business analytics. AutoML fits between cognitive APIs and custom ML platforms, delivering a level of customization without forcing developers to go through an elaborate workflow. This allows business analysts to stay focused on the core business problem(s) as complex problems will be solved without training the machines manually through the typical process of training models.
The use of AI will increase automation in industries globally. E.g., Amazon is already using robots in its operations as they work faster and with more accuracy, reducing human error and improving the services provided.
While the rise of AI is increasing automation in the processes of our daily lives, more research is required to provide solutions to the resulting problems that arise. So AI may eat into the low-skilled job market, but the demand for higher-skilled jobs (for designing AI) will increase.
3. AI POWERED/ENABLED CHIPS
AI relies heavily on specialized processors completing the CPU. Even highly advanced CPU may not improve the speed of training an AI model, as the AI needs extra hardware to do complex mathematical computations to make tasks like facial recognition and object detection faster.
Major chip manufacturers such as Intel, NVIDIA, AMD, ARM and Qualcomm will produce specialised optimised chips capable of executing AI-enabled applications such as computer vision, natural language processing and speech recognition. Hyperscale organizations like Microsoft, Amazon, Facebook, and Google will invest more in custom optimised chips based on field programmable gate arrays (FPGA) and application specific integrated circuits (ASIC) to run modern workloads based on AI and high-performance computing. Some of these chips will also help advanced databases to enhance query processing and predictive analytics. Future applications from the healthcare and automobile industries will rely on these chips to deliver intelligence to end-users.
4. FACIAL RECOGNITION
Facial recognition is an AI application that identifies a person with their digital image or patterns of their facial features, and its readability and accuracy are greatly improving due to extensive research. This application will move fast due to the non-invasive identification and simple deployment. Already, we see examples from Facebook’s Deepface program to tag users’ family and friends in photos, and the use of facial recognition in IPhoneX as a digital password. As everything becomes personalised, e.g. shopping and targeted advertising, the ground is ripe for the use of biometric identification.
Some uses will include helping security agencies identify rogue elements, helping businesses provide more personalized service, payment processing by means of security checks, early detection and prevention of crime, and following through in clinical trials and medical diagnostic procedures.
5. INTERNET OF THINGS (IOT) AND AI
IoT has positively disrupted many industries in the recent years and it is merging with AI to increase its useful impact in our lives. Industrial IoT is the top use case and first to be integrated with AI for outlier detection, root cause analysis and predictive maintenance of the equipment. Deep neural networks will play a key role in improving industrial operations. Similarly, autonomous driving will see huge usage of AI, solving daily problems like traffic congestion throughout the world.
AI meets IoT at the edge computing layer where most of the models trained in the public cloud will be deployed at. Advanced ML models based on deep neural networks will be optimised to run at the edge, capable of dealing with video frames, speech synthesis, time-series data and unstructured data generated by devices such as cameras, microphones, and other sensors.
IoT is set to become the biggest driver of AI and edge devices will be equipped with the special AI chips based on FPGAs and ASICs.
Data science will be increasingly operationalized, with a correlating increase in demand for ML engineers. In the years ahead, the focus will be on operationalising data science models and deploying big data to training them. The ML engineers’ role will be crucial to the in-production data pipeline to oversee data management and computational resources, as well as the maintenance of such systems (link).
The use of AI continues to increase dramatically with its integration with IoT and machine learning and as the tech giants boost AI research efforts. With these 5 trends, AI is poised to further expand its reach in the world, impacting industries and businesses on an international scale, while uplifting mankind to a new level of productivity.