Coding and artificial intelligence (AI) have made tremendous leaps in the last few years, and that growth is predicted to continue. By 2030, it is expected that robots will be able to think and feel for themselves. e-Careers looks at 10 ways that AI is changing technology for the better.
1. Self-driving cars
A self-driving automotive is most driver’s dream. Imagine getting in your car and being able to have a snooze while it automatically takes you to your destination. Companies like Google, Uber and Tesla have tried to make that dream a reality by testing self-driving cars in the far reaches of West America. We’re already into the age of having cars with automatic features including gears and clutches (these have been around for a while now) and automatic parking.
One thing I do fear is that advertisers may be able to pump endless amount of adverts through the speakers and interfaces, like some kind of boxed Spotify free account.
2. Delivery robots
Delivery drones are rapidly becoming the method of choice for delivery companies. In China, JD.com have been aggressively developing their drone capabilities. Instead of delivering directly to a customer’s home, they fly along a fixed route from the warehouse to a special landing pad where a local contractor picks up the package and delivers it to the customer.
They are now trying to work on a 1,000kg delivery drone.
It is hoped, and expected, that delivery drones will be used for disaster recovery, aid work and delivering medicines and vaccines to remote regions.
3. Sales and CRM Applications
SoftBank robotics invented Pepper, a shop floor robot designed to enhance customer experience. The genius of Pepper lies in the way he interacts with people. He is programmed to recognise your voice, facial expression, body movements and vocabulary, using that information to interpret your emotions and offer appropriate solutions and content.
There are over 140 SoftBank Mobile shops in Japan using Pepper, and Nestle have equipped over 1,000 of their outlets in Japan with Pepper too. If you’re looking for somewhere a little closer to home utilising AI for Customer Relationship Management (CRM), Enfield Council in London use IPsoft’s Amelia to provide a more enjoyable, efficient experience.
4. Smart traffic lights
Smart traffic lights are a system that combines normal, traditional traffic lights with sensors and AI to route vehicle and pedestrian traffic, providing a better experience, with less congestion. Rather than having set times, the technologies being developed monitor vehicle numbers and make changes in real time. It is believed that smart traffic lights will reduce waiting time in cars by 40%, which can cut down CO2 emissions by a staggering 6% - so it’s useful for the environment, as well as impatient motorists (disclaimer: of which I am one).
It is hoped that smart traffic lights will also contribute to the utopian dreams of smart cities that will use information and AI to increase operational efficiency – very Metropolis.
5. Dangerous jobs
Bomb disposal, the deep sea, and volcanoes – just a few of the places where a robot can do the work of a human, without the mortality risk. The disappearance of Malaysian Airlines flight MH370 prompted authorities to use deep sea robots to scour the ocean floor looking for clues and potential wreckage from the doomed flight.
The water pressure at such depths wouldn’t be suitable for humans to investigate up-close and personal – enter AI and technology.
6. Medical diagnosis
Have you seen Big Hero 6? BayMax is a “lite” version of what scientists and AI experts are trying to create – and to an extent, they already have. Deep Mind Health partnered with Moorfields Eye Hospital to deliver OCT scans to provide faster diagnosis and treatment. Doctors and Physicians have also had IBM’s Jeopardy beating AI, Watson, to assist them in healthcare to help diagnose and treat patients.
How long will it be before we see a real live BayMax?
7. Home Robots
Amazon have begun work on a domestic mobile robot which will help with household chores. Everyone can benefit from this, but especially the elderly and people who have limited mobility. It is also expected the robots will become assistants that can help people with restricted movement to get out of bed, meaning that they will always have someone available to provide aid and support. Quite a heart-warming entry really.
8. Assistive and cyborg technology
Hugh Herr is an Engineer and Biophysicist who has qualifications and awards coming out of his ears. Before that, he was a rock climbing prodigy. At the age of 17, he came into difficulties when climbing Mount Washington where he and a fellow climber were caught in a blizzard for three nights in -29 degree temperatures. Both of Herr’s legs had to be amputated below the knee as a result of the incident.
Using his own mind, Herr developed specialised robotic prostheses, and after months of surgery and rehabilitation he was able to start climbing again using his new limbs, becoming the first person with a major amputation to perform in a sport on par with able-bodied, elite-level athletes. Herr expects that in the near future, cyborg technologies will help humans push their bodies to physical limits and achievements never reached before.
You can see Herr's Ted Talk, in which he discusses his bionic limbs, here.
9. Facial recognition
Facial recognition has already been implemented in major airports including London’s Heathrow, the ePassport system scans your passport and your face to make the ID connection using facial recognition technology. It will be similar for other forms of identification and use in retail; this will save on queuing times and free up customer service agents to carry out other jobs which are less…. mundane.
This type of technology has also been utilised by Apple in their iPhone X with Face ID being used to unlock a user’s device. The technology provides a good foundation for the advancement of this breakthrough.
ROBOTS. IN. SPACE. How cool does that sound?
Robonaut is a robot astronaut (admittedly, obvious) that NASA developed to work alongside astronauts, carrying out tasks that require more dexterity than current space robotic systems which are designed to work with larger objects.
NASA are using the R2 model of Robonaut to explore the capabilities for possible deep space missions.
Have you heard of any of these ways AI is changing? Have you seen any of the foundations at work?