It’s a silly season as far as robotics innovation goes. At this year’s CES Consumer Electronics Show a new robot was unveiled that’s specially designed with the sole purpose of bringing a roll of toilet tissue to you if you run out.
Perhaps unsurprisingly, the novelty robot was developed by a toilet paper brand. Other robots unveiled recently include ones capable of playing ping pong and a companion robot for your dog.
But gimcracks and gewgaws like these aside, we’re actually in a new era of collaborative robots – or cobot if you prefer. These particular types of AI are designed to work alongside humans to enhance the productivity of both. Rather than acting as servants, delivering toilet rolls, and the like, these new units of technology will act to enhance human capabilities.
Human enhancement
The most obvious application is the use of a robot arm which mirrors the movement of a human operator – albeit with greater strength and resilience. Technology such as this can enhance human capabilities whilst reducing fragilities, meaning people can operate their robot avatars in environments of extreme heat, pressure or danger while watching from a safe distance.
It means operators can lift heavy machinery with greatly reduced risk of injury and associated loss of working time. Cobots’ ability to enhance existing human capabilities mean the technology offers huge opportunity to increase productivity, boosting profits, and GDP.
It’s arguably also the way for robotics to expand without massive detriment to human employment. In fact, opportunities to work in harmony with robots could bring new and exciting employment prospects to all kinds of workers. Cobots could help reduce workplace injury and mean physical strength and resilience is less of a requirement for employment in some roles. It can also open up possibilities for working in challenging environments, such as underwater or in space.
Communication is key
For maximum productiveness, it’s vital that humans and collaborative robots learn to communicate effectively. Many cobots are likely to be employed by specialists in an industrial setting, in heavy manufacturing, medicine, and hazardous industries.
But there’s also a great deal of scope for cobots to serve non-specialists and help us in our everyday lives, such as assisting the elderly in their homes, serve the domestic cook or help travelers negotiate their way through unfamiliar environments such as airports and foreign cities.
If humans and robots are to collaborate effectively they need to have a relationship that’s managed through language. For the technology to really take off outside the specialized industry, this communication probably needs to happen in human language ie natural language. If cobots can function effectively using natural language then we’ll reap the best rewards from this technology. It’s about engaging in discussion rather than just giving orders.
True innovation in robotics – and the point at which the technology becomes seriously useful – occurs when AI can operate outside of pre-programmed pathways. A robot can certainly be taught to do a predictable and repetitive task – in fact, many are already doing this.
But it’s much harder to teach them to innovate outside their pre-programmed pathways. Communication with human operators could be the key to using robots in a broader range of situations than can be anticipated at the programming stage.
For this reason, cobot technological progress is in large part about advances in natural language processing. We’ve already explored the many challenges of communication between human and AI technology, particularly when you factor in different accents and dialects.
READ MORE: The Language Inequality of Chatbots
Collaborative robots will need to learn to adapt to the main quirks of human language in order to communicate effectively. But effective communication is about more than just understanding a broad local accent. Cobots also need to be taught brevity of language and how to translate their large volumes of data into the amount that humans need from an encounter with them.
Take the example of autonomous robots that can move independently around a complex building, working out their own routes. They figure this out based on complex calculations such as distance from walls, map co-ordinates, and most direct route, as well as factors such as their remaining battery power.
But if a human asks them the same route, the language response required is very different to the data that the robots themselves use to manoeuvre. Essentially the job of the programmer is to teach robots to understand human requirements as well as their language.
A similar challenge would be getting a robot GP to explain a diagnosis and give health advice in a way a human patient without specialist medical knowledge can understand. If we can teach robots to do this, it frees them to speak to patients directly. If we fail to impact this ability to speak a layman’s language to robots then we’ll relegate their use to within the professional sphere.