Of Robot Guides and Self-Driving Cars
When was the last time you had a conversation with a robot, or hitched a ride on a driverless car?
For those who had attended the public sessions of the one-north Festival on 5-6 August 2016, it could have very well been their first time touching the future.
The inaugural event, jointly organised by the Agency for Science, Technology and Research (A*STAR) and JTC Corporation, provided opportunities for the public to experience the exciting innovations that are creating possibilities for our future.
The festival, which drew a large crowd on both days, celebrated research, innovation, creativity and enterprise in Singapore.
With participation from players hailing from a range of industries from biotechnology to consumer health, there were interactive exhibitions, laboratory tours, public talks and workshops galore for the public.
Local start-ups took to the stage to discuss how artificial intelligence is transforming the way we do business, while gaming enthusiasts were treated to a behind-the-scenes look at how games are developed.
Undoubtedly, a big draw for festival-goers were the technologies that are now shaping the urban landscape and changing how we work, live and commute.
The Road Ahead for autonomous vehicles
Cars that can drive themselves could soon become a reality on Singapore’s roads, say research teams working on these ‘smart’ vehicles.
One such team, based at Nanyang Technological University (NTU)’s Energy Research Institute (ERI@N), is collaborating with French company Navya on an autonomous vehicle that is currently ferrying passengers around the university campus.
The behavioural data they collect from the performance of these vehicles will inform how the next generation of Navya autonomous vehicles are designed.
Besides conducting behavioural studies, the NTU team is working on the wireless charging function of the Navya vehicles.
“Right now, the vehicle takes five to seven hours to fully charge. We are trying to reduce the charging time to less than an hour,” said research associate Ms Krithika Kandasamy, who is part of the autonomous vehicle team at ERI@N.
Sensor fusion is another important area the team is focusing on. NTU is now working with the Sentosa Development Corporation on converting one of their buggies into an autonomous one.
“Inside our campus, these buggies are running with four radars pretty well. But we feel that certain decisions will need more sensors, which we are working on adding,” Ms Kandasamy said.
“Furthermore, weather conditions can be a challenge. In Singapore, we have horizontal rains; you can install sensors that can react to these kinds of conditions.”
Naturally, the million dollar question is: how will autonomous vehicles eventually fare on busy city roads?
In a bid to allow regular and autonomous vehicles to seamlessly interact with each other, the research team is also looking into developing a robotics kit that can convert practically any vehicle into an autonomous one, Ms Kandasamy added.
Robotics and Additive Manufacturing go Hand-in-Hand
Just like humans, robots can come in all shapes and sizes, depending on the application.
Sometimes, they even lend a hand.
A fully flexible, programmable robot hand on display at the festival is just one example of how researchers at A*STAR’s Advanced Remanufacturing and Technology Centre (ARTC) have combined additive manufacturing with robotics.
The programmable parts were coupled with ‘finger’ segments that were 3D-printed in-house.
According to the team that developed this first-pass prototype, the entire process took about two to three weeks, and required some tweaks to the open-source software that was used.
“We made some improvements to the finger joints. We wanted it to work without failing too often or suffering too much wear-and-tear at the joints; essentially, to have a long life-cycle,” said Ms Fang Yongwei, a development engineer at ARTC.
Future applications include low-cost 3D-printed prosthetics or exoskeletons, according to Ms Fang.
The robot hand could be adapted into a glove of sorts, for use in industries which require plenty of heavy lifting. It could even be remote controlled, where the operator stands a safe distance away, she added.
In Conversation with EDGAR-2
A must-see at the festival was charming robot conversationalist EDGAR-2, which quickly attracted curious crowds.
According to research fellow Dr Wong Choon Yue from NTU’s Institute for Media Innovation, this humanoid robot is meant to operate autonomously — unlike its predecessor with the unlikely name of EDGAR-1.
Designed and built in Singapore, EDGAR-2, which stands for Expressions Display And Gesturing Avatar Robot 2, can be placed in public spaces like airports, shopping malls and restaurants, and be programmed to promote events in the area.
It can also be used in an educational capacity in museums, for example, or even become an exhibit itself.
Dr Wong noted that interactivity is a key factor in EDGAR-2’s potential success.
“It’s kind of a crowd-puller; in a way, it is a new kind of advertising platform. In addition to just telling people about what’s going on, people can walk up to the robot and interact with it. They can ask questions such as, ‘Where is the canteen? Where can I find the nearest taxi stand?’”
Currently, the research team is trying to bring EDGAR-2 to the commercialisation stage which, with the right partner, could be done within about a year or two, said Dr Wong.
Indeed, such collaborations are exactly what the organisers of the Festival hoped will come to fruition.
With the right people and organisations working closely together, a robot may someday take tourists around the National Gallery, while that driverless ride could soon bring new meaning to travelling in style.