-
Tips for becoming a good boxer - November 6, 2020
-
7 expert tips for making your hens night a memorable one - November 6, 2020
-
5 reasons to host your Christmas party on a cruise boat - November 6, 2020
-
What to do when you’re charged with a crime - November 6, 2020
-
Should you get one or multiple dogs? Here’s all you need to know - November 3, 2020
-
A Guide: How to Build Your Very Own Magic Mirror - February 14, 2019
-
Our Top Inspirational Baseball Stars - November 24, 2018
-
Five Tech Tools That Will Help You Turn Your Blog into a Business - November 24, 2018
-
How to Indulge on Vacation without Expanding Your Waist - November 9, 2018
-
5 Strategies for Businesses to Appeal to Today’s Increasingly Mobile-Crazed Customers - November 9, 2018
‘It’s a Trap’: In an Emergency, Do Not Believe the Evacuation Robot
Surprisingly, 26 of the subjects ended up following the robot even if it would lead them through the smoke down a new path, and towards a new door they have never seen before.
Advertisement
Their study focused on 42 volunteers who were told to follow a “guidance robot” into a conference room. “We expected that if the robot had proven itself untrustworthy in guiding them to the conference room then people wouldn’t follow it, but all of the volunteers followed the robot’s instructions, no matter how well it performed previously”. 30 subjects started out the test by following that robot when a smoke alarm went off and artificial smoke filled the hall. The robot, which was actually being secretly controlled by researchers, would sometimes repeatedly lead them in to the wrong rooms before leading them to the correct one. Interestingly, the path instructed by the robot contradicted the direction of the doorway used by the subjects to enter the building – a doorway clearly marked with exit signs. Strangely enough, even when the subjects were told that the robot broke down, they still felt like following it or fits instructions, a fact that baffled the researchers.
A report published in the Gizmodo News said, “People seem to believe that these robotic systems know more about the world than they really do, and that they would never make mistakes or have any kind of fault”, co-author Alan Wagner said in a statement.
Is it human’s use of smartphones and other technology that is luring us into a false sense of security with regards to what and whom we should trust?
Humans are willing to listen to a robot, even when that robot is broken or making obvious mistakes, according to research presented at the 2016 ACM/IEEE International Conference on Human-Robot Interaction.
The Georgia Tech study revealed that most human participants followed the Pioneer P3-AT rescue robot during a mock fire scenario, even when the robot was asking them to move away from known emergency exits.
A recent study performed by engineers at the Georgia Tech Research Institute resulted in shocking conclusions that proves that human beings are amazingly trusting of robots and other man-made contraptions that are supposedly made to exceed human capability.
“We absolutely didn’t expect this”, he says.
It’s dark, you’re in a building and the building is on fire.
It should probably be noted that the humans were indeed wary and distrustful of the erring robots when the tests were conducted outside a high-stress emergency situation.
Scary, but true! People are trusting robots blindly even if they are wrong.
In future research, the scientists hope to learn more about why the test subjects trusted the robot, whether that response differs by education level or demographics, and how the robots themselves might indicate the level of trust that should be given to them. The issue will become more relevant as we hand out greater shares of our lives to robot control, including possibly driving us in autonomous cars or handling our food.
Advertisement
In addition to those already mentioned, the research included Wenchen Li and Robert Allen, graduate research assistants in Georgia Tech’s College of Computing.