Mechatronics Art: Interface
Throughout my life, I’ve always been fond of robots. Similar to how most people are far more emotionally attached to dogs in a movie than people, I am the same with robots. Below are a few clips of robots who, despite their good intentions, are punished. WARNING: Content is violent and saddening.
The piece: Meet AVL-4274
This piece envisions a future in which the issues of mass incarceration, global warming, and the ethics of artificial intelligence are exacerbated to the extent where they intersect. AVL is a robot who was part of a farming robot colony, but malfunctioned and ended up destroying a few crops. However, AVL is extremely fond of plants - it’s all he cares about. Unfortunately, due to his glitches, he isn’t able to take care of them well. The engineers in the world have ceased to exist - they are just unempathetic technicians now who care only about efficiency. Outdated notions of corrections, which involve raw imprisonment, still exist in this world. Therefore, AVL was never fixed, he was simply locked up. Despite his good intentions, despite all the potential he has to sustain the dying world, he was just locked up, and forgotten. Now, AVL simply watches whatever plants he can from within his cage. He monitors them, but he’s chained up and can’t water them. He’s grateful for the kind souls who water the plants, whether that be the clouds above, or a passerby. He’s far more grateful if that passerby shows empathy.
This story isn’t far-fetched. Over two million human beings are incarcerated, with a recidivism rate of over 50% nationally. Our society is one that throws away that which doesn’t work even if fixing it takes less energy. This mentality applies directly to global warming, and how our throw-away society has polluted this Earth. It seems like something inextricable from our nature; perhaps robots are the only beings with absolutely pure intentions of maintenance and sustained care. However, the ethics of imprisonment are unjustified, and the ethics of intelligent robots are not established; it would make sense if the latter inherited the former philosophy: lock up robots instead of fixing them. Empathy is the key factor, the ultimate solution, to all these issues; it’s in short supply today, so perhaps AVL can demonstrate what it’s like to see through a perspective of pure, sustained, focused, good intention behind bars.
The goal of the piece is to take empathy to its furthest extent: literally putting you inside the robot software’s perspective, and fixing the robot’s glitched software through empathy. Ever since I began practicing fundamental engineering principles, I’ve always believed that empathy is the basis of every good engineering solution. Complex problem solving requires understanding the perspectives of multiple stakeholders. It’s honestly what the world needs more of. Unfortunately, given the issues described above, we’re in short supply. This interface implements empathy to the best of its ability. Ideally, I’d have some kind of transfer-of-consciousness like technology which would allow motor neurons to control the robot and electrodes to stimulate sensory neurons (like retinal neurons to see what the robot sees). Unfortunately, this level of empathy is way in the future. An immersive virtual reality interface like this one does its best to simulate empathy; the hardware connection between the VR game and the electronics in the cage allow the VR environment to affect the real world, making the simulation far more realistic.
As a sidenote: this robot was definitely inspired by these pure, innocent characters:
Wall-E - taking care of the plants and looking super cute
Chappie - creating a transfer-of-consciousness-like interface
BT-7274 - having pure intentions (3 protocols) and a cool name
The head is an old black and white TV, which connects to a camera mounted on the top of the cage. The camera direction is controlled by a servo motor. There’s a soil moisture sensor in each plant. The camera pans from plant to plant, “scanning” each one (reading the moisture levels) and outputting its status (red = dry, yellow = good, green = currently being watered). The moment a plant is being watered, the camera will look at it and express joy (through LEDs or otherwise in a future implementation). The code omits delays, and replaces the functionality of delays with moisture level readings (in a parallel manner which doesn’t require interrupts).
The virtual reality interface is implemented by the Microsoft Mixed Reality headset, a VR-ready laptop, Unity, and the Mixed Reality Toolkit for Unity (on Github). The hardware connection is implemented by a Serial connection between Unity and an Arduino with a servo. After the game asks the user to complete tasks to fix the robot, the game instructs the user to open the cage by touching a handle on the back gate; when that handle is touched, a Serial message is sent to the Arduino that the cage should be opened, and the Arduino turns the servo to let the cage fall.
(plus an additional servo motor, on another arduino, exactly as wired below)
Code for VR Interface
Kudos to the Microsoft Mixed Reality Toolkit for Unity, and various free non-commercial 3D models online.