Atlas, built by Boston Dynamics, is a humanoid robot with an extraordinary talent. Based on video demonstrations that Boston Dynamics has posted on-line, Atlas has gone through several iterations over the past decade with the current model standing about five feet high. Atlas has two legs, two arms, a human shaped body, and sensors mounted on top like a head, such as cameras that act like eyes and lidar, which uses laser light to help Atlas sense his position in space. What makes Atlas so astonishing is that he can walk and run like a human. Star Wars’ CP30 shuffles in a stiff-like gait, reminding us that at best he is just a mechanical man, whereas Atlas walks fluidly, swinging his hands back in forth. He can walk on uneven terrain, on different materials such as soft earth, snow, and on slippery surfaces. Like a human, Atlas shifts his body weight and may look ungainly, but he remains upright and moving. It is truly uncanny, watching him walk to a door, push it open with his hands, and walk through it just as a human would do. Atlas climbs stairs, whether the steps are evenly spaced or spaced at angles and different heights and the robot can also do backward and forward flips from a standing position and can jump higher than most humans.
Who said you can’t keep a good robot down? Videos show Atlas being tripped or pushed to the ground by his programmers and then he pulls himself up on his knees before standing upright again. This scene made me appreciate that I was beginning to attribute human-like characteristics to this mechanical marvel, such as using the pronoun he instead of it and feeling that his engineers were acting cruelly by purposely tripping him; albeit necessary actions to improve his programming.
According to Wikipedia, the earliest Atlas model was created in 2013. I like that word, created, as like a human, Atlas has a birth date. This early version was tethered by wires to an external power supply as well as wires connected to a computer for uploading and downloading data. Since then, Atlas has become autonomous, running off of battery power and apparently connected by Wi-Fi to the mainframe where his programming is written. In their latest video, two Atlas robots dance in better unison than two Rockettes dancers and even balance on one leg, with their foot lifted as high as their head, something most humans can’t do.
To appreciate how far Atlas has come in just the past seven years, it helps to go back 2 – 3 million years in human evolution. Our hominin ancestors looked more chimp-like than homo sapiens. The distinguishing feature was their ability to stand and walk upright. If you saw Lucy, the Australopithecine, whose fossils appear to date back about 3.2 million years ago, you would see a small, ape-like creature, with hands longer than her feet, but walking erect like a modern human. Bipedal ambulation appears to have evolved when some of our primate ancestors came down from the trees in search of food. Monkeys use their feet and tails much like hands to grab branches as they traverse the treetops. Anthropoid apes such as chimps and gorillas have retained some of these features; however when they walk, they also use their hands to maintain equilibrium; a running chimp looks more like a horse than a man as they use their hands to help stabilize their bodies. Thus, standing upright and using just our feet for locomotion is something that has taken human evolution millions of years to accomplish compared to Atlas’ seven years of intense computer programming.
My wife and I have vivid memories of when our daughter first learned to walk. I was in grad school and was putting up shelving in a spare room to use as my study. My wife was organizing our books which were spread all over the floor. My daughter, barely ten months old, was apparently attracted by the colorful book covers and seeing one she liked tried to crawl to it but couldn’t because the other books were an obstacle to her passage. So, she stood upright and took her first tentative steps. The physical maturation needed to walk takes time. Other than squirming, human infants can’t even roll over on their own. It takes between ten and fourteen months for most humans to develop the bones, muscles, and nervous system to allow walking to occur. And like a robot, it takes programming and practice. Much of the programming is in the form of DNA, but humans also need to learn, such as crawling before walking. Atlas’s maturation has taken a similar course. For example, his physical maturation has included changes in the shape of his limbs and feet and his neurological maturation has occurred in response to trial and error in which his programming is modified to overcome failures.
Despite these similarities, there is one powerful difference between the maturation and learning needed to accomplish the physical movement Atlas displays and how we organic creatures do it. Every human is a unique being, so learning to move about is unique and singular for each of us, despite the genetic coding that supports that process. Not so for Atlas and his successors. Once learned, the coding can be put into any robot of similar design. As long as the robot has the same mechanical features, Atlas 2 through Atlas 2000 will be able to do everything that Atlas has learned without the learning curve.
One goal of this programming is proprioception – the ability to integrate movement with sensory input, so that a robot knows its location relative to its environment and other objects. This is an area where a robot can exceed human abilities, as robots are capable of seeing and hearing over a broader range of wavelengths and frequencies.
From the perspective of robotics, the programmers at Boston Dynamics are developing an autonomous nervous system. Just as our body functions to support the brain, which is the seat of human consciousness, a robot’s body will provide its computer brain a means of mobility. Thus Atlas fulfills two purposes. First, as currently built, in the near future an Atlas type robot could supplant humans in a variety of jobs requiring preprogrammed repetitive physical movement, such as the work done in an Amazon “fulfillment center”. However, coupled with its own “AI brain”, Atlas’ body could help attain the goal of a truly autonomous being, possibly possessing its own conscious thought and self-awareness.
Ultimately, how robots are used will depend on whether they are preprogrammed for repetitive, predetermined multiple step tasks or, if given artificial intelligence, activities that provide for a higher level of autonomy. Health care settings provide an environment where both approaches will have a significant impact on health care delivery.
The term robot originated with the 1920 play R.U.R., for Rossum’s Universal Robots, by Czech writer Karel Capek. When we think of robots, we tend to visualize the skeletal frame of the T-800 robot from the Terminator movies or C3PO from Star Wars or Boston Dynamics’ Atlas. However, in R.U.R., the robots are actually artificial biological organisms (synthetic humans) grown in factories. In the play, the robots are bred to free humans from the drudgery of manual labor. Similarly, in the coming decade, we’ll see robots in the health care environment working at jobs that require mobility and involve repetitive tasks. For example, we’ll most likely see humanoid robots like Atlas work in hospital materials and supplies departments, not only loading and unloading trucks, but also delivering supplies throughout the hospital. Robots will also make excellent housekeeping workers, who can be programmed to clean and sanitize patient rooms. Using infrared vision, these cleaning bots will be able see spider webs in the corner of the ceiling that need sweeping or they may even be able to use the heat signature of bacteria to see areas that need disinfecting. And they’ll be able to enter areas where patients are highly infectious without fear of contracting the illness.
Pharmacy tech bots will fill prescriptions and in hospitals take the medications to the nursing station or even to patient rooms. Depending on the programming, a physical therapy assistant bot could help patients with their exercises and nursing assistant bots will transport patients within the hospital, such as wheeling patients from their rooms to the radiology department for tests.
With advanced programming and some A.I. to help adjust to different situations, by the end of the 2020s, robots should be able to complete some manual tasks now done by nurses, such as taking patient temperatures and blood pressure, giving shots, inserting IVs and catheters, and assisting doctors in the operating room. In nursing homes, CNA bots will help patients with activities of daily living. This will greatly reduce the risk of injury that human CNAs face when they help lift patients and it will help nursing homes confront a growing shortage of CNAs.
The incentive for using robots in place of people is primarily financial. Robots don’t need benefits such as paid vacations and sick leave, retirement pensions, or health insurance, which can account for 20% of the cost of a position, There is a one time cost for purchasing the robot and probably an annual contract with the robot’s manufacturer for maintenance, but otherwise, they don’t earn a salary, so in the end will cost much less than maintaining a large, human workforce. And robots won’t complain when they are asked to work overtime or take another shift.
So far, I have described how humanoid robots will be used to replace humans at primarily manual tasks. But it isn’t just the hardware that is being developed. Artificial intelligence, whether limited to a computer or installed in a robot will eventually be employed at advanced tasks. For example, the manufacturer of the da Vinci surgical system claims it can improve the accuracy of a human surgeon’s hands and eyes. If so, wouldn’t the surgery be even more accurate if guided by the sensors and hands of a robot? Or consider that during the 1990s tests were run using disease diagnostic algorithms to see if a computer could diagnose patients based on the verbal description of their problem. The computer-based analysis correlated highly with a human doctor’s diagnosis. Thus robots with A.I. may not replace physicians, but they will be used to augment their roles.
In The Case for Universal Health Care I included a section on the impact of technology in medicine. Technology does not always lower costs or improve clinical outcomes. However, robots clearly have the potential to do both.
Many people point out that working conditions in Amazon “fulfillment centers” have much to be desired. I believe one reason that Jeff Bezos, owner of Amazon, does not provide better working conditions and benefits for his employees is that he recognizes that by the end of this decade most of those employees will be replaced with robots not that dissimilar to Atlas. If that’s the case, other businesses, including health care facilities will join the move toward using robots to complete many manual, repetitive tasks and as a result, America’s workforce will look radically different from today’s.