Size / / /

Even here in Los Angeles, reputedly the land of perpetual summer, the days are not entirely immune to the Earth's annual migration around the sun. Recently all the usual symptoms have been appearing: the sun is setting later, clocks inexplicably advance an hour, flowers start peeking out among the palm trees. The rose bushes alongside my house have burst gloriously into bloom. Yes, spring has arrived once again, and with it come all the usual signs of renewal and rebirth: singing birds and pastel eggs, marshmallow peeps and decorations with rabbits, and of course cyborgs. Say what? OK, I'll admit that cyborgs are perhaps not exactly traditional harbingers of spring, but for that matter, when was the last time you saw an actual rabbit delivering eggs? In this season of new beginnings (with apologies to any readers in the southern hemisphere), let us peek into some laboratories and see what's up with the art and science of seriously rebuilding humanity. Hollywood may have killed off the Bionic Woman after only half a season this time, but out in the real world, you might be surprised by how many bionic neighbors you could already have.

The term "cyborg" itself was coined back in the 1960s, as part of a study on combined human-machine systems for space exploration, but the idea of merging technology with the human body extends much further back. At the dawn of science fiction, Frankenstein's monster comes immediately to mind, but there are other examples too. For instance, Edgar Allen Poe described in his short story "The Man That Was Used Up" a war hero who was rebuilt out of mechanical parts and prostheses, all the way back in 1839. Poe's story layers together several themes, from political satire of a possible presidential candidate to a sarcastic deconstruction of the idea of military manhood, but among them is a clear suggestion that the advance of technology will blur the distinctions between man and machine. Similar arguments continue to be advanced today, with some critics arguing that such technology is necessarily a dehumanizing influence, or that any technological enhancement must come at the cost of losing some essential human attributes. But there are also those, such as Ray Kurzweil, who argue that only through cybernetic technologies can we reach our full potential, and that the new opportunities of biotechnology and brain-computer interfaces do not require the sacrifice of any fundamental human nature.

Robot Hands for Skywalker: Cybernetics in the Hospital and Beyond

Of course the practical reality, in today's labs and hospitals, is not much concerned with such philosophical conundrums. Indeed, until recently most biocybernetics research has sought not to build superhumans, but instead to restore normal human abilities to those disabled by disease or trauma. Which is more of a technological miracle: upgrading a healthy person into a super-strong bionic fighting machine, or enabling amputees to walk and the blind to see?

Parts of this technology have already passed into the prosaic: today one would hardly thing about a hearing aid as a bionic enhancement, nor a pacemaker, yet those two devices in their modern form date back barely 50 years. The transistor's invention enabled the first practical, portable pacemakers and hearing aids, replacing the massive, refrigerator-sized initial prototypes, and unsurprisingly the continued march of Moore's Law has allowed ever more sophisticated versions. Today, the state of the art in hearing restoration is the cochlear implant (even called the "bionic ear" by some), which applies direct electrical stimulation to auditory nerves to restore a sense of hearing to the completely deaf. I highly recommend Michael Chorost's provocative and moving memoir Rebuilt: How Becoming Part Computer Made Me More Human, in which Chorost narrates his own journey from deafness back to hearing, with the aid of a computer inside his skull. For Chorost, despite the odd side effects (such as magnets sticking to his head . . .) becoming a cyborg provided a way to regain a more normal existence after losing his hearing, to reconnect more easily to the rest of society. Yet not everyone thinks that opportunity is necessarily the right choice. In a fascinating example of technological culture shock, the cochlear implant has reignited debates in the deaf and Deaf communities about the preservation of Deaf culture, whether hearing is fundamentally desirable, and what the ethics are of surgically installing such devices in children below the age of consent. The ability to cure deafness may indeed be a miracle of high technology, but perhaps it's a miracle embedded in complex and murky moral quandaries.

Nor has blindness been neglected. Today there are well over a dozen research groups with varying kinds of artificial eyes currently in lab tests or early clinical trials. The most common approach involves electrical stimulation of the retina, achieved by carefully threading electrodes around to the back of the eye where they can interface with retinal neurons. However, some artificial eyes opt for a deeper connection, directly to the brain via a host of truly science-fictional technologies such as percutaneous pedestals (read: computer jacks in someone's skull) and wetware (miniatured electronics embedded within the brain, with electronic tentacles extending throughout the patient's visual cortex). The Dobelle Institute's Artificial Eye made headlines a few years ago, when it enabled a blind man to drive a car (albeit slowly) guided by a digital video camera plugged into his brain. We're not quite yet at the level of Geordi La Forge's VISOR—but I suspect that we will be, long before the 23rd century has arrived.

Similar strides have been made with artificial limbs. America's misadventures in the Middle East have tragicaly resulted in thousands of lost limbs—which today translates into thousands of new computerized prosthetics. One of the more high profile developments in recent years has been the so-called "C-Leg", a computer-controlled knee and lower leg which allows above-the-knee amputees to walk, bike, and even Rollerblade. The even newer "Rheo Knee" features a full artificial intelligence system that learns and adapts to the wearer's motions in real time, adjusting stiffness and position a thousand times a second in automatic counterpart to the patient's good leg. Currently on the drawing board are robotic ankles incorporating artificial muscle polymers to allow full natural motion, such as stair climbing, under wireless control from embedded sensors in the amputee's nervous system. Future "biohybrid limbs" will blur the lines between man and machine even further, integrating prostheses directly onto a patient's skeleton and potentially regenerating skin and neurons over a mechanical framework. The required neural interfaces are already under development, with perhaps the most advanced today being Cyberkinetics' BrainGate. In one of the first demonstrations, a quadriplegic controlled a computer with his thoughts and moved a robot arm and hand as if it were his own limb. Similarly, a New England man who lost both arms in an accident now has a bionic replacement arm, controlled through his own nervous system rewired through a computer on his chest.

Gentlemen, we can rebuild him better: Bionic Enhancement

But where does repair stop and enhancement begin? If you read any news story about C-Legs, chances are you'll come across a quote from some former soldier along the lines of "I'm even more active and athletic now than I was when I had two legs!" Technology can already allow a full range of normal motion, and there's no reason to suspect that advances won't continue even further. You may have read about Oscar Pistorius, the double-amputee runner with carbon fiber springs for legs. After blowing away the competition in the sprint events at the Special Olympics, Pistorius was banned from competing in the regular Olympics because of fears that his artificial limbs give him a unfair competitive advantage. Indeed, in some sense they probably do, being lighter and springier than regular legs. Maybe there really is a slippery slope of human enhancement for athletics that we're treading near. It doesn't seem too likely that any runner would voluntarily have his or her legs amputated to get a carbon-fiber competitive advantage—but is it so inconceivable that, say, a distance runner might opt for hollow titanium bones for lightness, or a boxer could get extra cushioning installed around his brain? In some ways, the first confused wave of bionic athletics is already upon us, in the form of the unending succession of doping scandals. The original definition of the word 'cyborg' didn't refer only to purely mechanical enhancements of a human, but to a combined system of machine and man, working together to regulate the body in an artificial state—and by that definition, isn't an athlete who uses a synthetic chemical to change his body's state in some sense a cyborg?

Of course, there are other reasons one might seek out technological superpowers beyond the quest for athletic gold. The ongoing successes in artificial vision could extend to other wavelengths without much difficulty—and who wouldn't want X-ray vision? Meanwhile, some body modification artists have already opted to give themselves a sixth sense: direct awareness of electromagnetic fields through tiny magnets embedded within their fingers. Want to be able to feel your hard drive spinning up and sense where electrical wires are hidden inside walls? Apparently all it takes is the right kind of rare-earth magnets and a willingness to undergo unsanctioned and unorthodox surgery without anesthesia.

Or, if that's not weird enough for you yet, how does this sound: a functional cell phone embedded in one's arm, with a pixellated tattoo touchscreen for a display and a power supply that draws energy directly out of the wearer's own bloodstream. For now that particular upgrade remains just a conceptual design, not a real commercial product, but the bloodstream bio-power supply is a real device, one recently developed to provide an inexhaustible energy source for pacemakers and other embedded biochips.

But the killer application (literally!) for cybernetic enhancement is probably still super-strong fighting machines. No self-respecting 21st century military should be content merely with futuristic weaponry like anti-satellite missiles and microwave pain beams. What we need are super soldiers! Well, OK, perhaps need isn't the right word, but it appears that it may well be what we're going to get, given DARPA's continued funding of quite a few projects developing powerful robotic exoskeletons for soldiers. One of the first versions, the BLEEX system developed by engineers at UC Berkeley, was a legs-only exoskeleton designed to let soldiers easily carry massive backpacks with very heavy loads. Meanwhile, SARCOS Robotics (a subsidiary of Raytheon) plans to deliver this year to the U.S. Army its first shipment of a dozen prototype full-body exoskeletons. Think about that for a second: the army is already buying real live robotic exoskeletons to give soldiers super strength. You can even watch a video showing one of our future cybernetic overlords on YouTube. And if that's not enough to get you slightly alarmed, it gets even better: there's an actual company called Cyberdyne with a robotic exoskeleton of its own. (For double science-fiction-villain-reference bonus points, their exoskeleton is named HAL.) The rest of the country may have thought we were crazy here in California when we elected Arnold Schwarzenneger, but actually we've been planning long-term: we'll need the Governator to protect us from the coming uprising of cyborg supersoliders!

More seriously, though, the lab developing BLEEX was just across the road from my former office, and last summer I spoke with one of the graduate students there about the project. His perspective was that the computer motion control is a solved problem, now easily capable of applying tremendous forces and supporting huge weights, while carefully synchronizing with the human's motions inside the suit. The biggest outstanding challenge is powering such suits. Battery power yields very limited lifetimes, while hydrogen fuel cells aren't yet up to surviving in rugged military conditions. (And the above-mentioned blood-based fuel cell can't deliver anywhere near enough oomph.) The best solution right now remains gasoline engines, meaning that our supersoldiers currently must sneak through the woods with all the subtlety of a guy with a lawnmower strapped to his back.

(And speaking of robots walking loudly through the woods, it appears that future cyborg soldiers may be accompanied by robot quadrupeds as well. If you haven't yet seen the recent videos of Boston Dynamics' uncanny robot BigDog, it's well worth a look. Scale one of those up a few times larger, and we'd be well on our way to a functional AT-AT walker. Now we just need to make sure none of our enemies ever develop land speeders with harpoon guns to take them down . . . Except given how well BigDog recovers from being kicked and slipping on ice, it seems like its designers may already be well ahead of whoever Darth Vader hired!)

Welcome to the Matrix: Neural Interfaces and Brain Upgrades

So, we can repair the deaf, blind, and lame, let a double amputee run faster than probably 99% of humanity, and are in the process of developing super-strength exoskeletons that could one day potentially be miniaturized into true bionic superhero limbs. Pretty impressive stuff. But I suspect there are readers out there whose first thought on reading about computer jacks into one's brain was "where can I get one?"

Most brain-computer interfaces thus far have only been used in desperate cases, like the above-mentioned sight to the blind and robot arms for the disabled, but as the technology becomes more mature all sorts of other possibilities beckon. Imagine being able to download new skills to your mind, from cooking to Kung Fu. Imagine race car drivers whose vehicles literally are an extension of their nervous systems, responding intuitively in a fraction of a second. Imagine controlling your household appliances and robot servants at the speed of thought. Perfect recall, enhanced cognition, instant mental access to Wikipedia, it's all very tempting. Many of these possibilities remain a distant dream, but others are closer to reality as scientists worldwide seek out steps to a better brain. Already there are debates starting within academia about whether it is ethical for researchers to use cognition-enhancing drugs to boost their research careers. Prominent scientists like Vint Cerf predict a continued blurring of the line between people and technology, leading to fully functional brain-computer interfaces perhaps as soon as 2020. Perhaps this enhancement of humanity will lead to an enlightened new world, as argued by the Cyborg Democracy project. Perhaps it will lead to an incomprehensible acceleration of progress leading up to a transformational Singularity, as predicted by Vernor Vinge and others. Or perhaps basic human nature will prove more resilient than all that, and life will go on much like it does today, with cyber-intellects carrying out normal everyday lives, working and playing and falling in love amidst a glittering augmented reality.

In the nearer term, here are ways to interface computers with the mind that do not involve brain surgery. So-called "noninvasive" brain/computer interfaces use scanning techniques such as EEG or fMRI to keep real-time track of brain activity. Noninvasive BCIs have been demonstrated for manipulation of robot arms and control of avatars in virtual reality. More recently, a noninvasive BCI system was developed for pattern recognition in image search to take advantage of the human brain's tremendous ability to rapidly recognize complex objects in arbitrary settings. A BCI-equipped human can scan rapidly through vast amounts of photos or videos in search of some specific target, with a computer instantly triggering and flagging the data as soon as recognition occurs. Already such systems allow users to rapidly track several times more data than an unaided human. Just as computers are becoming part of humans, humans are becoming incorporated into computing systems.

On the other hand, neural interfaces operating in the other direction can potentially allow computer control of brains, and the minds within them. Somewhat disturbingly, the folks at DARPA are hard at work on developing neural implants for mind control of cyborg sharks. (No word on whether they have laser beams yet, but clearly it's a possibility!) Yet in some circumstances, mind control of a sort may be a potent force for good: "brain pacemakers" can provide electrical stimulation that counters depression at least as effectively as chemical antidepressants. (Remarkably, the stimulation is so effective and the response so instantaneous that patients report feeling their depression come and go literally at the flick of a switch.)

As the boundaries between man and machine blur from both directions, perhaps we should reconsider whether technological enhancement of the mind really even requires a direct interface with the brain. After all, we've already offloaded much of our collective memory to silicon. Quick, recite the actual phone numbers of the last five people you called. You don't need to, of course; your phone remembers for you. Remarkably, today's omnipresent cell phones are far more advanced than Star Trek communicators; you never saw Kirk texting Spock or browsing the net on his communicator. (Imagine how different the series would've been if away teams had had iPhones with Google Maps and satellite imagery . . .) Given that your phone is probably always within arm's reach, giving you near-instantaneous access to cyberspace and all its accumulated wisdom and inanity, does it really matter whether or not the communicator is physically linked into your body? I've frequently referred to my laptop as "my auxiliary brain," given that it holds my current projects, my future schedule, and a vast amount of my past memories. As I write this column, I'm traveling through Arizona, in a city I've never been to before, headed to a destination equally new to me. Yet I know precisely where I am and where I am going; my course is no mystery thanks to GPS. In a sense, I know where everything in the world is. Sure, it would be pretty slick to have all that knowledge only a thought away inside my brain—but I already have probably three-quarters of the functionality through my various handhelds.

The Internet's omnipresence is just as much a part of the future evolution of humanity as is the neurotechnological wizardry of direct brain-computer interfaces. The boundaries between man and machine are indeed blurring, from all directions. One can already control robot arms with one's mind, lift massive weights with a powered exoskeleton's aid, or plug a video camera into a socket in someone's head. And what will it really mean to speak of being a cyborg in an era where high-tech drugs and brain pacemakers play an equal role in mental health and mind extension? When anyone can instantly access planetary-scale internet applications from a stylish communicator whose use is so unconscious it's almost like an extra body part? In the future there will be no clear lines between humanity and its technology. The cyborgs are coming, and we see them in the mirrors every day. We are all becoming cyborgs.




Marshall Perrin (mperrin@bantha.org) is a professional astronomer living and working in Los Angeles. He thinks that it's almost as good a job as being an astronaut, but the commute is way shorter.
Current Issue
16 Dec 2024

Across the train tracks from BWI station, a portal shimmered in the shade of a patch of tall trees. From her seat on a northbound train taking on passengers, Dottie watched a woman slip a note out of her pocket, place it under a rock, strip off her work uniform, then walk naked, smiling, into the portal.
exposing to the bone just how different we are
a body protesting thinks itself as a door out of a darkroom, a bullet, too.
In this episode of SH@25, Editor Kat Kourbeti sits down with Vivian (Xiao Wen) Li to discuss her foray into poetry, screenwriting, music composition and more, and also presents a reading of her two poems published in 2022, 'Ave Maria' and 'The Mezzanine'.
Issue 9 Dec 2024
Issue 2 Dec 2024
By: E.M. Linden
Podcast read by: Jenna Hanchey
Issue 25 Nov 2024
Issue 18 Nov 2024
By: Susannah Rand
Podcast read by: Claire McNerney
Issue 11 Nov 2024
Issue 4 Nov 2024
Issue 28 Oct 2024
Issue 21 Oct 2024
By: KT Bryski
Podcast read by: Devin Martin
Issue 14 Oct 2024
Issue 7 Oct 2024
By: Christopher Blake
Podcast read by: Emmie Christie
Load More