A Futurist on Why Lawyers Will Start Becoming Obsolete This Year (original) (raw)
If you buy something using links in our stories, we may earn a commission. This helps support our journalism. Learn more. Please also consider subscribing to WIRED
Karl Schroeder is one of the best of the current generation of hard science fiction writers. He’s also an accomplished futurist who works for the design firm Idea Couture. In his new novel Lockstep, he presents the idea of a civilization that uses synchronized cryonics to maintain a thriving interplanetary society without the need for faster-than-light travel. This far future civilization has also replaced their entire legal system with all-knowing AIs. But we won’t have to wait thousands of years for technology to start replacing lawyers.
“We’re headed there in about six months in terms of contract law,” says Karl Schroeder in Episode 106 of the Geek’s Guide to the Galaxy podcast. “So if I’m claiming in Lockstep that at some point legal apparatus might be replaced by computerized systems, I’m only barely avoiding being out of date.”
Schroeder points to efforts like the Ethereum project, which uses block chains—the technology behind bitcoin—to create smart contracts. Such contracts live online, beyond the control of any single entity, and anyone can check their operating parameters at any time.
“It’s a kind of automaton,” says Schroeder. “It will follow the rules that have been laid down for it to the letter. It will never cheat.”
Listen to our complete interview with Karl Schroeder in Episode 106 of Geek’s Guide to the Galaxy. Then stick around after the interview as guest geeks D.E. Wittkower and Ashley Shew join host David Barr Kirtley to discuss Ender’s Game and Philosophy.
Karl Schroeder on why a lot of sci-fi is impossible:
“Star Wars, Star Trek, Battlestar Galactica, these are great stories. They only suffer from one problem—they’re all impossible. As far as we know, Einstein discovered a rule that’s ironclad across the cosmos—you cannot travel faster than light. If you cannot travel faster than light, then all of these stories become fantasies … It’s all very well to say that [FTL] could be invented, and in fact I will freely admit that we don’t know that it can’t be invented. You can’t prove that faster-than-light travel will never be invented, but you also can’t prove that Santa Claus doesn’t exist … And you can spend the rest of your life dreaming and wishing that faster-than-light travel could be invented—and I certainly think we should try to see if it can be—or you can actually get the same results that you would get from faster-than-light travel by other means.”
Karl Schroeder on backing up human civilization:
“Because the Lockstep’s always there, it’s developed into a kind of backup for human civilization. Some catastrophe will happen—rogue AIs become godlike and devour everything, or human civilizations fight wars and blow up each other’s planets, and everyone gets knocked back to the Stone Age. And then the Lockstep wakes up, they look around, and say, ‘Oh, it happened again,’ and they send their people in and they rebuild the civilization. And over tens of thousands of years this happens repeatedly, and they’re always there to pick up the pieces. So they literally do a backup and restore on human civilization repeatedly. One of the reasons they can do this is because they’re so insignificant as far as everyone else is concerned. They’re in suspended animation nearly all the time, and they’re in these places that no one else wants to go to, these little worlds between the stars, so no one has any incentive to go after them.”
Karl Schroeder on reconciling AI with creating a civilization:
“I encountered a problem—and I’ve encountered this with most of my books, actually—which Frank Herbert encountered when he was writing Dune, which of course is that he wanted to have a particular kind of civilization, but that civilization would be essentially ridiculous or impossible if AIs and robots existed, so in the case of Dune he used the ‘Butlerian Jihad,’ this holy war to destroy AIs … Basically a political reason why there would not be AIs in that particular universe. I did something similar with the technology in Lockstep … Even though technology advances spectacularly quickly around them, [the Lockstep] just basically draws a line in the sand and says, ‘If you’re going to live here, you’re going to live this way.’ The robot economy itself is essentially based on Rome. Rather than having hundreds of slaves, each person in the Lockstep has a number of robots. It’s illegal for corporations to own robots—they can only own single-purpose machines … So what people do is they send their robots out as a workforce—essentially as their slaves—to do their work for them, and they reap the profits.”
From our panel on Ender’s Game and Philosophy:
D.E. Wittkower: “I think [Peter Wiggin] is the worst person, and in some sense that’s not his fault. He kind of got a bad draw in terms of moral character. And what he does with that is actually really praiseworthy. It actually gets us into another weird issue in Kantian moral theory, where for Kant what’s morally praiseworthy is when you do the right thing because it’s the right thing. And so somebody who is just all smiles and sweetness and light, and just instinctually stops to help others because that’s what you do, is not actually terribly morally praiseworthy. Somebody who’s morally praiseworthy—for Kant—is somebody whose instincts are to be cruel and miserly, and who cares for others because it’s the right thing to do, rather than because it’s just instinctual … In saying Peter is the worst person, at the same time we might want to call him really morally praiseworthy, because not just [the consequences of his actions] are the best, but what he does with his horrible moral instincts is really transformative of himself ultimately, too.”
Ashley Shew: “I am made so angry in knowing about [Orson Scott Card’s] hate. I wish I had never known any of that … After reading [his books] I was really high on the idea that if you could love another that you don’t know, if you could know them, then you’re forced to love them. So Ender gets to know the hive queen and can’t help but love her and help her. I mean, Card has it all wrapped together—love, hate, war, peace, reconciliation. The idea of the Speaker for the Dead is that if you can tell a person’s life story as it happened, with all of the horrible parts included as well, that you can understand them and you are forced to love that person. I mean, this is part of what makes up all of Orson Scott Card’s novels in the Enderverse, the idea that knowledge is love. Forget this ‘children at war’ thing, the Ender books are about love and understanding and acceptance, and you can encounter something completely foreign to yourself, a true Other, and once you understand it, the love is there. That Orson Scott Card could be so hateful at all just makes no sense. If he reads his own books I think he’ll understand better what I mean here.”
You can listen to episodes of the Geek's Guide to the Galaxy podcast here.