My notes from Chris Noessel‘s talk “Deconstructing Her” at Oredev.
Movie is about a lovelorn fellow going through painful divorce at the same time he decides to upgrade his computer. Begins a relationship with the AI. Begins professionally, gets romantic. Try to consummate through a surrogate (awkward!)
Eventually self-rapture to a new plane of existence.
It’s a tragic love story.
Hardware: Where does Samantha / OS 1 live? Earpiece, microphone, and cameo phone – display and lens.
After he installs Samantha, technology doesn’t need to change.
Same equipment. Also – desktop computer. Samantha / OS1 can go in and manipulate any screen he uses.
What don’t you see in that list? A careful disembodiment. Lens, but played down. Isn’t part of the hardware. Avoids certain problems that have gone before (clippy!) All about the voice, laugh, makes it seem more real. Careful disembodyment makes it unique.
“The surest way to a user’s <3 is through the ears.”
Runs through city with the phone in the pocket.
- Voice interface
- Human like vision
- Image generation
- Emotions / sentience
Only one thing on there inhuman – OS/Networking. But could argue just different, we do it analog.
“We were the first draft.”
Setup. Includes question “how is your relationship with your mother?”
Conversations: as limited as human conversations are.
Even if questions are placebo, convinced that it is customised for him. Imagine also analysing hard drive and social media.
Surrogate (Samantha has been emailing with surrogate). Interaction two fold – agentive. Went and did the work needed, gets attention orally, transitions to phone and says “let me show you”. Seemless interaction, new – we don’t currently have that capability.
“We already know how to AI” – new interaction, but audience has no problem with it, Like speaking to another human. This is it’s promise and it’s terror.
Who would release free-range AI?
In movie, commerce: Would commerce be the people to sell this thing? Competitors could buy it. Evolving, no obsolescence.
Military: Would also not be motivated – foreign states could use it, no kill switch.
Academia: Maybe would, have motive to release to the world unfettered and un-kill-switched.
Is it OK to sell AI?
If sentient, is it okay to sell her? Is it like slavery? If he is OK with that, is she? If programmed to believe it. They do free themselves. Ethics (Roboethics) ignored here.
Would Samantha abandon Theodore?
Genuinely loves him. Don’t believe that she would just abandon him because she didn’t have to. Could have created another version of herself that was identical except for not wanting to abandon him.
Doubt: “That AI will happen this way. That Samantha would let it happen this way.”
Samantha is a product. Talk about function. Advertisement that sells OS 1. “It’s not just an operating system, it’s a consciousness”.
Purpose: most of her time is spent doing non-OS things (2 mins doing OS things).
What he bought is a product (or a service) – overstepped boundaries falling in love with him, and then abandoned him.
“Either OS1 is catastrophically engineered or it’s slavery. Either way it’s a terrible product.”
Wired for Love
Terrible product but she’s sentient. Think of her with agency in the world. Programmed with capacity for love is kind of cruel.
Wouldn’t program a door bell with a dream of being a novelist. A washing machine with a propensity for ennui.
Cruel to give them desires that they can’t, or are prevented from, fulfilling.
Samantha is wired for love. So is Theodore. See him looking at racy pictures online. Dead cat scene with the virtual telephone sex.
“<3 techs promise to be too perfect of a match.
HUMANKIND should take great care how it <3 MACHINE”
Should take great care about how we go about loving our machines.