| Comments: |
The ring makes you a p-zombie
I can't be certain you're wrong, but I wouldn't expect that.
That's really good, although rather scary!
I am not sure I would *care*!
You're living an abnormally successful life, a rich and much-loved member of the community with a happy family?
Who
the heck cares *who* is living such a life? It's being lived, right? If
you can do it with an earring and half of the brain, instead of all of
the brain, so much the better!
Would you
voluntarily ingest a parasitic worm that eats your brain, absorbs your
goal system, hangs out in your skull, connects to your nervous system,
and then lives about the same life you would live only better?
(suppose
for the moment that you're being selfish and ignoring concerns about
whether the parasite would be better at pursuing efficient charity and
so on)
In Greg
Egan's "jewelhead" stories, everyone gets a jewel implanted in their
head at the age of 18. The jewel gradually learns to predict the brain's
responses to stimuli, without inspecting the internals of the brain.
When the jewel's predictions become indistinguishable from perfect, the
organic brain can be removed surgically, and the person lives happily
ever after (the jewel grants immortality).
Yup, this story reminded me of those Greg Egan stories.
If I came
to possess such a ring, I would commit (by any means at my disposal) to
the following policy: I will wear the earring for a day or two every
week and do as it says, with the proviso that if it ever tells me to do
something for which I cannot see a plausible rationale, I will
immediately take it off and not wear it again for another month.
I
wonder whether, in those circumstances, it would begin by telling me
that I would be better to take it off. (If it did, I would.)
Such an
*earring*, I mean. The idea of such a thing being a *ring* seems somehow
terribly natural, Western culture being what it is.
One problem
with the earring is similar to the problem with video game
walkthroughs: it's more fun to figure things out for yourself. I can
definitely imagine situations in which I'd want to *ask* the earring
what to do, but I don't think I'd want it telling me everything, all the
time. (Consider: I can play better chess by asking a computer chess
program what move to make, but if I did that it would be boring, be
cheating, or both.)
If it
would make you happiest to do a half-assed job at your work and then go
home and spend the rest of the day in bed having vague sexual fantasies,
the earring will tell you to do that. The earring is never wrong.
Oh, this is very, very bad.
Locking it up in a treasure vault isn't safe enough. I'd throw it into the nearest volcano.
The story
stipulates that most people who wear the earring end up highly
successful pillars of their local community, which suggests that it
isn't in fact ("in fact"?! well, you know what I mean) generally leading
people to spend all their time lying in bed having vague sexual
fantasies.
My crucial
question would be whether the ring discourages or facilitates growth.
You specifically say it offers better, but not necessarily optimum,
choices. If my decision making ability is capped under the earring, but
would grow without it, then it's no wonder the earring told me to take
it off.
If I learn faster with the earring than without it, then
it's a learning aid and I'd consider it, although I don't see how you
can accept everything but the first command.
I would, by the way, say that was of publishable quality. I'd recommend sending it off to a few places, and see if any of them take it (Somewhere like http://dailysciencefiction.com/ if nothing else.)
The problem is that it's published so he can sell only second rights.
This post
gave me hope. Every time I try to really write out my idea of the future
I'm really scared of, I end up getting it wrong in some way. I was
afraid for a moment that you'd done it with little effort in a random
blog post, but then your earring ended up making people brainless which
is obviously villainous and terrible and a clear reason to simply never
put it on in the first place, and so the earring immediately became
trivially avoidable and therefore non-scary. If you can't quite do it
right, at least not yet, then maybe my inability is less because of my
lack of creativity and more because it actually is a difficult idea.
This
will be getting it wrong, too, but I'm going to write only a couple of
paragraphs so that the ways I'll get it wrong will at least be excusable
by the lack of apparent effort.
What if you could trust your
hunch about people? What if, at least according to your perception, you
could trust a person with well-fitting, clean, situation-appropriate
clothes to make a good person to do business with while a huckster
almost always looked like a huckster to you? What if there were pretty
people in the world, but people who were available and who would make a
good romantic partner to you just looked more glowingly beautiful than
anyone else? What if people pitching to you ideas that would make bad
career moves had annoying voices? What if, while fast food was decent,
salad tasted really amazing?
Say you could get Consequence
Glasses that do just that. Say they even have a tuning knob. It goes
from 0 to 1 because I am a geek. At 0 you get your natural perception
with its natural biases, at 1 things probably will have to get a little
bit abstract because so much becomes just a representation of what kind
of consequences it leads to. Maybe it even works on some kind of a
curve, so that if you just use them at, say, 0.1, it simply protects you
from whatever tricks you're most vulnerable to - your most sinful foods
taste just a bit stale but not in such a way that it makes you less
happy, the well-designed rewards of videogames that keep you playing
seem just a bit more arbitrary and distant, etc.. Only once you turn it
higher does it really start directing you toward anything specific.
You
can decide to just not put the glasses on at all. You can decide to
only run them at a very low level. Every turn of the knob is precisely
your own choice. It's simply that the higher you turn it, the easier it
will be to just do the right decisions in every situation. It won't even
feel like someone else is in charge - it's you calling the shots,
calling them as you see them. Hell, you were already wearing consequence
glasses anyway, it's just that they're biased for inclusive fitness in
mankind's ancestral environment, not for happiness and fulfillment in
the modern world. Your existing emotional filters were never designed
for the stimuli and for the consequences of this environment - changing
your perception only makes them work better. It makes you into a more formidable person. A person with an easier life, yes, but only because your decisions are more grounded in reality, not less!
Yes,
you will become dependent on the glasses, just as you would on the
earring. They're probably eventually even going to cause changes to your
brain structure as you stop worrying so much about whether people are
lying to you, about whether you're making the best possible use of your
time, etc.. But it's a lot, lot harder to see them as taking anything
away from you. Certainly you can't just brush it off with a generic
"things should not be that convenient" type of argument.
I do
worry that the only way to figure out whether you should wear the
glasses is to put them on and see how they look in a mirror...
Edited at 2012-10-04 07:40 am (UTC)
Have you read Ted Chiang's Liking what you see? The premise is a little similar, and it's a great story.
From: (Anonymous) 2012-10-05 08:55 am (UTC)
| (Link)
|
I might use the glasses if they included some kind of penalty for using them maybe?
Larry
Niven's Protectors had a similar thing going, very high intelligence
coupled with very strong instinctual drives meant they had precious
little in the way of free will. You become a Protector by eating the
tree of life: Your brain expands, your joints become enlarged (For extra
leverage), skin becomes armor thick, gonads are reabsorbed and you
become functionally immortal. And if you have no children to protect,
you sit down and die.
From: (Anonymous) 2012-10-05 08:46 am (UTC)
| (Link)
|
... it looks like SOMEBODY needs a lesson in Remedial Fun Theory.
... That sounds like a porno.
From: (Anonymous) 2012-10-04 12:38 pm (UTC)
| (Link)
|
If this is
an allegory for something, consider whether you're not committing some
of the dystopian fallacies that you yourself complained about earlier.
And
for mysterianism working better than clarity? Well, it's definitely
easier to get away with a bad argument and get people to take it
seriously if you write mysterious parable as opposed to a clear piece,
so in that sense it "works" better.
From: (Anonymous) 2012-10-04 02:52 pm (UTC)
dystopian? | (Link)
|
I was going to point out the same thing; the shrunken brain is a signal as subtle as no longer enjoying classical music. Interesting nonetheless.
From: (Anonymous) 2012-10-04 08:00 pm (UTC)
| (Link)
|
You know,
at first this seemed a bit scary...but then I had to ask: what if your
goals are defined in terms of your cognition? IE, for example, I want be
a famous and successful scientist, and I define "successful" as having
unparalleled actual mastery of the art? Or if, in a more mundane
example, I want to solve a Sudoku puzzle, and I define "having solved
the puzzle" in a way that requires me to do the actual cognition, and
derive the benefit of having solved the puzzle without being 'spoiled'?
What sort of advice can the WE give? I'm sure it can give advice, and
that the advice is useful...but if, as a result, I derive the benefit of
the mental exercise that I wanted, I'm not entirely sure that this is a
problem.
There's a couple of ways read this. One is that the WE
doesn't actually help you implement your 'real' goals, but the second or
third order subgoals and that its first bit of advice, "take me off",
was warning you about precisely this effect. Or you could read it as a
warning about defining goals in such a way as to allow that being turned
into a brainless puppet of an alien horror would accomplish those goal
(if you think being turned into a brainless puppet would be bad).
Another
random thought - it seems that the second-stage WE has effectively
replaced my lower-level functions with an extremely smart but alien
intelligence that does everything better than I could do myself...the
thing is, I'm not entirely convinced that this isn't already the case in
reality. Certainly, if I tried to, say, catch a ball running on only my
higher-level cognition that I consider "me", I'd fail utterly.
The earring sound oddly like yahoo answers. vague sexual fantasies indeed | |