ForumsWEPRNewcomb's paradox

78 11139
Einfach
offline
Einfach
1,448 posts
Nomad

Newcomb's Problem - Please read the first three paragraphs of the link before commenting

What do you do?

Do not say the Being does not exist, because that is assumed in the problem.

  • 78 Replies
MageGrayWolf
offline
MageGrayWolf
9,462 posts
Farmer

This, too, is missing the point of the thought experiment. You are to choose base on the evidence you have. The reason the box is closed is so that you can't assess a confidence level on the contents of the box.


That's why I would try and gain more evidence. Since my knowledge of the content of the closed box is lacking I would try to gain as much information as I could before deciding.
Maybe it does bypass what's trying to be attempted here but the set up seem rather poorly laid out to begin with.
Moegreche
offline
Moegreche
3,826 posts
Duke

Maybe it does bypass what's trying to be attempted here but the set up seem rather poorly laid out to begin with.


It's laid out well enough to do what it's meant to do. Again, we can't assess the paradox in a vacuum - we must assess in the light of decision theory.
And it's easy enough to establish in the setup that you can't touch or shake the closed box. I'm pretty sure that either Newcomb or Nozick included this condition. Even if they didn't, a) it's easily rectified and b) shaking the box isn't true to the problem.

Really, including a condition that you can't touch the box isn't all that necessary. If you're going to shake the box and decide, then presumably the computer would have predicted that. The solution, from the computer's end, would be to treat shaking the box as taking the box. Thus, by deciding to shake it, you have basically decided to take it. The computer would simply leave the box empty.
You've advanced the problem, but only by one step. You're now trying to decide whether or not to shake the box when you go in. Interestingly, this is still representable as a decision problem.
Sonatavarius
offline
Sonatavarius
1,322 posts
Farmer

i would think the whole problem with semantics could be fixed if you change things up a bit in the question so that it takes some of these things like "shaking" it out of the picture....

ur given the option of picking two universal gift cards...

one red (box 1)

one blue (box 2)

and apply the question to those instead of the use of boxes w/ money in them. that way you don't know whether or not the blue card has a million dollars tagged to it or 0 dollars tagged to it...


maybe i'm getting too far off of topic. you can only choose between the red one which has 1000 dollars tagged to it or the blue one which may or may not have 1 million.

i mean... touch them, shake them, look at them all you want... (no phoning the supporting sponsor of the card.... like they'd give you the confidential info if its not in your name anyway)

there is no research you can do....

now what do you do?

i think i would just take the million dollar box... simply b/c at that point another 1000 really doesn't make a difference.... or that would be my mindset... as for whether or not said being would choose my mindset correctly or not i have no way of knowing that

thingthingjack
offline
thingthingjack
43 posts
Nomad

i'd tell the deity no thanks and just walk away.

Sonatavarius
offline
Sonatavarius
1,322 posts
Farmer

i guess i'm now going to bring up what i just tried to weed out... semantics.... the being is not a deity but a highly intelligent one that can predict what you'll do in such an instance....


forgive me for not reading everything... don't know if this was asked yet


is there such a thing as a 1000 dollar bill or was there ever? i mean if there's no such thing as a 1000 dollar bill then i'd def take the million dollars

Thearmedgamer
offline
Thearmedgamer
156 posts
Peasant

If he's had100% percent accuracy with his predictions, as is stated then he would know I would pick Just the closed box and put a million bucks in it, as such, I pick just the closed box validating his theory and getting my money.

I've seen paradoxes before, and i think i figured this one out.

Thearmedgamer
offline
Thearmedgamer
156 posts
Peasant

In case you're wondering the &quotaradox" is self causing effect

Moe
offline
Moe
1,714 posts
Blacksmith

I will start by saying that I have not read the entire thread, to be honest some of it confused me.

The being claims that he is able to predict what any human being will decide to do. If he predicted you would take only the closed box, then he placed a million dollars in it. But if he predicted you would take both boxes, he left the closed box empty. Furthermore, he has run this experiment with 999 people before, and has been right every time.


This paragraph is what makes this seem like its not a paradox to me. Given this fact before making your decision would lead most(if not all) to pick the closed box. Basically how I'm reading this is that the being tells you how to get either 1 million or one thousand, and human nature would lean towards the 1 million.
Endscape
offline
Endscape
1,182 posts
Nomad

... i did interpret the question wrong.... reading on a psp is hard when u have impaired vision hmmm ill think it over.

Moegreche
offline
Moegreche
3,826 posts
Duke

You guys are still missing the point of this. It may help to understand some of the commitments you must hold to even get the paradox off the ground.
Again, this is a paradox that deals with decision theory, and within decision theory, we often use dollars as a measure of utility.
So, for example, to be more confident in h than in g just means that I am willing to take a bet that {$1 if h, $0 if ~h) over the bet {$1 if g, $0 if ~g). But for this kind of utility to work, we have to take it with certain commitments:

1) You value only money.
2) You value each dollar just as much as the next - that is to say that even if you were a billionaire, you will still value a dollar just as much as if you were broke.

This means that we want to maximize our utility - the number of dollars we can give in a given scenario. So while on a practical level, most of us would be happy with a million bucks, this doesn't match (2) from above. By (2), we would be happy with a million, but we wouldn't be satisfied if there's a chance to get another thousand on top of that.
So the response:

human nature would lean towards the 1 million

is correct, but it's not human nature we're assessing - it's decision theory.

In case you're wondering the &quotaradox" is self causing effect


No, that's not the paradox at all. An event that causes itself is logically incomprehensible, not a paradox. I will try to (briefly) explain what generates the paradox.

So I want the most money from this game because I value only money and I value each dollar as much as the next.

Solution 1: Take both boxes
The maximal utility here would be to have the opportunity to make $1,001,000, and the only way to do this is to take both boxes. Thus, I have overwhelming reason to take both boxes. But knowing the being has very likely predicted this, I have overwhelming reason to only take the closed box. But if the being has predicted this, I now have overwhelming reason to take both boxes - and so on.
So if this strategy returns "Take both boxes" as true, it ends up being false (and then true and then false again...)

Solution 2: Take the closed box

In this case, I would expect there to be $1,000,000 in the closed box. But there would also be another $1,000 in the other box. So if I decide to take the closed box to achieve maximum utility, I would then have to decide to take both boxes. Thus "Take both boxes" under this interpretation is false. But then it becomes true and false and true and so on...

And thus we have a paradox for decision theory.
Moe
offline
Moe
1,714 posts
Blacksmith

is correct, but it's not human nature we're assessing - it's decision theory.


Ok, that makes sense. However we were told before making the choice that we would only get the million if the being predicted we only picked the closed box, and that he is always right. So if he is always right there is no way to get 1,001,000 , its always either 1000 or 1000000.
Drace
offline
Drace
3,880 posts
Nomad

I don't see at all how this is a paradox. A more appropriate term is a dilemma.

And I don't see the complexity in this either.

Take the closed box, because if he guesses wrong, you win a million dollars.

But if always guesses right, then there is no advantage of taking the closed box and taking both boxes grants you a $1000.
Its the only way you can even profit from the ordeal.

Ghgt99
offline
Ghgt99
1,890 posts
Nomad

I agree this is not a paradox but a dilemma.

Einfach
offline
Einfach
1,448 posts
Nomad

No - it is a paradox, and Moegreche explained it very well. I should have put more prereqs in the OP like Moegreche did, but they didn't come to mind.

sk8brder246
offline
sk8brder246
740 posts
Nomad

duh its obvious to take the closed one

Showing 31-45 of 78