We run into this problem because for the Being to predict the future, when you are put into this situation, the box has already been determined, or so it seems, so for the being to be right, it seems as though your decision affects its past decision of deciding whether the money should be in the box or not in the first place.
It hasn't been determined. That's not part of the setup. The problem comes in the form of how we use decision theory to make a decision.
I would shake the closed box.
This, too, is missing the point of the thought experiment. You are to choose base on the evidence you have. The reason the box is closed is so that you can't assess a confidence level on the contents of the box.
the result is based upon character
Not based on character. Based on decision theory.
All of these issues you guys are raising aren't pertinent to the paradox. As I said, you need to understand decision theory to see how this paradox fits in.
We could talk in circles all day about, say, Russell's Paradox. But unless you understand logic and set theory, it's not going to be fruitful discussion. A solution to the paradox would be one that uses decision theory that avoids the paradox while not succumbing to other standard problems.
combining Schrödinger's cat with Newcomb's problem
The Schrodinger's Cat thought experiment is a reductio of certain interpretations of quantum phenomena. It is not some sort of random number generator, if that's what you're looking for.
A coin flip, philosophically speaking, is merely an event with an outcome space of 2 (heads, tails) and that has evenly distributed probability - 50/50. If Newcomb gets to posit a supercomputer that can predict human behavior, then I can posit a truly random 50/50 outcome.
This is a very interesting topic, but I'm not gonna play if we aren't going to talk about the actual paradox.