ForumsWEPRThe Value Problem

16 5166
Moegreche
offline
Moegreche
3,826 posts
Duke

There is an alarming lack of purely philosophical threads on this forum. Here is my attempt to rectify the situation. The issue I'm about to discuss is actually incredibly involved and encompasses a number of questions. I'm going to start with the stripped-down version lest I waste my time writing this for no one to respond.

The Value Problem

There is a long held intuition in epistemology (the study of knowledge) that knowledge is more valuable than mere true belief. The difference between knowing that P and having a mere true belief that P is that a lucky guess counts as a true belief, but not knowledge. This makes sense to most people. Knowing the answer seems intuitively more valuable than just correctly guessing the answer.
In the Meno (one of Plato's works), Socrates asks Meno this very question. In particular, Socrates wants to know what practical value knowledge has over a correct guess. Suppose you want to get to Larissa. You can either ask someone who knows the way, or someone who merely has a true belief on how to get there. But either way, you'll be given the right directions to Larissa. At least, this is Socrates' example. We can modernize it a bit.
Let's say you're a contestant on a game show. You are 1 question away from winning a cash prize. Is there any difference here between knowing the answer and just guessing correctly? Either way, you win the cash.
This is supposed to show that the added value of knowledge cannot be practical value. Knowledge is not any better at guiding our actions that mere true belief. Both will get us our desired, practical outcome.
So here's the question: where does the added value of knowledge come from?
Just to get your minds working, here are two options (though they are hardly the only two). Some philosophers have looked for the added value in some kind of justification. A justified true belief (that is, a belief you have some reason for holding) seems better than a lucky true belief. And some philosophers hold that the value problem is insoluble - that is, there is no additional value for knowledge over mere true belief.
But please, don't just limit yourself to the above two options. There are plenty of ways of thinking about the problem, these are just there to get the juices flowing.

  • 16 Replies
Moegreche
offline
Moegreche
3,826 posts
Duke

I think I get way you're saying, jeol. A belief could be wrong, whereas knowledge cannot (this is because most accounts of knowledge have a truth requirement, thus you can't know something that is false). Now, knowledge would definitely have value over a false belief, and that value would be cashed out as truth, which is considered by many to be epistemically valuable (and also by many to be the only epistemic good).
But the problem is accounting for the value of knowledge over that of a true belief that falls short of knowledge. The best way to think about this is the game show analogy, I think. If you guess the answer correctly and win the cash prize, does it matter that you didn't know the answer?

partydevil
offline
partydevil
5,129 posts
Jester

I think I get way you're saying, jeol. A belief could be wrong, whereas knowledge cannot (this is because most accounts of knowledge have a truth requirement, thus you can't know something that is false). Now, knowledge would definitely have value over a false belief, and that value would be cashed out as truth, which is considered by many to be epistemically valuable (and also by many to be the only epistemic good).


well i havn't seen a topic being answered so fast in a while xD

But the problem is accounting for the value of knowledge over that of a true belief that falls short of knowledge. The best way to think about this is the game show analogy, I think. If you guess the answer correctly and win the cash prize, does it matter that you didn't know the answer?


so whit only believe your playing the luck card.
i don't think hoping for good luck is beter or equal to getting it right anyway because you know it.
i rather want to know the answer then to do a lucky guess.
aknerd
offline
aknerd
1,416 posts
Peasant

I suppose an added advantage of true knowledge might be that it could help you acquire more knowledge/benefits. The game show might be an example: if you guess correctly, that's it. You win, but that is all you win with that guess. But if you figure out a way to win, you could potentially use that method in the future, or sell the method to a future player.

A lucky guess adds nothing to your mental tool set, I guess is what I am trying to say.

Moegreche
offline
Moegreche
3,826 posts
Duke

so whit only believe your playing the luck card.


My example used a lucky guess, which is a bit misleading. It doesn't have to be a guess at all - just a true belief that falls short of knowledge. Some may argue that a guess doesn't even constitute a belief, which is probably right.
There are plenty of examples of beliefs that fall short of knowledge. Let's say you're trying to get to New York from Atlanta. You have a belief about how to get there, and that belief is true - so you're going to get there. It seems like that is less valuable than having knowledge about how to get there. The question, though, is why.
aknerd
offline
aknerd
1,416 posts
Peasant

There are plenty of examples of beliefs that fall short of knowledge. Let's say you're trying to get to New York from Atlanta. You have a belief about how to get there, and that belief is true - so you're going to get there. It seems like that is less valuable than having knowledge about how to get there. The question, though, is why.


Again, the knowledge tactic not only grants you with the true knowledge of how to get to New York, but the knowledge of how you acquired this knowledge as well. This potentially enables you to learn how to go other places as well.

The true belief tactic does not provide similar benefits.

How does one know whether one has true beliefs or true knowledge? How can you test such a thing?
Moegreche
offline
Moegreche
3,826 posts
Duke

I suppose an added advantage of true knowledge might be that it could help you acquire more knowledge/benefits... A lucky guess adds nothing to your mental tool set, I guess is what I am trying to say.


That's a really nice response. So knowledge might play a role in getting us to new truths. I'd really like you to expand on this, if you can. And do you think it's a problem for your response that not all knowledge has this feature? What I mean by this, is that there are plenty of pointless truths. I could sit here and count the number of dust motes on my desk and thus come to know how many there are. But this kind of knowledge doesn't seem to help me at all in the way you suggest. Do you think that's a problem for your account?
Kasic
offline
Kasic
5,552 posts
Jester

Going with the game show example...

While there would be no real difference in value between knowledge and a true belief, successive attempts at the game-show would show that those who are more knowledgeable walk away with more, because a true-belief could have easily been a false-belief.

Strop
offline
Strop
10,816 posts
Bard

give a man a fish and he'll eat for a day, teach a man to fish and he'll eat for a lifetime. would you consider that relevant to the question at hand?

HahiHa
offline
HahiHa
8,256 posts
Regent

Let's say you'd answer the game's question in both cases, knowing and guessing. Knowing might feel more satisfying while guessing is 'simply' relieving when you hear you're correct. I think there might be less difference once reached the goal than during the process of reaching it (if that makes any sense at all).

In a way though, knowledge means you already learnt something, that through guessing you might only learn now; so in a way you're better prepared, which can be seen as more valuable especially in crucial questions or emergencies. Which leads me to think that others might trust you more in an emergency if you know things, than if you can guess correctly.

Moegreche
offline
Moegreche
3,826 posts
Duke

... a true-belief could have easily been a false-belief.


... knowledge imply's truth and guesses can go awry.


I think these responses really capture the heart of what Socrates thinks the added value of knowledge amounts to. He likens belief to the statues of Daedalus. These statues are so lifelike that it's said they could wander off if not tethered down. Knowledge, says Socrates, tethers down belief. It makes the belief more stable.
Now, it's not quite clear what Socrates means here. At face value, he seems to suggest that beliefs can be lost while knowledge cannot. But we forget things that we once knew all the time, so this doesn't seem right. I think a more charitable interpretation is one along the lines of what you guys are saying: a belief that falls short of knowledge can be false (and guesses can very easily be false), but knowledge by definition cannot be false.

But here's the kicker. A true belief, even if it's a guess, cannot be false either. But this seems to miss the point of the argument. Enter Reliabilism.

Reliabilism is a theory of knowledge that tries to capture this idea that guesses can be wrong. For a reliabilist, knowledge is a reliably produced true belief. Guessing isn't reliable, so it's not knowledge. So by having a reliable method of belief formation, we make it far more likely that future beliefs will also be true. So Strop's suggestion is relevant - a reliable method will get you knowledge now and is far more likely to get you more knowledge in the future. And I think this captures the force of the above quotes as well.

But there's a problem for reliabilism. Actually, there are quite a few problems. The main argument against it is the new evil demon hypothesis, but that's not relevant to this discussion (it is, however, a fun thought experiment for anyone interested). The problem for reliabilism in this case is that we cannot account for the added value of knowledge. This is, after all, the question at hand.
Now the argument goes that reliably produced beliefs are more valuable that beliefs that aren't reliably produced. The reason there is no added value, however, becomes clear when you recognize that a mere true belief is already true! Whether or not it was reliably produced doesn't add any additional value. The belief isn't more likely to be true because it already is true.
An analogy here might help. Let's suppose I value things that dissolve in water. Salt dissolves in water, so I value salt. Now along comes someone who offers me 'super salt' and says that it also dissolves in water. He says it's better, more valuable than my salt. But all I care about is that salt dissolves in water, so I couldn't care less about super salt. The stuff I already have does what I need it to do.

Simply put, reliabilism looks for the added value of knowledge in that fact that it's more likely to be true. But given that belief is already true, there's no way to make it more likely to be true.
There are plenty of responses to this counterexample, but I'd like to hear what you guys think.
Moegreche
offline
Moegreche
3,826 posts
Duke

Knowing might feel more satisfying while guessing is 'simply' relieving when you hear you're correct. I think there might be less difference once reached the goal than during the process of reaching it (if that makes any sense at all).


I think you're on to something here. This response has an entirely different flavour to it than the reliabilist response. There have been some papers written very recently that suggest this sort of thing. The idea is that knowledge is phenomenologically different than mere true belief. In other words, you're in a very different state of mind when you have knowledge than when you have true belief. What this response hinges on, though, has to do with the idea that you don't 'know' that your belief is true. But this just sets the problem either to one side or back one step. Now what you're claiming is that knowing is better than not knowing, which is a different question altogether. If I don't know that P, then P could be false, thus we lose the value of truth that is already given in the value problem. This new question becomes philosophically uninteresting.
I've seen plenty of resistance to the response I've just given, but I'd like to see what you think. Does it avoid the question by asking a different and ultimately uninteresting question? If not, it looks like you're committed to the notion that the experience itself of having knowledge adds value over that of true belief. But how do we cash out this value? It becomes a sticky problem.
aknerd
offline
aknerd
1,416 posts
Peasant

. I could sit here and count the number of dust motes on my desk and thus come to know how many there are. But this kind of knowledge doesn't seem to help me at all in the way you suggest. Do you think that's a problem for your account?


No, not really. There is no such thing as trivial knowledge. Why did you think it was worth your time to count the dust motes in the first place? Could this reason be applied to future cases?

Additionally, since we are assuming this is true knowledge, this implies you were able to develop a system that enabled you to count every dust mote exactly one time, without skipping any motes or counting one twice on accident.

If you were to do something like skip one mote, but count another twice on accident, this would merely be a true belief. Surely, such a system that is as accurate as the one you created would have practical use in a variety of fields.

BUT. Again, I ask: how do you know it is true knowledge? Back to the traveling example:

Say you wanted to travel to NYC. So, you plot out a route using a map, and it turns out this route is accurate. Therefore, it would seem that you have acquired True knowledge.

But, what if the map makers themselves were only guessing as to the location of NYC, and happened to get lucky. Then, isn't your "knowledge" simply a belief? Even though you weren't really guess, you only believed you possessed knowledge, you didn't know you possessed knowledge.

case two:
What if the mapmakers DID know the location of NYC (lets say that they had been there before). In fact, they make a map of the entire world, accurately placing every location using their knowledge gained from their personal travels. Since this is a theoretical problem, lets say that they correctly place an infinite number of locations. However, they make one mistake: one location is randomly placed. They are aware of their potential mistake, but do not correct it. They do, however, mention there there is one error in their map.

So, if our traveler wanted to go to Dallas, and used this map to plot a route, would this be considered using true knowledge? The likelihood of Dallas being incorrectly located is incredibly small. But, at best I think that they can only believe that they possess true knowledge, even if Dallas was correctly placed by the mapmaker. Any potential for error negates any claim to knowledge one might gleam from the map.

But, is there any source of information that is guaranteed to be completely accurate? No. So is there any such thing as true knowledge?
Moegreche
offline
Moegreche
3,826 posts
Duke

There is no such thing as trivial knowledge. Why did you think it was worth your time to count the dust motes in the first place? Could this reason be applied to future cases?


What if I was just really bored and decided to count them? No real reason behind it - I wasn't even that curious.
Now your point about developing a system for counting dust motes is interesting. I hadn't thought about that. But wouldn't you agree that there are 2 different propositions that are known:

1) There are x number of dust motes on my desk.
2) Such-and-such is a reliable method of counting dust motes.

Is there a value of (1) that isn't just cashed out in its contribution to coming to know that (2)?

So, if our traveler wanted to go to Dallas, and used this map to plot a route, would this be considered using true knowledge? The likelihood of Dallas being incorrectly located is incredibly small. But, at best I think that they can only believe that they possess true knowledge, even if Dallas was correctly placed by the mapmaker. Any potential for error negates any claim to knowledge one might gleam from the map.


Now this is a very different, although also very interesting, problem for knowledge. If you want, we could definitely get another thread going to talk about this. But for our purposes, we looking at the supposed difference in value between a true belief that falls short of knowledge that P vs knowledge that P. Here, we need not worry about the possibility of error, since P's truth is already guaranteed. But the issues you raise are well worth thinking about.
Moegreche
offline
Moegreche
3,826 posts
Duke

[/quote]Question: if someone stated something false as a fact, and you believed it as a fact and truth, would it be a belief or knowledge? I suppose it would be a belief, since you believe it to be a hard and true fact...[quote]

Correct, you would only have a belief, and a false one at that. I wasn't very clear about this in my opening post, so maybe I should clarify things now.
There is pretty much a consensus that truth is required for knowledge. In other words, truth is a necessary condition for knowledge. This also means that knowing P implies that P is true.
It is this truth requirement that ends up contributing a whole lot to the value problem. Since truth is already captured by our concept of knowledge, we have a hard time looking for additional value beyond truth. Truth is seen by most to be an epistemic good and by many to be the sole epistemic good. In other words, getting things right is the only thing we care about.
One move, then, is to remove the truth requirement for knowledge. But it turns out that this move is a complete disaster. We either end up with a very unintuitive account of knowledge or one that doesn't seem valuable in the right way (or both).

And keep in mind that guesses aren't the only kinds of belief that fall short of knowledge. I could have a very firm belief in something that is true, but fail to actually know it for a variety of reasons. Perhaps the belief hasn't reached some threshold of justification needed for knowledge. Or perhaps the belief has been Gettiered (you'll just have to Google Gettier cases, as they're too involved to get into here and not really relevant to the discussion at this point, although they can be relevant).

Showing 1-14 of 16