Is irrationality a matter of internal conflict?

1/4
Is irrationality a matter of internal conflict?
Simon Gaus
Requirements of narrow irrationality
INTENTION COMPATIBILITY
BELIEF COMPATIBILITY
MEANS-END-COHERENCE
ENKRASIA
MODUS PONENS
Rationality requires that you do not both intend to A and intend to B
if you believe that you cannot both A and B.
Rationality requires that you do not believe some proposition p and
some other proposition q if you believe the negated conjunction
~(p&q).
Rationality requires that, if you intend to A and believe that some
action B is necessary for you to be able to A, you intend to B.
Rationality requires that, if you believe that you ought to A, you
intend to A.
Rationality requires that you believe q if you believe p and q if p.
Question: What is it in virtue of which the various phenomena mentioned by these requirements count
as instances of the same thing – irrationality? What motivates the cut between these requirements and
more substantial “requirements of reason”?
Bratman’s Lack of Standpoint-Theory
“When I recognize inconsistency in my own intentions, I see that in this specific
case there is no clear answer to the question, “Where do I stand?” This question
about myself is, with respect to this domain, simply not settled; there is as yet no
fact of the matter...[FN:] I say “with respect to this domain” to indicate that I am
not claiming that any inconsistency or incoherence blocks taking a stand on all
matters. The claim I want to make is relativized to matters that are in the content
of the intentions that are inconsistent or incoherent.” (2009: 431)
The suggested account of narrow irrationality has four components:
(1) The idea that intentions and beliefs can be jointly inconsistent or incoherent in virtue of their
content.
(2) The idea that intentions and beliefs are jointly inconsistent or incoherent in virtue of their content
just in case of violates a rational requirement on intentions.
(3) The claim that an intention ordinarily constitutes the agent’s practical standpoint, and a belief the
agent’s theoretical standpoint, on the matters that the intention/belief is about.
(4) The claim that if an agent has intentions that are, in virtue of their content, jointly inconsistent or
incoherent, the agent does not have a standpoint on the subject matters the jointly inconsistent or
incoherent intentions are about.
(1): good, though more work is needed; (2): doubtful, but not hopeless; (3): independently plausible;
(4): seems plausible
2/4
Overall account has at least 3 advantages:

Unified

Understandable characterization: irrationality as mental malfunctioning;

Good chances for vindicating normativity of rationality

can explain why having conflicting intentions and beliefs is irrational, but having conflicting
desires, seemings or intuitions is not.
Not all irrational inconsistencies involve loss of standpoint (i.e. (4) = false)
Intentions-counterexample
Suppose you intend to go to the cinema at 8 pm, also intend to the theater at 7.30,
and believe that you cannot both go to the cinema at 8 pm and go to the theater at
7.30. You are disposed to answer “yes!” when asked whether you intend to go to
the cinema. You are disposed to answer “Yes!” when asked whether you intend to
go to the theater. And you are disposed to answer “no!” when asked whether you
can do both. Seems possible.
 Conflicting Intentions, irrationality present, but you have not failed to take a stand on any of the
issues that are in the contents of your intentions; it’s just that your standpoints, too, conflict.
Belief-Counterexample:
Suppose you believe “grass is green”, “if grass is green, then grass hoppers are
green”, and “grass hoppers are gray”. Again, suppose you are disposed to answer
“Yes!” in response to whether grass is green, to whether grass hoppers are green if
grass is red, to whether grass hoppers are gray, and even to whether the conjunction
of these three claims is true. Seems possible.
 Conflicting beliefs, irrationality present, but you have not failed to take a stand on any of the
issues that are in the contents of your beliefs; it’s just that your standpoints, too, conflict.
Diagnosis: Intuition that inconsistency = loss of standpoint only if inconsistent attitudes are about the
same subject matter. This intuitively seems to us to be so when one attitude’s content is the negation
of the other’s. But the individuation of subject matter does not keep track with inconsistency: there can
be inconsistency-relations among propositions that are not similar enough to each other to count as
being about the same subject matter. And in such cases, it seems that the agent has inconsistent
standpoints rather than failed to take a standpoint.
Upshot: Get rid of (4), keep the rest. Resulting conception: Being irrational is having conflicting
standpoints (rather than failing to have one).
3/4
The imperfect mathematician and the problem of complex inconsistencies
Suppose a smart, but rather ill-informed and imperfect mathematician believes
some set of mathematical axioms but also believes the negation of something that
has been established, via a complex proof and unbeknownst to the mathematician,
to follow from these axioms. Suppose she believes the negation because her
usually very reliable supervisor has told her that the negation is true. Is she
irrational?
 Imperfect mathematician does not seem to be irrational. But her beliefs conflict. So having
conflicting standpoints cannot be sufficient for irrationality.
 Intuitively, the problem is that this kind of inconsistency is not sufficiently obvious. What to do?
Modifying the judgement
(a) Using the non-obviousness of the inconsistency to fashion an error theory about the nonirrationality judgement.
Objection: Considered opinion remains stable even when inconsistency is stipulated
(b) Re-interpreting our judgement: when we say that the mathematician is not irrational, we really
mean that the mathematician is not criticisably irrational – her irrationality is excused.
Objection: No reason to think that this irrationality judgement is different from ordinary irrationality
judgements. If this judgement is about criticisability, than criticisability is part of the concept of
irrationality we are interested in. Talking about “ideal rationality” just changes the subject matter.
Modifying the account or the notion of internal conflict employed.
(a) Irrationality is a matter of obvious internal conflict
Objection: Limited Wriggle Room: a theory counts as a modification of the idea that irrationality is
solely a matter of conflict among one’s standpoints only if, according to it, a) the primary bearers of
irrationality are a combinations of standpoint-constituting attitudes and b) whether any given such
combination is irrational is fully determined by the intrinsic properties of the attitudes involved in that
combination.
 Obviousness is not an intrinsic property of what is obvious, so (a) violates LMR.
(b) Irrationality is a matter of a severe degree of internal conflict.
Objection:
(i) Hard to see how two directly inconsistent beliefs can be more or less inconsistent
(ii) Unlikely that any account of degrees of inconsistency purely in terms of the intrinsic properties of
the attitudes involved in the inconsistency will track our irrationality intuitions, as these seem to
depend on obviousness
(c) Irrationality is a matter of internal conflict, but not all combinations of logically inconsistent beliefs
constitute internal conflicts
Objection: No good alternative. Most normal cases of inconsistency must still come out as involving
internal conflict, and the account of internal conflict is supposed to be exclusively in terms of intrinsic
4/4
properties and relations. But apart from the formal inconsistency, there do not seem to be intrinsic
relations that hold between most normal cases of inconsistent beliefs.
6. Relaxing LMR and adding transparency- or inaccessibility conditions?
(1) Irrationality = internal conflict that is transparent to the agent
(a) Transparency in virtue of explicit awareness?
Objection: far too narrow
(b) Transparency in virtue of implicit awareness?
Objection: Marker for implicit awareness – in contrast to not being aware – is that, in most
circumstances, becoming explicitly aware does not set off ‘automated’ mental processes. But for most
cases of irrationality, becoming explicitly aware of the conflict immediately sets off an automated
attitude revision.
(2) Irrationality = internal conflict that is accessible to the agent
Suppose a somewhat less imperfect mathematician reflects on the complex
inconsistent propositions, develops various hunches, sketches proofs and finally
understands that the previously held beliefs are inconsistent. Plausibly, there is some
time prior to reaching the eventual understanding such that the mathematician could
have made the next step earlier (if she had skipped a meal, say) and would have
reached the conclusion earlier if she had made the step earlier. Suppose the actual
conclusion was reached at t3 because the mathematician started reasoning at t2 but
could have been reached at t2 if the mathematician had started reasoning at t1. In that
case, it is true at t2 (at the latest) that the agent would have ‘seen’ the incompatibility
if she had thought about it, and that she actually had the incompatible beliefs.
Consequently, at t2 – before the mathematician actually comes up with the brilliant
proof – the mathematician is irrational, whereas before t2, she was not.
 General upshot: That an agent has the ability to recognize a very complex inconsistency does not
mean that she exercises that ability; and if she does not, and does not seem as if the mere fact that
she had the ability to recognize a very complex inconsistency suffices to make her complexly
inconsistent beliefs irrational.
Irrationality = mental conflict that could have been avoided by a specific kind of reasoning?
Idea: Being irrational is being in a standpoint-conflict that would have been avoided if one had
engaged in a very rudimentary form of reasoning, i.e. something like automated belief-updating.
Problem 1: Specifying the right kind of mental process is not easy
Problem 2: Whether there is a specific kind of reasoning that has the power to safeguard against
mental conflict seems to be an empirical question.
Problem 3: The resulting account looks hideously gerrymandered; in particular, whether one is
irrational at a time would turn out depend on a combination of one’s mental states at that time and
one’s mental processes at earlier times.