References in:
Add references
You must login to add references.




Objective Bayesianism is a methodological theory that is currently applied in statistics, philosophy, artificial intelligence, physics and other sciences. This book develops the formal and philosophical foundations of the theory, at a level accessible to a graduate student with some familiarity with mathematical notation. 

An examination of topics involved in statistical reasoning with imprecise probabilities. The book discusses assessment and elicitation, extensions, envelopes and decisions, the importance of imprecision, conditional previsions and coherent statistical models. 



Coping with uncertainty is a necessary part of ordinary life and is crucial to an understanding of how the mind works. For example, it is a vital element in developing artificial intelligence that will not be undermined by its own rigidities. There have been many approaches to the problem of uncertain inference, ranging from probability to inductive logic to nonmonotonic logic. Thisbook seeks to provide a clear exposition of these approaches within a unified framework. The principal market for the book (...) 

At least one of these conceptions of probability underlies any theory of statistical inference (or, to use Neyman's phrase, 'inductive behavior'). ... 

Additionally, the text shows how to develop computationally feasible methods to mesh with this framework. 



We report two issues concerning diverging sets of Bayesian (conditional) probabilitiesdivergence of "posteriors"that can result with increasing evidence. Consider a set P of probabilities typically, but not always, based on a set of Bayesian "priors." Fix E, an event of interest, and X, a random variable to be observed. With respect to P, when the set of conditional probabilities for E, given X, strictly contains the set of unconditional probabilities for E, for each possible outcome X = x, call this (...) 



A bounded formula is a pair consisting of a propositional formula φ in the first coordinate and a real number within the unit interval in the second coordinate, interpreted to express the lowerbound probability of φ. Converting conjunctive/disjunctive combinations of bounded formulas to a single bounded formula consisting of the conjunction/disjunction of the propositions occurring in the collection along with a newly calculated lower probability is called absorption. This paper introduces two inference rules for effecting conjunctive and disjunctive absorption and (...) 

Coping with uncertainty is a necessary part of ordinary life and is crucial to an understanding of how the mind works. For example, it is a vital element in developing artificial intelligence that will not be undermined by its own rigidities. There have been many approaches to the problem of uncertain inference, ranging from probability to inductive logic to nonmonotonic logic. Thisbook seeks to provide a clear exposition of these approaches within a unified framework. The principal market for the book (...) 

This major work challenges some widely held positions in epistemology  those of Peirce and Popper on the one hand and those of Quine and Kuhn on the other. 







the symmetry of our evidential situation. If our confidence is best modeled by a standard probability function this means that we are to distribute our subjective probability or credence sharply and evenly over possibilities among which our evidence does not discriminate. Once thought to be the central principle of probabilistic reasoning by great.. 





This essay presents results about a deviation from independence measure called focused correlation . This measure explicates the formal relationship between probabilistic dependence of an evidence set and the incremental confirmation of a hypothesis, resolves a basic question underlying Peter Klein and Ted Warfield's ‘truthconduciveness’ problem for Bayesian coherentism, and provides a qualified rebuttal to Erik Olsson's claim that there is no informative link between correlation and confirmation. The generality of the result is compared to recent programs in Bayesian epistemology (...) 





This paper concerns exchangeable analogical predictions based on similarity relations between predicates, and deals with a restricted class of such relations. It describes a system of Carnapian λγ rules on underlying predicate families to model the analogical predictions for this restricted class. Instead of the usual axiomatic definition, the system is characterized with a Bayesian model that employs certain statistical hypotheses. Finally the paper argues that the Bayesian model can be generalized to cover cases outside the restricted class of similarity (...) 





We discuss several features of coherent choice functions —where the admissible options in a decision problem are exactly those that maximize expected utility for some probability/utility pair in fixed set S of probability/utility pairs. In this paper we consider, primarily, normal form decision problems under uncertainty—where only the probability component of S is indeterminate and utility for two privileged outcomes is determinate. Coherent choice distinguishes between each pair of sets of probabilities regardless the “shape” or “connectedness” of the sets of (...) 

Coherentism maintains that coherent beliefs are more likely to be true than incoherent beliefs, and that coherent evidence provides more confirmation of a hypothesis when the evidence is made coherent by the explanation provided by that hypothesis. Although probabilistic models of credence ought to be wellsuited to justifying such claims, negative results from Bayesian epistemology have suggested otherwise. In this essay we argue that the connection between coherence and confirmation should be understood as a relation mediated by the causal relationships (...) 

Conditioning can make imprecise probabilities uniformly more imprecise. We call this effect "dilation". In a previous paper (1993), Seidenfeld and Wasserman established some basic results about dilation. In this paper we further investigate dilation on several models. In particular, we consider conditions under which dilation persists under marginalization and we quantify the degree of dilation. We also show that dilation manifests itself asymptotically in certain robust Bayesian models and we characterize the rate at which dilation occurs. 

This paper analyzes concepts of independence and assumptions of convexity in the theory of sets of probability distributions. The starting point is Kyburg and Pittarelli’s discussion of “convex Bayesianism” (in particular their proposals concerning Eadmissibility, independence, and convexity). The paper offers an organized review of the literature on independence for sets of probability distributions; new results on graphoid properties and on the justification of “strong independence” (using exchangeability) are presented. Finally, the connection between Kyburg and Pittarelli’s results and recent developments (...) 

A common methodological adage holds that diverse evidence better confirms a hypothesis than does the same amount of similar evidence. Proponents of Bayesian approaches to scientific reasoning such as Horwich, Howson and Urbach, and Earman claim to offer both a precise rendering of this maxim in probabilistic terms and an explanation of why the maxim should be part of the methodological canon of good science. This paper contends that these claims are mistaken and that, at best, Bayesian accounts of diverse (...) 

Focused correlation compares the degree of association within an evidence set to the degree of association in that evidence set given that some hypothesis is true. A difference between the confirmation lent to a hypothesis by one evidence set and the confirmation lent to that hypothesis by another evidence set is robustly tracked by a difference in focused correlations of those evidence sets on that hypothesis, provided that all the individual pieces of evidence are equally, positively relevant to that hypothesis. (...) 





Many have claimed that unspecific evidence sometimes demands unsharp, indeterminate, imprecise, vague, or intervalvalued probabilities. Against this, a variant of the diachronic Dutch Book argument shows that perfectly rational agents always have perfectly sharp probabilities. 



The "traditional" view of normative decision theory, as reported (for example) in chapter 2 of Luce and RaiÃa's [1957] classic work, Games and Decisions, proposes a reduction of sequential decisions problems to nonsequential decisions: a reduction of extensive forms to normal forms. Nonetheless, this reduction is not without its critics, both from inside and outside expected utility theory, It islay purpose in this essay to join with those critics by advocating the following thesis. 

While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches to probabilistic logic into a simple unifying framework: logically complex evidence can be used to associate probability intervals or probabilities with sentences. 





Kyburg goes halfway towards objective Bayesianism. He accepts that frequencies constrain rational belief to an interval but stops short of isolating an optimal degree of belief within this interval. I examine the case for going the whole hog. 

