Skip to main content
. 2023 Jul 21;7:392–411. doi: 10.1162/opmi_a_00091

Figure 3. .

Figure 3. 

Mutual information calculations for a realized second-order model. (A) the actor observes a signal α = d + (2/d′)ϵα (red for L: d = −1; green for R: d = +1) and makes an unbiased decision a for d = ±1 at the boundary α = 0. (B, C) The rater receives two pieces of information: β = α + β; γ = d + γ where all ϵ are standard 𝒩(0, 1) and independent. γ is called unique since it contains information about d that is not shared with the actor. Here d′ = 1; B = 1; G = 5. (D) The density P(β, γ|a = −1) slightly favours the lower left quadrant, but with substantial noise. The distribution integrates to 1; color scale not shown for convenience. (E) The conditional probability P(d = −1|a = −1, β, γ) is the accuracy afforded by the rater’s information set (a = −1, β, γ). If β, γ ≪ 0, then the decision a = −1 is likely to be true. The contour lines show the boundaries where this objective confidence crosses the values shown—the enclosed regions are where objective confidence ratings would be provided. (F) If we consider the regions of β, γ that define these bins of confidence, we can assess the expected accuracy—defined by the combination of the probability of ending up in one of the confidence bins c(a = −1, β, γ) and the chance of being correct (white) or incorrect (black) in that bin. The mutual information 𝓘(r, c) between being correct and c (given a = −1) is 0.104 bits. (G–I) The same as (D–F) except for the case that a = 1. Since the problem is symmetric, this is essentially the same as for a = −1.