Abstract
The paper starts with a brief review of the literature about uncertainty in geological, geophysical and petrophysical data. In particular, we present the viewpoints of experts in geophysics on the application of Bayesian inference and subjective probability. Then we present arguments that the use of classical probability theory (CP) does not match completely the structure of geophysical data. We emphasize that such data are characterized by contextuality and non-Kolmogorovness (the impossibility to use the CP model), incompleteness as well as incompatibility of some geophysical measurements. These characteristics of geophysical data are similar to the characteristics of quantum physical data. Notwithstanding all this, contextuality can be seen as a major deviation of quantum theory from classical physics. In particular, the contextual probability viewpoint is the essence of the Växjö interpretation of quantum mechanics. We propose to use quantum probability (QP) for decision-making during the characterization, modelling, exploring and management of the intelligent hydrocarbon reservoir. Quantum Bayesianism (QBism), one of the recently developed information interpretations of quantum theory, can be used as the interpretational basis for such QP decision-making in geology, geophysics and petroleum projects design and management.
This article is part of the themed issue ‘Second quantum revolution: foundational questions’.
Keywords: quantum Bayesian inference, uncertainty, geophysical data, contextuality, intelligent hydrocarbon reservoir
1. Introduction
The main task of geosciences applied to the petroleum industry and to natural resources in general is to make accurate predictions about the structure and space distribution of the main geological and petrophysical patterns in the Earth's subsurface. This is a key point for making sound reservoir decisions related to exploration and management of hydrocarbon deposits (e.g. oil) and well-drilling, as well as for management of water resources and environmental activity, among others. However, it is practically impossible to create a detailed picture of the subsurface world. Surprisingly, we know less about this world than about the cosmos and probably even less than about the quantum micro-world.
We remark that, in recent years, asserted strategies for uncertainty assessments of data and model structure have been proposed for models related to environmental and groundwater resources; see e.g. [1,2]. Geological models are beginning to apply computational intelligence for reservoir management with a view to improving the reliability of reservoir predictions and modelling, with realistic tolerance to imprecision and uncertainty [3]. An intelligent oil reservoir identification approach by deploying a quantum Levenberg–Marquardt neural network was accomplished by Liu et al. [4] as an improved alternative to common statistical identification methods in engineering applications. Geophysical data are characterized by variability, incompleteness and noise, plus there is a mismatch between the scales of observations and numerical simulation, which make biased the multiscale and multiphysical data up- and down-scaling during petroleum reservoir static and dynamic characterization. Thus, decision-making in applied geosciences has a high degree of uncertainty and risk. This uncertainty is not only of objective nature. It is also related to subjectivity of the decision-making. The results of measurements are analysed and modelled by human experts, and decisions (regarding, for example, the outcomes of oil well-drilling) are based on subjective probabilities assigned by the experts working on petroleum, geological or geophysical projects.
The viewpoint on the uncertainty which ‘plagues every effort to model subsurface processes and every decision made on the basis of such models’ was presented in the conceptual paper of Tartakovsky & Winter [5] and Tartakovsky [6], dealing with uncertainty quantification, probabilistic risk assessment (PRA) and decision-making under uncertainty. They concluded that the right tool for decision-making is Bayesian inference with subjective probabilities. More generally, these authors embedded geological decision-making into the general theory of decision-making [7,8].
The same viewpoint was presented in the works of some other geoscientists; for example, Sandersen [9] pointed out that: ‘A comprehensive assessment of the uncertainties of the geological model is, however, a complicated task. The nature of the datasets included in the geological model is normally very heterogeneous and every dataset has uncertainties of its own. In addition to this, the geological interpretations performed during the geological modelling have a high degree of subjectivity’.
All these efforts have resulted in intelligent completion technology designs, for instance, of the Schlumberger Company, and such designs manage uncertainties in reservoirs (for example, hydrocarbon deposits in rocks) by monitoring and controlling the individual zones within the wells directly by an operator. Intelligent completion technology is enabling operators to optimize production or injection programmes [10].
The perspectives of optimization and control of ‘intelligent wells’ will not result in an acceptable control of uncertainties unless the next important step towards coupling of the analysis of geological, geophysical and petrophysical data with the general theory of decision-making is executed. The success of this coupling can be found in the article of Baddeley et al. [11]. The authors of that paper compared decision-making in geology with decision-making in the economy, psychology and cognitive science. They especially emphasized the role of biases in experts' decisions about geophysical data and subjectivity of such decisions.
Such studies of uncertainty in the analysis of geophysical data and their coupling with the general theory of decision-making motivated us to consider the possibility of the use of advanced methods of modern theory of decision-making for the purpose of sustainable oil/gas characterization and exploration designs and top-down intelligent reservoir modelling and managing. Recently, psychologists, sociologists and economists have started to use the mathematical apparatus of quantum probability theory (QP) for modelling the process of decision-making. This is a rapidly developing area of research; see e.g. [12–22] for monographs and a few representative papers (these works also contain relevant references). We speculate that, once QP is combined with advanced intelligent computation techniques, the design and management of the intelligent oil/gas reservoir in the petroleum industry will become a reality.
One can wonder why this mathematical apparatus developed specifically to describe the probabilistic behaviour of a very special class of physical systems, that is, quantum particles, works so well in other fields, as applied to a variety of problems outside of quantum physics. This phenomenon was analysed in great detail in the article [20]; see also §4c of this paper. Here we only stress that the keywords are contextuality and adaptivity. Both quantum particles and a variety of biological and geological systems are very sensitive to variability of contexts for their decisions, judgements and actions [18]. As was explained in a long series of works of one of the coauthors of this paper (see, e.g., pioneering paper [12] and monograph [14]), the main distinguishing feature of QP is its contextuality.1 Geological/geophysical data are as well very sensitive to context variations; see §2 for the analysis. It is natural to describe such data in terms of QP (as one of the best developed models of contextual probability theory). We remark that treatment of quantum theory as a special non-classical probability (non-CP) theory representing contextual probabilities is the essence of the Växjö interpretation of quantum mechanics [14].
Quantum (contextual, by the Växjö interpretation) probabilities can be interpreted in different ways similarly to classical probabilities. Two main interpretations are objective and subjective interpretations.2 Subjective probability perspective on quantum mechanics is known as quantum Bayesianism (QBism).3 In a recent article [20] QBism was considered as the basic interpretation of QP-based decision theory. It was proposed to use the QBist perspective on QP not only in quantum physics, but also in other areas of research, especially psychology and economics. In this paper we shall discuss the possibility to proceed with the quantum subjective probability approach for decision-making related to analysis of geological/geophysical data in view of intelligent oil/gas reservoir design and sustainable management.
This paper is conceptual and its main aim is to attract the attention of experts making decisions about geo scientific projects (e.g. the oil exploration or geohydrological applications) to the new type of theory of decision-making, quantum Bayesian analysis and, more generally, to applications of QP in the petroleum industry. This paper was motivated by the results of the authors' collaboration during the last 5 years of research on the fractal behaviour of a naturally fractured carbonate reservoir, performed with financial support of Mexican programme ‘SENER-CONACYT-Hydrocarburos’. The data collected and analysed during these years demonstrate a high degree of heterogeneity, spatial variability and anisotropy, resulting in high uncertainty of subsurface geometry and topology modelling; see §4c. Taking into account that the design and application of an appropriate strategy for reservoir modelling in strongly heterogeneous and fractured reservoirs is still a controversial issue in reservoir modelling [29], we try to maximize integration of multiscale and multiphysical data inside the new QBism modelling approach in order to bypass the problem of continuous in time/space data assimilation to the more realistic case of discrete in time/space measurements contaminated by errors of different origin [30]. Or, paraphrasing Bergeron [31], we try to show how passing from classical to quantum probability can help to translate geological and geophysical ideas into the mathematical and computer simulation language of uncertainty.
2. Uncertainty and risk in geological and geophysical characterization of an oil/gas reservoir
Subsurface systems are very complex. Modern measurement techniques have restricted applicability. Even when they are available for the reservoir's static or dynamic characterization, the bank of measured data can be extremely costly, e.g. in the case of oil/gas reservoir characterization under the seabed or across horizontal wells. There are numerous unknown configurations between the geological (especially sedimentary), tectonic and diagenetic conditions which occur and combine under the Earth's surface, resulting in high heterogeneity and nonlinear behaviour, depending crucially on geographical location.4 Therefore, decision-making about the exploration programme is done under conditions of high uncertainty, and such decisions are very risky. The degree of uncertainty and risk is not less (and probably higher) than for decisions in the financial market. One has to rule out the most unlikely scenarios and make a correct decision, as the consequences of a wrong decision may be drastic and costly. For these strongly heterogeneous and fractured reservoirs, only the integration of petrophysical modelling [29] to the correctly selected analytical toolbox assures the better performance of the production zone.
This is a good place to cite J. Caers, a professor of geological sciences at Stanford's School of Earth, Energy & Environmental Sciences (in 2015 he was interviewed by Than [32]): ‘If you're using very simple models that represent the subsurface as simply layered, or homogeneous, then you could be in for a rude shock when you start drilling for oil or water, or injecting cleanup chemicals into a contaminated aquifer’. He continued: ‘The wells you drill could be dry and the cleanup project that you thought would take a few months could take years. These things have actually happened, and it was due to faulty assumptions about uncertainties’.
It is important to point out that for a long time the role of uncertainty in geological decision-making was shadowed by the use of deterministic models of the hydrodynamic type. The beauty of the deterministic equations of mathematical physics, say, the system of Navier–Stokes equations, made the impression that the output of numerical simulation for these equations provides an adequate picture of subsurface processes. This problem was enlightened in the article of Tartakovsky & Winter [5], which can be considered as a manifest of the importance of uncertainty, for their study in hydrogeology:
We still lack both theory and practice—and perhaps, the will—to deal realistically with basic observational quantum systems limits. Groundwater hydrology would be deterministic if we knew conductivity and all other parameters at every point in an aquifer. For that matter, we could solve the Navier–Stokes equations directly if we knew the exact geometry of the porous space. Although raw computer power is enough to solve most practical problems in fluid mechanics, including those involving turbulence, the situation is different in hydrogeology. Our ability to model subsurface flow and transport is severely undermined by lack of information, a problem that cannot be resolved with more computational resources. Enhanced site characterization also has its limits. For the foreseeable future, we cannot know the detailed properties of an aquifer without destroying it, and even that might not be enough for a complete characterization. While the basis of hydrogeologic uncertainty is epistemological, it is no less fundamental in our field than is the Uncertainty Principle a (physical property) in particle physics. [The words in bold were added by the authors of this paper.]
This reference to the Heisenberg uncertainty principle is especially important for us and can be expressed in terms of Jorgensen [33]: we are looking for ‘the measure of a measurement’. We also point to the remark that ‘we cannot know the detailed properties of an aquifer without destroying it’; cf. footnote 1 comparing quantum and geophysical measurements. Starting with a debate about hydrological uncertainty, Tartakovsky & Winter [5] naturally concluded that the only possibility to make adequate decisions on geological projects is to use Bayesian inference endowed with subjective interpretation of probabilities: ‘subsurface models have to be probabilistic and the corresponding probabilities have to be subjective, i.e. they have to account for soft data such as expert knowledge’; see also Tartakovsky [34], and the references therein. We plan to extend this Bayesian approach to decision-making in geoscience applications by appealing to its quantum generalization.
3. Quantum probability
The general formalism of quantum theory is very complicated and mathematically advanced. However, for our purpose—to apply quantum Bayesian inference for problems of decision-making and problem judgement—we only need the basics regarding the measurement theory and quantum handling of probability; see books [14–16] for a non-physicist friendly presentation of these basics.
In the quantum formalism, states of systems are represented by normalized vectors belonging to complex Hilbert state space H endowed with the scalar product denoted as
. Observables are represented by Hermitian operators acting in H.
QP theory is based on a simple rule connecting the Hilbert space representation of states and observables with probability. This rule was invented by M. Born and known as the Born rule.
For simplicity, let us consider an observable given by the operator X having a purely discrete spectrum. The set of its eigenvalues is given by the sequence of real numbers (xi). Let (ei) be the set of corresponding eigenvectors. Again, for simplicity, we assume that the spectrum is non-degenerate, i.e. each eigensubspace is one-dimensional. Let ψ be a normalized vector in H representing a state of the system:
. Then the probability that the output of a measurement of X is given by the eigenvalue xi is represented in the Hilbert space terminology by the Born rule5:
| 3.1 |
This formula is the basis for QP. It couples the vector-state representation with probability. In contrast to CP, QP operates with state vectors (complex probability amplitudes) rather than with probabilities directly. In particular, the QP analogue of the probability update is based on the state update. Therefore, it is necessary to have not only a rule coupling the state vector and the probability of a measurement outcome, but also a rule determining the state transformation produced as the feedback of the measurement. The general formalism of such state transformations is presented by the quantum theory of instruments [35–37]. Its simplest form (sufficient for many important applications) is given by the von Neumann–Luders projection postulate.
We shall formulate this postulate in the general operator setting. We consider the case of observables represented by Hermitian operators having in general a degenerate (but still discrete) spectrum, i.e. a few linearly independent eigenvectors can correspond to the same eigenvalue. Such a Hermitian operator X can be represented in the form
where the x variable represents the spectrum of X and Fx denotes the orthogonal projector onto the subspace of eigenvectors corresponding to the eigenvalue x. Generalizing (3.1), the Born rule takes the form:
| 3.2 |
We remark that each orthogonal projector is Hermitian and is equal to its square. Hence
| 3.3 |
The X measurement with the result X = x generates the transformation of the pre-measurement state ψ into the post-measurement state ψx, which is given by the projection of ψ onto the eigensubspace corresponding to the eigenvalue x:
| 3.4 |
(The denominator is needed for the result to be again a quantum state, i.e. the normalized vector:
)
4. Quantum probability in geophysics
As was pointed out in the introduction, nowadays QP is widely applied outside of physics, especially for modelling decision-making. Here the main areas of application are cognitive psychology and economics.
(a). Contextual probability in physics and decision-making
One can be curious why the same mathematical formalism works well for so vastly different systems, quantum particles and humans [14] (as well as for other biological systems, e.g. cells and proteins [18]). Here the keyword is contextuality. Microsystems and humans (as well as other biological systems) are very sensitive to variations of contexts. QP can be considered as a calculus of contextual probabilities; this viewpoint on QP was presented in the series of works of Khrennikov; see monograph [14] (and known as the Växjö interpretation of quantum mechanics). Each context (physical, biological, cognitive) C determines its own probability space endowed with its own probability measure PC. In general it is impossible to represent all contextual probabilities as conditional probabilities with respect to the fixed probability measure P, i.e. to have the CP representation
, where conditional probability is defined with the aid of the Bayes formula:
| 4.1 |
(We remark that CP is based on the set-measure-theoretic axiomatics of Kolmogorov (1933); here the Bayes formula (4.1) is the definition.)
In general, the family of contextual probabilities cannot be embedded into a single Kolmogorov probability space; contextual probability theory is hence non-Kolmogorovean. Thus, observables are context-dependent; they cannot be represented by random variables on the same probability space.
There can be developed a variety of non-Kolmogorovean models. QP is just one of such models. It is characterized by simplicity, since probabilities are represented by vectors in a linear state space and observables by linear operators (and the linear model is always simpler than nonlinear). QP is also characterized by the high degree of testing—it works very well for the important class of context-sensitive systems—quantum particles. Recently, QP demonstrated applicability in modelling decision-making in cognitive psychology and economics. Therefore, it is natural to try to employ QP for formalization of contextual probability in other domains of science.
Now we turn to geophysics. Here measurements are characterized by the unavoidable variability of the results from one rock sample to another one, or from one scale to another one. Therefore, it is difficult (if at all possible) to handle this variability on the basis of common probability space. The use of contextual probability theory (as non-Kolmogorovean probability theory) and its QP model may be fruitful.
(b). Subjective perspective on decision-making based on geophysical data
At the same time, one can point to the important difference in applicability of QP for observations in quantum systems and humans on the one hand and geophysical observations on the other. It is commonly accepted that quantum systems do not have objective properties (and this is the essence of the Copenhagen interpretation of quantum mechanics), i.e. there are no intrinsic system properties which can be ‘uncoupled’ from measurement devices and methodologies. A similar viewpoint is sufficiently strongly presented in cognitive psychology: humans elaborate answers to questions and make decisions by adapting to contexts in which these questions are presented and decisions are made [11]. In contrast to quantum physics and cognitive psychology and economics, it is commonly assumed that in geology one works with the collected data corresponding to objective physical processes, e.g. flows of water or oil in porous media. However, even here the role of a subjective component of decision-making is very strong. We present here an extended citation from the paper of Sandersen [9]:
Geological models, however, need special attention because of the high degree of embedded subjectivity. A quantification of the uncertainties may be carried out on the individual datasets. But a quantification of the uncertainties of the geological model as a whole requires an assessment of contributors to the uncertainty that can not necessarily be described numerically. Typically, the subjective aspect of the geological interpretations is so dominant, that it will overshadow the uncertainties related to the datasets. A quantification of the uncertainties would be preferable because we thereby get an opportunity to perform automated calculations of the uncertainties. A quantification of the uncertainties of each dataset can be performed objectively by determining the uncertainties related to equipment, sampling, data interpretation etc. But when interpreting the geology of the area, different data types are combined and a high degree of subjectivity is introduced by the geologist. In the interpretation process, the uncertainties of the individual datasets are not additive, creating a higher model uncertainty.
Thus, it seems that in geology the subjective component of the process of decision-making is no less important than in cognitive psychology or economics. As was already pointed out, this viewpoint was presented in great detail in the paper of Baddeley et at. [11] devoted to the comparative analysis of decision-making in geophysics and the humanities.
In fact, the aforementioned subjective component in geological decision-making is closely coupled to contextuality of data collected on different scales and by multiphysics techniques. Observations cannot provide a complete picture of subsurface processes. The number of drillings is restricted. Each well gives some portion of information about these processes, but it is far from being complete.
(c). Geology: physical space reality or mathematical illusion?
The ideology of ‘objectivity of classical physical processes’ is heavily based on the idea about ‘really existing physical space’. The latter is mathematically modelled as the Cartesian product of three real lines, R3. Moreover, typically one does not distinguish the mathematical model R3 from its physical counterpart. Therefore, physical processes modelled with dynamical systems in R3, including stochastic processes, are considered as real physical processes.
In contrast to this classical realistic interpretation of processes in R3, processes modelled in complex Hilbert space (in quantum physics or applications to decision-making) are treated as non-objective, as epistemic, i.e. representing knowledge of observers. In relation to this distinction, philosophers speak about the ontic modelling of classical physical processes versus epistemic modelling of quantum ones (see Plotnitsky [38]).
Of course, even in classical physics, approachability of the ontic level is only theoretical and even here the primary goal of the observer is to collect knowledge, i.e. to create an epistemic representation. However, even a theoretical possibility to approach the ontic level, i.e. reality as it is,6 plays a crucial role in the experimental methodology of classical physics. In particular, apart from epistemological uncertainty related to collection of knowledge about a physical system, there is also ontic uncertainty of ‘real physical processes’. In quantum physics, at least in its modern information interpretations [14,23–28,31,33], quantum uncertainty is treated as purely epistemic uncertainty.
We remark that, surprisingly, some signatures of this discussion about ontic–epistemic modelling can be found in the geological literature. In particular, there is also a distinction of two classes of uncertainty, aleatory and epistemic; see e.g. Nadim [39].7 The latter is the aforementioned epistemic uncertainty. The former is the geophysical analogue of ontic uncertainty.
R3 is questionable. Random porous media have fractal and multifractal features (see e.g. Oleschko et al. [40,41]) and geophysical processes are adequately modelled by dynamical systems not in R3, but in fractal and multifractal spaces. The fractal structure can vary from rock to rock. This variability should be modelled by complex multifractal spaces.
The authors of this paper also contributed to such fractal modelling of geophysical processes by using p-adic and more general ultrametric spaces representing the tree-like structure of capillary networks in random porous media; see Khrennikov et al. [42–45] and Oleschko et al. [46]. Probably, the fields of p-adic numbers Qp (here p > 1 is a prime number) are better candidates for ‘physical space representation’ of porous media than the field of real numbers R. The problem of variability of the fractal structure of porous media is treated in our works by using complex ultrametric trees. We illustrate the ultrametric (and hence multifractal) structure of subsurface configurations by images of tree-like skeletons extracted from real geophysical data; see figures 1 and 2.
Figure 1.
An example of a probability cube for porosity, extracted from real seismic 3D data of one oil reservoir. The values of porosity fluctuate between zero and near to 14% and are codified inside the rainbow colour spectrum. To find the conductive zone inside the reservoir on the macro scale, the continuity and tortuosity of each colour pattern are defined by the probability distribution of electromagnetic waves or fluid flow.
Figure 2.
An example of an X-ray microtomographic image of a fragment of the rock sampled inside the same reservoir as in figure 1. The black colour corresponds to the pore space. From this cube, the probability cube for porosity can be constructed for the micro scale. The main goal of petrophysical research is to integrate the micro-correlations to macro-correlations for each studied attribute of the reservoir, with special attention to PoroPerm: the relation between porosity and permeability. Note that each pixel from the seismic cube should be populated with the porosity data of the tomographic image (size in micrometres). The porosity probability cube is constructed from each tomographic slice.
In the situation when the ‘reality of physical space’8 is questionable, it is natural to switch to the operational description of the possible outputs of observations. The quantum model (treated operationally) provides such a description.
(d). The firefly box as illustration for measurements on subsurface configurations
Let us consider measurements in a firefly box setting. This example was proposed by quantum logician Foulis [57] and it played an important role in understanding the difference between quantum and classical logics of events. In this paper we do not have a possibility to discuss logical aspects of QP decision-making, but this example can also serve to illustrate the difference between CP and QP and especially the notion of incompatibility of observations.
There is a box, see figure 3, with two translucent (but not transparent) windows. One window is on the front of the box and the other window is on the side. Other sides of the box are opaque. There is a firefly flying inside of this box. At any given moment, the insect might or might not have its light on. If the light is on, it can be seen as a blip by looking at either of the two windows.
Figure 3.

The illustrative model of fuzzy observations (quantum logic) known as a ‘firefly in a box’.
An observer analysing firefly behaviour can perform two observations, one by looking through the front window and the other one by looking through the side window. Denote them by the symbols F and S. By performing the F-measurement when the light is on, one can tell by the position of the blip whether the firefly is in the left (F = −1) or right (F = +1) half of the box. In the same way by performing the S-measurement when the light is on, one can tell whether the firefly is in the front (S = −1) or back (S = +1) half of the box. Because the windows are not transparent, one cannot rely on depth perception to determine from the front window whether the firefly is in the front or back half of the box, nor from the side window, whether the firefly is in the left or right half of the box. The crucial point is that one cannot perform two measurements simultaneously, i.e. to look at the front and side windows at the same time. Thus physically these observables are incompatible. Moreover, we can construct the quantum-like mathematical model of these observations and their probabilistic outputs including conditional measurements, e.g. first F and then S and vice versa. In this model these observables are represented by non-commuting operators; see Khrennikov [58,59].
This pair of observables provides only incomplete knowledge about the probability distribution of locations of the firefly in this box. We remark that incompatibility of these observables does not prevent the possibility of determination of so-to-say hidden variables, the position of the firefly in the box given by the Cartesian coordinates (x,y,z). However, to determine these coordinates we need more windows. In fact, construction of such a net of windows may destroy the box (and, in any event, change the behaviour of this firefly). This situation is similar to an attempt to construct a complete picture of a water aquifer or an oil reservoir by drilling a dense net of wells, cf. the discussion in §2. In this comparison, wells play the role of windows for the firefly box.
In the case of the firefly box, one can propose to use less destructive measurement techniques, e.g. light photodetectors, to find the coordinates (x,y,z) of the firefly. However, their applicability depends on the degree of transparency of the sides. By using, for example, photodetectors one should take into account noise which is always present in space and box material. The situation is again similar to geophysical investigation of subsurface configurations. Here we also have non-transparency of the (Earth's) surface combined with noise.
As was already emphasized in §4c, the real physical situation is even more complicated (figures 1 and 2). All previous considerations about the firefly box were based on the explicit assumption about the Euclidean geometry of space inside the box. Suppose now that this geometry is fractal or multifractal, e.g., that a firefly or, better to say, some kind of a fire-caterpillar can move only through capillaries of a network in the random porous medium filling the box. Suppose that the surfaces of capillaries are transparent for light and/or permeable for fluid. In this situation collection of data through a few observables (and without pretending to determine the Cartesian coordinates of the fire-caterpillar) looks like a very natural strategy. The latter example of the fire-caterpillar fractal box mimics well the geophysical situation: drops of water or oil moving through capillary or fracture networks in random porous media.
5. Classical versus quantum updates of probabilities
Probability update is the rule for how we should revise our estimate for the probability of a hypothesis, given some new information, e.g. the output of a new measurement.
(a). Classical Bayesian inference
According to CP, our belief in different hypotheses should be updated using Bayes' law, which is a straightforward consequence of the Bayes formula defining conditional probability in CP. This probability update law states that
![]() |
5.1 |
Here Hi is a particular hypothesis and D is the observed data. CP is based on the set-theoretical paradigm; here equation (4.1) is equivalent to
![]() |
The term
corresponds to the joint probability of the hypothesis and the data.
(b). The law of total probability and its violations in natural science and the humanities
We pay special attention to the denominator of (5.1). In fact, this is the expression for the probability of the observed data P(D). By using the formalism of CP we can prove that
| 5.2 |
The proof is based on the Bayes set-theoretic definition of the conditional probability and additivity of probability. This formula is known as the law of total probability.
Bayes' rule is highly intuitive, indeed to the point that it is hard to envisage alternatives. The QP programme presents a particular alternative to Bayes' rule for the probability update. Our research programme has largely been concerned with exploring a particular alternative probabilistic framework, which reveals alternative intuitions about probabilistic inference. As is typical in development of science, the main driving force for reconsideration of the basic laws of some existing theory (in our case CP) is experimental violation of these laws. The law of total probability is the basic test which is used for such demonstrations, both in quantum physics and in applications to cognitive psychology and economics (as well as molecular biology).
In quantum physics this law is violated by data collected in interference experiments. The basic experiment of this type is the two-slit experiment. Here quantum particles (typically photons) are emitted by a source. Then they have to pass through a screen with two slits. And finally the particles hit another screen, covered by photo-emulsion, producing black dots on it. The collected data D is the probability distribution of dots. One can proceed with two hypotheses, Hi, i = 1,2, ‘the particle passed through the ith slit’. Then it can be shown that statistical data collected in the two-slit experiment violate the law of total probability; see Khrennikov [10,58–60], Asano et al. [18].
To demonstrate this violation, the two-slit experiment is performed in three different contexts: Ci, i = 1,2, ‘only the ith slit is open’; and C12, ‘both slits are open’.
are collected for contexts Ci and probability P(D) for context C12. If the corresponding experimental frequencies approximating these probabilities are placed in (5.2), we can see that the left-hand side does not equal the right-hand side.
The quantum formalism gives the additional ‘interference term’ in the right-hand side; see Khrennikov [14,60]. This term can be positive—constructive interference—or negative—destructive interference. This experimental violation of the law of total probability is an exhibition of contextuality of the observed data.
Similar violations were found for statistical data from cognitive psychology. The first violation of the law of total probability for recognition of ambiguous figures was obtained in the experiment performed by Conte et al. [61] which was designed on the basis of the scheme presented in an article of Khrennikov [62]. Later, due to efforts of professor of cognitive psychology J. Busemeyer and his students and collaborators, violation of the law of total probability was coupled to a number of well-known psychological effects, in particular, disjunction effect and order effect [15,63]. In molecular biology, Asano et al. [18] coupled violation of the law of total probability to lactose–glucose metabolism of E. coli bacteria. This is an example of destructive interference.
Another possibility to test the boundaries of application of CP is to check directly the validity of the Bayes rule for probability update [20].
Violations of the Bayes rule and the law of total probability for experimental statistical data motivate the use of other theoretical schemes for probability update. The quantum scheme is one of the most promising candidates for such an alternative.
(c). Quantum Bayesian inference
In QP we cannot operate with the set-theoretical representation of hypotheses. Here we operate only with observables. In CP observables are represented by random variables. Therefore, it is useful to rewrite the Bayes rule solely in terms of observables.
In CP, the set of hypotheses can be represented with the aid of a random variable Θ taking the values {θ1, θ2, … , θm} with the probabilities π(θi). More generally, when Θ is not necessarily discrete, we can consider the probability density π(θ) rather than discrete probabilities π(θi). However, in this paper we restrict our consideration to the discrete set of hypotheses.
Data in CP can be represented by another random variable X having the values {x1, x2, … , xn} and the probability distribution p(x | θ) for each state θ. We use two different letters π and p to distinguish between probability distributions for hypotheses and information X. Now, if the random variable X is measured, the prior probability distribution π(θ) can be updated with the aid of the information collected from the result of the measurement, say xi. The CP update generates π(θ | x) according to the Bayes rule
| 5.3 |
Now we turn to QP. Consider two quantum observables represented by Hermitian operators Θ and X with eigenvalues θ1, θ2, … , θm and eigenvalues x1, x2, … , xn, respectively. The first one represents hypotheses and the second one information which can be used for probability update. Here
| 5.4 |
where (Eθ) and (Fx) are orthogonal projectors onto eigensubspaces of these operators and θ = θj, x = xj.
In QP endowed with the subjective interpretation of probability (QBism, see §6), for a particular decision-maker, the initial state representation relevant to a situation is given by a belief state ψ0 ∈ H. This state encodes information about subjective probabilities for all conceivable states of nature. They can be extracted by performing direct measurements of Θ:
| 5.5 |
This observation procedure corresponds to CP decision-making corresponding to the prior possibilities for Θ, when the observer is in state ψ0. However, QP is not about the update of probabilities. It is about the update of states. Gaining of additional information from measurement of observable X generates the state update based on the projection postulate.
As in CP, the next step is to update the probabilities of θ on the basis of additional information from a measurement of X. By using the projection postulate (3.2), we get
| 5.6 |
The result is not constrained to coincide with the Bayesian probability update (5.3). At the same time, this quantum rule for probability update is the natural generalization of CP rule (5.3). If observables X and Θ are compatible,9 then the quantum Bayesian rule (5.6) can be represented in the purely probabilistic form of CP Bayesian update (5.3); see [16] for details.
As was already remarked, the quantum Bayesian rule (5.6) can be in principle interpreted similarly to the classical Bayesian rule (5.3)—as the update of prior probability given by (5.5). However, its genuine interpretation is as the state update: the prior state ψ0 (the belief state of the decision-maker before the X measurement) is updated to the post-measurement state given by
, see (3.4), generated by the feedback on observation of the result X = x. All probabilities under consideration are subjective probabilities; the state update takes place in the head of the decision-maker.
6. Quantum Bayesianism and its role in decision-making for intelligent reservoir management
QBism is one of the information interpretations of quantum mechanics; it was created during the last 17 years due to the tremendous efforts (both scientific and disseminating) of C. Fuchs and his collaborators; see Fuchs & Schack [23,24] for the presentation of the modern version of QBism.10 The key statement of QBism is that quantum probabilities have to be interpreted as subjective probabilities. QBists have enlightened the private agent perspective to decision-making about the possible outputs of quantum measurements. By presenting some theory, it is always useful to appeal to the original works of its creators; here we cite Fuchs & Schack [65]:
The fundamental primitive of QBism is the concept of experience. According to QBism, quantum mechanics is a theory that any agent can use to evaluate her expectations for the content of her personal experience. … In QBism, a measurement is an action an agent takes to elicit an experience. The measurement outcome is the experience so elicited. The measurement outcome is thus personal to the agent who takes the measurement action. In this sense, quantum mechanics, like probability theory, is a single user theory. A measurement does not reveal a pre-existing value. Rather, the measurement outcome is created in the measurement action.
This statement has a really revolutionary character. It seems to be the first declaration of subjective interpretation of probability in quantum physics, where probabilities were traditionally interpreted as the objective entities, i.e. existing independently of an observer's actions.
This subjective probability approach to quantum physics has consequences for physics in general. One can start to think, see Khrennikov [20,66], that if the subjective probability perspective can be consistently used in quantum physics,11 then one can extend it to, for example, classical statistical physics, i.e. to treat even probabilities in thermodynamics as subjective probabilities. One can speak not only about QBism, but even about CBism (classical physics Bayesianism endowed with the subjective interpretation of probability); see [20,66] for a detailed discussion. The subjective probability viewpoint on probabilities in classical statistical mechanics and thermodynamics is not so exotic; for example, it is presented in the excellent introduction to classical and quantum thermodynamics written by Schrödinger [67]. Both classical and quantum theories are presented by using the subjective probability approach. And QBists also point out (see Fuchs & Schack [65]):
According to QBism, quantum mechanics can be applied to any physical system. QBism treats all physical systems in the same way, including atoms, beam splitters, Stern–Gerlach magnets, preparation devices, measurement apparatuses, all the way to living beings and other agents. In this, QBism differs crucially from various versions of the Copenhagen interpretation.
Thus QBists' perspective can be definitely applied outside of quantum physics. In particular, it can serve as a subjective QP-probabilistic basis of decision-making; see Khrennikov [20]. In particular, QBism supports our proposal to use QP in decision-making about geological projects.12
7. Classical versus quantum Bayesian approach to risk analysis in oil reservoirs
As an illustrative example of the application of our formalism, we consider the very important problem for the oil industry: to decrease risks of premature water or gas breakthrough in wells. This task is the main feature of the ‘intelligent downhole production project’. Our aim is compare the classical and quantum Bayesian approaches to probabilistic risk analysis (PRA) in oil reservoirs.13
(a). Classical probabilistic risk analysis for breakdown of efficiency of oil–water separation control
The aim of PRA is to find both the likelihood of system failure and the efficiency of alternative remediation strategies (alternative strategies for well diagnostics, water–gas control and protection strategies against fluid–gas invasion). From now on, we restrict our consideration to the problem of probability measuring of oil well invasion by water; the problem of oil well invasion by gas is treated in the same way.
The first step of classical PRA (cf. with PRA for aquifer contamination [34]) is construction of a minimal Boolean algebra of events related to this problem. The main event of interest is ‘oil well invasion by water’ (WIW). It is composed of events representing the failure of its constitutive parts (basic events), such as
— SO: ‘oil-spill occurrence which is related with the water well invasion’,
— FNA: ‘failure of natural attenuation’,
— FRE : ‘failure of a remediation effort’,
and their negations, NSO, NA and RE. We now make some clarification remarks about the terminology in use.
— Natural attenuation (NA): The key to water control is diagnostics—to identify the specific water problem at hand. For instance, in a naturally fractured reservoir the best practice is drilling perpendicular to the fracture system. In general, the water/oil ratios, production data and logging measurements can be considered as natural attenuation design.
— Remediation effort (RE): Design and installation of a plug, cement operation and gel treatment in a well.
Now we consider negations of these events, namely, FNA and FRE. We remark that both negations are consequences of the lack of understanding of different problems and consequent application of inappropriate solutions.
— Failure of natural attenuation (FNA): The geometry and topology of the natural fracture pattern is not taken into account. The other source of FNA is the lack of knowledge on the three-dimensional porosity/permeability pattern of the reservoir.
— Failure of remediation effort (FRE): This can happen as the result of the incorrect design of a plug, cement operation or gel treatment in a well.
The Boolean operations ‘and’ (∩) and ‘or’ (∪) endowing a collection of basic events play the crucial role in the following calculations. Our next step is to represent the WIW event via a Boolean expression:
| 7.1 |
In PRA this expression is known as a cut set representation of the fault tree (see Bedford & Cooke [68]). This representation is used to compute probability P(WIW) by using the additivity of classical probability measure:
| 7.2 |
By using conditioning on the event SO, this equation can be rewritten in the following form:
| 7.3 |
or
| 7.4 |
If spill has already occurred and everything about it is known, then we can set P(SO) = 1. Then
| 7.5 |
Tartakovsky [34] pointed out that ‘in engineering applications, the probability of a component failure, e.g. P(FNA) or P(FRE), is typically small and the probability of a simultaneous failure of more than one component, e.g.
, is often an order of magnitude smaller than that.’ (Of course, this observation is not universal!) Under the aforementioned condition, equation (7.5) can be simplified by neglecting the last term:
| 7.6 |
This approximate equality is known in engineering as a rare event approximation; see Bedford & Cooke [68]. In this approximation the probability of a system failure depends exclusively on the probability of failure of its constitutives.
The rest of paper [34] is devoted to the analysis of conditions of applicability of the rare event approximation and its modification in the case of non-negligible probability
. For us the crucial point is the appearance of the approximate equality (7.6) and the discussion about the possibility of its violation; see again [34]. Another problem discussed in the literature on applications of the Bayesian methods for geophysical data is that calculation of some probabilities (or obtaining good estimates) can be very time and resource consuming. In the case under consideration this is the probability
. The main aim is to exclude from consideration the ‘problematic’ probabilities (which are typically probabilities of conjunctions).
(b). Towards quantum probabilistic risk analysis for breakdown of efficiency of oil–water separation control
We start QP analysis with the observation that the problematic probabilities, such as
, are the probabilities of conjunction of two events (or in the set-theoretic representation of events, the probability of intersection of the sets representing these events). QP is designed to exclude such conjunctions from consideration. In quantum physics there exist events for which conjunction is not defined.
We emphasize once again that in the quantum theory events have no objective, but only an observational meaning. For example, consider the event FNA, ‘failure of natural attenuation’. In the classical probabilistic analysis FNA has the objective meaning: the failure has really taken place. Here FNA is the event which happens in nature. The quantum formalism describes events related to outputs of measurements and nothing else.
Thus the quantum event FNA has the meaning: outputs of observations for this oil well were interpreted as ‘failure of natural attenuation’. A decision-maker (even in the form of a computer program) plays the important role in the interpretation of the results of measurement. This observational viewpoint on the event FNA leads to the following interpretation of its negation, the event NA: outputs of observations do not confirm ‘failure of natural attenuation’. (It might be better to use the symbol of negation of FNA, say NFNA, in place of NA.) This interpretation of the event NA implies the necessity to include it in the complete conditioning scheme for the event WIW. The event WIW, ‘oil well invasion by water’, can happen even if the expert made the NA decision (i.e. not FNA decision). Such a decision does not mean that natural attenuation of the oil well really happened in nature. It only means that on the basis of collected geophysical data the expert was not able to make the decision ‘failure of natural attenuation’. To conclude this discussion about classical versus quantum events, consider (just for a moment) the case of conditioning of the event WIW only on the pair of quantum events, FNA and NA, and the corresponding version of the formula of total probability:
An expert or a computer program assigns (on the basis of collected data) the probability P(FNA) and sets P(NA) = 1 − P(FNA), since NA is negation of FNA. However, at the level of the complete (subsurface) description by values of the physical parameters, the events FNA and NA are represented by sets FNA* and NA* such that their intersection has non-zero probability. Thus
In this situation the formula of total probability is inapplicable.
The same can be said about the event FRE and its (observational) negation, which we denote by the symbol RE. In the observational framework the event FRE is interpreted in the following way. The collection of data led an expert to the decision ‘failure of a remediation effort’. And the event RE is just negation of the event FRE. The event RE has the following meaning: the collected data do not lead to the decision ‘failure of a remediation effort’. Again, the event WIW can happen even if the expert made the RE decision. (We again repeat that the use of symbols such as RE can be misleading; it may be better to use the symbol NFRE. But, to make closer analogy with classical PRA, we shall use the symbols NA and RE.)
Such an observational interpretation of events involved in conditioning for the event WIW on the system of events (FNA, NA, FRE, RE) blocks the standard classical probabilistic reasoning based on the set theoretical representation of events. Therefore, in general, there are no reasons to expect that the formula of total probability in the form:
![]() |
7.7 |
holds true.
And finally we get to the point. The previous long discussion was still just a classical allegory and the real situation is even more complicated (from the methodological and philosophical viewpoints). The above picture based on the interplay between the sets of values of parameters does not match the real physical situation. In general we cannot even determine the values of physical parameters (and even the parameters) leading to the decisions about the events WIW, FNA and FRE (or their negations). Subsurface configurations are so complex and variable that the set-theoretic approach is simply inapplicable.
In the QP-formalism conditioning on the system of observational events (FNA, NA, FRE, RE) is based on the quantum generalization of the formula of total probability [12–14]. Besides the classical conditioning block, the right-hand side of (7.7), this formula contains additional terms which are known in physics as the interference terms. The quantum version of the formula of total probability has the form [12–14]:
![]() |
7.8 |
where the indexed angle γ denotes the phase of ‘interference’ between the corresponding pair of events from the system (NA, FNA, RE, FRE) and the ‘decision event’ WIW, ‘oil well invasion by water’. (This modification of the classical formula of total probability will be derived in §7c, in the quantum framework. From the viewpoint of classical probabilistic formalism, the additional terms in (7.8) look very exotic. However, in quantum theory they appear naturally as the result of calculations based on linear algebra and the Born rule for probability.)
In the terms of the theory of decision-making, ‘interference’ can be treated as a kind of dependence on conditions. Consider, for example, the interference coefficient
. Its presence in the quantum formula of total probability shows that the contributions of the events FNA (‘failure of natural attenuation’) and FRE (‘failure of a remediation effort’) to the occurrence of the event WIW depend on each other.
Now we illustrate the application of the quantum formula of total probability by considering the case of maximal and minimal amplitudes of interference, i.e. corresponding to phases γ = 0 and π/2, π.
It is natural to assume that the interference coefficients
and
are equal to zero, i.e. the corresponding phases are equal to π/2. For example, the first coefficient encodes the contribution to the probability of ‘oil well invasion by water’ through interrelation of the events FNA and NA. The equality
means that the observational events FNA and NA match very well with the real states of the oil well and the classical set-theoretic description can be used.
In this example the interference between FNA and FRE is maximal, i.e. 
Really, for an expert the events FNA (‘failure of natural attenuation’) and FRE (‘failure of a remediation effort’) are strongly dependent. This is a case of constructive interference.
Now, to make considerations simpler, let us restrict conditioning of WIW only to the pair of events FNA and FRE, i.e. turn to the scheme of classical PRA; see (7.6). The quantum formalism (under the assumption of maximal interference) gives the formula:
![]() |
7.9 |
We remark that this formula can be considered as a précis of the approximate equality (7.6), the rare event approximation. However, in contrast to classical probabilistic formalism, in which the detailed analysis based on calculation of probability
and the equality (7.5) would decrease the probability of contamination, QP analysis increases it. This shows why the above statement, that referring to the possibility to represent observational events by some sets of the values of geophysical parameters, was just an allegory.
(c). Derivation of quantum formula of total probability for risk analysis for oil well invasion by water
In QP, the formula of total probability is derived in the following formalization. Suppose that the state of the system is represented by the normalized vector ψ. This is the state of the subsurface configuration in the region of the oil well. The complete Hilbert state space
has high dimension corresponding to numerous geophysical characteristics of this subsurface configuration. However, we need not work in such extended multidimensional state space. Similarly to problems treated in quantum information technologies, e.g. quantum computing, we can extract the degrees of freedom corresponding to the basic observables.
We consider the four-dimensional state space with basic states corresponding to the events (NA, FNA, RE, FRE), i.e. the orthonormal basis (
). We recall that these vectors represent the ‘observational events’. The orthogonality of vectors represents the distinction between possible decisions of an expert (or expert system). The corresponding sets of geophysical parameters, (NA*, FNA*, RE*, FRE*), need not be disjoint. (As was stressed in §7a, in general such ‘subsurface’ sets are not even well defined; therefore discussion about ‘disjoint’ versus ‘non-disjoint’ sets is merely allegorical).
We remark that the assumption about orthogonality of these vectors is not crucial. We proceed under this assumption to simplify the mathematical framework.14 Consider the Hermitian operator X with eigenvectors (
). In the quantum information approach the magnitudes of eigenvalues (
) are not important. This can be any quadruple of distinct real numbers. We remark that the belief state ψ of a hydrologic expert (encoding beliefs about possible causes of water invasion into the oil well) can be expanded with respect to the basis consisting of eigenvectors of the observable X:
| 7.10 |
We also consider another Hermitian operator Θ acting in this four-dimensional state space. It represents the dichotomous observable corresponding to the oil well invasion by water; denote the eigenvalues of Θ by the symbols
). The first one labels the event ‘water invasion into the oil well’ and the second labels the negation of this event. The magnitudes of these numbers again play no role. These are just labels for the corresponding events. We remark that, since Θ acts in the same state space as the operator X, its spectrum is degenerate, i.e. a few linearly independent eigenvectors can correspond to an eigenvalue. For symmetry reasons it is natural to assume that eigensubspaces have equal dimensions. The Hermitian operator Θ can be represented with the aid of the projectors onto eigensubspaces corresponding to its eigenvalues:
| 7.11 |
where the projectors are two-dimensional and orthogonal. By fixing the eigenvalues as ±1 we obtain the very simple representation of this operator:
| 7.12 |
By using the Born rule and expansion (7.10) of the belief state ψ, we can find the probability for the event WIW:
![]() |
By using the exponential representation of the complex scalar products,
![]() |
7.13 |
and by identifying their absolute values with the corresponding probabilities,
| 7.14 |
we obtain the equality (7.8), the quantum formula of total probability, where phases are calculated as
| 7.15 |
Consider the simplest decision-making context in which all phases,
,
,
,
are equal to zero, i.e. the state of knowledge preceding the decision-making is represented by the state vector:
| 7.16 |
This is in some sense the minimal quantum(-like) state representation of knowledge, using only the probabilities for the X observable. The main problem of applying the quantum generalization of the formula of total probability for decision-making is selecting the decision operator Θ.
Of course, the main task for further study is to elaborate the methods to determine the decision operator Θ for geological projects (e.g. in hydrology and petroleum geology). This problem is reduced to determination of transition phases; see (7.15). One promising approach is to apply various procedures of statistically based learning, including such advanced methods as deep learning. And this approach has already been approbated in applications of the quantum-like model for decision-making in cognitive science and microbiology (e.g. gene expression and epimutations); see Asano et al. [18]. However, construction of the decision operator Θ on the basis of statistical methods of learning is not a simple task. To proceed in this way, we need a sufficiently big database of probabilities and transition probabilities corresponding to previous cases of decision-making, say about ‘breakdown of efficiency of oil–water separation control.’ Statistical data collected in petroleum projects has already led to the creation of big statistical databases about breakdown of efficiency of oil–water separation control. In future, the authors of this paper plan to work in this direction—creation of proper databases and design of algorithms for reconstruction of the decision operators.
8. Conclusion
Our main proposal is to use QP for decision-making during the characterization, modelling, exploring and management of an intelligent hydrocarbon reservoir. We plan to develop this approach in more detail in our further theoretical and practical research and to elaborate the presented example of PRA for oil well invasion by water and gas.
Footnotes
In the Växjö approach to quantum physics, quantum probabilities were interpreted as objective. However, in applications to decision making, they started to be treated as subjective [20].
It was developed by Fuchs et al. (see [23,24] for its modern presentation) as one of the purely informational interpretations of quantum mechanics (cf. Brukner and Zeilinger [25], D'Ariano et al. [26], Chiribella et al. [27] and Plotnitsky [28]). We remark that appearance of QBism (as well as other information interpretations) was stimulated by the quantum information revolution.
The subsurface structures are not less complex than quantum systems. This geo–quantum analogy can be extended by pointing to the restrictive character of measuring possibilities in both areas of science. A single electron cannot be ‘scanned’ to obtain all the parameters determining the electron's state. In the same way, we are able to drill (e.g., to find an oil reservoir) only at a few points at the surface and to collect data only from these few wells. In both cases the collected data are incomplete.
We remark that by the spectral postulate of quantum mechanics only the values belonging to the spectrum (in our case, the set of eigenvalues xi) can be measured for quantum observables. Note that, in order to construct the hydrodynamical model, we should know the whole spectrum of the diversity of permeability/porosity relations.
As an example, we can consider the possibility to measure both position and momentum of a classical particle with any degree of precision. (Of course, everybody understands that this is only a theoretical possibility.)
In particular, Nadim pointed out [39]: ‘Working with uncertainty is an essential aspect of engineering—the larger the uncertainty and the closer to critical, the greater the need for evaluating its effect(s) on the results. To characterize the uncertainties in soil and/or rock properties, the engineer needs to combine, in addition to actual data, knowledge about the quality of the data, knowledge on the geology and, most importantly, engineering judgment’.
We remark that the real space basis of physics was critically questioned in the challenging project of p-adic and, more generally, ultrametric mathematical physics [47–56].
Physically compatibility means the possibility of the joint measurement of such observables; mathematically it means the possibility to represent these observables by commuting operators.
One of the first public and loud presentations of QBism was performed by C. Fuchs in 2001 at one of the first conferences of the famous Växjö series of conferences on quantum foundations, ‘Quantum Theory: Reconsideration of Foundations’, see [64].
And QBists have demonstrated that this is really the case.
Once again we repeat that this is a conceptual paper; our aim is to motivate the use of QP decision-making based on geophysical measurements. Concrete applications may demand essential additional efforts.
The classical counterpart of our analysis is similar to PRA in hydrology as was presented in the article of Tartakovsky [34] devoted to Bayesian PRA for contamination of an aquifer of groundwater, where the author followed the work of Bedford & Cooke [68].
However, this framework can be easily generalized to take into account the possible non-orthogonality of some state vectors [69].
Data accessibility
This article has no additional data.
Authors' contributions
M.Á.L.A. contributed substantially to the conception, design and interpretation of data, and revising the article crucially, made an important intellectual contribution and gave final approval. A.K. contributed substantially to conception and design, drafting the article and revising it critically as well as its final approval. K.O. contributed substantially to conception and design, acquisition, analysis and interpretation of data, drafting the article and revising it critically as well as its final approval. M.d.J.C. contributed substantially to acquisition, analysis and interpretation of data, drafting the article and revising it critically as well as its final approval.
Competing interests
We declare we have no competing interests.
Funding
This paper was partially supported by the project SENER-CONACYT-Hidrocarburos, Yacimiento-Petrolero como un Reactor Fractal, No 168638, and by the Consejo Nacional de Ciencia y Tecnologia (CONACYT), Mexico, under the grant 312–2015, Fronteras de la Ciencia.
References
- 1.Refsgaard JC, van der Sluijs JP, Brown J, van der Keur P. 2006. A framework for dealing with uncertainty due to model structure error. Adv. Water Resour. 29, 1586–1597. ( 10.1016/j.advwatres.2005.11.013) [DOI] [Google Scholar]
- 2.Nilsson B, Højbjerg AL, Refsgaard JC, Troldborg L. 2007. Uncertainty in geological and hydrological data. Hydrol. Earth Syst. Sci. 11, 1551–1561. ( 10.5194/hess-11-1551-2007) [DOI] [Google Scholar]
- 3.Nikravesh M. 2003. Computational intelligence for reservoir management In Proc. IEEE Int. Conf. on Industrial Informatics (INDIN 2003), pp. 396–401. New York, NY: IEEE. [Google Scholar]
- 4.Liu N, Zheng F, Xi K. 2011. An intelligent oil reservoir identification approach by deploying quantum Levenberg–Marquardt neural network and rough set. Int. J. Comput. Sci. Eng. 6, 76–85. ( 10.1504/IJCSE.2011.041215) [DOI] [Google Scholar]
- 5.Tartakovsky DM, Winter CL. 2008. Uncertain future of hydrogeology. J. Hydrol, Eng. 13, 37–39. ( 10.1061/(ASCE)1084-0699(2008)13:1(37) [DOI] [Google Scholar]
- 6.Tartakovsky DM. 2013. Assessment and management of risk in subsurface hydrology: a review and perspective. Adv. Water Resour. 51, 247–260. ( 10.1016/j.advwatres.2012.04.007) [DOI] [Google Scholar]
- 7.Kahneman D, Tversky A. 1973. On the psychology of prediction. Psychol. Rev. 80, 237–251. ( 10.1037/h0034747) [DOI] [Google Scholar]
- 8.Gigerenzer G. 2002. Reckoning with risk: learning to live with uncertainty. London, UK: Penguin Books. [Google Scholar]
- 9.Sandersen PEB. 2008. Uncertainty assessment of geological models—a qualitative approach. In Credibility of modelling. Calibration and reliability in groundwater modelling (eds Refsgaard JC, Kovar K, Haarder E, Nygaard E), pp. 337–344. Wallingford, UK: IAHS. [Google Scholar]
- 10.Schlumberger Company. 2007. Intelligent completions. Middle East Asia Reserv. Rev. 8, 6–21. [Google Scholar]
- 11.Baddeley MC, Curtis A, Wood R. 2004. An introduction to prior information derived from probabilistic judgements: elicitation of knowledge, cognitive bias and herding. In Geological prior information: informing science and engineering (eds A Curtis, R Wood). Geol. Soc. Lond. Spec. Publ. 239, pp. 15–27. London, UK: Geological Society, London ( 10.1144/GSL.SP.2004.239.01.02) [DOI] [Google Scholar]
- 12.Khrennikov A. 1999. Classical and quantum mechanics on information spaces with applications to cognitive, psychological, social and anomalous phenomena. Found. Phys. 29, 1065–1098. ( 10.1023/A:1018885632116) [DOI] [Google Scholar]
- 13.Khrennikov A. 2004. Information dynamics in cognitive, psychological, social, and anomalous phenomena. Fundamental Theories of Physics, vol. 138 Dordrecht, The Netherland: Kluwer. [Google Scholar]
- 14.Khrennikov A. 2010. Ubiquitous quantum structure: from psychology to finances. Berlin, Germany: Springer. [Google Scholar]
- 15.Busemeyer JR, Bruza PD. 2012. Quantum models of cognition and decision. Cambridge, UK: Cambridge University Press. [Google Scholar]
- 16.Bagarello F. 2012. Quantum dynamics for classical systems: with applications of the number operator. New York, NY: Wiley. [Google Scholar]
- 17.Haven E, Khrennikov A. 2013. Quantum social science. Cambridge, UK: Cambridge University Press. [Google Scholar]
- 18.Asano M, Khrennikov A, Ohya M, Tanaka Y, Yamato I, 2015. Quantum adaptivity in biology: from genetics to cognition. Berlin, Germany: Springer. [Google Scholar]
- 19.Bagarello F, Haven E.. 2015. Towards a formalization of a two traders market with information exchange. Phys. Scr. 90, 015203 ( 10.1088/0031-8949/90/1/015203) [DOI] [Google Scholar]
- 20.Khrennikov A. 2016. Quantum Bayesianism as the basis of general theory of decision-making. Phil. Trans. R. Soc. A 374, 20150245 ( 10.1098/rsta.2015.0245) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Dzhafarov EN, Zhang R, Kujala JV.. 2015. Is there contextuality in behavioral and social systems? Phil. Trans. R. Soc. A 374, 20150099 ( 10.1098/rsta.2015.0099) [DOI] [PubMed] [Google Scholar]
- 22.Dzhafarov EN, Kujala JV, Cervantes VH. 2016. Contextuality-by-default: a brief overview of ideas, concepts, and terminology. Lecture Notes in Computer Science, vol. 9535, pp. 12–23. Berlin, Germany: Springer ( 10.1007/978-3-319-28675-4_2) [DOI] [Google Scholar]
- 23.Fuchs CA, Schack R. 2011. A quantum-Bayesian route to quantum-state space. Found. Phys. 41, 345–356. ( 10.1007/s10701-009-9404-8) [DOI] [Google Scholar]
- 24.Fuchs CA, Schack R. 2013. Quantum-Bayesian coherence. Rev. Mod. Phys. 85, 1693–1715. ( 10.1103/RevModPhys.85.1693) [DOI] [Google Scholar]
- 25.Brukner C, Zeilinger A. 1999. Operationally invariant information in quantum mechanics. Phys. Rev. Lett. 83, 3354–3357. ( 10.1103/PhysRevLett.83.3354) [DOI] [Google Scholar]
- 26.D'Ariano GM. 2007. Operational axioms for quantum mechanics. In Foundations of probability and physics - 4 (eds Adenier G, Fuchs C, Khrennikov A), AIP Conf. Proc. 889, pp. 79–105. Melville, NY: AIP. [Google Scholar]
- 27.Chiribella G, d'Ariano GM, Perinotti P. 2012. Informational axioms for quantum theory. In Foundations of probability and physics - 6 (eds M D'Ariano, S-M Fei, E Haven, B Hiesmayr, G Jaeger, A Khrennikov, J-A Larsson), AIP Conf. Proc. 1424, pp. 270–279. Melville, NY: AIP.
- 28.Plotnitsky A. 2002. Quantum atomicity and quantum information: Bohr, Heisenberg, and quantum mechanics as an information theory. In Quantum theory: reconsideration of foundations (ed. Khrennikov A.), pp. 309–343. Växjö, Sweden: Växjö University Press. [Google Scholar]
- 29.Soleimani M, Shorki BJ, Rafiei M. 2016. Integrated petrophysical modeling for a strongly heterogeneous and fractured reservoir, Sarvak Formation, SW Iran. Nat. Resour. Res. 26, 75–88. ( 10.1007/s11053-016-9300-9) [DOI] [Google Scholar]
- 30.Foias C, Mondaini CF, Titi ES. 2016. A discrete data assimilation scheme for the solutions of the two-dimensional Navier–Stokes equations and their statistics. SIAM J. Appl. Dyn. Syst. 4, 2109–2142. ( 10.1137/16M1076526) [DOI] [Google Scholar]
- 31.Bergeron H. 2001. From classical to quantum mechanics: ‘How to translate physical ideas into mathematical language’. J. Math. Phys. 42, 3983–4019. ( 10.1063/1.1386410) [DOI] [Google Scholar]
- 32.Than K. 2006. Training computer models to accurately simulate nature's variability. Stanford News Service, December 17.
- 33.Jorgensen PE. 2007. The measure of a measurement. J. Math. Phys. 48, 1–15. ( 10.1063/1.2794561) [DOI] [Google Scholar]
- 34.Tartakovsky DM. 2007. Probabilistic risk analysis in subsurface hydrology. Geophys. Res. Lett. 34, 380 ( 10.1029/2007GL029245) [DOI] [Google Scholar]
- 35.Davies EB, Lewis JT. 1970. An operational approach to quantum probability. Commun. Math. Phys. 17, 239–260. ( 10.1007/BF01647093) [DOI] [Google Scholar]
- 36.Ozawa M. 1984. Quantum measuring processes of continuous observables. J. Math. Phys. 25, 79–87. ( 10.1063/1.526000) [DOI] [Google Scholar]
- 37.Ozawa M. 2004. Uncertainty relations for noise and disturbance in generalized quantum measurements. Ann. Phys. 311, 350–416. ( 10.1016/j.aop.2003.12.012) [DOI] [Google Scholar]
- 38.Plotnitsky A. 2009. Epistemology and probability: Bohr, Heisenberg, Schrödinger and the nature of quantum-theoretical thinking. Berlin, Germany: Springer. [Google Scholar]
- 39.Nadim F. 2007. Tools and strategies for dealing with uncertainty in geophysics. In Probabilistic methods in geotechnical engineering (eds Griffiths DV, Fenton GA). CISM International Centre for Mechanical Sciences Courses and Lectures, vol. 491, pp. 71–95. Wien, Austria: Springer. [Google Scholar]
- 40.Oleschko K, Parrot JF, Ronquillo G, Shoba S, Stoops G, Marcelino V. 2004. Weathering: toward a fractal quantifying. Math. Geol. 36, 607–627. ( 10.1023/B:MATG.0000037739.43278.34) [DOI] [Google Scholar]
- 41.Oleschko K, Korvin G, Figueroa B, Vuelvas MA, Balankin AS, Flores L, Carreon D.. 2003. Fractal radar scattering from soil. Phys. Rev. E 67, 041403 ( 10.1103/PhysRevE.67.041403) [DOI] [PubMed] [Google Scholar]
- 42.Khrennikov A, Kozyrev SV, Oleschko K, Jaramillo AG, de Jesus Correa Lopez M. 2013. Application of p-adic analysis to time series. Inf. Dim. Anal. Quant. Prob. Relat. Top. 16 1350030 ( 10.1142/S0219025713500306) [DOI] [Google Scholar]
- 43.Khrennikov A, Oleschko K, de Jesús Correa Lopez M. 2016. Modeling fluid's dynamics with master equations in ultrametric spaces representing the treelike structure of capillary networks. Entropy 18, 249 ( 10.3390/e18070249) [DOI] [Google Scholar]
- 44.Khrennikov AY. Oleschko K, de Jesús Correa López M. 2016. Applications of p-adic numbers: from physics to geology. Contemp. Math. 665, 121–131. ( 10.1090/conm/665/13363) [DOI] [Google Scholar]
- 45.Khrennikov A, Oleschko K, de Jesus Correa Lopez M. 2016. Application of p-adic wavelets to model reaction–diffusion dynamics in random porous media. J. Fourier Anal. Appl. 22, 809–822. ( 10.1007/s00041-015-9433-y) [DOI] [Google Scholar]
- 46.Oleschko K, Khrennikov A. 2017. Applications of p-adics to geophysics: linear and quasilinear diffusion of water-in-oil and oil-in-water emulsions. Theor. Math. Phys. 190, 154–163. ( 10.1134/S0040577917010135) [DOI] [Google Scholar]
- 47.Volovich IV. 1987. P-adic space-time and string theory. Theor. Math. Phys. 71, 574–576. ( 10.1007/BF01017088) [DOI] [Google Scholar]
- 48.Aref'eva IYa, Dragovich BG, Volovich IV. 1988. On the p-adic summability of the anharmonic oscillator. Phys. Lett. B 200, 512–514. ( 10.1016/0370-2693(88)90161-X) [DOI] [Google Scholar]
- 49.Vladimirov VS, Volovich IV, Zelenov EI. 1994. P-Adic analysis and mathematical physics. Singapore: World Scientific. [Google Scholar]
- 50.Dragovich BG. 1995. Adelic harmonic oscillator. Int. J. Mod. Phys. A 10, 2349–2365. ( 10.1142/S0217751X95001145) [DOI] [Google Scholar]
- 51.Khrennikov AYu. 1994. P-Adic valued distributions in mathematical physics. Dordrecht, The Netherlands: Kluwer. [Google Scholar]
- 52.Avetisov VA, Bikulov AH, Kozyrev SV.. 1999. Application of p-adic analysis to models of breaking of replica symmetry. J. Phys. A Math. Gen. 32, 8785–8791. ( 10.1088/0305-4470/32/50/301) [DOI] [Google Scholar]
- 53.Parisi G, Sourlas N.. 2000. P-Adic numbers and replica symmetry breaking. Eur. Phys. J. B 14, 535–542. ( 10.1007/s100510051) [DOI] [Google Scholar]
- 54.Albeverio S, Cianci R, Khrennikov A. Yu. 2009. P-adic valued quantization. P-Adic Numbers Ultrametric Anal. Appl. 1, 91–104. ( 10.1134/S2070046609020010) [DOI] [Google Scholar]
- 55.Dragovich B, Khrennikov A, Kozyrev SV, Volovich IV. 2009. On p-adic mathematical physics. P-Adic Numbers Ultrametric Anal. Appl. 1, 1–17. ( 10.1134/S2070046609010014) [DOI] [Google Scholar]
- 56.Aref'eva Ya, Djordjevic GS, Khrennikov AYu, Kozyrev SV, Rakic Z, Volovich IV. 2017. p-Adic mathematical physics and B. Dragovich research. P-Adic Numbers Ultrametric Anal. Appl. 9, 82–85. ( 10.1134/S207004661701008) [DOI] [Google Scholar]
- 57.Foulis DJ. 1999. A half century of quantum logic: what have we learned? In Quantum structures and the nature of reality. Einstein meets Magritte: an Interdisciplinary Reflection on Science, Nature, Art, Human Action and Society, vol. 7, pp. 1–36. Dordrecht, The Netherlands: Springer. [Google Scholar]
- 58.Khrennikov A. 2009. Quantum-like representation of macroscopic configurations In Quantum Interaction, Proc. 3rd Int. Symp., Saarbrücken, Germany, 25–27 March. Lecture Notes in Artificial Intelligence, vol. 5494, pp. 44–58. Berlin, Germany: Springer. [DOI] [PubMed]
- 59.Khrennikov A. 2004. Contextual approach to quantum mechanics and the theory of the fundamental prespace. J. Math. Phys. 45, 902–921. ( 10.1063/1.1645650) [DOI] [Google Scholar]
- 60.Khrennikov A. 2001. Linear representations of probabilistic transformations induced by context transitions. J. Phys. A Math. Gen. 34, 9965–9981. ( 10.1088/0305-4470/34/47/304) [DOI] [Google Scholar]
- 61.Conte E, Todarello O, Federici A, Vitiello F, Lopane M, Khrennikov A, Zbilut JP. 2007. Some remarks on an experiment suggesting quantum-like behavior of cognitive entities and formulation of an abstract quantum mechanical formalism to describe cognitive entity and its dynamics. Chaos Solitons Fractals 31, 1076–1088. ( 10.1016/j.chaos.2005.09.061) [DOI] [Google Scholar]
- 62.Khrennikov A. 2004. On quantum-like probabilistic structure of mental information. Open Syst. Inf. Dyn. 11, 267–275. ( 10.1023/B:OPSY.0000047570.68941.9d) [DOI] [Google Scholar]
- 63.Busemeyer JR, Wang Z, Townsend JT. 2006. Quantum dynamics of human decision making. J. Math. Psychol. 50, 220–241. ( 10.1016/j.jmp.2006.01.003) [DOI] [Google Scholar]
- 64.Fuchs CA. 2002. Quantum mechanics as quantum information (and only a little more). In Quantum theory: reconsideration of foundations (ed. Khrennikov A.), pp. 463–543. Växjö, Sweden: Växjö University Press. [Google Scholar]
- 65.Fuchs CA, Schack R.. 2014. QBism and the Greeks: why a quantum state does not represent an element of physical reality. Phys. Scr. 90, 015104 ( 10.1088/0031-8949/90/1/015104) [DOI] [Google Scholar]
- 66.Khrennikov A. 2015. External observer reflections on QBism. (https://arxiv.org/abs/1512.07195. )
- 67.Schrödinger E. 1989. Statistical thermodynamics (reprint) Mineola, NY: Dover Publications. [Google Scholar]
- 68.Bedford T, Cooke R. 2001. Probabilistic risk analysis: foundations and methods. Cambridge, UK: Cambridge University Press. [Google Scholar]
- 69.Khrennikov A. 2016. Analog of formula of total probability for quantum observables represented by positive operator valued measures. Int. J. Theor. Phys. 55, 3859–3874. ( 10.1007/s10773-016-3015-x) [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
This article has no additional data.









