Abstract
The practice of medicine has evolved significantly over time, from a more holistic to a reductionist or mechanistic approach. This paper briefly traces the history of medicine and the transition to quantitative medicine, which has enabled more personalized and targeted treatments, and improved understanding of the underlying biological mechanisms of disease. However, this shift has also presented some challenges and criticisms, including the danger of losing sight of the patient as a unique, whole individual. This paper explores the underlying principles and key contributions of quantitative medicine, as well as the context for its rise, including the development of new technologies and the influence of reductionist philosophies. The challenges and criticisms of this approach, and the need to balance reductionist and holistic approaches in order to achieve a comprehensive understanding of human health will be discussed. Ultimately, by integrating insights from philosophy, physics, and other fields, we may be able to develop new and innovative approaches that bridge the gap between reductionism and holism and improve patient outcomes with the new “quantitative holism.”
Keywords: Medicine, holism, quantitative medicine, evolution, philosophy
Introduction
Medicine has a rich and complex history, shaped by a variety of factors including cultural, social, and scientific developments. From the earliest forms of healing practiced by ancient civilizations to the cutting-edge medical technologies of today, medicine has undergone significant changes over time. One of the most profound shifts in the history of medicine has been the move from a more holistic approach to a reductionist or mechanistic approach. 1
In its early days, medicine was often practiced by spiritual or religious leaders who saw illness as a punishment from the gods or a result of spiritual imbalance. 2 These healers sought to restore balance to the body, mind, and spirit of their patients through a variety of methods including prayer, meditation, and herbal remedies. 2 Over time, these approaches gave way to more empirical and scientific methods, as doctors began to develop a better understanding of the human body and its functioning. 3
During the Renaissance, for example, anatomists began to dissect cadavers in order to gain a deeper understanding of the human body’s structure and function. 4 This led to the development of new surgical techniques and treatments, as well as a greater emphasis on empirical observation and experimentation. 5 In the 19th and early 20th centuries, medicine became increasingly specialized, with doctors focusing on specific areas of the body or specific diseases.6,7
However, the reductionist approach to medicine really came into its own in the mid-20th century, with the rise of the biomedical model. 8 This model sees the body as a machine, with individual parts that can be studied and manipulated independently. The goal of medicine became focused on identifying and treating specific diseases or conditions, often through the use of drugs or surgery. While this approach has led to significant advances in medical knowledge and treatment, it has also been criticized for the risk of ignoring the complex interactions between different systems in the body and for reducing patients to a collection of symptoms and diagnoses. 9
The shift toward a more reductionist approach to medicine was influenced by a variety of scientific and philosophical trends, as well as by the development of new technologies that allowed for more precise and targeted interventions. One of the key philosophical influences on the reductionist approach to medicine was the rise of Cartesian dualism in the 17th century.10,11 This philosophical perspective sees the mind and body as separate entities, with the body being a machine that can be studied and understood independently of the mind. This perspective influenced many early anatomists and physiologists, who saw the body as a collection of discrete parts that could be studied in isolation. 12
Another influential philosophical trend was logical positivism, which emerged in the early 20th century. 13 This approach emphasized the importance of empirical observation and experimentation in the development of scientific knowledge. In medicine, this led to a greater emphasis on quantitative data and measurable outcomes, as well as on the use of randomized controlled trials to test the efficacy of different treatment. 14
One key figure in the development of the reductionist approach to medicine was French physiologist Claude Bernard. In the mid-19th century, Bernard argued that the body could be understood as a collection of independent physiological systems, each of which could be studied and understood in isolation. 15 He also emphasized the importance of experimentation and measurement in the development of medical knowledge. Another important figure in the history of quantitative medicine was English physician and epidemiologist Austin Bradford Hill. Hill’s work in the mid-20th century helped to establish the importance of randomized controlled trials in the testing of new treatments. 16 He also emphasized the need for careful observation and data collection, and argued that medicine should be based on empirical evidence rather than intuition or tradition. 16
At the same time, advances in technology were also contributing to the shift toward a more reductionist approach to medicine. The development of new tools like microscopes and X-rays allowed doctors to see inside the body and study its structure and function in greater detail. The invention of new drugs and surgical techniques also allowed for more precise and targeted interventions. In recent years, the development of new technologies like genomics and proteomics has further fueled the rise of quantitative medicine. These fields allow researchers to study the complex interactions between genes, proteins, and other biological molecules, and to develop more personalized and targeted treatments based on this knowledge. 17 At the same time, the growing availability of digital health data has opened up new opportunities for using big data and machine learning to analyze patterns in health and disease.18,19
Despite these advances, however, there are still many challenges and limitations to the reductionist approach to medicine. Some critics argue that it ignores the complexity and interconnectedness of the human body, and that it can lead to a focus on treating individual symptoms rather than addressing underlying causes. Others argue that it can be dehumanizing, reducing patients to a collection of data points and diagnoses.
The rise of quantitative medicine
Quantitative medicine is a paradigm shift in the practice of medicine that emphasizes the use of quantitative data and mathematical models to understand and treat disease. 20 This approach is based on the idea that the human body can be studied as a complex system, with many interconnected parts that can be modeled and simulated using mathematical and computational tools. At its core, quantitative medicine is based on four key principles: precision, personalization, prediction, and prevention.
Precision Medicine: Precision medicine is a key tenet of quantitative medicine, and refers to the use of molecular and genetic information to guide the development of targeted treatments for individual patients. This approach recognizes that different patients may respond differently to the same treatment, and that a more personalized approach may be necessary to achieve optimal outcomes.
One example of precision medicine is the use of genomic sequencing to identify specific mutations or genetic markers that are associated with certain types of cancer. By identifying these markers, doctors can develop treatments that are tailored to the patient’s specific genetic profile, potentially improving outcomes and reducing side effects.
Personalized Medicine: Personalized medicine builds on the principles of precision medicine, but takes a broader view of the patient as a whole person. This approach recognizes that patients may have different needs and preferences, and that a one-size-fits-all approach to treatment may not be effective.
Personalized medicine emphasizes the importance of patient-centered care, and involves working closely with patients to develop treatment plans that take into account their individual circumstances, including their medical history, lifestyle, and preferences.
Predictive Medicine: Predictive medicine involves using mathematical and computational models to predict the risk of disease and to identify patients who may be at higher risk. This approach can be used to develop more targeted screening and prevention strategies, potentially reducing the overall burden of disease.
One example of predictive medicine is the use of risk prediction models to identify patients who are at higher risk of developing cardiovascular disease. By identifying these patients early, doctors can develop targeted interventions, such as lifestyle changes or medication, to prevent or delay the onset of disease.
Preventive Medicine: Preventive medicine involves identifying and addressing risk factors before they lead to disease. This approach emphasizes the importance of healthy lifestyle choices, such as exercise, diet, and smoking cessation, as well as targeted interventions, such as vaccination and screening, to prevent or reduce the risk of disease.
The principles of quantitative medicine are rooted in the use of quantitative data and mathematical models to understand and predict the behavior of complex systems. This approach has been enabled by advances in computing power and data analytics, as well as by the development of new tools and technologies, such as genomics and proteomics.
One key contributor to the rise of quantitative medicine has been the field of systems biology, which seeks to understand complex biological systems as networks of interacting components. 21 Systems biology approaches often involve the use of computational models to simulate the behavior of these systems, and can be used to identify new drug targets and to develop more personalized treatment strategies. 22 Another key contributor has been the field of digital health, which involves the use of technology, such as wearables and mobile apps, to collect and analyze health data. Digital health data can be used to develop predictive models of disease risk, to monitor patient outcomes in real-time, and to develop more personalized treatment plans.
The rise of quantitative medicine has also been driven by advances in genomic and proteomic technologies, which allow researchers to study the underlying biological mechanisms of disease in unprecedented detail. 23 By analyzing the expression patterns of genes and proteins, researchers can identify new drug targets and develop more targeted therapies.
The rise of quantitative medicine has been driven in part by advances in new technologies, such as imaging techniques and high-throughput sequencing, which enable the measurement of vast amounts of data on individual patients. However, these technologies alone cannot fully explain the shift toward a more quantitative approach. Rather, the rise of quantitative medicine can also be seen as a reflection of broader trends in science and philosophy toward reductionism and mechanistic thinking.
In this sense, the emergence of quantitative medicine has parallels with the emergence of theoretical physics in the early 20th century. 24 Just as theoretical physics sought to understand the underlying mechanisms of the physical world through the use of mathematical models and simulations, quantitative medicine seeks to understand the underlying biological mechanisms of disease through the use of complex data analysis and computational models.
One example of this connection can be seen in the use of network analysis to study biological systems. 25 In physics, network analysis has been used to study the properties of complex systems such as the internet or social networks. Similarly, in quantitative medicine, network analysis can be used to study the interactions between genes or proteins, and to identify key drivers of disease. Furthermore, the use of mathematical models and simulations in quantitative medicine is also reminiscent of the approach taken in theoretical physics. For example, mathematical models of the spread of infectious diseases have been used to predict the effectiveness of different interventions, such as vaccination campaigns or social distancing measures.
However, as with any reductionist approach, there are also limitations to the use of mathematical models and simulations in medicine. One potential danger is that these models may oversimplify complex biological systems, leading to inaccurate predictions and treatments. Additionally, as discussed earlier, there is a risk of losing sight of the patient as a unique, whole individual.
Challenges, criticism, and its impact
One challenge is the sheer complexity of biological systems, which can make it difficult to develop accurate models and simulations. Another challenge is the need for large, high-quality datasets to train these models, which can be difficult to obtain in some cases. Critics of quantitative medicine have also argued that this approach can be reductionist, focusing too narrowly on individual components of biological systems and neglecting the broader context in which these systems operate. This criticism is rooted in the philosophical debate between reductionism and holism, which we discussed in the previous section.
Despite these challenges, the rise of quantitative medicine has transformed the practice of medicine, enabling more personalized and targeted treatments, and improving our understanding of the underlying biological mechanisms of disease. As we move forward, it will be important to continue to balance the benefits and limitations of this approach, and to ensure that our models and simulations are grounded in a holistic understanding of the complex systems that underlie human health.
While the rise of quantitative medicine has undoubtedly led to significant advances in our understanding of human health and disease, it is important to recognize that the practice of medicine involves much more than simply identifying and targeting isolated biological processes. At its core, medicine is a humanistic endeavor that involves treating patients as unique individuals with complex physical, emotional, and social needs.
One of the challenges of the quantitative approach to medicine is that it can lead to a fragmented view of the patient, focusing solely on isolated biological processes and neglecting the broader context in which these processes occur (the human being). This approach can be particularly problematic when it comes to treating chronic diseases or complex conditions that involve multiple biological systems and psychological and social factors.
Moreover, the super-specialization of medical practitioners can exacerbate the problem of fragmentation, as physicians become increasingly focused on specific sub-fields or areas of expertise, and lose sight of the broader context in which their patients exist. This can lead to a situation in which physicians view their patients as a collection of isolated parts, rather than as unique, interconnected individuals.
This fragmentation of the medical approach runs counter to the holistic view of medicine, which emphasizes the interconnectedness of biological systems and the importance of treating patients as whole individuals, rather than simply as the sum of their individual parts. This view recognizes the importance of the social, cultural, and psychological factors that contribute to human health and disease, and emphasizes the importance of patient-centered care that takes into account the unique needs and circumstances of each individual patient.
The challenge of treating patients as unique individuals requires a shift in the culture and mindset of the medical profession. This shift involves recognizing the limitations of a reductionist approach to medicine and embracing a more holistic view that recognizes the interconnectedness of biological systems and the importance of patient-centered care. By doing so, we can develop more effective treatments and interventions that improve the health and wellbeing of individuals and communities around the world.
A pressing requirement: from physic to medicine
The convergence and reconciliation of reductionism and holism is an unavoidable need in scientific research in general, and in medicine, in particular. It can correctly be presented as the necessary middle ground between macroscopic and microscopic description. If we take, for example, a biological system, which we can consider as the prototype of a complex system, its macroscopic description can be very varied and require a language with a very rich vocabulary: the multiplicity and diversity of these descriptions can be taken as an indicator of complexity and cannot be neglected, so that a traditional reductionist approach would be ineffective.
Equally true, however, is that a global perspective, in which the nature of interactions between constituents is neglected, also seems sterile, as this characteristic is crucial in determining overall behavior. A fundamental property of complex systems is therefore the possibility, indeed the necessity, to be described both at the microscopic level and at a higher level where different categories and concepts must be used, implementing what has been called an “intermediate point of view,” which concretely realizes the continuous crossing, the coming and going between the two mentioned levels.
A concrete example of this need is the current situation in brain studies. Molecular neurobiology has been extraordinarily successful and has gathered very detailed and fully satisfactory information on the functioning of individual neurons. However, this knowledge does not allow us to directly understand how a billion neurons can behave like a mammalian brain. At the opposite extreme we have psychology, for which the properties of individual neurons (and more generally the chemical-physical properties of the brain) are completely irrelevant. This science has laboriously forged its own conceptual categories to describe human thought. The meeting of these two extremes seems arduous, but fortunately even here intermediate approaches are emerging, such as cognitive psychology, which is dedicated to the detailed study of the mechanisms and processes through which human beings perceive the world and organize their knowledge and activities.
An effective example by Giorgio Parisi, who was awarded the 2021 Nobel Prize in Physics precisely for his studies on complex systems, illustrates why and how the macroscopic and the microscopic, the global and the local interact in explaining the reality of these systems in such a way that they cannot be treated separately. A study, the result of an international collaboration, 26 including, among others, Pierfrancesco Urbani and Francesco Zamponi Parisi, dealt with a process, that of vitrification, that is, the transition from the liquid state, which occurs at high temperature, to the solid state, as it cools down, the molecular details of which had hitherto escaped notice. The authors managed to give a complete physical description of this by showing, surprisingly, that the set of different configurations assumed by the glass particles when solidification occurs has a fractal structure. Recall that, in mathematical terms, a fractal is a geometric object endowed with scale invariance: in practice, it appears to have the same structure at whatever dimensional scale one considers it. Fractal structures are often found in nature, and they unite incredibly diverse objects, such as a romanesco broccoli, a stretch of coastline and the edge of a leaf.
Phase transitions are processes that occur daily before our eyes: for example, when water reaches a temperature of zero degrees Celsius, it solidifies and becomes ice; a glass or wax sample in its liquid state, as it cools, becomes solid. However, these are two very different phase transitions because, in the case of water, solidification is sudden, whereas in the case of glass or wax, the process is gradual: as the liquid cools, it acquires greater and greater viscosity until it becomes a solid in its own right. To explain this difference, one cannot help but consider what happens at the microscopic level: when water cools to zero degrees, the initially disordered molecules arrange themselves neatly in a crystalline lattice, whereas in glass and wax, the atoms arranged in a completely disordered manner remain equally disordered even when the solid state is reached. From a physical point of view, the question is: why is there such a conspicuous macroscopic effect, the solidification of wax, even though it changes the microscopic arrangement of the molecules very little?
The answer can be illustrated through a metaphor proposed by Parisi. Let’s think of an underground carriage at rush hour, where the travelers inside are very compressed. Usually, however, there are small gaps that allow one person to change position, because another person perhaps moves a little, pushing another, and momentarily vacates the space. Under these conditions it would only take four or five more people for any movement to be blocked, but seen from the outside, these two situations do not appear very different.
In the case of glass or wax, something similar happens. As the temperature goes down, the molecules decrease their vibrational motions and become more and more stuck in their position because neighboring molecules are stuck, and so on. The traditional idea was that there was only one way for molecules to get stuck. Instead, Parisi and colleagues were able to show that the phase transition occurs with different configurations of the molecules. Returning to the people compressed in the underground, there may be many similar situations, but they are slightly different. A person, for example, can lift an arm, or be able to turn 90 degrees: as they say in physics, different configurations are possible. The same happens with the molecules of a glass that is cooling: for a given value of pressure, the molecules have a certain freedom of movement; as the pressure increases and the temperature decreases, the space available for movement becomes smaller and smaller and is fragmented into smaller spaces, which are no longer in communication with each other. In the set of possible configurations and spaces available for movement, the scale invariance typical of fractals manifests itself.
It is interesting to understand how they arrived at this result. Rather than dealing with real-world materials, which are extremely complex and diverse, and for which many different theories have been proposed that make extensive use of approximations, to the point that it often becomes quite difficult to establish whether the statements made by the theory are true even within the logical structure of the theory itself, they have elaborated a simple, solvable mathematical model that is valid for everything that can be classified, generically, as “glass.” The explanatory hypotheses gradually proposed, being limited to this mathematical model, could be checked directly, establishing, in a well-defined mathematical sense, their correctness. The advantage of this approach is that the model shows an interesting capacity for expansion and unification, allowing it to be applied not only to the phenomena for which it was originally developed, but also to others that were thought to be somehow distinct from it.
The interesting aspect of this approach is that it allowed it to work on two parallel but distinct levels, the mathematical model and the real world. The model’s statements do not concern concrete objects belonging to the real world but specific abstract mathematical objects; its structure is deductive, consisting of a few postulates concerning its objects and a method for deriving a potentially infinite number of consequences. The model is evolved mathematically and at the conclusion of this process its applicability to the real world is verified through a series of “correspondence rules” between the abstract objects of the theory and those of reality, the object of study. This allows the theory to be extended, using the deductive method and introducing new rules of correspondence, to deal with situations that were not a priori included in the initial objectives for which it was initially developed.
This is what concretely consists of what we can call the back-and-forth between the abstract model and reality, by virtue of which, to return to the metaphor of the underground, one can grasp both the macroscopic similarities and the microscopic differences between the situations in which people squeezed into a car find themselves. The two levels, initially distinct and parallel, thus converge, interacting in a concrete and productive way: this is what the only apparently counterintuitive evolutionary approach into a holistic one consists of.
If we want to arrive at a medicine that can be quantitative, precision, personalized, predictive and preventive, this lesson and its enrichments due to the recent evolution from the classical concept of model to that of digital twin must be taken into account. The model is, by definition, an artificial and simplified representation of reality. The relationship between a territory and all its possible maps, physical, political, geomorphological, hydrographic, nautical, economic, demographic etc. exemplifies this nature well, as does the fact that the choice between one or the other depends on the specific problem to be addressed. If, for example, one wants to visualize the distribution of different climatic types and make a reliable weather forecast, one will use a weather map that focuses on these aspects, neglecting all others. This is what simplification consists of, as mentioned above, acting as a perceptive and cognitive filter that responds to the need to conveniently frame and resolve the issue under study. However, if the problem posed concerns not the individual and specific properties of the territory, but the relationships between its various aspects and their interactions, such a model is not functional and effective.
The digital twin compensates for this limitation in that it mimics not a single distinctive feature, but the entire structure and nature of a phenomenon or process and even its context through sets of virtual information constructs dynamically updated thanks to data derived from its physical twin, with which it is constantly connected, throughout its entire life cycle and thanks to informed decisions that generate value. The characterizing element of the digital twin is the two-way, continuous dialog with the physical entity represented: on the one hand, the digital twin provides information to actively monitor and control the physical twin, on the other hand, the information generated by the real twin feeds the simulation algorithms of the digital twin.
The concept of a “twin strategy” was generated from NASA’s Apollo program, which build two real identical space vehicles. One was launched onto the air space, the other stayed on Earth to mirror the conditions of the launched one. The first mention of the term “digital twin” can be traced back to the year 2003 when Grieves mentioned it in the context of manufacturing. 27 Initially, the space industry was primarily concerned with the topic of Digital Twin (DT). In 2012, the NASA and the U.S. Air Force jointly published a paper about the DT, which stated the DT was the key technology for future vehicles. After that, the number of research studies on DT in aerospace has increased and the DT was introduced into more fields such as automotive, oil and gas as well as health care and medicine. Examples are online operation monitoring of process plants, traffic and logistics management, dynamic data assimilation enabled weather forecasting, real-time monitoring systems to detect leakages in oil and water pipelines, and remote control and maintenance of satellites or space-stations. For instance, Singapore is developing a digital copy of the entire city to monitor and improve utilities.
Grieves originally defined the DT in three dimensions 27 : a physical entity, a digital counterpart and a connection that ties the two parts together. In most definitions, the DT is considered as a virtual representation that interacts with the physical object throughout its lifecycle and provides intelligence for evaluation, optimization, prediction.
Taking into account all that has been said, those who fear this virtual duplication of the material world could be answered by pointing out that only if we want to force the situation do we open the door to many different realities, opposed to our real world: in essence, we are representing the latter, in all its folds and with all its extraordinary difficulties, accurately and with predictive potential, going beyond the present and simulating the future, thanks to the availability of Big Data, mathematical models, and AI algorithms.
The application to the medicine
At this point of our analysis come examples could be presented to better focus the ongoing process and impact. Multiple Sclerosis is a chronic autoimmune, degenerative and lifelong disease of the central nervous system (CNS) and the most common cause of neurological disability in young adults. At a pathological level, the infiltration of immune cells into the CNS manifests as localized demyelinating lesions in the white and gray matters of the brain and spinal cord, observed in pathological specimens as well as in magnetic resonance imaging (MRI) sequences. In addition, the disease leads to a progressive destruction of myelin layers (demyelination) and progressive axonal injury, loss and neurodegeneration, impairing the function of the CNS in several ways. MS has different clinical disease courses that have been classically described: beyond this raw classification of disease courses, each MS patient presents with a very individual course of his MS. Therefore, hen quantifying MS, it is necessary to distinguish between different dimensions and perspectives.
An emerging approach toward personalized treatment is precision medicine that takes into account individual variability in genes, environment, and lifestyle for each person. Precision medicine covers diagnosis, treatment and management to achieve better patient outcomes, through precision medicine and twin strategy it is possible to break down the complexity of the disease. The patterns and inter-individual variability can be better understood.
Concrete implementations of digital twins can already be found for organs such as the heart, Recently, a research group from Sofia University in Bulgaria performed a first exercise of simulation of DTs. Petrova-Antonova et al. 28 developed a web-based DT platform for MS diagnosis and rehabilitation that consists of two components: a transactional application that automates tests for MS diagnosis and rehabilitation, and an analytic application that provides data aggregation, enrichment, analysis, and visualization that can be used in any instance of the transactional application to generate new knowledge and support decision making. the analytical application is currently undeveloped and subject to further research.
Concrete implementations of digital twins can already be found for organs such as the heart, for example. Horizon 2020 project, iHEART, proposed and implemented by the team of the polytechnic of Milan led by Prof. Alfio Quarteroni, is developing a virtual heart, made up of mathematical equations that describe the complex interaction of physical phenomena that underpin the heartbeat itself. 29 The aim is to construct a “digital twin” of the patient – a virtual replica of an individual’s heart, based on their biometric data and diagnostic tests. This would prove to be a fundamental tool for heart surgeons and cardiologists, who could use it to explore different treatment options or surgical strategies before treating the actual patient, thus optimizing and personalizing their care based on the individual characteristics specific to them. The goal of iHEART is to construct a mathematical model of the human heart, that is, a virtual replica of the organ that allows us to study and predict its behavior by means of computer simulations. In order to build a model of this kind, Quarteroni and his team are seeking out mathematical equations capable of faithfully representing the behavior of the heart, from the scale of the cells all the way up to that of the atria and ventricles. This results in a system of equations, all of which are paired together according to a dense network of interactions. They use specific, computer-implemented algorithms for these highly complex equations, allowing us to find an approximate—yet nonetheless very precise—solution to the issue at hand. The model of the heart that we are developing could 1 day become a tool in the hands of cardiologists and heart surgeons. By using the biometric data and diagnostic tests of a specific patient, the virtual heart developed in the iHEART project could be personalized, effectively creating a “digital twin” of the patient’s heart. The doctor could then use this virtual replica to explore different treatment options or surgical strategies, tailored specifically to the individual patient, simply by interacting with the computer – and all before treating the actual patient. Secondly, the model could assist doctors in interpreting the results of diagnostic tests, giving them the opportunity to replace invasive methods of measurement with indirect, less invasive ones.
The model could also be used for medical research purposes. Indeed, it makes it possible to run scenario analyses, study the interactions between the different components of the organ, or simulate the effects of diseases or innovative treatments by applying them—entirely virtually—to the digital heart.
Potential evolution and new synthesis: A new holism?
Is it possible for the reductionist model we have experienced in the last century to transform into a new quantitative holistic approach, through digital and algorithmic evolution? Although it may appear to be a contradictory idea, could the evolution of the reductionist model aid in this transition?
The reductionist model has traditionally focused on breaking complex systems down into smaller, more manageable parts, in order to understand how they function. One possible evolution of the reductionist model could involve incorporating a quantitative holistic approach, which considers the entire system as a whole, while still utilizing quantitative methods to measure and analyze its components. This would require a shift in thinking, moving away from the reductionist perspective that sees the system as merely the sum of its parts, toward a more integrated approach that recognizes the importance of emergent properties and systemic interactions.
While it may seem counterintuitive for a reductionist approach to evolve into a holistic one, evolution is not always a straightforward process, and can involve unexpected transformations and adaptations. By incorporating new ideas and methodologies, the reductionist model could potentially evolve into a more comprehensive and accurate understanding of complex systems.
The concept of returning to a more holistic approach to healthcare through quantitative medicine could be considered as an example of a circular pattern in science, where ideas and approaches evolve over time, only to return to a starting point but at a new level of understanding. This circular pattern could be considered as the “spiral of science” or the “circle of knowledge.”
Philosopher and historian of science Thomas Kuhn proposed the concept of scientific paradigms, which are frameworks of understanding that guide scientific research and discovery. According to Kuhn, science undergoes periodic revolutions, in which existing paradigms are replaced by new ones, leading to a significant shift in scientific thinking and practice. However, Kuhn also noted that scientific progress is not always linear, and that there are often periods of stagnation or even regression before a new paradigm emerges. This circular pattern of progress and regression is sometimes referred to as the “Kuhnian circle.”
In the case of quantitative medicine, the use of mathematical and statistical methods together with the potentiality of artificial intelligence represents a new paradigm in healthcare that has the potential to revolutionize the way we understand and treat disease. However, this approach is not entirely new; holistic approaches to medicine have existed for thousands of years, and the concept of the body as a complex system has been recognized by many ancient medical traditions. This return could be achieved through the creation of a theoretical model of quantitative analysis that is so advanced and complex that it can “theoretically” simulate the human body as a digital twin.
However, the application of digital twin technology in medicine raises a number of unique challenges and considerations. One of the primary challenges is the need for large, high-quality datasets to accurately model complex biological systems. This requires the integration of data from a variety of sources, including medical imaging, genomics, and other forms of biological data. Additionally, the accuracy of digital twin models depends on the accuracy and completeness of the data used to create them.
Another challenge is the ethical and privacy considerations related to the use and storage of sensitive medical data. This is particularly important given the increasing prevalence of data breaches and cyber-attacks targeting medical records and other sensitive information. Researchers must ensure that appropriate measures are in place to protect the privacy and security of patient data.
Moreover, the use of digital twin technology in medicine also raises important philosophical and ethical considerations related to the true nature of human identity and the relationship between the physical body and the digital realm and the interconnection between these two domains. Philosophers and physicists have long debated the nature of reality and the relationship between physical and digital representations of the world and they also raise questions about the potential consequences of creating digital replicas of biological systems, and the extent to which these replicas can truly capture the complexity and richness of living systems.
Despite these challenges and considerations, the use of digital twins in medicine holds significant promise for advancing our understanding of human health and disease, and developing more personalized and targeted treatments. As researchers continue to refine and develop digital twin technology, it will be important to balance the benefits and limitations of this approach, and to ensure that ethical and privacy considerations are carefully considered and addressed.
In light of these challenges and criticisms, is it possible to reacquire a more holistic view of medicine? While the development of quantitative methods and technologies has undoubtedly led to significant advances in our understanding of human health and disease, it is important to recognize the limitations of this approach and to explore new modalities for understanding the complexity and richness of biological systems.
One possible way forward is to integrate the principles of holistic medicine with the advances of quantitative medicine, creating a new paradigm that acknowledges the interconnectedness of biological systems while still leveraging the power of quantitative methods and technologies. This approach would recognize the importance of individualized, patient-centered care, while also incorporating the latest advances in genomics, medical imaging, and other forms of biological data. Another way forward is to explore new technologies and approaches that can capture the complexity and richness of biological systems in a more comprehensive and nuanced way. This could involve the development of new simulation techniques that incorporate a wider range of biological processes and interactions, or the use of advanced imaging technologies that allow for a more detailed and precise understanding of the human body and this process surprisingly could create a new model that we could define “ quantitative holism ” that implies the potentiality to regenerate a unifying model through the inputs of the quantitative data analyzed by the advanced AI model.
Ultimately, the evolution of medicine is an ongoing process, shaped by a wide range of social, cultural, and technological factors. While the transition from a holistic to a quantitative approach to medicine has been an important step forward in our understanding of human health and disease, it is important to continue exploring new modalities and approaches that can capture the full complexity and richness of biological systems. By doing so, we can develop more effective treatments and interventions that improve the health and wellbeing of individuals and communities around the world.
Conclusion
In conclusion, the rise of quantitative medicine has marked a significant transition in the history of medicine, from a more holistic approach to a reductionist or mechanistic approach. While this has brought many benefits, including more personalized and targeted treatments, and improved understanding of the underlying biological mechanisms of disease, it has also presented some challenges and criticisms. In particular, the danger of losing sight of the patient as a whole, unique individual, and the need to ensure that our models and simulations are grounded in a holistic understanding of the complex systems that underlie human health.
Furthermore, the rise of quantitative medicine has not occurred in isolation, but rather reflects broader philosophical and scientific trends. The development of new technologies and the influence of reductionist philosophies have been key drivers of this shift. However, it is important to recognize that reductionism has its limitations and that a truly comprehensive understanding of human health will require a synthesis of reductionist and holistic approaches.
As we move forward, it will be important to continue to critically evaluate the benefits and limitations of quantitative medicine, and to ensure that we maintain a balanced approach that recognizes the unique complexity of each individual patient. By integrating insights from philosophy, physics, and other fields, we may be able to develop new and innovative approaches that bridge the gap between reductionism and holism in a model that we could define “quantitative holism,” and ultimately improve patient outcomes.
Footnotes
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding: The author(s) received no financial support for the research, authorship, and/or publication of this article.
ORCID iD: Silvano Tagliagambe
https://orcid.org/0000-0003-2388-6230
References
- 1.Yawar A. Medicine and the human story. Lancet 2010; 375: 546–547. [DOI] [PubMed] [Google Scholar]
- 2.Puchalski CM. The role of spirituality in health care. Proc 2001; 14: 352–357. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Masic I, Miokovic M, Muhamedagic B. Evidence based medicine – New Approaches and challenges. Acta Inform Med 2008; 16: 219–225. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Ghosh SK. Human cadaveric dissection: a historical account from ancient Greece to the modern era. Anat Cell Biol 2015; 48: 153. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Standring S. A brief history of topographical anatomy. J Anat 2016; 229: 32–62. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Rosenberg CE. The tyranny of diagnosis: specific entities and individual experience. Milbank Q 2002; 80: 237–260. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Leinster S. Training medical practitioners: which comes first, the generalist or the specialist? J R Soc Med 2014; 107: 99–102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Ahn AC, Tewari M, Poon CS, et al. The limits of reductionism in medicine: could systems biology offer an alternative? PLoS Med 2006; 3: 0709–0713. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Rocca E, Anjum RL. Complexity, reductionism and the biomedical model. In: Anjum RL, Copeland S, Rocca E. (eds) Rethinking causality, complexity and evidence for the unique patient. Cham: Springer, 2020, pp.75–94. [Google Scholar]
- 10.O’Leary D. Medicine’s metaphysical morass: how confusion about dualism threatens public health. Synthese 2021; 199: 1977–2005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Mehta N. Mind-body dualism: A critique from a health perspectiveFNx08. Mens Sana Monogr 2011; 9: 202. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Zampieri F, ElMaghawry M, Zanatta A, et al. Andreas Vesalius: Celebrating 500 years of dissecting nature. Glob Cardiol Sci Pract 2015; 2015: 66. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Blumberg AE, Feigl H. Logical Positivism. J Philos 1931; 28: 281. [Google Scholar]
- 14.Delgado A, Guddati AK. Clinical endpoints in oncology - a primer. Am J Cancer Res 2021; 11: 1121–1131. [PMC free article] [PubMed] [Google Scholar]
- 15.Bernard C. The “Milieu Intérieur”, and Regulatory Physiology. Hist Philos Life Sci 1986; 8(1): 3–25. [PubMed] [Google Scholar]
- 16.Hill AB. The environment and disease: association or causation? Proc R Soc Med 1965; 58: 295–300. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Hood LE, Omenn GS, Moritz RL, et al. New and improved proteomics technologies for understanding complex biological systems: addressing a grand challenge in the life sciences. Proteomics 2012; 12: 2773–2783. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Zhang A, Xing L, Zou J, et al. Shifting machine learning for healthcare from development to deployment and from models to data. Nat Biomed Eng 2022; 6(12): 1330–1345. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Dash S, Shakyawar SK, Sharma M, et al. Big data in healthcare: management, analysis and future prospects. J Big Data 2019; 6: 1–25. [Google Scholar]
- 20.Bollet AJ. Pierre Louis: the numerical method and the foundation of quantitative medicine. Am J Med Sci 1973; 266: 92–101. [DOI] [PubMed] [Google Scholar]
- 21.Auffray C, Noble D, Nottale L, et al. Progress in integrative systems biology, physiology and medicine: towards a scale-relative biology. Eur Phys J A 2020; 56: 1–24. [Google Scholar]
- 22.Finley SD, Chu LH, Popel AS. Computational systems biology approaches to anti-angiogenic cancer therapeutics. Drug Discov Today 2015; 20: 187–197. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Hossain MU, Ferdous N, Reza MN, et al. Pathogen-driven gene expression patterns lead to a novel approach to the identification of common therapeutic targets. Sci Rep 2022; 12: 1–16. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Weisskopf VF. Physics in the twentieth century selected essays. Cambridge: The MIT Press, 1974. [Google Scholar]
- 25.Ghiassian SD, Menche J, Chasman DI, et al. Endophenotype network models: Common core of complex diseases. Sci Rep 2016; 6: 27414. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Parisi G, Urbani P, Zamponi F. Theory of simple glasses. Cambridge: Cambridge University Press, 2020. [Google Scholar]
- 27.Grieves M. Digital Twin: manufacturing excellence through virtual factory replication. This paper introduces the concept of a A Whitepaper by Dr. Michael Grieves. White Pap, 2015. [Google Scholar]
- 28.Petrova-Antonova D, Spasov I, Krasteva I, et al. A Digital Twin platform for diagnostics and rehabilitation of multiple sclerosis. Lect Notes Comput Sci (including Subser Lect Notes Artif Intell Lect Notes Bioinformatics) 12249 LNCS:503–518. 2020. DOI: 10.1007/978-3-030-58799-4_37/FIGURES/6 [DOI] [Google Scholar]
- 29.Quarteroni A. Un modello matematico del cuore umano. G Ital Cardiol 2020; 21: 907–914. [DOI] [PubMed] [Google Scholar]
