Skip to main content
HFSP Journal logoLink to HFSP Journal
editorial
. 2009 Oct 7;3(5):287–289. doi: 10.2976/1.3233933

Guided self-organization

Mikhail Prokopenko 1,2
PMCID: PMC2801529  PMID: 20057962

Typically, self-organization is defined as the evolution of a system into an organized form in the absence of external pressures. A broad definition of self-organization is given by Haken (2006).

“A system is self-organizing if it acquires a spatial, temporal, or functional structure without specific interference from the outside. By ‘specific’ we mean that the structure or functioning is not impressed on the system but that the system is acted upon from the outside in a non-specific fashion. For instance, the fluid which forms hexagons is heated from below in an entirely uniform fashion and it acquires its specific structure by self-organization.”

Another definition is offered byCamazine et al. (2001) in the context of pattern formation in biological systems.

“Self-organization is a process in which pattern at the global level of a system emerges solely from numerous interactions among the lower-level components of the system. Moreover, the rules specifying interactions among the system’s components are executed using only local information, without reference to the global pattern.”

These definitions capture three important aspects of self-organization. First, it is assumed that the system has many interacting components and advances from a less organized state to a more organized state dynamically over some time, while exchanging energy, matter, and∕or information with the environment. Second, this organization is manifested via global coordination and the global behavior of the system is a result of the interactions among the agents. In other words, the global pattern is not imposed on the system by an external ordering influence (Bonabeau et al., 1997). Finally, the components, whose properties and behaviors are defined prior to the organization itself, have only local information and do not have knowledge of the global state of the system—therefore, the process of self-organization involves some local information transfer (Polani, 2003; Lizier et al., 2008).

Self-organization may seem to contradict the second law of thermodynamics that captures the tendency of systems to disorder. The “paradox” was explained in terms of multiple coupled levels of dynamic activity within the Kugler–Turvey model (Kugler and Turvey, 1987): self-organization and loss of entropy occurs at the macro-level while the system dynamics on the micro-level (which serves as an entropy “sink”) generates increasing disorder.

Kauffman (2000) suggested that the underlying principle of self-organization is the generation of constraints in the release of energy. According to this view, the constrained release allows for such energy to be controlled and channeled to perform some useful work. This work in turn can be used to build better and more efficient constraints for the release of further energy and so on. Adding and controlling constraints on self-organization opens a way to guide it in a specific way.

In general, one may consider different ways to guide the process (dynamics) of self-organization, achieving a specific increase in structure or function within a system. This guidance may be provided by limiting the scope or extent of the self-organizing structures∕functions, or specifying the rate of the internal dynamics, or simply selecting a subset of all possible trajectories that the dynamics may take. The formal definition of guided self-organization and its properties (robustness, adaptability, scalability, etc.) remains an elusive task but there were a few recent attempts, specifically within information theory and dynamical systems: universal utility functions (Klyubin et al., 2005), information-driven evolution (Prokopenko et al., 2006a; 2006b), robust overdesign (Ay et al., 2007), reinforcement-driven homeokinesis (Martius et al., 2007), predictive information based homeokinesis (Ay et al., 2008), etc. However, the lack of agreement of what is meant by complexity, constraints, etc., and a common methodology across multiple scales leaves any definition of (guided) self-organization somehow vague, indicating a clear gap (Polani, 2007). Filling this gap and finding new and systematic ways for the guidance of self-organization is the main theme of GSO Workshops, and the works collected in this special issue aim to identify essential guiding principles.

The perspective by Polani (2009) argues that information (defined as a reduction in uncertainty, i.e., Shannon information) is a critical resource for biological organisms and that it trades off with the available metabolic energy. This leads to the parsimony principle suggesting that if organisms would develop a suboptimal information processing strategy, this would lead to a waste of metabolic energy. The parsimony principle captures the amount of information necessary to achieve a particular utility and aims to provide an implicit measure of the cost per time required to process the sensoric information for generating a desired behavior: “an organism that realizes an evolutionarily successful behavior will at the same time attempt to minimize the required sensoric information to achieve this behavior.” Polani also discusses other information-theoretic principles as candidates for understanding the information dynamics of organisms, concluding that information may be a fundamental currency underlying the success of living organisms—the “currency of life.” If this is indeed the case, then a new level of “quantitative predictiveness” can be introduced into biology and artificial life.

The paper by Prokopenko et al. (2009) considers a simple information-theoretic model for evolutionary dynamics approaching the “coding threshold,” where the capacity to symbolically represent nucleic acid sequences emerges in response to a change in environmental conditions. The study argues that a coupling between a “proto-cell” and its proto-encoding becomes beneficial in terms of preserving the proto-cell’s information in a specific noisy environment, that is, this coupling becomes viable only at a certain “error threshold” level. A limited reduction in the information channel’s capacity, brought about by the environmental noise, which created the appropriate selection pressure for the coupling between a proto-cell and its encoding is another example of the parsimony principle. As argued by Polani (2009), another important benefit of the information-theoretic view is the bookkeeping property of information: information acquisition by an organism increases the organization of its knowledge about the environment. Such knowledge is general—it abstracts away the details of the inner mechanism—and may typically be shared across different organisms. The high degree of universality found in genetic code lends significant support to this view. Specifically, the paper by Prokopenko et al. (2009) investigates whether different proto-cells could horizontally transfer and share such proto-encodings via a joint encoding, even if they had slightly different individual dynamics.

Furthermore, the parsimony principle discussed by Polani (2009) suggests that the adaptation processes operating on different time scales (including both the long-term evolutionary process and short-term lifetime learning dynamics) may “conspire” to minimize the cost for the complexity of a particular task, given the set of available sensors and actions. However, the overall complexity of neural dynamics may increase over evolutionary time. This is investigated in the paper by Yaeger (2009) who simulates an artificial life environment, Polyworld, where agents use their neural networks in trying to survive in the environment. The study utilizes a specific information-theoretic metric for capturing the complexity of the neural dynamics—“TSE complexity” (for the authors’ initials, Tononi–Sporns–Edelman). In particular, this work proposes that periods of complexity growth correspond to periods of behavioral adaptation. As the evolutionary process explores different forms of embodiment, the organism may need to rediscover new information-processing limits that are made available by the changes. Thus, given the embodiment, the parsimony principle would suggest to minimize the processing cost by varying the available neural machinery while satisfying a suitable (and possibly new) utility function. One may speculate that in Polyworld, two constraints guide self-organization of agents’ neural networks: (i) the utility function (i.e., reproduction), which may need a more complex neural rewiring and (ii) the parsimony principle (a lesser cost per task), which may keep the complexity required by the first constraint as minimal as possible, avoiding an unnecessary complexity growth. Confirming this conjecture, Yaeger observes a tendency “to weakly stabilize complexity at a ‘just good enough’ level.”

The paper by Boedecker et al. (2009) studies reservoir computing paradigm recently advanced in the field of recurrent neural networks. The study combines both general and problem-specific methods arguing for the correct balance between (i) a generic∕universal way to prototype the system—in this case, a recurrent neural network initialized with permutation matrices for the reservoir connectivity—and (ii) a problem-specific guidance—in this case, a training method based on intrinsic plasticity that makes use of the input signal in order to shape the output of the reservoir neurons according to a desired (target) probability distribution.

In general, one may suspect that a formal definition of guided self-organization would require an elegant and natural way to combine both task-independent objectives (e.g., parsimony principle, predictive information, etc.) with task-dependent constraints and drives. This, of course, remains a subject of future research—however, the studies presented in this issue provide some directions and examples.

ACKNOWLEDGMENTS

The support of sponsors of the First International Workshop on Guided Self-Organization (GSO-2008, Sydney, Australia), including CSIRO Complex Systems Science Theme, ARC COSNet, CSIRO ICT Centre, ARC EEI, and University of Sydney is gratefully acknowledged.

REFERENCES

  1. Ay, N, Bertschinger, N, Der, R, Guettler, F, and Olbrich, E (2008). “Predictive information and explorative behavior of autonomous robots.” Eur. Phys. J. B 63, 329–339. 10.1140/epjb/e2008-00175-0 [DOI] [Google Scholar]
  2. Ay, N, Flack, J, and Krakauer, D (2007). “Robustness and complexity co-constructed in multimodal signalling networks.” Philos. Trans. R. Soc. London, Ser. B 362, 441–447. 10.1098/rstb.2006.1971 [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Boedecker, J, Obst, O, Mayer, N M, and Asada, S (2009). “Initialization and self-organized optimization of recurrent neural network connectivity.” HFSP J. 3(5), 340–349. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Bonabeau, E, Theraulaz, G, Deneubourg, J-L, and Camazine, S (1997). “Self-organisation in social insects.” Trends Ecol. Evol. 12(5), 188–193. 10.1016/S0169-5347(97)01048-3 [DOI] [PubMed] [Google Scholar]
  5. Camazine, S, Deneubourg, J-L, Franks, N, Sneyd, J, Theraulaz, G, and Bonabeau, E (2001). Self-Organization in Biological Systems, Princeton University Press, Princeton, NJ. [Google Scholar]
  6. Haken, H (2006). Information and Self-Organization: A Macroscopic Approach to Complex Systems, Springer, Berlin, Heidelberg. [Google Scholar]
  7. Kauffman, S A (2000). Investigations, Oxford University Press, Oxford, UK. [Google Scholar]
  8. Klyubin, A S, Polani, D, and Nehaniv, C L (2005). “All else being equal be empowered.” Advances in Artificial Life, Eighth European Conference (ECAL-2005), Lecture Notes in Computer Science, Vol. 3630, Capcarrere M., Freitas A., Bentley P., Johnson C., and Timmis J., eds., Springer, New York, pp 744–753.
  9. Kugler, P, and Turvey, M (1987). Information, Natural Law, and the Self-Assembly of Rhythmic Movement, Lawrence Erlbaum, Hillsdate, NJ. [Google Scholar]
  10. Lizier, J T, Prokopenko, M, and Zomaya, A Y (2008). “Local information transfer as a spatiotemporal filter for complex systems.” Phys. Rev. E 77(2), 026110. 10.1103/PhysRevE.77.026110 [DOI] [PubMed] [Google Scholar]
  11. Martius, G, Herrmann, M, and Der, R (2007). “Guided self-organisation for autonomous robot development.” Advances in Artificial Life, Ninth European Conference on Artificial Life (ECAL-2007), Lecture Notes in Artificial Intelligence, Vol. 4648, Almeida e Costa F., Rocha L., Costa E., Harvey I., and Coutinho A., eds., Springer, New York, pp. 766–775. [Google Scholar]
  12. Polani, D (2003). “Measuring self-organization via observers.” Advances in Artificial Life, Proceedings of the Seventh European Conference on Artificial Life (ECAL), Banzhaf W., Christaller T., Dittrich P., Kim J. T., and Ziegler J., eds., Springer, Heidelberg, Germany, pp. 667–675.
  13. Polani, D (2007). “Foundations and formalizations of self-organization.” Advances in Applied Self-Organizing Systems, Prokopenko M., ed., Springer, London, pp. 19–37. [Google Scholar]
  14. Polani, D (2009). “Information: currency of life?” HFSP J. , 3(5), 307–316. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Prokopenko, M, Gerasimov, V, and Tanev, I (2006a). “Evolving spatiotemporal coordination in a modular robotic system.” Animals to Animats 9, Ninth International Conference on the Simulation of Adaptive Behavior (SAB 2006), Lecture Notes in Computer Science, Vol. 4095, Nolfi S., Baldassarre G., Calabretta R., Hallam J., Marocco D., Meyer J. -A., and Parisi D., eds., Springer, New York, pp. 558–569.
  16. Prokopenko, M, Gerasimov, V, and Tanev, I (2006b). “Measuring spatiotemporal coordination in a modular robotic system.” Artificial Life X, Proceedings of the Tenth International Conference on the Simulation and Synthesis of Living Systems, Rocha L., Yaeger L., Bedau M., Floreano D., Goldstone R., and Vespignani A., eds., MIT Press, Bloomington, IN, pp. 185–191.
  17. Prokopenko, M, Polani, D, and Chadwick, M (2009). “Stigmergic gene transfer and emergence of universal coding.” HFSP J. , 3(5), 317–327. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Yaeger, L S (2009). “How evolution guides complexity?” HFSP J. , 3(5), 328–339. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from HFSP Journal are provided here courtesy of HFSP Publishing.

RESOURCES