Complexity science and machine learning are two complementary approaches to discovering and encod... more Complexity science and machine learning are two complementary approaches to discovering and encoding regularities in irreducibly high dimensional phenomena. Whereas complexity science represents a coarse-grained paradigm of understanding, machine learning is a fine-grained paradigm of prediction. Both approaches seek to solve the “Wigner-Reversal” or the unreasonable ineffectiveness of mathematics in the adaptive domain where broken symmetries and broken ergodicity dominate. In order to integrate these paradigms I introduce the idea of “Meta-Ockham” which 1) moves minimality from the description of a model for a phenomenon to a description of a process for generating a model and 2) describes low dimensional features–schema–in these models. Reinforcement learning and natural selection are both parsimonious in this revised sense of minimal processes that parameterize arbitrarily high-dimensional inductive models containing latent, low-dimensional, regularities. I describe these models...
Background: Using a statistical physics approach, we study the stochastic switching behavior of a... more Background: Using a statistical physics approach, we study the stochastic switching behavior of a model circuit of multisite phosphorylation and dephosphorylation with feedback. The circuit consists of a kinase and phosphatase acting on multiple sites of a substrate that, contingent on its modification state, catalyzes its own phosphorylation and, in a symmetric scenario, dephosphorylation. The symmetric case is viewed as a cartoon of conflicting feedback that could result from antagonistic pathways impinging on the state of a shared component. Results: Multisite phosphorylation is sufficient for bistable behavior under feedback even when catalysis is linear in substrate concentration, which is the case we consider. We compute the phase diagram, fluctuation spectrum and large-deviation properties related to switch memory within a statistical mechanics framework. Bistability occurs as either a first-order or second-order nonequilibrium phase transition, depending on the network symmetries and the ratio of phosphatase to kinase numbers. In the second-order case, the circuit never leaves the bistable regime upon increasing the number of substrate molecules at constant kinase to phosphatase ratio. Conclusion: The number of substrate molecules is a key parameter controlling both the onset of the bistable regime, fluctuation intensity, and the residence time in a switched state. The relevance of the concept of memory depends on the degree of switch symmetry, as memory presupposes information to be remembered, which is highest for equal residence times in the switched states.
The original "Seven Motifs" set forth a roadmap of essential methods for the field of scientific ... more The original "Seven Motifs" set forth a roadmap of essential methods for the field of scientific computing, where a motif is an algorithmic method that captures a pattern of computation and data movement. 1 We present the Nine Motifs of Simulation Intelligence, a roadmap for the development and integration of the essential algorithms necessary for a merger of scientific computing, scientific simulation, and artificial intelligence. We call this merger simulation intelligence (SI), for short. We posit the motifs of simulation intelligence are interconnected and interdependent, much like the components within the layers of an operating system. Using this metaphor, we explore the nature of each layer of the simulation intelligence "operating system" stack (SI-stack) and the motifs therein:
Journal of the Association for Information Science and Technology, Nov 23, 2017
We explore how ideas from infectious disease and genetics can be used to uncover patterns of cult... more We explore how ideas from infectious disease and genetics can be used to uncover patterns of cultural inheritance and innovation in a corpus of 591 national constitutions spanning 1789-2008. Legal "Ideas" are encoded as "topics"-words statistically linked in documents-derived from topic modeling the corpus of constitutions. Using these topics we derive a diffusion network for borrowing from ancestral constitutions back to the US Constitution of 1789 and reveal that constitutions are complex cultural recombinants. We find systematic variation in patterns of borrowing from ancestral texts and "biological"-like behavior in patterns of inheritance with the distribution of "offspring" arising through a bounded preferential-attachment process. This process leads to a small number of highly innovative (influential) constitutions some of which have yet to have been identified as so in the current literature. Our findings thus shed new light on the critical nodes of the constitution-making network. The constitutional network structure reflects periods of intense constitution creation, and systematic patterns of variation in constitutional lifespan and temporal influence.
Despite the near universal assumption of individuality in biology, there is little agreement abou... more Despite the near universal assumption of individuality in biology, there is little agreement about what individuals are and few rigorous quantitative methods for their identification. Here, we propose that individuals are aggregates that preserve a measure of temporal integrity, i.e., "propagate" information from their past into their futures. We formalize this idea using information theory and graphical models. This mathematical formulation yields three principled and distinct forms of individuality-an organismal, a colonial, and a driven form-each of which varies in the degree of environmental dependence and inherited information. This approach can be thought of as a Gestalt approach to evolution where selection makes figure-ground (agent-environment) distinctions using suitable information-theoretic lenses. A benefit of the approach is that it expands the scope of allowable individuals to include adaptive aggregations in systems that are multi-scale, highly distributed, and do not necessarily have physical boundaries such as cell walls or clonal somatic tissue. Such individuals might be visible to selection but hard to detect by observers without suitable measurement principles. The information theory of individuality allows for the identification of individuals at all levels of organization from molecular to cultural and provides a basis for testing assumptions about the natural scales of a system and argues for the importance of uncertainty reduction through coarse-graining in adaptive systems.
ABSTRACTOrganisms are able to partition resources adaptively between growth and repair. The preci... more ABSTRACTOrganisms are able to partition resources adaptively between growth and repair. The precise nature of optimal partitioning and how this emerges from cellular dynamics including insurmountable trade-offs remains an open question. We construct a mathematical framework to estimate optimal partitioning and the corresponding maximal growth rate constrained by empirical scaling laws. We model a biosynthesis tradeoff governing the partitioning of the ribosome economy between replicating functional proteins and replicating the ribosome pool, and also an energy tradeoff arising from the finite energy budget of the cell. Through exact analytic calculations we predict limits on the range of values partitioning ratios take while sustaining growth. We calculate how partitioning and cellular composition scale with protein and ribosome degradation rates and organism size. These results reveal different classes of optimizing strategies corresponding to phenotypically distinct bacterial life...
* Note that in this binary example, the new environmental configuration when switching is unique,... more * Note that in this binary example, the new environmental configuration when switching is unique, enforcing a deterministic switch, but in general there may be a large number of K 1 options such that the agent cannot easily guess at the results of environmental fluctuations.
We survey a current, heated debate in the AI research community on whether large pre-trained lang... more We survey a current, heated debate in the AI research community on whether large pre-trained language models can be said to understand language-and the physical and social situations language encodes-in any humanlike sense. We describe arguments that have been made for and against such understanding, and key questions for the broader sciences of intelligence that have arisen in light of these arguments. We contend that an extended science of intelligence can be developed that will provide insight into distinct modes of understanding, their strengths and limitations, and the challenge of integrating diverse forms of cognition. What does it mean to understand something? This question has long engaged philosophers, cognitive scientists, and educators, nearly always with reference to humans and other animals. However, with the recent rise of large-scale AI systems-especially so-called large language models-a heated debate has arisen in the AI community on whether machines can now be said to understand natural language, and thus understand the physical and social situations that language can describe. This debate is not just academic; the extent and manner in which machines understand our world has real stakes for how much we can trust them to drive cars, diagnose diseases, care for the elderly, educate children, and more generally act robustly and transparently in tasks that impact humans. Moreover, the current debate suggests a fascinating divergence in how to think about understanding in intelligent systems, in particular the contrast between mental models that rely on statistical correlations and those that rely on causal mechanisms. Until quite recently there was general agreement in the AI research community about machine understanding: while AI systems exhibit seemingly intelligent behavior in many specific tasks, they do not understand the data they process in the way humans do. Facial recognition software does not understand that faces are parts of bodies, or the role of facial expressions in social interactions, or what it means to "face" an unpleasant situation, or any of the other uncountable ways in which humans conceptualize faces. Similarly, speech-to-text and machine translation programs do not understand the language they process, and autonomous driving systems do not understand the meaning of the subtle eye contact or body language drivers and pedestrians use to avoid accidents. Indeed, the oft-noted brittleness of these AI systems-their unpredictable errors and lack of robust generalization abilities-are key indicators of their lack of understanding [59]. However, over the last several years, a new kind of AI system has soared in popularity and influence in the research community, one that has changed the views of some people about the prospects of machines that understand language. Variously called Large Language Models (LLMs), Large Pre
Armed conflict data display scaling and universal dynamics in both social and physical properties... more Armed conflict data display scaling and universal dynamics in both social and physical properties like fatalities and geographic extent. We propose a randomly branching, armed-conflict model that relates multiple properties to one another in a way consistent with data. The model incorporates a fractal lattice on which conflict spreads, uniform dynamics driving conflict growth, and regional virulence that modulates local conflict intensity. The quantitative constraints on scaling and universal dynamics we use to develop our minimal model serve more generally as a set of constraints for other models for armed conflict dynamics. We show how this approach akin to thermodynamics imparts mechanistic intuition and unifies multiple conflict properties, giving insight into causation, prediction, and intervention timing.
Recent work suggests that collective computation of social structure can minimize uncertainty abo... more Recent work suggests that collective computation of social structure can minimize uncertainty about the social and physical environment, facilitating adaptation. We explore these ideas by studying how fission-fusion social structure arises in spider monkey (Ateles geoffroyi) groups, exploring whether monkeys use social knowledge to collectively compute subgroup size distributions adaptive for foraging in variable environments. We assess whether individual decisions to stay in or leave subgroups are conditioned on strategies based on the presence or absence of others. We search for this evidence in a time series of subgroup membership. We find that individuals have multiple strategies, suggesting that the social knowledge of different individuals is important. These stay-leave strategies provide microscopic inputs to a stochastic model of collective computation encoded in a family of circuits. Each circuit represents an hypothesis for how collectives combine strategies to make decisions, and how these produce various subgroup size distributions. By running these circuits forward in simulation we generate new subgroup size distributions and measure how well they match food abundance in the environment using transfer entropies. We find that spider monkeys decide to stay or go using information from multiple individuals and that they can collectively compute a distribution of subgroup size that makes efficient use of ephemeral sources of nutrition. We are able to artificially tune circuits with subgroup size distributions that are a better fit to the environment than the observed. This suggests that a combination of measurement error, constraint, and adaptive lag are diminishing the power of collective computation in this system. These results are relevant for a more general understanding of the emergence of ordered states in multi-scale social systems with adaptive properties-both natural and engineered.
Conflicts of interest between members of a group can improve the accuracy of the collective compu... more Conflicts of interest between members of a group can improve the accuracy of the collective computation they perform.
Sexual reproduction, typically conceived of as a puzzling feature of eukaryotes, has posed an ext... more Sexual reproduction, typically conceived of as a puzzling feature of eukaryotes, has posed an extraordinary evolutionary challenge in terms of the two-fold replicative advantage of asexuals over sexuals. Here we show mathematically that a greater than two fold cost is paid by retroviruses such as HIV during reverse transcription. For a retrovirus replication is achieved through RNA reverse transcription and the effectively linear growth processes of DNA transcription during gene expression. Retroviruses are unique among viruses in that they show an alternation of generations between a diploid free living phase and a haploid integrated phase. Retroviruses engage in extensive recombination during the synthesis of the haploid DNA provirus. Whereas reverse transcription generates large amounts of sequence variation, DNA transcription is a high fidelity process. Retroviruses come under strong selection pressures from immune systems to generate escape mutants, and reverse transciption int...
Institutions have been described as ‘the humanly devised constraints that structure political, ec... more Institutions have been described as ‘the humanly devised constraints that structure political, economic, and social interactions.’ This broad definition of institutions spans social norms, laws, companies, and even scientific theories. We describe a non-equilibrium, multi-scale learning framework supporting institutional quasi-stationarity, periodicity, and switching. Individuals collectively construct ledgers constituting institutions. Agents read only a part of the ledger–positive and negative opinions of an institution—its “public position” whose value biases one agent’s preferences over those of rivals. These positions encode collective perception and action relating to laws, the power of parties in political office, and advocacy for scientific theories. We consider a diversity of complex temporal phenomena in the history of social and research culture (e.g. scientific revolutions) and provide a new explanation for ubiquitous cultural resistance to change and novelty–a systemic ...
We construct and analyze a stochastic model of a population of N protein molecules, each of which... more We construct and analyze a stochastic model of a population of N protein molecules, each of which can be phosphorylated and dephosphorylated at J sites by the same kinase and phosphatase respectively. In addition, fully dephosphorylated (phosphorylated) proteins feed back to act as catalysts for all intermediate dephosphorylation (phosphorylation) reactions. The population is treated as a vector of occupancies of J + 1 sites in a one-dimensional lattice representing the phosphorylation states of each protein molecule. This model exhibits a continuous phase transition at any J ≥ 2, between a symmetric and a symmetry-breaking state. In addition, we find that the universality class of the phase transition at finite J is different than that at J → ∞.
The array of physiological changes that occur when humans venture into space for long periods pre... more The array of physiological changes that occur when humans venture into space for long periods presents a challenge to future exploration. The changes are conventionally investigated independently, but a complete understanding of adaptation requires a conceptual basis founded in intergrative physiology, aided by appropriate mathematical modeling. NASA is in the early stages of developing such an approach.
Complexity science and machine learning are two complementary approaches to discovering and encod... more Complexity science and machine learning are two complementary approaches to discovering and encoding regularities in irreducibly high dimensional phenomena. Whereas complexity science represents a coarse-grained paradigm of understanding, machine learning is a fine-grained paradigm of prediction. Both approaches seek to solve the “Wigner-Reversal” or the unreasonable ineffectiveness of mathematics in the adaptive domain where broken symmetries and broken ergodicity dominate. In order to integrate these paradigms I introduce the idea of “Meta-Ockham” which 1) moves minimality from the description of a model for a phenomenon to a description of a process for generating a model and 2) describes low dimensional features–schema–in these models. Reinforcement learning and natural selection are both parsimonious in this revised sense of minimal processes that parameterize arbitrarily high-dimensional inductive models containing latent, low-dimensional, regularities. I describe these models...
Background: Using a statistical physics approach, we study the stochastic switching behavior of a... more Background: Using a statistical physics approach, we study the stochastic switching behavior of a model circuit of multisite phosphorylation and dephosphorylation with feedback. The circuit consists of a kinase and phosphatase acting on multiple sites of a substrate that, contingent on its modification state, catalyzes its own phosphorylation and, in a symmetric scenario, dephosphorylation. The symmetric case is viewed as a cartoon of conflicting feedback that could result from antagonistic pathways impinging on the state of a shared component. Results: Multisite phosphorylation is sufficient for bistable behavior under feedback even when catalysis is linear in substrate concentration, which is the case we consider. We compute the phase diagram, fluctuation spectrum and large-deviation properties related to switch memory within a statistical mechanics framework. Bistability occurs as either a first-order or second-order nonequilibrium phase transition, depending on the network symmetries and the ratio of phosphatase to kinase numbers. In the second-order case, the circuit never leaves the bistable regime upon increasing the number of substrate molecules at constant kinase to phosphatase ratio. Conclusion: The number of substrate molecules is a key parameter controlling both the onset of the bistable regime, fluctuation intensity, and the residence time in a switched state. The relevance of the concept of memory depends on the degree of switch symmetry, as memory presupposes information to be remembered, which is highest for equal residence times in the switched states.
The original "Seven Motifs" set forth a roadmap of essential methods for the field of scientific ... more The original "Seven Motifs" set forth a roadmap of essential methods for the field of scientific computing, where a motif is an algorithmic method that captures a pattern of computation and data movement. 1 We present the Nine Motifs of Simulation Intelligence, a roadmap for the development and integration of the essential algorithms necessary for a merger of scientific computing, scientific simulation, and artificial intelligence. We call this merger simulation intelligence (SI), for short. We posit the motifs of simulation intelligence are interconnected and interdependent, much like the components within the layers of an operating system. Using this metaphor, we explore the nature of each layer of the simulation intelligence "operating system" stack (SI-stack) and the motifs therein:
Journal of the Association for Information Science and Technology, Nov 23, 2017
We explore how ideas from infectious disease and genetics can be used to uncover patterns of cult... more We explore how ideas from infectious disease and genetics can be used to uncover patterns of cultural inheritance and innovation in a corpus of 591 national constitutions spanning 1789-2008. Legal "Ideas" are encoded as "topics"-words statistically linked in documents-derived from topic modeling the corpus of constitutions. Using these topics we derive a diffusion network for borrowing from ancestral constitutions back to the US Constitution of 1789 and reveal that constitutions are complex cultural recombinants. We find systematic variation in patterns of borrowing from ancestral texts and "biological"-like behavior in patterns of inheritance with the distribution of "offspring" arising through a bounded preferential-attachment process. This process leads to a small number of highly innovative (influential) constitutions some of which have yet to have been identified as so in the current literature. Our findings thus shed new light on the critical nodes of the constitution-making network. The constitutional network structure reflects periods of intense constitution creation, and systematic patterns of variation in constitutional lifespan and temporal influence.
Despite the near universal assumption of individuality in biology, there is little agreement abou... more Despite the near universal assumption of individuality in biology, there is little agreement about what individuals are and few rigorous quantitative methods for their identification. Here, we propose that individuals are aggregates that preserve a measure of temporal integrity, i.e., "propagate" information from their past into their futures. We formalize this idea using information theory and graphical models. This mathematical formulation yields three principled and distinct forms of individuality-an organismal, a colonial, and a driven form-each of which varies in the degree of environmental dependence and inherited information. This approach can be thought of as a Gestalt approach to evolution where selection makes figure-ground (agent-environment) distinctions using suitable information-theoretic lenses. A benefit of the approach is that it expands the scope of allowable individuals to include adaptive aggregations in systems that are multi-scale, highly distributed, and do not necessarily have physical boundaries such as cell walls or clonal somatic tissue. Such individuals might be visible to selection but hard to detect by observers without suitable measurement principles. The information theory of individuality allows for the identification of individuals at all levels of organization from molecular to cultural and provides a basis for testing assumptions about the natural scales of a system and argues for the importance of uncertainty reduction through coarse-graining in adaptive systems.
ABSTRACTOrganisms are able to partition resources adaptively between growth and repair. The preci... more ABSTRACTOrganisms are able to partition resources adaptively between growth and repair. The precise nature of optimal partitioning and how this emerges from cellular dynamics including insurmountable trade-offs remains an open question. We construct a mathematical framework to estimate optimal partitioning and the corresponding maximal growth rate constrained by empirical scaling laws. We model a biosynthesis tradeoff governing the partitioning of the ribosome economy between replicating functional proteins and replicating the ribosome pool, and also an energy tradeoff arising from the finite energy budget of the cell. Through exact analytic calculations we predict limits on the range of values partitioning ratios take while sustaining growth. We calculate how partitioning and cellular composition scale with protein and ribosome degradation rates and organism size. These results reveal different classes of optimizing strategies corresponding to phenotypically distinct bacterial life...
* Note that in this binary example, the new environmental configuration when switching is unique,... more * Note that in this binary example, the new environmental configuration when switching is unique, enforcing a deterministic switch, but in general there may be a large number of K 1 options such that the agent cannot easily guess at the results of environmental fluctuations.
We survey a current, heated debate in the AI research community on whether large pre-trained lang... more We survey a current, heated debate in the AI research community on whether large pre-trained language models can be said to understand language-and the physical and social situations language encodes-in any humanlike sense. We describe arguments that have been made for and against such understanding, and key questions for the broader sciences of intelligence that have arisen in light of these arguments. We contend that an extended science of intelligence can be developed that will provide insight into distinct modes of understanding, their strengths and limitations, and the challenge of integrating diverse forms of cognition. What does it mean to understand something? This question has long engaged philosophers, cognitive scientists, and educators, nearly always with reference to humans and other animals. However, with the recent rise of large-scale AI systems-especially so-called large language models-a heated debate has arisen in the AI community on whether machines can now be said to understand natural language, and thus understand the physical and social situations that language can describe. This debate is not just academic; the extent and manner in which machines understand our world has real stakes for how much we can trust them to drive cars, diagnose diseases, care for the elderly, educate children, and more generally act robustly and transparently in tasks that impact humans. Moreover, the current debate suggests a fascinating divergence in how to think about understanding in intelligent systems, in particular the contrast between mental models that rely on statistical correlations and those that rely on causal mechanisms. Until quite recently there was general agreement in the AI research community about machine understanding: while AI systems exhibit seemingly intelligent behavior in many specific tasks, they do not understand the data they process in the way humans do. Facial recognition software does not understand that faces are parts of bodies, or the role of facial expressions in social interactions, or what it means to "face" an unpleasant situation, or any of the other uncountable ways in which humans conceptualize faces. Similarly, speech-to-text and machine translation programs do not understand the language they process, and autonomous driving systems do not understand the meaning of the subtle eye contact or body language drivers and pedestrians use to avoid accidents. Indeed, the oft-noted brittleness of these AI systems-their unpredictable errors and lack of robust generalization abilities-are key indicators of their lack of understanding [59]. However, over the last several years, a new kind of AI system has soared in popularity and influence in the research community, one that has changed the views of some people about the prospects of machines that understand language. Variously called Large Language Models (LLMs), Large Pre
Armed conflict data display scaling and universal dynamics in both social and physical properties... more Armed conflict data display scaling and universal dynamics in both social and physical properties like fatalities and geographic extent. We propose a randomly branching, armed-conflict model that relates multiple properties to one another in a way consistent with data. The model incorporates a fractal lattice on which conflict spreads, uniform dynamics driving conflict growth, and regional virulence that modulates local conflict intensity. The quantitative constraints on scaling and universal dynamics we use to develop our minimal model serve more generally as a set of constraints for other models for armed conflict dynamics. We show how this approach akin to thermodynamics imparts mechanistic intuition and unifies multiple conflict properties, giving insight into causation, prediction, and intervention timing.
Recent work suggests that collective computation of social structure can minimize uncertainty abo... more Recent work suggests that collective computation of social structure can minimize uncertainty about the social and physical environment, facilitating adaptation. We explore these ideas by studying how fission-fusion social structure arises in spider monkey (Ateles geoffroyi) groups, exploring whether monkeys use social knowledge to collectively compute subgroup size distributions adaptive for foraging in variable environments. We assess whether individual decisions to stay in or leave subgroups are conditioned on strategies based on the presence or absence of others. We search for this evidence in a time series of subgroup membership. We find that individuals have multiple strategies, suggesting that the social knowledge of different individuals is important. These stay-leave strategies provide microscopic inputs to a stochastic model of collective computation encoded in a family of circuits. Each circuit represents an hypothesis for how collectives combine strategies to make decisions, and how these produce various subgroup size distributions. By running these circuits forward in simulation we generate new subgroup size distributions and measure how well they match food abundance in the environment using transfer entropies. We find that spider monkeys decide to stay or go using information from multiple individuals and that they can collectively compute a distribution of subgroup size that makes efficient use of ephemeral sources of nutrition. We are able to artificially tune circuits with subgroup size distributions that are a better fit to the environment than the observed. This suggests that a combination of measurement error, constraint, and adaptive lag are diminishing the power of collective computation in this system. These results are relevant for a more general understanding of the emergence of ordered states in multi-scale social systems with adaptive properties-both natural and engineered.
Conflicts of interest between members of a group can improve the accuracy of the collective compu... more Conflicts of interest between members of a group can improve the accuracy of the collective computation they perform.
Sexual reproduction, typically conceived of as a puzzling feature of eukaryotes, has posed an ext... more Sexual reproduction, typically conceived of as a puzzling feature of eukaryotes, has posed an extraordinary evolutionary challenge in terms of the two-fold replicative advantage of asexuals over sexuals. Here we show mathematically that a greater than two fold cost is paid by retroviruses such as HIV during reverse transcription. For a retrovirus replication is achieved through RNA reverse transcription and the effectively linear growth processes of DNA transcription during gene expression. Retroviruses are unique among viruses in that they show an alternation of generations between a diploid free living phase and a haploid integrated phase. Retroviruses engage in extensive recombination during the synthesis of the haploid DNA provirus. Whereas reverse transcription generates large amounts of sequence variation, DNA transcription is a high fidelity process. Retroviruses come under strong selection pressures from immune systems to generate escape mutants, and reverse transciption int...
Institutions have been described as ‘the humanly devised constraints that structure political, ec... more Institutions have been described as ‘the humanly devised constraints that structure political, economic, and social interactions.’ This broad definition of institutions spans social norms, laws, companies, and even scientific theories. We describe a non-equilibrium, multi-scale learning framework supporting institutional quasi-stationarity, periodicity, and switching. Individuals collectively construct ledgers constituting institutions. Agents read only a part of the ledger–positive and negative opinions of an institution—its “public position” whose value biases one agent’s preferences over those of rivals. These positions encode collective perception and action relating to laws, the power of parties in political office, and advocacy for scientific theories. We consider a diversity of complex temporal phenomena in the history of social and research culture (e.g. scientific revolutions) and provide a new explanation for ubiquitous cultural resistance to change and novelty–a systemic ...
We construct and analyze a stochastic model of a population of N protein molecules, each of which... more We construct and analyze a stochastic model of a population of N protein molecules, each of which can be phosphorylated and dephosphorylated at J sites by the same kinase and phosphatase respectively. In addition, fully dephosphorylated (phosphorylated) proteins feed back to act as catalysts for all intermediate dephosphorylation (phosphorylation) reactions. The population is treated as a vector of occupancies of J + 1 sites in a one-dimensional lattice representing the phosphorylation states of each protein molecule. This model exhibits a continuous phase transition at any J ≥ 2, between a symmetric and a symmetry-breaking state. In addition, we find that the universality class of the phase transition at finite J is different than that at J → ∞.
The array of physiological changes that occur when humans venture into space for long periods pre... more The array of physiological changes that occur when humans venture into space for long periods presents a challenge to future exploration. The changes are conventionally investigated independently, but a complete understanding of adaptation requires a conceptual basis founded in intergrative physiology, aided by appropriate mathematical modeling. NASA is in the early stages of developing such an approach.
Uploads
Papers by David Krakauer