Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2001, Physical Review Letters
…
13 pages
1 file
The evolution of many complex systems, including the world wide web, business and citation networks is encoded in the dynamic web describing the interactions between the system's constituents. Despite their irreversible and non-equilibrium nature these networks follow Bose statistics and can undergo Bose-Einstein condensation. Addressing the dynamical properties of these non-equilibrium systems within the framework of equilibrium quantum gases predicts that the 'first-mover-advantage', 'fit-get-rich' and 'winner-takes-all' phenomena observed in competitive systems are thermodynamically distinct phases of the underlying evolving networks.
Computing Research Repository, 2001
Complex networks describe a wide range of systems in nature and society. Frequently cited examples include the cell, a network of chemicals linked by chemical reactions, and the Internet, a network of routers and computers connected by physical links. While traditionally these systems have been modeled as random graphs, it is increasingly recognized that the topology and evolution of real networks are governed by robust organizing principles. This article reviews the recent advances in the field of complex networks, focusing on the statistical mechanics of network topology and dynamics. After reviewing the empirical data that motivated the recent interest in networks, the authors discuss the main models and analytical tools, covering random graphs, small-world and scale-free networks, the emerging theory of evolving networks, and the interplay between topology and the network's robustness against failures and attacks.
Journal of Statistical Mechanics: Theory and Experiment, 2014
We investigate choice-driven network growth. In this model, nodes are added one by one according to the following procedure: for each addition event a set of target nodes is selected, each according to linear preferential attachment, and a new node attaches to the target with the highest degree. Depending on precise details of the attachment rule, the resulting networks has three possible outcomes: (i) a non-universal power-law degree distribution; (ii) a single macroscopic hub (a node whose degree is of the order of N , the number of network nodes), while the remainder of the nodes comprises a non-universal power-law degree distribution; (iii) a degree distribution that decays as (k ln k) −2 at the transition between cases (i) and (ii). These properties are robust when attachment occurs to the highest-degree node from at least two targets. When attachment is made to a target whose degree is not the highest, the degree distribution has the double-exponential form exp(−const. × e k ), from which the largest degree grows only as ln ln N .
Statistical physics is the natural framework to model complex networks. In the last twenty years, it has brought novel physical insights on a variety of emergent phenomena, such as self-organisation, scale invariance, mixed distributions and ensemble non-equivalence, which cannot be deduced from the behaviour of the individual constituents. At the same time, thanks to its deep connection with information theory, statistical physics and the principle of maximum entropy have led to the definition of null models reproducing some features of empirical networks, but otherwise as random as possible. We review here the statistical physics approach for complex networks and the null models for the various physical problems, focusing in particular on the analytic frameworks reproducing the local features of the network. We show how these models have been used to detect statistically significant and predictive structural patterns in real-world networks, as well as to reconstruct the network structure in case of incomplete information. We further survey the statistical physics frameworks that reproduce more complex, semi-local network features using Markov chain Monte Carlo sampling, and the models of generalised network structures such as multiplex networks, interacting networks and simplicial complexes. The science of networks has exploded in the Information Age thanks to the unprecedented production and storage of data on basically any human activity. Indeed, a network represents the simplest yet extremely effective way to model a large class of technological, social, economic and biological systems, as a set of entities (nodes) and of interactions (links) among them. These interactions do represent the fundamental degrees of freedom of the network, and can be of different types—undirected or directed, binary or valued (weighted)—depending on the nature of the system and the resolution used to describe it. Notably, most of the networks observed in the real world fall within the domain of complex systems, as they exhibit strong and complicated interaction patterns, and feature collective emergent phenomena that do not follow trivially from the behaviours of the individual entities [1]. For instance, many networks are scale-free [2–6], meaning that the number of links incident to a node (known as the node's degree) is fat-tailed distributed, sometimes following a power-law: most nodes have a few links, but a few nodes (the hubs) have many of them. The same happens for the distribution of the total weight of connections incident to a node (the node's strength) [7, 8]. Similarly, most real-world networks are organised into modules or feature a community structure [9, 10], and they possess high clustering—as nodes tend to create tightly linked groups, but are also small-world [11–13] as the distance (in terms of number of connections) amongst node pairs scales logarithmically with the system size. The observation of these universal features in complex networks has stimulated the development of a unifying mathematical language to model their structure and understand the dynamical processes taking place on them—such as the flow of traffic on the Internet or the spreading of either diseases or information in a population [14–16]. Two different approaches to network modelling can be pursued. The first one consists in identifying one or more microscopic mechanisms driving the formation of the network, and use them to define a dynamic model which can reproduce some of the emergent properties of real systems. The small-world model [11], the preferential attachment model [2], the fitness model [5], the relevance model [17] and many others follow this approach which is akin to kinetic theory. These models can handle only simple microscopic dynamics, and thus while providing good physical insights they need several refinements to give quantitatively accurate predictions. The other possible approach consists in identifying a set of characteristic static properties of real systems, and then building networks having the same properties but otherwise maximally random. This approach is thus akin to statistical mechanics and therefore is based on rigorous probabilistic arguments that can lead to accurate and reliable predictions. The mathematical framework is that of exponential random graphs (ERG), which has been first introduced in the social sciences and statistics [18–26] as a convenient formulation relying on numerical techniques such as Markov chain Monte Carlo algorithms. The interpretation of ERG in physical terms is due to Park and Newman [27], who showed how to derive them from the principle of maximum entropy and the statistical mechanics of Boltzmann and Gibbs. As formulated by Jaynes [28], the variational principle of maximum entropy states that the probability distribution best representing the current state of (knowledge on) a system is the one which maximises the Shannon entropy, subject in principle to any prior information on the system itself. This means making self-consistent inference assum
A data-driven approach to analysing and characterizing complex networks outperforms simulations based on traditional network models.
Advances in Physics, 2002
We review the recent fast progress in statistical physics of evolving networks. Interest has focused mainly on the structural properties of random complex networks in communications, biology, social sciences and economics. A number of giant artificial networks of such a kind came into existence recently. This opens a wide field for the study of their topology, evolution, and complex processes occurring in them. Such networks possess a rich set of scaling properties. A number of them are scale-free and show striking resilience against random breakdowns. In spite of large sizes of these networks, the distances between most their vertices are short-a feature known as the "smallworld" effect. We discuss how growing networks self-organize into scale-free structures and the role of the mechanism of preferential linking. We consider the topological and structural properties of evolving networks, and percolation in these networks. We present a number of models demonstrating the main features of evolving networks and discuss current approaches for their simulation and analytical study. Applications of the general results to particular networks in Nature are discussed. We demonstrate the generic connections of the network growth processes with the general problems of non-equilibrium physics, econophysics, evolutionary biology, etc. CONTENTS 42 IX L. Eigenvalue spectrum of the adjacency matrix 42 IX M. Scale-free trees 43 X. Non-scale-free networks with preferential linking 43 XI. Percolation on networks 44 XI A. Theory of percolation on undirected equilibrium networks 44 XI B. Percolation on directed equilibrium networks 48 XI C. Failures and attacks 49 XI D. Resilience against random breakdowns 50 XI E. Intentional damage 52 XI F. Disease spread within networks 54 XI G. Anomalous percolation on growing networks 55 XII. Growth of networks and self-organized criticality 57 XII A. Linking with sand-pile problems 57 XII B. Preferential linking and the Simon model 57 XII C. Multiplicative stochastic models and the generalized Lotka-Volterra equation 58 XIII. Concluding remarks 58 Acknowledgements 59 References 59
2008
A basic premise behind the study of large networks is that interaction leads to complex collective behavior. In our work we found very interesting and counterintuitive patterns for time evolving networks, which change some of the basic assumptions that were made in the past. We then develop models that explain processes which govern the network evolution, fit such models to real networks, and use them to generate realistic graphs or give formal explanations about their properties.
Physical Review E, 2008
A condensation transition was predicted for growing technological networks evolving by preferential attachment and competing quality of their nodes, as described by the fitness model. When this condensation occurs a node acquires a finite fraction of all the links of the network. Earlier studies based on steady state degree distribution and on the mapping to Bose-Einstein condensation, were able to identify the critical point. Here we characterize the dynamics of condensation and we present evidence that below the condensation temperature there is a slow down of the dynamics and that a single node (not necessarily the best node in the network) emerges as the winner for very long times. The characteristic time t * at which this phenomenon occurs diverges both at the critical point and at T → 0 when new links are attached deterministically to the highest quality node of the network. PACS numbers: 89.75.Hc, 89.75.Da, 89.75.Fb
Journal of Management, 1999
The development of the field of strategic management within the last two decades has been dramatic. While its roots have been in a more applied area, often referred to as business policy, the current field of strategic management is strongly theory based, with substantial empirical research, and is eclectic in nature. This review of the development of the field and its current position examines the field's early development and the primary theoretical and methodological bases through its history. Early developments include Strategy and Structure and Ansoff's (1965) Corporate Strategy. These early works took on a contingency perspective (fit between strategy and structure) and a resource-based framework emphasizing internal strengths and weaknesses. Perhaps, one of the more significant contributions to the development of strategic management came from industrial organization (IO) economics, specifically the work of Michael Porter. The structure-conduct-performance framework and the notion of strategic groups, as well as providing a foundation for research on competitive dynamics, are flourishing currently. The IO paradigm also brought econometric tools to the research on strategic management. Building on the IO economics framework, the organizational economics perspective contributed transaction costs economics and agency theory to strategic management. More recent theoretical contributions focus on the resource-based view of the firm. While it has its roots in Edith Penrose's work in the late 1950s, the resource-based view was largely introduced to the field of strategic management in the 1980s and became a dominant framework in the 1990s. Based on the resource-based view or developing concurrently were research on strategic leadership, strate-
Каптеревские чтения., 2010
Rome, IAI, October 2023, 5 p. (IAI Commentaries ; 23|53), 2023
International Journal of Management and Applied Research
Cisneros-Heredia, D. F. & McDiarmid, R. W. (2007) Revision of the characters of Centrolenidae(Amphibia: Anura: Athesphatanura),with comments on its taxonomyand the description of new taxa of glassfrogs. Zootaxa 1572: 1-82., 2007
The Clinical Journal of Pain, 2006
Revista Brasileira de Engenharia Agrícola e Ambiental, 2014
The European Physical Journal B, 2005
Hydrobiologia, 2007
E3S web of conferences, 2024