Change search
Refine search result
1 - 8 of 8
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Borysov, Stanislav S.
    et al.
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita). KTH Royal Institute of Technology, Sweden.
    Roudi, Yasser
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita). The Kavli Institute for Systems Neuroscience, NTNU, Norway.
    Balatsky, Alexander V.
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita). Institute for Materials Science, Los Alamos National Laboratory, Los Alamos, USA .
    US stock market interaction network as learned by the Boltzmann machine2015In: European Physical Journal B: Condensed Matter Physics, ISSN 1434-6028, E-ISSN 1434-6036, Vol. 88, no 12Article in journal (Refereed)
    Abstract [en]

    We study historical dynamics of joint equilibrium distribution of stock returns in the U.S. stock market using the Boltzmann distribution model being parametrized by external fields and pairwise couplings. Within Boltzmann learning framework for statistical inference, we analyze historical behavior of the parameters inferred using exact and approximate learning algorithms. Since the model and inference methods require use of binary variables, effect of this mapping of continuous returns to the discrete domain is studied. The presented results show that binarization preserves the correlation structure of the market. Properties of distributions of external fields and couplings as well as the market interaction network and industry sector clustering structure are studied for different historical dates and moving window sizes. We demonstrate that the observed positive heavy tail in distribution of couplings is related to the sparse clustering structure of the market. We also show that discrepancies between the model's parameters might be used as a precursor of financial instabilities.

  • 2. Dunn, Benjamin
    et al.
    Roudi, Yasser
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita).
    Learning and inference in a nonequilibrium Ising model with hidden nodes2013In: Physical Review E. Statistical, Nonlinear, and Soft Matter Physics, ISSN 1539-3755, E-ISSN 1550-2376, Vol. 87, no 2, p. 022127-Article in journal (Refereed)
    Abstract [en]

    We study inference and reconstruction of couplings in a partially observed kinetic Ising model. With hidden spins, calculating the likelihood of a sequence of observed spin configurations requires performing a trace over the configurations of the hidden ones. This, as we show, can be represented as a path integral. Using this representation, we demonstrate that systematic approximate inference and learning rules can be derived using dynamical mean-field theory. Although naive mean-field theory leads to an unstable learning rule, taking into account Gaussian corrections allows learning the couplings involving hidden nodes. It also improves learning of the couplings between the observed nodes compared to when hidden nodes are ignored. DOI: 10.1103/PhysRevE.87.022127

  • 3.
    Hertz, John A.
    et al.
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita). University of Copenhagen, Denmark.
    Roudi, Yasser
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita). Kavli Institute for Systems Neuroscience and Centre for Neural Computation, NTNU, Norway; Institute for Advanced Study, Princeton, NJ, USA.
    Sollich, Peter
    Path integral methods for the dynamics of stochastic and disordered systems2017In: Journal of Physics A: Mathematical and Theoretical, ISSN 1751-8113, E-ISSN 1751-8121, Vol. 50, no 3, article id 033001Article, review/survey (Refereed)
    Abstract [en]

    We review some of the techniques used to study the dynamics of disordered systems subject to both quenched and fast (thermal) noise. Starting from the Martin-Siggia-Rose/Janssen-De Dominicis-Peliti path integral formalism for a single variable stochastic dynamics, we provide a pedagogical survey of the perturbative, i.e. diagrammatic, approach to dynamics and how this formalism can be used for studying soft spin models. We review the supersymmetric formulation of the Langevin dynamics of these models and discuss the physical implications of the supersymmetry. We also describe the key steps involved in studying the disorder-averaged dynamics. Finally, we discuss the path integral approach for the case of hard Ising spins and review some recent developments in the dynamics of such kinetic Ising models.

  • 4.
    Hertz, John
    et al.
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita).
    Roudi, Yasser
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita).
    Thorning, Andreas
    Niels Bohr Institute, Copenhagen University, 2100 Copenhagen Ø, Denmark .
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Aurell, Erik
    Department of Computational Biology, Royal Institute of Technology, 106 91 Stockholm, Sweden .
    Zeng, Hong-Li
    Department of Applied Physics, Helsinki University of Technology, 02015 TKK Espoo, Finland .
    Inferring network connectivity using kinetic Ising models2010In: BMC neuroscience (Online), ISSN 1471-2202, E-ISSN 1471-2202, Vol. 11, no Suppl 1, p. P51-Article in journal (Refereed)
  • 5. Marsili, Matteo
    et al.
    Mastromatteo, Iacopo
    Roudi, Yasser
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita). Norwegian University of Science & Technology.
    On sampling and modeling complex systems2013In: Journal of Statistical Mechanics: Theory and Experiment, ISSN 1742-5468, E-ISSN 1742-5468, p. P09003-Article in journal (Refereed)
    Abstract [en]

    The study of complex systems is limited by the fact that only a few variables are accessible for modeling and sampling, which are not necessarily the most relevant ones to explain the system behavior. In addition, empirical data typically undersample the space of possible states. We study a generic framework where a complex system is seen as a system of many interacting degrees of freedom, which are known only in part, that optimize a given function. We show that the underlying distribution with respect to the known variables has the Boltzmann form, with a temperature that depends on the number of unknown variables. In particular, when the influence of the unknown degrees of freedom on the known variables is not too irregular, the temperature decreases as the number of variables increases. This suggests that models can be predictable only when the number of relevant variables is less than a critical threshold. Concerning sampling, we argue that the information that a sample contains on the behavior of the system is quantified by the entropy of the frequency with which different states occur. This allows us to characterize the properties of maximally informative samples: within a simple approximation, the most informative frequency size distributions have power law behavior and Zipf's law emerges at the crossover between the under sampled regime and the regime where the sample contains enough statistics to make inferences on the behavior of the system. These ideas are illustrated in some applications, showing that they can be used to identify relevant variables or to select the most informative representations of data, e.g. in data clustering.

  • 6.
    Roudi, Yasser
    et al.
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita).
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Hertz, John
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita).
    Fast and realiable methods for extracting functional connectivity in large populations2009In: BMC neuroscience (Online), ISSN 1471-2202, E-ISSN 1471-2202, BMC Neuroscience, ISSN 1471-2202, Vol. 10, no Suppl 1, p. 09-Article in journal (Refereed)
  • 7.
    Tyrcha, Joanna
    et al.
    Stockholm University, Faculty of Science, Department of Mathematics.
    Roudi, Yasser
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita). Kavli Institute for Systems Neuroscience, NTNU, Norway.
    Marsili, Matteo
    Hertz, John
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita). Niels Bohr Institute, Denmark.
    The effect of nonstationarity on models inferred from neural data2013In: Journal of Statistical Mechanics: Theory and Experiment, ISSN 1742-5468, E-ISSN 1742-5468, article id P03005Article in journal (Refereed)
    Abstract [en]

    Neurons subject to a common nonstationary input may exhibit a correlated firing behavior. Correlations in the statistics of neural spike trains also arise as the effect of interaction between neurons. Here we show that these two situations can be distinguished with machine learning techniques, provided that the data are rich enough. In order to do this, we study the problem of inferring a kinetic Ising model, stationary or nonstationary, from the available data. We apply the inference procedure to two data sets: one from salamander retinal ganglion cells and the other from a realistic computational cortical network model. We show that many aspects of the concerted activity of the salamander retinal neurons can be traced simply to the external input. A model of non-interacting neurons subject to a nonstationary external field outperforms a model with stationary input with couplings between neurons, even accounting for the differences in the number of model parameters. When couplings are added to the nonstationary model, for the retinal data, little is gained: the inferred couplings are generally not significant. Likewise, the distribution of the sizes of sets of neurons that spike simultaneously and the frequency of spike patterns as a function of their rank (Zipf plots) are well explained by an independent-neuron model with time-dependent external input, and adding connections to such a model does not offer significant improvement. For the cortical model data, robust couplings, well correlated with the real connections, can be inferred using the nonstationary model. Adding connections to this model slightly improves the agreement with the data for the probability of synchronous spikes but hardly affects the Zipf plot.

  • 8. Zeng, Hong-Li
    et al.
    Alava, Mikko
    Aurell, Erik
    Hertz, John
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita).
    Roudi, Yasser
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita).
    Maximum Likelihood Reconstruction for Ising Models with Asynchronous Updates2013In: Physical Review Letters, ISSN 0031-9007, E-ISSN 1079-7114, Vol. 110, no 21, p. 210601-Article in journal (Refereed)
    Abstract [en]

    We describe how the couplings in an asynchronous kinetic Ising model can be inferred. We consider two cases: one in which we know both the spin history and the update times and one in which we know only the spin history. For the first case, we show that one can average over all possible choices of update times to obtain a learning rule that depends only on spin correlations and can also be derived from the equations of motion for the correlations. For the second case, the same rule can be derived within a further decoupling approximation. We study all methods numerically for fully asymmetric Sherrington-Kirkpatrick models, varying the data length, system size, temperature, and external field. Good convergence is observed in accordance with the theoretical expectations.

1 - 8 of 8
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf