Change search
Refine search result
1 - 28 of 28
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1. Battistin, Claudia
    et al.
    Hertz, John
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita). Niels Bohr Institute, Denmark.
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Roudi, Yasser
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita). Kavli Institute for Systems Neuroscience/Centre for Neural Computation, Norway.
    Belief propagation and replicas for inference and learning in a kinetic Ising model with hidden spins2015In: Journal of Statistical Mechanics: Theory and Experiment, ISSN 1742-5468, E-ISSN 1742-5468, article id P05021Article in journal (Refereed)
    Abstract [en]

    We propose a new algorithm for inferring the state of hidden spins and reconstructing the connections in a synchronous kinetic Ising model, given the observed history. Focusing on the case in which the hidden spins are conditionally independent of each other given the state of observable spins, we show that calculating the likelihood of the data can be simplified by introducing a set of replicated auxiliary spins. Belief propagation (BP) and susceptibility propagation (SusP) can then be used to infer the states of hidden variables and to learn the couplings. We study the convergence and performance of this algorithm for networks with both Gaussian-distributed and binary bonds. We also study how the algorithm behaves as the fraction of hidden nodes and the amount of data are changed, showing that it outperforms the Thouless-Anderson-Palmer (TAP) equations for reconstructing the connections.

  • 2.
    Hertz, John A.
    et al.
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita).
    Roudi, Yasser
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita).
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Ising model for inferring network structure from spike data2013In: Principle of Neural Coding / [ed] Rodrigo Quian Quiroga, Stefano Panzeri, Boca/Raton: CRC Press, 2013, p. 527-546Chapter in book (Refereed)
    Abstract [en]

    Now that spike trains from many neurons can be recorded simultaneously, there is a need for methods to decode these data to learn about the networks that these neurons are part of. One approach to this problem is to adjust the parameters of a simple model network to make its spike trains resemble the data as much as possible. The connections in the model network can then give us an idea of how the real neurons that generated the data are connected and how they influence each other. In this chapter we describe how to do this for the simplest kind of model: an Ising network. We derive algorithms for finding the best model connection strengths for fitting a given data set, as well as faster approximate algorithms based on mean field theory. We test the performance of these algorithms on data from model networks and experiments.

  • 3.
    Hertz, John
    et al.
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita).
    Roudi, Yasser
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita).
    Thorning, Andreas
    Niels Bohr Institute, Copenhagen University, 2100 Copenhagen Ø, Denmark .
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Aurell, Erik
    Department of Computational Biology, Royal Institute of Technology, 106 91 Stockholm, Sweden .
    Zeng, Hong-Li
    Department of Applied Physics, Helsinki University of Technology, 02015 TKK Espoo, Finland .
    Inferring network connectivity using kinetic Ising models2010In: BMC neuroscience (Online), ISSN 1471-2202, E-ISSN 1471-2202, Vol. 11, no Suppl 1, p. P51-Article in journal (Refereed)
  • 4.
    Hertz, John
    et al.
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita).
    Roudi, Yasser
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita).
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Ising model for inferring network structure from spike data: In Principal of Neural CodingArticle in journal (Refereed)
  • 5.
    Jafari-Mamaghani, Mehrdad
    et al.
    Stockholm University, Faculty of Science, Department of Mathematics.
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Transfer entropy expressions for a class of non-Gaussian distributionsManuscript (preprint) (Other academic)
    Abstract [en]

    Transfer entropy is a frequently employed measure of conditional co-dependence in non-parametric analysis of Granger causality. In this paper, we derive analytical expressions for transfer entropy for the multivariate exponential, logistic, Pareto (type I − IV) and Burr distributions. The latter two fall into the class of fat-tailed distributions with power law properties, used frequently in biological, physical and actuarial sciences. We discover that the transfer entropy expressions for all four distributions are identical and depend merely on the multivariate distribution parameter and the number of distribution dimensions. Moreover, we find that in all four cases the transfer entropies are given by the same decreasing function of distribution dimensionality.

  • 6.
    Jafari-Mamaghani, Mehrdad
    et al.
    Stockholm University, Faculty of Science, Department of Mathematics. Karolinska Institutet, Sweden.
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Transfer Entropy Expressions for a Class of Non-Gaussian Distributions2014In: Entropy, ISSN 1099-4300, E-ISSN 1099-4300, Vol. 16, no 3, p. 1743-1755Article in journal (Refereed)
    Abstract [en]

    Transfer entropy is a frequently employed measure of conditional co-dependence in non-parametric analysis of Granger causality. In this paper, we derive analytical expressions for transfer entropy for the multivariate exponential, logistic, Pareto (type I - IV) and Burr distributions. The latter two fall into the class of fat-tailed distributions with power law properties, used frequently in biological, physical and actuarial sciences. We discover that the transfer entropy expressions for all four distributions are identical and depend merely on the multivariate distribution parameter and the number of distribution dimensions. Moreover, we find that in all four cases the transfer entropies are given by the same decreasing function of distribution dimensionality.

  • 7. Lock, John G.
    et al.
    Jafari-Mamaghani, Mehrdad
    Stockholm University, Faculty of Science, Department of Mathematics. Karolinska Institute, Sweden.
    Shafqat-Abbasi, Hamdah
    Gong, Xiaowei
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Strömblad, Staffan
    Plasticity in the Macromolecular-Scale Causal Networks of Cell Migration2014In: PLoS ONE, ISSN 1932-6203, E-ISSN 1932-6203, Vol. 9, no 2, p. e90593-Article in journal (Refereed)
    Abstract [en]

    Heterogeneous and dynamic single cell migration behaviours arise from a complex multi-scale signalling network comprising both molecular components and macromolecular modules, among which cell-matrix adhesions and F-actin directly mediate migration. To date, the global wiring architecture characterizing this network remains poorly defined. It is also unclear whether such a wiring pattern may be stable and generalizable to different conditions, or plastic and context dependent. Here, synchronous imaging-based quantification of migration systemorganization, represented by 87 morphological and dynamic macromolecular module features, and migration system behaviour, i.e., migration speed, facilitated Granger causality analysis. We thereby leveraged natural cellular heterogeneity to begin mapping the directionally specific causal wiring between organizational and behavioural features of the cell migration system. This represents an important advance on commonly used correlative analyses that do not resolve causal directionality. We identified organizational features such as adhesion stability and adhesion F-actin content that, as anticipated, causally influenced cell migration speed. Strikingly, we also found that cell speed can exert causal influence over organizationalfeatures, including cell shape and adhesion complex location, thus revealing causality in directions contradictory to previous expectations. Importantly, by comparing unperturbed and signalling-modulated cells, we provide proof-of-principle that causal interaction patterns are in fact plastic and context dependent, rather than stable and generalizable.

  • 8. Mielniczuk, Jan
    et al.
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Consistency of multilayer perceptron regression estimators1993In: Neural Networks, Vol. 6, p. 1019-1022Article in journal (Refereed)
  • 9. Nellaker, Christoffer
    et al.
    Li, Fang
    Uhrzander, Fredrik
    Stockholm University.
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Karlsson, Hakan
    Expression profiling of repetitive elements by melting temperature analysis: variation in HERV-W gag expression across human individuals and tissues2009In: BMC Genomics, ISSN 1471-2164, E-ISSN 1471-2164, Vol. 10, p. 532-Article in journal (Refereed)
    Abstract [en]

    Background: Human endogenous retroviruses (HERV) constitute approximately 8% of the human genome and have long been considered ""junk"". The sheer number and repetitive nature of these elements make studies of their expression methodologically challenging. Hence, little is known of transcription of genomic regions harboring such elements. Results: Applying a recently developed technique for obtaining high resolution melting temperature data, we examined the frequency distributions of HERV-W gag element into 13 Tm categories in human tissues. Transcripts containing HERV-W gag sequences were expressed in non-random patterns with extensive variations in the expression between both tissues, including different brain regions, and individuals. Furthermore, the patterns of such transcripts varied more between individuals in brain regions than other tissues. Conclusion: Thus, regulated expression of non-coding regions of the human genome appears to include the HERV-W family of repetitive elements. Although it remains to be established whether such expression patterns represent leakage from transcription of functional regions or specific transcription, the current approach proves itself useful for studying detailed expression patterns of repetitive regions.

  • 10. Nellåker, Christoffer
    et al.
    Uhrzander, Fredrik
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics. Matematisk statistik.
    Karlsson, Håkan
    Mixture models for analysis of melting temperature data2008In: BMC Bioinformatics, Vol. 9:370Article in journal (Refereed)
    Abstract [en]

    Background

    In addition to their use in detecting undesired real-time PCR products, melting temperatures are useful for detecting variations in the desired target sequences. Methodological improvements in recent years allow the generation of high-resolution melting-temperature (Tm) data. However, there is currently no convention on how to statistically analyze such high-resolution Tm data.

    Results

    Mixture model analysis was applied to Tm data. Models were selected based on Akaike's information criterion. Mixture model analysis correctly identified categories in Tm data obtained for known plasmid targets. Using simulated data, we investigated the number of observations required for model construction. The precision of the reported mixing proportions from data fitted to a preconstructed model was also evaluated.

    Conclusion

    Mixture model analysis of Tm data allows the minimum number of different sequences in a set of amplicons and their relative frequencies to be determined. This approach allows Tm data to be analyzed, classified, and compared in an unbiased manner.

  • 11.
    Roudi, Yasser
    et al.
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita).
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Hertz, John
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita).
    Fast and realiable methods for extracting functional connectivity in large populations2009In: BMC neuroscience (Online), ISSN 1471-2202, E-ISSN 1471-2202, BMC Neuroscience, ISSN 1471-2202, Vol. 10, no Suppl 1, p. 09-Article in journal (Refereed)
  • 12. Roudi, Yasser
    et al.
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Hertz, John
    Ising model for neural data: Model quality and approximate methods for extracting functional connectivity2009In: Physical Review E. Statistical, Nonlinear, and Soft Matter Physics, ISSN 1539-3755, E-ISSN 1550-2376, Vol. 79, no 5, p. 51915-Article in journal (Refereed)
    Abstract [en]

    We study pairwise Ising models for describing the statistics of multineuron spike trains, using data from a simulated cortical network. We explore efficient ways of finding the optimal couplings in these models and examine their statistical properties. To do this, we extract the optimal couplings for subsets of size up to 200 neurons, essentially exactly, using Boltzmann learning. We then study the quality of several approximate methods for finding the couplings by comparing their results with those found from Boltzmann learning. Two of these methods-inversion of the Thouless-Anderson-Palmer equations and an approximation proposed by Sessak and Monasson-are remarkably accurate. Using these approximations for larger subsets of neurons, we find that extracting couplings using data from a subset smaller than the full network tends systematically to overestimate their magnitude. This effect is described qualitatively by infinite-range spin-glass theory for the normal phase. We also show that a globally correlated input to the neurons in the network leads to a small increase in the average coupling. However, the pair-to-pair variation in the couplings is much larger than this and reflects intrinsic properties of the network. Finally, we study the quality of these models by comparing their entropies with that of the data. We find that they perform well for small subsets of the neurons in the network, but the fit quality starts to deteriorate as the subset size grows, signaling the need to include higher-order correlations to describe the statistics of large networks.

  • 13.
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Age-dependent cell cycle models2001In: Journal of Theoretical Biology, no 213, p. 89-101Article in journal (Refereed)
  • 14.
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Asymptotic Stability of the Mass Distribution in the Case of the Linear and Exponential Growth in Probabilistic Models of the Cell Cycle2000Report (Other academic)
  • 15.
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Cell cycle progression2004In: Comptes Rendues: Biologies, no 327, p. 193-200Article in journal (Refereed)
  • 16.
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Dynamics of integrate and fire models2007In: Mathematical Modeling of Biological Systems, Vol. II, p. 235-246Article in journal (Refereed)
  • 17.
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Spike statistics for a high-conductance cortical network model2007In: 5th Nordic Neuroinformatics Workshop: Life Science Center, Espoo, Finland, 2007Conference paper (Other academic)
  • 18.
    Tyrcha, Joanna
    et al.
    Stockholm University, Faculty of Science, Department of Mathematics.
    Hertz, John
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita). Niels Bohr Institute, Copenhagen, Denmark.
    NETWORK INFERENCE WITH HIDDEN UNITS2014In: Mathematical Biosciences and Engineering, ISSN 1547-1063, E-ISSN 1551-0018, Vol. 11, no 1, p. 149-165Article in journal (Refereed)
    Abstract [en]

    We derive learning rules for finding the connections between units in stochastic dynamical networks from the recorded history of a visible subset of the units. We consider two models. In both of them, the visible units are binary and stochastic. In one model the hidden units are continuous-valued, with sigmoidal activation functions, and in the other they are binary and stochastic like the visible ones. We derive exact learning rules for both cases. For the stochastic case, performing the exact calculation requires, in general, repeated summations over an number of configurations that grows exponentially with the size of the system and the data length, which is not feasible for large systems. We derive a mean field theory, based on a factorized ansatz for the distribution of hidden-unit states, which offers an attractive alternative for large systems. We present the results of some numerical calculations that illustrate key features of the two models and, for the stochastic case, the exact and approximate calculations.

  • 19.
    Tyrcha, Joanna
    et al.
    Stockholm University, Faculty of Science, Department of Mathematics. Matematisk statistik.
    Hertz, John
    Spike pattern distributions in model cortical networks2008In: COSYNE-Computational and Systems Neuroscience 2008, Salt Lake City, 2008Conference paper (Other (popular science, discussion, etc.))
    Abstract [en]

    We can learn something about coding in large populations of neurons from models of the spike pattern distributions constructed from data. In our work, we do this for data generated from computational models of local cortical networks. This permits us to explore how features of the neuronal and synaptic properties of the network are related to those of the spike pattern distribution model. We employ the approach of Schneidman et al [1] and model this distribution by a Sherrington-Kirkpatrick (SK) model: P[S] = Z-1exp(½ΣijJijSiSj+ΣihiSi). In the work reported here, we analyze spike records from a simple model of a cortical column in a high-conductance state for two different cases: one with stationary tonic firing and the other with a rapidly time-varying input that produces rapid variations in firing rates. The average cross-correlation coefficient in the former is an order of magnitude smaller than that in the latter.

    To estimate the parameters Jij and hi we use a technique [2] based on inversion of the Thouless-Anderson-Palmer equations from spin glass theory. We have performed these fits for groups of neurons of sizes from 12 to 200 for tonic firing and from 6 to 800 for the case of the rapidly time-varying “stimulus”. The first two figures show that the distributions of Jij’s in the two cases are quite similar, both growing slightly narrower with increasing N. They are also qualitatively similar to those found by Schneidman et al and by Tkačik et al [3] for data from retinal networks. As in their work, it does not appear to be necessary to include higher order couplings. The means, which are much smaller than the standard deviations, also decrease with N, and the one for tonic firing is less than half that for the stimulus-driven network.

    However, the models obtained never appear to be in a spin glass phase for any of the sizes studied, in contrast to the finding of Tkačik et al, who reported spin glass behaviour at N=120. This is shown in the third figure panel. The x axis is 1/J, where J = N1/2std(Jij) and the y axis is H/J, where H is the total “field” N-1Σi(hi+ΣjJij‹Sj›). The green curve marks the Almeida-Thouless line separating the normal and spin glass phases in this parameter plane. All our data, for N ≤800 (the number of excitatory neurons in the originally-simulated network), lie in the normal region, and extrapolation from our results predicts spin glass behaviour only for N>5000.

    [1] E. Schneidman et al., Nature 440 1007-1012 (2006)

    [2] T. Tanaka, Phys Rev E 58 2302-2310 (1998); H. J. Kappen and F. B Rodriguez, Neural Comp 10 1137-1156 (1998)

    [3] G. Tkačik et al., arXiv:q-bio.NC/0611072 v1 (2006)

  • 20.
    Tyrcha, Joanna
    et al.
    Stockholm University, Faculty of Science, Department of Mathematics. Matematisk statistik.
    Hertz, John
    Testing Algorithms for Extracting Functional Connectivity from Spike Data2008In: 1st INCF Congress of Neuroinformatics: Databasing and Modeling the Brain, 2008Conference paper (Other (popular science, discussion, etc.))
    Abstract [en]

    We can learn something about how large neuronal networks function from models of their spike pattern distributions constructed from data. We do this using the approach introduced by Schneidman et al [1], modeling this distribution by an Ising model: P[S] = Z-1exp(ΣJijSiSj + ΣihiSi). In the work reported here, we explore the accuracy of two algorithms for extracting the model parameters Jij and hi by testing them on data generated by networks in which these parameters are known.

    Both algorithms use, as input, the firing rates and mutual correlations of the neurons in the network. The first algorithm is straightforward Boltzmann learning. It will yield the parameters correctly if the input statistics are known exactly,but it may be very slow to converge. The second, very fast, algorithm [2] is based on inversion of the Thouless-Anderson-Palmer equations from spin glass theory. It is derived from a small-Jij expansion, but it is in principle correct for all Jij when the network is infinitely large and densely connected.

    In practice, however, the rates and correlations used as inputs to the algorithms are estimates based on a finite number of measurements. Therefore, there will be errors in the extracted model parameters. Errors will also occur if the data are incomplete, i.e., if the rates and correlations are not measured for all neurons or all pairs. This case is highly relevant to the experimental situation, since in practice it is only possible to record from a small fraction of the neurons in a network.

    Two particular kinds of error statistics are of special interest: variances of the differences between true and extracted parameters, and variances of the differences between parameters extracted for two independent sets of training data. We study the relation between the two, since the first is what we are interested in but only the second can be computed in the realistic situation, where we do not know the parameters a priori. We also examine the variance of the difference between the true and extracted correlations.

    Finally, we apply the algorithms to the data of Schneidman et al from salamander retinal ganglion neurons.

    References

    --------------------------------------------------------------------------------

    1. E Schneidman et al, Nature 440 1007-1012 (2006); G Tkacik et al, arXiv:q-bio.NC/0611072 (2006)

    2. T Tanaka, Phys Rev E 58 2302-2310 (1998); H J Kappen and F B Rodriguez, Neural Comp 10 1137-1156 (1998)

  • 21.
    Tyrcha, Joanna
    et al.
    Stockholm University, Faculty of Science. Stockholm University, Faculty of Science, Department of Mathematics.
    Hertz, John
    Nordita.
    Roudi, Yasser
    Kavli Institute for Systems Neuroscience.
    Inferring network connectivity using kinectic Ising models2010In: BMC neuroscience (Online), ISSN 1471-2202, E-ISSN 1471-2202, Vol. 11, no 51Article in journal (Refereed)
  • 22.
    Tyrcha, Joanna
    et al.
    Stockholm University, Faculty of Science, Department of Mathematics.
    Levy, William
    Another contribution by synaptic failures to energy efficient processing by neurons2004In: Neurocomputing, Vol. 58-60, p. 59-66Article in journal (Refereed)
  • 23.
    Tyrcha, Joanna
    et al.
    Stockholm University, Faculty of Science, Department of Mathematics.
    Levy, William
    Synaptic failures and a Gaussian excitation distribution2005In: Neurocomputing, Vol. 65-66, p. 891-899Article in journal (Refereed)
  • 24.
    Tyrcha, Joanna
    et al.
    Stockholm University, Faculty of Science, Department of Mathematics.
    Roudi, Yasser
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita). Kavli Institute for Systems Neuroscience, NTNU, Norway.
    Marsili, Matteo
    Hertz, John
    Stockholm University, Nordic Institute for Theoretical Physics (Nordita). Niels Bohr Institute, Denmark.
    The effect of nonstationarity on models inferred from neural data2013In: Journal of Statistical Mechanics: Theory and Experiment, ISSN 1742-5468, E-ISSN 1742-5468, article id P03005Article in journal (Refereed)
    Abstract [en]

    Neurons subject to a common nonstationary input may exhibit a correlated firing behavior. Correlations in the statistics of neural spike trains also arise as the effect of interaction between neurons. Here we show that these two situations can be distinguished with machine learning techniques, provided that the data are rich enough. In order to do this, we study the problem of inferring a kinetic Ising model, stationary or nonstationary, from the available data. We apply the inference procedure to two data sets: one from salamander retinal ganglion cells and the other from a realistic computational cortical network model. We show that many aspects of the concerted activity of the salamander retinal neurons can be traced simply to the external input. A model of non-interacting neurons subject to a nonstationary external field outperforms a model with stationary input with couplings between neurons, even accounting for the differences in the number of model parameters. When couplings are added to the nonstationary model, for the retinal data, little is gained: the inferred couplings are generally not significant. Likewise, the distribution of the sizes of sets of neurons that spike simultaneously and the frequency of spike patterns as a function of their rank (Zipf plots) are well explained by an independent-neuron model with time-dependent external input, and adding connections to such a model does not offer significant improvement. For the cortical model data, robust couplings, well correlated with the real connections, can be inferred using the nonstationary model. Adding connections to this model slightly improves the agreement with the data for the probability of synchronous spikes but hardly affects the Zipf plot.

  • 25.
    Tyrcha, Joanna
    et al.
    Stockholm University, Faculty of Science, Department of Mathematics.
    Sundberg, Rolf
    Stockholm University, Faculty of Science, Department of Mathematics.
    Statistical modelling and saddle point approximation of tail probabilities for accumulated splice loss in fibre optic networks2000In: J. Applied Statistics, ISSN 0266-4763, Vol. 27, no 2, p. 245-256Article in journal (Refereed)
  • 26.
    Tyrcha, Joanna
    et al.
    Stockholm University, Faculty of Science, Department of Mathematics.
    Sundberg, Rolf
    Stockholm University, Faculty of Science, Department of Mathematics.
    Lindskog, Peter
    Sundström, Bernt
    Statistical modelling and saddle point approximation of tail probabilities for accumulated splice loss in fibre optic networks1998Report (Other academic)
  • 27. Wu, Xiangbao
    et al.
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Levy, William
    A neural network solution to the transverse patterning problem depends on repetition of the input code1998In: Biol. Cybern., no 79, p. 203-213Article in journal (Refereed)
  • 28. Wu, Xiangbao
    et al.
    Tyrcha, Joanna
    Stockholm University, Faculty of Science, Department of Mathematics.
    Levy, William
    A special role for input codes in solving the transverse patterning problem1997In: Computational Neuroscience: Trends in Research, p. 885-889Article in journal (Refereed)
1 - 28 of 28
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf