Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Sometimes Size Does Not Matter
Stockholm University, Faculty of Science, Department of Mathematics.ORCID iD: 0000-0003-2767-8818
Number of Authors: 32023 (English)In: Foundations of physics, ISSN 0015-9018, E-ISSN 1572-9516, Vol. 53, no 1, article id 1Article in journal (Refereed) Published
Abstract [en]

Recently Díaz, Hössjer and Marks (DHM) presented a Bayesian framework to measure cosmological tuning (either fine or coarse) that uses maximum entropy (maxent) distributions on unbounded sample spaces as priors for the parameters of the physical models (https://doi.org/10.1088/1475-7516/2021/07/020). The DHM framework stands in contrast to previous attempts to measure tuning that rely on a uniform prior assumption. However, since the parameters of the models often take values in spaces of infinite size, the uniformity assumption is unwarranted. This is known as the normalization problem. In this paper we explain why and how the DHM framework not only evades the normalization problem but also circumvents other objections to the tuning measurement like the so called weak anthropic principle, the selection of a single maxent distribution and, importantly, the lack of invariance of maxent distributions with respect to data transformations. We also propose to treat fine-tuning as an emergence problem to avoid infinite loops in the prior distribution of hyperparameters (common to all Bayesian analysis), and explain that previous attempts to measure tuning using uniform priors are particular cases of the DHM framework. Finally, we prove a theorem, explaining when tuning is fine or coarse for different families of distributions. The theorem is summarized in a table for ease of reference, and the tuning of three physical parameters is analyzed using the conclusions of the theorem.

Place, publisher, year, edition, pages
2023. Vol. 53, no 1, article id 1
Keywords [en]
Bayesian statistics, Constants of nature, Emergence, Fine-tuning, Fundamental constants, Infinites, Maximum entropy, Standard models, Weak anthropic principle
National Category
Mathematics
Identifiers
URN: urn:nbn:se:su:diva-213533DOI: 10.1007/s10701-022-00650-1ISI: 000887803700001Scopus ID: 2-s2.0-85142246341OAI: oai:DiVA.org:su-213533DiVA, id: diva2:1724703
Available from: 2023-01-09 Created: 2023-01-09 Last updated: 2023-01-09Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Hössjer, Ola

Search in DiVA

By author/editor
Hössjer, Ola
By organisation
Department of Mathematics
In the same journal
Foundations of physics
Mathematics

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 20 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf