Navigation auf uzh.ch
The world of enlightenment is under attack. Fake news receive attention whilewell-founded scientific findings are discredited or ignored. It is easy to blame populists with this challenge of open societies. This, however, ignores that the scientific community allowed its high standards to betempered with. The recent reproducibility crisis helped (at least partially) to create a climate which allows alternative facts to be communicated and acted upon. Empirical research relies on statistical epistemology.
Statistics is the magnifying glass which allows to disentangle relevant effects from noise and thus facts from fiction under appropriate error control. Almost from the beginning of academic statistics, the field has been subject to misconception, if not outright fraud ("How to lie with statistics'' was published in 1954). In addition to poor design and conduction of experiments, misuse and misinterpretation of statistical models are problematic (Randall and Welser, 2018).
On top of it, novel methods are mushrooming and with > 13'000 add-on packages available from the CRAN software repository, the statistical illiterate data analyst has many options to get it wrong but often only one to get it right. The call is out to statisticians to provide better ways to get it right. The common cookbook-style attitude in statistical teaching and research hides the core ideas and concepts underlying statistical reasoning, fails to highlight the connections between different methods, and has the potential to create misconceptions.
The cookbook only contains a limited number of recipes. To analyse experiments appropriately, it must be replaced by a box of Legobricks, where the data analyst can build the method matching the experiment. This research project aims to provide data analysts and statisticians alike with a novel Lego system implementing a comprehensive way of understanding, designing, analysing, and communicating empirical investigations. We suggest leveraging the unifying spirit of transformation models to address the challenge.
These models help to understand the connections between many classics, such as the normal linear, the proportional hazard, or the proportional odds model. Transformation models consist of simple Legobricks which can be reassembled in many different, and problem-specific ways. There is no limit on the models we can built in this framework; very complex novel highly non-linear interaction models for probabilistic forecasting as well as very simple models giving rise to classical rank tests can be understood in the same framework.
Based on recently published theoretical and computational core concepts in maximum-likelihood inference for conditional transformation models, this project develops a range of novel methods for specific types of experiments: Transformation forests and boosting for probabilistic forecasts, novel permutation score tests and corresponding confidence intervals for relevant parameters in complex designs. Open-source software will provide tools for building tailored analyses those results hopefully provide better answers to research questions.