A Vision of Metascience – Marginal REVOLUTION


Lots to praise and to ponder in this excellent piece by Michael Nielsen and Kanjun Qiu on improving the discovery ecosystem with metascience. The piece contains some pop-ideas to stimulate thinking such as:

  • Fund-by-variance: Instead of funding grants that get the highest average score from reviewers, a funder should use the variance (or kurtosis or some similar measurement of disagreement) in reviewer scores as a primary signal: only fund things that are highly polarizing (some people love it, some people hate it). One thesis to support such a program is that you may prefer to fund projects with a modest chance of outlier success over projects with a high chance of modest success. An alternate thesis is that you should aspire to fund things only you would fund, and so should look for signal to that end: projects everyone agrees are good will certainly get funded elsewhere. And if you merely fund what everyone else is funding, then you have little marginal impact,.

But it’s really not about one one idea but about understanding why scientific tools are rarely applied to science itself and what can we do to improve metascience. Lots of bad news but there are some positive examples. Thereplication revolution (no longer a crisis!) appears to be working:

There are encouraging signs that pre-registered study designs like this are helping address the methodological problems described above. Consider the following five graphs. The graphs show the results from five major studies, each of which attempted to replicate many experiments from the social sciences literature. Filled in circles indicate the replication found a statistically significant result, in the same direction as the original study. Open circles indicate this criterion wasn’t met. Circles above the line indicate the replication effect was larger than the original effect size, while circles below the line indicate the effect size was smaller. A high degree of replicability would mean many experiments with filled circles, clustered fairly close to the line. Here’s what these five replication studies actually found:

As you can see, the first four replication studies show many replications with questionable results – large changes in effect size, or a failure to meet statistical significance. This suggests a need for further investigation, and possibly that the initial result was faulty. The fifth study is different, with statistical significance replicating in all cases, and much smaller changes in effect sizes. This is a 2020 study by John Protzko et al that aims to be a “best practices” study. By this, they mean the original studies were done using pre-registered study design, as well as: large samples, and open sharing of code, data and other methodological materials, making experiments and analysis easier to replicate…In short, the replications in the fifth graph are based on studies using much higher evidentiary standards than had previously been the norm in psychology. Of course, the results don’t show that the effects are real. But they’re extremely encouraging, and suggest the spread of ideas like Registered Reports contribute to substantial progress.

The story of Brian Nosek is very interesting:

Many people have played important roles in instigating the replication crisis. But perhaps no single person has done more than Brian Nosek. Nosek is a social psychologist who until 2013 was a professor at the University of Virginia. In 2013, Nosek took leave from his tenured position to co-found the Center for Open Science (COS) as an independent not-for-profit (jointly with Jeff Spies, then a graduate student in his lab). Nosek and the COS were key co-organizers of the 2015 replication paper by the Open Science Collaboration. Nosek and the COS have also been (along with Daniël Lakens, Chris Chambers, and many others) central in developing Registered Reports. In particular, they founded and operate the OSF website, which is the key infrastructure supporting Registered Reports.

…The origin story of the COS is interesting. In 2007 and 2008, Nosek submitted several grant proposals to the NSF and NIH, suggesting many of the ideas that would eventually mature into the COS. All these proposals were turned down. Between 2008 and 2012 he gave up applying for grants for metascience. Instead, he mostly self-funded his lab, using speaker’s fees from talks about his prior professional work. A graduate student of Nosek’s named Jeff Spies did some preliminary work developing the site that would become OSF. In 2012 this got some media attention, and as a result was noticed by several private foundations, including a foundation begun by a billionaire hedge fund operator, John Arnold, and his wife, Laura Arnold. The Arnold Foundation reached out and rapidly agreed to provide some funding, ultimately in the form of a $5.25 million grant.

  To do this, Nosek had to leave the University of Virginia? Why? He was then attacked by many in the scientific community. Why? Read the whole thing for much more.



Source link