January 11th, 2022
This is a revised version of the piece first published on Future, a media platform by Andreessen Horowitz.
Despite zero prior experience in infectious diseases, I created covid19-projections.com in April 2020 in the span of one week. Being one of the only accurate models at the time, I became known as an “expert” in COVID-19. Never could I have imagined that so many people and organizations would end up citing a COVID-19 model by an untrained data scientist. It led me to think about what it truly means to be an expert.
The vast availability of information on the internet has enabled people to gain in-depth knowledge in a topic in a short amount of time, in a way that just wasn’t possible say 20 years ago. In a span of a few days in April 2020, I was able to understand the basics of infectious diseases from the comforts of my room. In the months since, was able to keep up with the fast pace of new research and release real-time analysis, which inevitably gave me an advantage in producing the latest insights.
Advancements in technology and broken barriers of communication have enabled individuals like myself to have access to tons of information and knowledge in a short amount of time. As technology continues to improve, the ability to quickly adapt to new information is crucial in successfully navigating the post-modern world.
Individuals today have a much greater breadth of knowledge than before, partially thanks to technology. For example, in the NBA today, basketball players like Lebron James are no longer expected to just shut up and dribble. While there are always exceptions, the idea of leveraging wisdom of the crowd to make accurate forecasts has been an intriguing topic for me (see Superforecasting by Tetlock & Gardner).
I wanted to put this notion of “crowdsourced ideas” to the test. Reading bland research papers online can only get me so far; I wanted to know what people really thought. Using Twitter, I was able to receive real-time responses from experts and non-experts alike. For a time-sensitive phenonmenon like a pandemic, this quick iteration pace turned out to be critical in helping me hone my model and adjust to fast-changing dynamics. Throughout the whole process, I heavily relied on another crowdsourced project.
For me, starting with a “blank slate” approach was extremely valuable when it comes to modeling COVID-19. Because this virus is quite literally a novel coronavirus, many past prior conceptions of how a virus operates may not apply in this case. Since I had no prior conceptions about what a virus can or cannot do, I based my results only on what the data is telling me. This turned out to be the right approach. I’ve seen many instances where traditional experts were unwilling to change their beliefs because the results disagreed with their priors, sometimes almost doggedly so.
During this time, my own perception of expertise has also changed over time. In any established field, I believe there can be a modicum of “groupthink” in many expert communities. This can lead to a stagnation in innovation, as shown in this model by Bhattacharya and Packalen. In essence, the authors show that academic citations encourage too much work in crowded areas rather than developing new ideas. This has the unintended consequence of what I call “delayed expertise” for novel phenomenons, since a critical mass must first be reached for an idea to be fully developed.
To conclude, the breakdown of formerly-closed information barriers and advancement of open data sources are lowering the barrier to entry for individual contributors, many of whom may not have prior experience. As long as these individuals possess the ability and resolve to continuously search for scientific truth and create testable hypotheses (such as by making future forecasts), they will always give traditional experts a run for their money.