In recent years, the application and impact of artificial intelligence (AI) technology in the field of scientific research have become increasingly significant. From biomedicine, materials science, to astronomy and other disciplines, AI4Science is changing the way scientific research is conducted and accelerating scientific research. AI provides unprecedented support and assistance to scientists by providing functions such as large-scale data analysis, simulating experimental environments, and optimizing research processes, and has indeed pushed scientific research to a new stage. However, while AI brings more possibilities to scientific research, it also raises many problems. A research team from Yale University and Princeton University believes that in scientific research, AI may lead to the uniformity of knowledge production, namely scientific monocultures . Specifically, it is a culture in which certain types of methods, questions, and perspectives dominate others, making science less innovative and more prone to error . At the same time, the solutions proposed by AI may also mask our cognitive limitations , making us susceptible to the illusion that we know more about the world than we actually do. Therefore, scientists need to carefully consider the potential impact of AI in scientific research so as to produce knowledge responsibly . The related research paper, titled "Artificial intelligence and illusions of understanding in scientific research", has been published in the scientific journal Nature. This paper extracts four major application visions of AI from a large amount of literature related to AI4Science - Oracles, Surrogates, Quants and Arbiters, and discusses the application and impact of AI in scientific research respectively. Table | Visions of AI in scientific research. The above four types were derived by analyzing recent publications on the potential of AI to improve knowledge production across scientific disciplines. The included papers either used the general phrase "AI" or mentioned specific methods under the AI concept, most commonly machine learning, natural language processing, and large language models. The table summarizes how the vision responds to different research stages and perceived limitations on scientific capabilities and efficiency. Why do human scientists trust AI? In recent years, AI has been increasingly used in scientific research, and the solutions it provides are believed to be able to revolutionize scientific research methods. But how can AI gain the trust of scientists? First, AI tools are viewed not just as tools but as collaborators in research, capable of providing solutions and insights beyond human limitations . This perception enhances scientists’ trust in AI because they view these tools as partners in knowledge production. Second, AI tools are perceived to have qualities that enhance their trustworthiness, such as objectivity, deep understanding, and the ability to provide satisfactory explanations . This portrayal makes AI tools more trustworthy than humans in certain situations. Third, AI tools provide solutions that match people’s preferences for simple, broad, simplified, and quantified explanations, thereby enhancing the perception of understanding . However, there are also cognitive risks associated with integrating AI into scientific research, particularly in the form of illusions of cognitive understanding. Figure | Illusions of understanding in AI-driven scientific research. As shown in Figure a above, scientists using AI tools for research may have an illusion of explanatory depth. In the example, a scientist uses an AI Quant to model a phenomenon (X) and believes that their understanding of X is deeper than it actually is. In Figure b, in a single cognitive culture, scientists can easily have an illusion of breadth of exploration, mistakenly believing that they are exploring the space of all testable hypotheses, when in fact they are exploring a narrower space of hypotheses that can be tested with AI tools. In addition, as shown in Figure c, in a monoculture of knowledge, scientists can easily develop an illusion of objectivity, where they mistakenly believe that AI tools have no stance or are able to represent all possible stances (using alternatives to human participants for research), while AI tools are actually embedded with the training data and the views of the developers. Illusions of cognitive understanding occur when individuals overestimate their own level of understanding, mistaking the knowledge and abilities of AI tools or other community members for their own. This phenomenon can lead to errors and overoptimism in scientific assertions, especially when AI tools are used outside of the field of experts. Is the future of AI4Science looking good? Because AI tools seem credible and promise to improve the quality and quantity of research, research that relies on these tools is likely to increase significantly. Currently, citations to AI are increasing in publications and patents, and papers using AI tools are showing an increasing trend in citations both within and outside the discipline. If this trend continues, what cognitive risks will arise when AI-assisted research gradually dominates knowledge production? The article gives a point of view: science may form a monoculture. The author uses an analogy to illustrate this point: In agriculture, monoculture is the practice of growing only one crop on the same piece of land. This practice increases efficiency, but also makes the crop more vulnerable to pests and diseases. The article argues that the efficiencies offered by AI tools can promote the development of a scientific monoculture, in which certain forms of knowledge production dominate all others. They can do so in two different but complementary ways. First, by prioritizing the problems and methods that are best suited to AI assistance (a monoculture of knowledge production). Second, by prioritizing the types of perspectives that AI can express (the monoculture of knowledge owners) . Just as plant monocultures are more vulnerable to pests and disease, scientific monocultures make our understanding of the world more susceptible to errors, biases, and missed opportunities for innovation. When new tools and technologies emerge, scientists tend to quickly take advantage of them. However, this risk may become particularly acute as AI tools are predicted to become widely used in a variety of fields, becoming super-collaborators integrated into knowledge communities. In addition to threatening the flourishing of science, a monoculture of knowledge production can create illusions of cognitive understanding, in which scientists mistakenly believe that AI tools advance human understanding of science, rather than realizing that these tools actually narrow the scope of scientific knowledge production . Understanding the risks is a key step in dealing with them. In order to mitigate the risks of applying AI in knowledge production, humans need to enhance their understanding of the cognitive risks of scientific monoculture and their perception of cognitive illusions. In fact, the application of AI in science really excites everyone, such as AI chemists accelerating catalysis research, generative AI assisting carbon capture, and so on. However, before AI4Science can go further, scientists must consider the technical limitations and potential of AI, as well as its impact on the social practices of scientific knowledge production . AI-assisted scientific research may portend a future in which not only the participants lack diversity, but also the research topics pursued lack diversity. They also induce an illusion of cognitive understanding, preventing us from realizing that our perspective has narrowed. While the widespread use of AI in science is inevitable, scientists, in addition to producing knowledge, have the responsibility to determine how things develop. Paper link: https://www.nature.com/articles/s41586-024-07146-0 |
<<: Seriously, stop using plastic bottles!
When doing private domain business, how to acquir...
The prices on Taobao’s daily specials are very ch...
gossip "Will long hair compete with the baby...
Many app developers can't wait to remind user...
Everyone has different habits when it comes to cl...
After half a year of silence, the Douyin Mini Pro...
According to financial website The Motley Fool, Ap...
What's going on as cinemas across the country...
This article is a review and summary of the failu...
2019 is already more than halfway through. Lookin...
Friends who do promotion should all have this fee...
Can an e-commerce business license be used to ope...
Recently there was a discussion topic about traff...
How much does it cost to make a fruit and vegetab...
Preface Flutter is a cross-platform solution laun...