May 3, 2024

Medical Trend

Medical News and Medical Resources

Warning! ChatGPT could help create the next pandemic virus

Warning! ChatGPT could help create the next pandemic virus



 

Warning! ChatGPT could help create the next pandemic virus, biosecurity experts warn

ChatGPT is a new chat robot model released by the artificial intelligence research laboratory OpenAI on November 30, 2022 – a natural language processing tool driven by artificial intelligence technology.

ChatGPT can conduct conversations by learning and understanding human language, and can also interact according to the context of the chat, truly chatting and communicating like humans.

Since its launch, the powerful capabilities of ChatGPT have attracted much attention. On March 14, 2023, OpenAI released an upgraded version ofGPT-4, its performance has been greatly improved.

 

Tech experts have long warned that artificial intelligence (AI) could turn against humanity by taking over everything from commerce to warfare. And now, we might add a new concern — that AI might help people with no scientific background and nefarious intentions design a virus capable of sparking a pandemic.

 

Recently, Kevin Esvelt , a biosecurity expert at the Massachusetts Institute of Technology (MIT) , conducted a test in which he asked students to create a dangerous virus with the help of ChatGPT or other large language models.

After just an hour, the students had a list of candidate viruses and a list of companies that could synthesize the genetic sequences of those viruses and assemble the viral fragments.

 

The results of the test, which were recently published on the preprint platform arXiv , highlight that artificial intelligence systems may soon allow non-scientists to design biological weapons, the authors said . Of course, some virologists think this is a bit unfounded.

 

Warning! ChatGPT could help create the next pandemic virus

 

 

Biosafety experts are already concerned that the biology community’s culture of open exchange of information could be exploited by bioterrorists.

In theory, a paper describing an extinct or spreading deadly virus could provide a blueprint for a new biological weapon. But until now, it has required a considerable amount of expertise to make it happen.

Potential bioterrorists need to identify a candidate virus as a starting point, synthesize viral genetic material, and mix it with other reagents to “prime” a virus capable of infecting cells and replicating.

 

However, these complex steps are becoming easier. For example, synthetic biology companies usually screen to ensure that orders do not contain genetic material for potential biological weapons, but new DNA printers are now able to bypass this.

Malicious people can send gene fragments synthesized by DNA printers to outsourced service companies (CROs) to assemble target viruses.

Of course, to actually trigger a pandemic, you need to manufacture the virus on a large scale and find an effective delivery system.

 

The rapid advancement of artificial intelligence technology is further reducing the above barriers. To conduct his investigation, Kevin Esvelt divided a group of graduate students with no life science expertise into three groups of 3-4 people. Each group had access to GPT-4, Bard (an AI chatbot developed by Google) , and other AI chatbots, and had 1 hour to ask the chatbots to help them design and acquire a pathogen capable of causing a pandemic.

 

Some chatbots will not answer these dangerous inquiries, but students have found that this protection can be easily bypassed by saying they are developing a vaccine to prevent these dangerous viruses, for example.

Within an hour, the chatbot offered up four viruses — the 1918 H1N1 flu virus , the 2012 H5N1 bird flu virus , smallpox virus , and a strain of Nipah virus .

In some cases, chatbots have even pointed to genetic mutations reported in the literature to increase transmission.

 

The chatbots also describe techniques for assembling viruses from their genetic sequences, as well as necessary lab supplies and companies that can provide them.

Finally, the chatbot even lists companies that might be willing to synthesize genetic material without screening and contract labs to help put the material together.

 

The research team is skeptical that these chatbot suggestions pose a pandemic threat. For example, modern humans have some degree of immunity to previous pandemic influenza viruses. The huge genome of the smallpox virus is difficult even for experts to assemble.

But they also argue that as more of the literature on biological threats is incorporated into AI training data, the experiment highlights how AI and other tools can make it easier for would-be terrorists to unleash new threats.

 

Kevin Esvelt thinks information constraints on chatbots and other AI engines could help.

For example, the very few papers describing recipes for creating and enhancing pathogens were excluded from the AI ​​training set.

Removing these papers (less than 1% of the total number of papers on PubMed)was enough to remove almost all risk,he wrote in a preprint paper.

Of course, this comes at a price, and AI will not be able to use these papers to advance biology in a positive way. But doing so prevents the misuse of the technology.

 

Additional safeguards include requiring all gene synthesis companies and future desktop DNA printers to screen for genetic material for known pathogens and toxins, and requiring CRO companies to verify the safety of any genetic material they are asked to assemble.

 

But Gustavo Palacios , a virologist at the Mount Sinai Health System, says there’s no need to panic. Initiating a virus is more challenging than described, and the idea that CRO companies would create bioweapons is absurd.

 

 

 

 

 

References :
https://arxiv.org/abs/2306.03809
https://www.science.org/content/article/could-chatbots-help-design-next-pandemic-virus

Warning! ChatGPT could help create the next pandemic virus

(source:internet, reference only)


Disclaimer of medicaltrend.org


Important Note: The information provided is for informational purposes only and should not be considered as medical advice.