Microsoft Research |
BioGPT: Text Generation
- Run
- About
More
Response
The response to your query will appear here
BioGPT is a language model trained for biomedical tasks, it is trained on the PubMed dataset which contains 15 million abstracts. BioGPT outperforms larger, more general language models in biomedical language benchmarks.
Example use case: Generate text within the biomedical domain. See examples below.
Technology: GPT-2 backbone
Limitations: Sometimes produces false information
Technology: GPT-2 backbone
Limitations: Sometimes produces false information
Citation:
Renqian Luo, Liai Sun, Yingce Xia, Tao Qin, Sheng Zhang, Hoifung Poon, Tie-Yan Liu, BioGPT: generative pre-trained transformer for biomedical text generation and mining, Briefings in Bioinformatics, Volume 23, Issue 6, November 2022, bbac409, https://doi.org/10.1093/bib/bbac409
Released:
Feb-23-2023
Feb-23-2023