GPT-4 is a transformer based model pre-trained to predict the next token in a document.
Source: GPT-4 Technical ReportPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 78 | 11.76% |
Large Language Model | 47 | 7.09% |
Question Answering | 38 | 5.73% |
Retrieval | 26 | 3.92% |
In-Context Learning | 24 | 3.62% |
Code Generation | 17 | 2.56% |
Benchmarking | 17 | 2.56% |
Decision Making | 15 | 2.26% |
Prompt Engineering | 15 | 2.26% |