GENERATIVE PRE-TRAINED TRANSFORMER 3

Authors

  • Oleksandr GOLUBENKO
  • Oleksandr PIDMOGYLNYI

DOI:

https://doi.org/10.53920/ITS-2022-2-2

Keywords:

artificial intelligence (AI), machine learning, natural language processing (NLP), generative pretraining transformer (GPT), text generation, deep learning, neural network

Abstract

GPT (Generative Pre-training Transformer) is a type of artificial intelligence (AI) that uses machine learning algorithms to generate text in natural language. The first version of GPT, released in 2018, was a revolutionary breakthrough in AI and natural language processing (NLP). However, it also had some limitations and issues that were addressed in subsequent versions of the model.

One of the main problems with the first version of GPT was the lack of control over the content it generated. The model was trained on a large dataset of human-generated text and was able to generate coherent and seemingly human-like text on a wide range of topics. However, he often produced text that was biased, offensive, or otherwise inappropriate because he could not fully understand the context or meaning of the words used.

Another problem with the first version of GPT was its inability to handle more complex NLP tasks such as translation or annotation. Although he could produce coherent text, he could not understand the meaning or structure of the text as a human could.

Later versions of GPT, such as GPT-2 and GPT-3, addressed these issues and added new capabilities, such as the ability to perform more complex NLP tasks and generate more coherent and context-appropriate text. However, they still have limitations and can produce biased or inconsistent results if not used responsibly.

Published

2022-12-30

How to Cite

GOLUBENKO О. І., & PIDMOGYLNYI О. О. (2022). GENERATIVE PRE-TRAINED TRANSFORMER 3. ITSynergy, (2), 19–27. https://doi.org/10.53920/ITS-2022-2-2

Issue

Section

Presentation