Fun with OpenAI
evilrat
evilrat666 at gmail.com
Wed Dec 21 18:43:22 UTC 2022
On Wednesday, 21 December 2022 at 17:01:34 UTC, H. S. Teoh wrote:
>
> This sounds like a straight quote from Wikipedia... or a
> paraphrase.
>
What did surprised you? It is all trained on all kinds of texts
available including the internet scraped data. All it does is
basically predicting the next 'word' for the input.
GPT-J generates something similar but less stable and gets easily
confused.
The summary is likely a result of additional filtering and topic
extraction using another neural network like BERT. And also it
seems they do some extra pre- and post-processing and sampling,
because GPT-J can output nonsense like "D doesn't have garbage
collection and is lower level than C++" and "D is very simple and
easy to debug".
More information about the Digitalmars-d
mailing list