Fun with OpenAI

H. S. Teoh hsteoh at qfbox.info
Wed Dec 21 19:14:37 UTC 2022


On Wed, Dec 21, 2022 at 06:43:22PM +0000, evilrat via Digitalmars-d wrote:
> On Wednesday, 21 December 2022 at 17:01:34 UTC, H. S. Teoh wrote:
> > 
> > This sounds like a straight quote from Wikipedia... or a paraphrase.
> > 
> 
> What did surprised you?

I wasn't surprised at all. Just skeptical.


> It is all trained on all kinds of texts available including the
> internet scraped data. All it does is basically predicting the next
> 'word' for the input.

Exactly what I thought it was based on.


> GPT-J generates something similar but less stable and gets easily
> confused.
> 
> The summary is likely a result of additional filtering and topic
> extraction using another neural network like BERT. And also it seems
> they do some extra pre- and post-processing and sampling, because
> GPT-J can output nonsense like "D doesn't have garbage collection and
> is lower level than C++" and "D is very simple and easy to debug".

IOW it's just a smarter way to dress up the same information that's
already available online and found by traditional means.  Nothing to see
here, move along.


T

-- 
People tell me I'm stubborn, but I refuse to accept it!


More information about the Digitalmars-d mailing list