Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>If you give GPT3 code with a bug in it, and ask it to find the bug, it can't really do that.

The hell are you talking about? I've been doing this literally any time I need something fixed and it does just fine.



It doesn’t solve non-trivial bugs. It can bugs that match patterns that have been asked a lot if Stackoverflow or something like that.


I love the sound of goal posts as they go whooshing past.


I think it's more likely a problem that the GP poorly worded their initial statement, rather than actually moving the goalposts. They were probably having trouble with a few thorny bugs, tried ChatGPT, got nowhere, and forgot to qualify their initial statement with "for the few non-trivial bugs I tried".

From the external point of view, the goalposts moved, but from within the GP's poorly expressed mental model, they haven't moved. But, that's just a guess.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: