Toot by Jage Jage (mas.to)

@sclower @BorrisInABox Your case is one where you have a particular problem you need solved, and using an LLM to do the work is the more efficient method for solving the problem. But you're not passing it off as "Hey I wrote this code", and I'm assuming that if you were to hand this script off to someone else you'd make it clear that you got this code from ChatGPT or similar and so you couldn't troubleshoot it if it didn't work exactly as expected.

Toot by Jage Jage (mas.to)

@sclower @BorrisInABox This isn't what I mean when I say "Turn over your cognative functions". I mean things like the following: When writing documentation intended for public use, you copy all your text straight from an LLM, and then don't check the result to ensure it's correct. You then pass that documentation on to people who are depending on you for the correct information, and the information you've provided is incorrect. Or, you're setting up a new computer, and, instead of consulting the appropriate documentation, you go to ChatGPT or similar for the answer. Both of these are situations I've encountered within the last 9 months or so, and it's only getting worse.