The "question" My new coworker is actually ChatGPT is clearly a piece of fiction and should be closed.
We shouldn't encourage this sort of fantasy writing.
The "question" My new coworker is actually ChatGPT is clearly a piece of fiction and should be closed.
We shouldn't encourage this sort of fantasy writing.
It should be discussed, as hyperbole and misunderstanding, rather than fantasy.
There's no way the job is being done autonomously by ChatGPT any more than it is being done autonomously by a chainsaw. That aspect is fantasy or hyperbole. But it's also an understandable mistake, given it's new tech, people are going off fictional TV and movie representations, and this SE site is in a good place to do that education and draw those distinctions.
It's entirely plausible that someone is relying so heavily on ChatGPT, say to run multiple jobs, that the symptoms are as reported in the question. Indeed, that's a more plausible explanation than the one given in the question, as ChatGPT wouldn't take hours to respond.
There are already news reports of people using ChatGPT as grifty overemployment.
I would argue the question is relevant, and these details should be thrashed out in the comments and answers, rather than putting the whole thing out of bounds by closing the question because someone misunderstands what the tech can do.
It could be a legitimate piece of stupidity. On either the questioner's part or that of their cow-orker.
I'm inclined to answer as "Even if you believe this, it's none of your business as long as they are being productive; leave them alone and do your job." Which is a good general answer to most workplace fantasies, including the several romantic fantasies that have gone by in the past year.