All comments seem to assume the officers lied about not using AI, but the article doesn't actually say that:
> officers had found this material through a Google search
> the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot
> his force used fictional output from Microsoft Copilot
What this says is that the material originates from Copilot.
I suppose you can read that and interpret that they lied about the Google search, but if you assume incompetence over malice, the more likely interpretation is that they didn't properly verify their source found through Google. It could have been the source of the source of their source that used Copilot, not the officers themselves.
The takeaway here is that even if you don't use AI tools and do things as you did before AI, you may still be basing your work on AI content.
A parallel may someone saying they don't use AI in their code because they don't use AI tools, but then it turns out that a dependency of a dependency is built by AI.
> officers had found this material through a Google search
> the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot
> his force used fictional output from Microsoft Copilot
What this says is that the material originates from Copilot.
I suppose you can read that and interpret that they lied about the Google search, but if you assume incompetence over malice, the more likely interpretation is that they didn't properly verify their source found through Google. It could have been the source of the source of their source that used Copilot, not the officers themselves.
The takeaway here is that even if you don't use AI tools and do things as you did before AI, you may still be basing your work on AI content.
A parallel may someone saying they don't use AI in their code because they don't use AI tools, but then it turns out that a dependency of a dependency is built by AI.