“I could be wrong, but it sounds like there’s a decent chance this means some bot managed to convince YouTube’s bots that some re-upload of that video (of which there has been an incessant onslaught) was the original, and successfully issue the takedown and copyright strike request.”

While Sanderson, who did not immediately respond to a request for comment, attributed the error to automation, Nikita Varabei, co-founder and CEO of ChainPatrol.io, claimed the takedown was the result of human error – an employee pasted the wrong URL into a takedown submission form.

“Honestly, it was just human error. There was no automated AI that submitted this takedown in any way. This specific thing wasn’t even reviewed by our system. What we see here is the case of human error where we actually need to have more automated checks in place that can prevent this kind of human error.”

False positives, said Varabei, are something his AI-powered company takes extremely seriously, noting that Chainpatrol responded to Sanderson as soon as it saw his post. “False positives are something that should never happen,” he said.

Alas, false positives do happen. Game site itch.io, for example, in December said its website was taken down briefly as a result of “AI powered” brand protection software.

Varabei said people often don’t understand that ChainPatrol handles millions of scam sites, fake domains, and fake YouTube videos.

“We try to keep an extremely low false positive rate,” he said. “We deal with them very quickly. In some cases, human error like this can happen. And we already have a plan of action that we’re starting to deploy and develop to make sure that this doesn’t repeat.” ®