News
MIT researchers say the influential ImageNet data set has "systematic annotation issues" when used as a benchmark for evaluating object recognition models.
The tool makes small changes to images that are imperceptible to the human eye but nevertheless hamper AI-generated alterations.
Researchers from MIT have 3D printed a turtle that can fool image recognition algorithms into thinking it's a rifle. Here's why that's scary.
When something does zero-shot image classification, that means it’s able to make judgments about the contents of an image without the user needing to train the system beforehand on what to look ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results