System
UNLABELED develops camouflage patterns actually work in the real world to protect us from labeling, such as “a camouflage to prevent AI recognizes a person”.
Utilizing a technique called “Adversarial Patch” that induces false recognition by adding a specific pattern to an image or video. This method lets AI cause misrecognition, miss objects or recognize objects as different.