System

UNLABELED develops camouflage patterns actually work in the real world to protect us from labeling, such as “a camouflage to prevent AI recognizes a person”.

Utilizing a technique called “Adversarial Patch” that induces false recognition by adding a specific pattern to an image or video. This method lets AI cause misrecognition, miss objects or recognize objects as different.

*Camouflage patterns are generated based on YOLOv2, and results may differ according to the environment such as camera angle, distance, brightness, etc.