DE[EN]
adversarial attackadverser Angriff

The deliberate use of adversarial examples to cause errors in the output of a model. Especially artificial neural networks are prone to this kind of attacks.

ISO/IEC TR 29119-11:2020 deliberate use of adversarial examples (3.1.7) to cause a ML model (3.1.46) to fail

ISTQB - CTAI Syllabus The deliberate use of adversarial examples to cause an ML model to fail