Skip to content

Adversarial Examples

rameshjesswani edited this page Jun 18, 2018 · 1 revision
  • Adversarial Examples are those examples which machine learning model fails to classify correctly. In other words, these are examples that are created to fool the classifier.

  • An adversarial example is usually the noise added in the original image.

  • Suppose I have Mnist dataset, If I train any classifier such as Random Forest, and I get accuracy around 95% on trained dataset.

  • After that, I generate the adversarial images from test dataset.

    • How to generate adversarial images from test dataset?
      • Take each test image,
        Adversarial Example
  • ** To be completed **

Clone this wiki locally