Concept:

Red Team

Select any number of buttons on the left to see varieties of data sources available for analysis.

Definition

Red Team

Humans who deliberately attempt to trick or negatively influence a machine system to increase its quality, accuracy, security, or consistency. Red teams in AI work to lessen the number of offensive, inaccurate, biased, and/or undesirable results users experience. In cybersecurity, red teams are sometimes part of 'white hat' hacking for testing.

"The red team attempted to trick the AI tool into saying racist things so that developers could spot ways to prevent the model from coming to those false conclusions."

Guidebooks

No items found.