A face database with a large number of high-quality attribute annotations
-
Updated
Dec 14, 2021
A face database with a large number of high-quality attribute annotations
This is an open-source tool to assess and improve the trustworthiness of AI systems.
Source code and notebooks to reproduce experiments and benchmarks on Bias Faces in the Wild (BFW).
Oracle Guardian AI Open Source Project is a library consisting of tools to assess fairness/bias and privacy of machine learning models and data sets.
Evidence-based tools and community collaboration to end algorithmic bias, one data scientist at a time.
Julia Toolkit with fairness metrics and bias mitigation algorithms
[Nature Medicine] The Limits of Fair Medical Imaging AI In Real-World Generalization
Official code of "Discover and Mitigate Unknown Biases with Debiasing Alternate Networks" (ECCV 2022)
CIRCLe: Color Invariant Representation Learning for Unbiased Classification of Skin Lesions
Demographic Bias of Vision-Language Foundation Models in Medical Imaging
[ICCV 2023] Partition-and-Debias: Agnostic Biases Mitigation via a Mixture of Biases-Specific Experts
Enforcing fairness in binary and multiclass classification
Explainable AI & fashion talk & experiments
Research POC on the mitigation of bias in large language models (FLAN-T5 and Bloomz) through fine-tuning.
Code implementation for BiasMitigationRL, a reinforcement learning-based bias mitigation method.
An ML competition on CodaLab to estimate the age from images while mitigating the bias
CIRCLe: Color Invariant Representation Learning for Unbiased Classification of Skin Lesions. Mirror of https://github.com/arezou-pakzad/CIRCLe
This project implements a collaborative agent pipeline to detect and reduce biases in large language model outputs, focusing on improving pronoun inclusivity and fair queer representation.
Bias detection Toolkit: Chrome Extension, Python Package, SOTA research paper docs.
Add a description, image, and links to the bias-mitigation topic page so that developers can more easily learn about it.
To associate your repository with the bias-mitigation topic, visit your repo's landing page and select "manage topics."