Unleashing power of unbiased data with DGIST
In the fast-evolving world of artificial intelligence (AI), data collection serves as the bedrock for building advanced machine learning models. Yet, this seemingly straightforward task is fraught with the potential to introduce unintended texture biases. When an AI model is trained on biased data and then applied to out-of-distribution data, its performance can take a dramatic hit. The root and impact of these biases need to be carefully addressed. Countless studies have sought to mitigate or eliminate these biases. Earlier research efforts proposed techniques like adversarial learning to extract bias-independent features, enabling models to fulfill their intended classification tasks without relying on biased data. However, despite these promising efforts, decoupling biased features thro...