This repository contains the implementation for the ISBI 2025 paper "Fairness Analysis of CLIP-Based Foundation Models for X-Ray Image Classification" (arXiv:2501.19086). The project evaluates the fairness and performance of CLIP-based foundation models (CLIP, MedCLIP, and BiomedCLIP) on X-ray image classification tasks. We implement three fine-tuning approaches: Linear Prob, MLP, and LoRA (Low-Rank Adaptation).
conda env create -f environment.yaml
conda activate base_clip
# Install CLIP and MedCLIP
pip install git+https://github.com/openai/CLIP.git
pip install git+https://github.com/RyanWangZf/MedCLIP.gitan example command to run LoRA fine-tuning with CLIP (B/16 variant):
python main.py --model_type clip --variant B16 --mode loraYou can modify these arguments according to parse_arguments in utils.py.
python calculate_metrics.pyIf you use this code in your research, please cite our paper:
@inproceedings{sun2025fairness,
title={Fairness Analysis of Clip-Based Foundation Models for X-Ray Image Classification},
author={Sun, Xiangyu and Zou, Xiaoguang and Wu, Yuanquan and Wang, Guotai and Zhang, Shaoting},
booktitle={2025 IEEE 22nd International Symposium on Biomedical Imaging (ISBI)},
pages={1--5},
year={2025},
organization={IEEE}
}This project is licensed under the MIT License (or specify your preferred license).