diff --git a/.DS_Store b/.DS_Store new file mode 100644 index 0000000..3cbf5c4 Binary files /dev/null and b/.DS_Store differ diff --git a/assets/.DS_Store b/assets/.DS_Store new file mode 100644 index 0000000..5008ddf Binary files /dev/null and b/assets/.DS_Store differ diff --git a/assets/artbench_counter_1_wo.png b/assets/artbench_counter_1_wo.png new file mode 100644 index 0000000..4ddc5dd Binary files /dev/null and b/assets/artbench_counter_1_wo.png differ diff --git a/assets/artbench_counter_5.png b/assets/artbench_counter_5.png new file mode 100644 index 0000000..63ec166 Binary files /dev/null and b/assets/artbench_counter_5.png differ diff --git a/assets/artbench_counter_9_wo.png b/assets/artbench_counter_9_wo.png new file mode 100644 index 0000000..d0ce8b9 Binary files /dev/null and b/assets/artbench_counter_9_wo.png differ diff --git a/assets/cifar2_counter_1.png b/assets/cifar2_counter_1.png new file mode 100644 index 0000000..a695e85 Binary files /dev/null and b/assets/cifar2_counter_1.png differ diff --git a/assets/cifar2_counter_21_wo.png b/assets/cifar2_counter_21_wo.png new file mode 100644 index 0000000..d5eb894 Binary files /dev/null and b/assets/cifar2_counter_21_wo.png differ diff --git a/assets/cifar2_counter_5_wo.png b/assets/cifar2_counter_5_wo.png new file mode 100644 index 0000000..9eed072 Binary files /dev/null and b/assets/cifar2_counter_5_wo.png differ diff --git a/assets/table_1.png b/assets/table_1.png new file mode 100644 index 0000000..ea2bd9b Binary files /dev/null and b/assets/table_1.png differ diff --git a/assets/teaser_1.png b/assets/teaser_1.png new file mode 100644 index 0000000..cdeb07a Binary files /dev/null and b/assets/teaser_1.png differ diff --git a/assets/teaser_2.png b/assets/teaser_2.png new file mode 100644 index 0000000..361ab01 Binary files /dev/null and b/assets/teaser_2.png differ diff --git a/index.html b/index.html new file mode 100644 index 0000000..99702e0 --- /dev/null +++ b/index.html @@ -0,0 +1,623 @@ + + +
+ + + + ++ Data attribution seeks to trace model outputs back to training data. + With the recent development of diffusion models, data attribution has become a desired module to properly assign valuations for high-quality or copyrighted training samples, ensuring that data contributors are fairly compensated or credited. + Several theoretically motivated methods have been proposed to implement data attribution, in an effort to improve the trade-off between computational scalability and effectiveness. +
+ In this work, we conduct extensive experiments and ablation studies on attributing diffusion models, specifically focusing on DDPMs trained on CIFAR-10 and CelebA, as well as a Stable Diffusion model LoRA-finetuned on ArtBench. + Intriguingly, we report counter-intuitive observations that theoretically unjustified design choices for attribution empirically outperform previous baselines by a large margin, in terms of both linear datamodeling score and counterfactual evaluation. +
++ Our work presents a significantly more efficient approach for attributing diffusion models, while the unexpected findings suggest that at least in non-convex settings, constructions guided by theoretical assumptions may lead to inferior attribution performance. +
+ +
+ |
+
+ + | +
+ |
+
+ + Proponents and opponents visualization on ArtBench-2 using TRAK and D-TRAK with various # of timesteps (10 or 100). + For each sample of interest, 5 most positive influential training samples and 3 most negative influential training samples are given together with the influence scores (below each sample). + + |
+
+ |
+
+ + Counterfactual visualization on CIFAR-2 (Left) and ArtBench-2 (Right). + We compare the original generated samples to those generated from the same random seed with the retrained models. + + |
+
@inproceedings{
+ zheng2023intriguing,
+ title={Intriguing Properties of Data Attribution on Diffusion Models},
+ author={Zheng, Xiaosen and Pang, Tianyu and Du, Chao and Jiang, Jing and Lin, Min},
+ booktitle={NeurIPS Workshop on Attributing Model Behavior at Scale},
+ year={2023},
+ }
+ }
+