-
Notifications
You must be signed in to change notification settings - Fork 143
/
Copy pathindex.html
175 lines (162 loc) · 10.7 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Awesome Dataset Distillation</title>
<link rel="stylesheet" media="screen" href="css/styles.css">
<link rel="apple-touch-icon" sizes="180x180" href="images/apple-touch-icon.png">
<link rel="icon" type="image/png" sizes="32x32" href="images/favicon-32x32.png">
<link rel="icon" type="image/png" sizes="16x16" href="images/favicon-16x16.png">
<link rel="mask-icon" href="images/safari-pinned-tab.svg" color="#5bbad5">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta name="author" content="Li Longzhen">
<meta name="description" content="A curated list of awesome papers on dataset distillation and related applications.">
<meta name="color-scheme" content="light">
<link rel="preconnect" href="https://fonts.googleapis.com">
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
<link href="https://fonts.googleapis.com/css2?family=Outfit:[email protected]&display=swap" rel="stylesheet">
<script src="PaperCatcher.js"></script>
</head>
<body id="awesome-dataset-distillation" class="background">
<header class="top-header">
<div class="background" style="height: 8px;"></div>
<div class="top-bar background">
<div class="top-bar-left">
<span onclick="open_sidebar()" class="navigation-button on-background-text">
<svg xmlns="http://www.w3.org/2000/svg" height="24" viewBox="0 -960 960 960" width="24"><path d="M120-240v-80h720v80H120Zm0-200v-80h720v80H120Zm0-200v-80h720v80H120Z"/></svg>
</span>
<span class="top-bar-title title-medium on-background-text">
<span>Awesome Dataset Distillation</span>
</span>
</div>
</div>
</header>
<div onclick="close_sidebar()" class="overlay" id="sidebarOverlay"></div>
<div class="background" style="height: 72px;"></div>
<div class="top-title">
<div class="title-brief background">
<!--Here put the title and main content-->
<div class="show-on-large-screens"><h1 class="title display-large">Awesome Dataset Distillation</h1></div>
<div class="show-on-middle-screens"><h1 class="title display-large">Awesome<br> Dataset Distillation</h1></div>
<div class="show-on-small-screens"><h1 class="title display-large">Awesome<br> Dataset<br> Distillation</h1></div>
<div><p class="brief title-large">Awesome Dataset Distillation provides the most comprehensive and detailed information on the Dataset Distillation field.</p></div>
<div><p class="brief title-large">This project is curated and maintained by
<a style="color: black;" href="https://www-lmd.ist.hokudai.ac.jp/member/guang-li/">Guang Li</a>,
<a style="color: black;" href="https://www.bozhao.me/">Bo Zhao</a>,
and <a style="color: black;" href="https://www.tongzhouwang.info/">Tongzhou Wang</a>.
</p></div>
<div>
<button onclick="location.href='https://github.com/Guang000/Awesome-Dataset-Distillation'" type="button" class="github-homepage primary">
<span class="on-primary-text headline-small">
GitHub
</span>
</button>
</div>
</div>
</div>
<div class="sidebar surface-variant" id="quickContent">
<div class="top-bar" id="sidebarTitle">
<div class="surface-variant" style="height: 8px"></div>
<div class="top-bar-left surface-variant">
<span onclick="close_sidebar()" class="close-navigation-button on-background-text">
<svg xmlns="http://www.w3.org/2000/svg" height="24" viewBox="0 -960 960 960" width="24"><path d="M120-240v-80h520v80H120Zm664-40L584-480l200-200 56 56-144 144 144 144-56 56ZM120-440v-80h400v80H120Zm0-200v-80h520v80H120Z"/></svg>
</span>
<span class="top-bar-title title-medium on-background-text">
Awesome Dataset Distillation
</span>
</div>
</div>
<div class="surface-variant" style="height: 64px"></div>
<div class="navigation-list" id="navi"></div>
</div>
<!--statistics-->
<div class="overall-container">
<div class="count-container">
<div class="item-number-container" onclick="location.href='https://github.com/Guang000/Awesome-Dataset-Distillation'">
<p class="count-title on-background-text display-small">GitHub Stars:</p>
<p class="count-number on-background-text display-large" id="starCount"> </p>
</div>
<hr class="count-divider on-background-text">
<div class="item-number-container" onclick="location.href='https://github.com/Guang000/Awesome-Dataset-Distillation'">
<p class="count-title on-background-text display-small">Paper Statistics:</p>
<p class="count-number on-background-text display-large" id="counter"> </p>
</div>
</div>
<h2 class="section-title display-medium on-background-text">Background & Vision</h2>
<div class="background-vision-container surface-variant">
<!--Put the Background information here-->
<p class="background-vision body-large on-surface-variant-text">Dataset distillation is the task of synthesizing a small dataset such that models trained on it achieve high performance on the original large dataset. A dataset distillation algorithm takes as input a large real dataset to be distilled (training set), and outputs a small synthetic distilled dataset, which is evaluated via testing models trained on this distilled dataset on a separate real dataset (validation/test set). A good small distilled dataset is not only useful in dataset understanding, but has various applications (e.g., continual learning, privacy, neural architecture search, etc.). This task was first introduced in the paper <a class="body-large on-surface-variant-text" href="https://www.tongzhouwang.info/dataset_distillation/">Dataset Distillation [Tongzhou Wang et al., '18]</a>, along with a proposed algorithm using backpropagation through optimization steps. Then the task was first extended to the real-world datasets in the paper <a class="body-large on-surface-variant-text" href="https://arxiv.org/abs/2104.02857">Medical Dataset Distillation [Guang Li et al., '19]</a>, which also explored the privacy preservation possibilities of dataset distillation. In the paper <a class="body-large on-surface-variant-text" href="https://arxiv.org/abs/2006.05929">Dataset Condensation [Bo Zhao et al., '20]</a>, gradient matching was first introduced and greatly promoted the development of the dataset distillation field.</p>
<br>
<p class="background-vision body-large on-surface-variant-text">In recent years (2022-now), dataset distillation has gained increasing attention in the research community, across many institutes and labs. More papers are now being published each year. These wonderful researches have been constantly improving dataset distillation and exploring its various variants and applications.</p>
</div>
<!--latest updates-->
<h2 class="section-title display-medium on-background-text">Latest Updates</h2>
<div class="lastupdate-container surface-variant" id="latest">
</div>
<!--main contents-->
<h2 class="section-title display-medium on-background-text">Content</h2>
<div class="main-content" id="contents"></div>
<!--media coverage-->
<h2 class="section-title display-medium on-background-text">Media Coverage</h2>
<div class="media-coverage-container">
<div class="media-coverage-content-container title-large surface-variant on-surface-variant-text">
<a class="on-surface-variant-text" href="https://twitter.com/TongzhouWang/status/1560043815204970497?cxt=HHwWgoCz9bPlsaYrAAAA"><p class="essay-content">Beginning of Awesome Dataset Distillation</p></a>
</div>
<div class="media-coverage-content-container title-large surface-variant on-surface-variant-text">
<a class="on-surface-variant-text" href="https://www.libhunt.com/posts/874974-d-most-popular-ai-research-aug-2022-ranked-based-on-github-stars"><p class="essay-content">Most Popular AI Research Aug 2022</p></a>
</div>
<div class="media-coverage-content-container title-large surface-variant on-surface-variant-text">
<a class="on-surface-variant-text" href="https://www.jiqizhixin.com/articles/2022-10-11-22"><p class="essay-content">一个项目帮你了解数据集蒸馏Dataset Distillation</p></a>
</div>
<div class="media-coverage-content-container title-large surface-variant on-surface-variant-text">
<a class="on-surface-variant-text" href="https://mp.weixin.qq.com/s/__IjS0_FMpu35X9cNhNhPg"><p class="essay-content">浓缩就是精华:用大一统视角看待数据集蒸馏</p></a>
</div>
</div>
</div>
<!--footprint-->
<div class="page-footer-container">
<hr class="outline-variant" style="width:100%;margin:0">
<footer class="page-footer">
<section class="about">
<div>
<h3 class="page-footer-title headline-large on-background-text">Awesome Dataset Distillation</h3>
<div class="cite">
<!--Put cite instruction here-->
<p class="page-footer-instruction body-large on-background-text">If you find this project useful for your research, please use the following BibTeX entry:</p>
<code class="on-background-text" style="display: block; white-space: pre-line">
@misc{li2022awesome,
author={Li, Guang and Zhao, Bo and Wang, Tongzhou},
title={Awesome Dataset Distillation},
howpublished={\url{https://github.com/Guang000/Awesome-Dataset-Distillation}},
year={2022}
}
</code>
</div>
<div class="visitor-map">
<script src="//rf.revolvermaps.com/0/0/7.js?i=53hthxwlwi6&m=0c&c=ff0000&cr1=ffffff&br=20&sx=0&cw=fbfcfe&cb=191c1e" async="async"></script>
</div>
<div>
<p class="page-footer-instruction body-large on-background-text">This page has been viewed:</p>
<script src="//counter.websiteout.com/js/7/6/0/0"></script>
<p class="page-footer-instruction body-large on-background-text">since 2024.03.22</p>
</div>
</div>
<div>
<div class="contribution">
<!--Put contribution instruction here-->
<h4 class="page-footer-subtitle body-medium on-background-text">Contribution Guide</h4>
<p class="page-footer-instruction body-large on-background-text">If you want to make contribution to this project, click <a class="primary-text" href="https://github.com/Guang000/Awesome-Dataset-Distillation/blob/main/CONTRIBUTING.md">here</a> for more details.</p>
</div>
<div class="acknowledgment" id="thanks"></div>
</div>
</section>
</footer>
</div>
<script>
window.onload = github_star;
fetch('data/navi_abbr.json')
.then(response => response.json())
.then(data =>{initialize(data)});
</script>
</body>
</html>