Changes
On December 2, 2024 at 5:54:54 PM UTC, admin:
-
Changed title to ImageNet Dataset (previously ImageNet dataset)
-
Set author of ImageNet Dataset to Myungseo Song (previously Diane Bouchacourt)
-
Updated description of ImageNet Dataset from
The dataset used in this paper is a collection of images from the ImageNet dataset, preprocessed and used for training and evaluation of the proposed diffusion spectral entropy and diffusion spectral mutual information methods.
toObject recognition is arguably the most important problem at the heart of computer vision. Recently, Barbu et al. introduced a dataset called ObjectNet which includes objects in daily life situations.
-
Removed the following tags from ImageNet Dataset
- Natural images
- thousand object categories
- Deep learning
- Images
- classification
- computer vision
- Object Classification
- AlexNet
- Image Compression
- ILSVRC
- Image classification
- Image Resolution
- image dataset
- convolutional neural networks
- large-scale image dataset
- Large-scale dataset
- large-scale
- Object Classification
- 1000 Classes
- large-scale image database
- VGG-16
- Generative Adversarial Networks
- CNNs
- ImageNet dataset
- Computer vision
- images
- object recognition
- Large-scale Dataset
- generative models
- Variations
- object labels
- CNN
- Object recognition
- adversarial attacks
- Object detection
- Deep Learning
- large-scale dataset
- Neural Networks
-
Added the following tags to ImageNet Dataset
-
Changed value of field
defined_in
tohttps://doi.org/10.48550/arXiv.2106.14156
in ImageNet Dataset -
Changed value of field
extra_authors
to[{'extra_author': 'Jinyoung Choi', 'orcid': ''}, {'extra_author': 'Bohyung Han', 'orcid': ''}]
in ImageNet Dataset -
Changed value of field
citation
to['https://doi.org/10.48550/arXiv.1412.3684', 'https://doi.org/10.48550/arXiv.2305.01506', 'https://doi.org/10.48550/arXiv.2206.11488', 'https://doi.org/10.48550/arXiv.1904.06490', 'https://doi.org/10.48550/arXiv.2108.09551', 'https://doi.org/10.48550/arXiv.2006.07161', 'https://doi.org/10.48550/arXiv.2304.02798', 'https://doi.org/10.48550/arXiv.1909.00145']
in ImageNet Dataset -
Deleted resource Original Metadata from ImageNet Dataset
f | 1 | { | f | 1 | { |
2 | "access_rights": "", | 2 | "access_rights": "", | ||
n | 3 | "author": "Diane Bouchacourt", | n | 3 | "author": "Myungseo Song", |
4 | "author_email": "", | 4 | "author_email": "", | ||
5 | "citation": [ | 5 | "citation": [ | ||
n | 6 | "https://doi.org/10.48550/arXiv.1810.01018", | n | ||
7 | "https://doi.org/10.48550/arXiv.1508.07148", | ||||
8 | "https://doi.org/10.48550/arXiv.2105.12781", | ||||
9 | "https://doi.org/10.1109/TPAMI.2019.2922175", | ||||
10 | "https://doi.org/10.1109/CVPR.2019.00012", | ||||
11 | "https://doi.org/10.48550/arXiv.2012.08112", | ||||
12 | "https://doi.org/10.48550/arXiv.1703.09912", | ||||
13 | "https://doi.org/10.1109/ACCESS.2023.3242982", | ||||
14 | "https://doi.org/10.48550/arXiv.1910.12336", | 6 | "https://doi.org/10.48550/arXiv.1412.3684", | ||
15 | "https://doi.org/10.48550/arXiv.2312.04823", | ||||
16 | "https://doi.org/10.1109/LRA.2020.2967289", | ||||
17 | "https://doi.org/10.48550/arXiv.1901.08644", | ||||
18 | "https://doi.org/10.48550/arXiv.2211.01317", | ||||
19 | "https://doi.org/10.48550/arXiv.1806.08874", | ||||
20 | "https://doi.org/10.48550/arXiv.2402.04867", | ||||
21 | "https://doi.org/10.48550/arXiv.2011.08485", | ||||
22 | "https://doi.org/10.48550/arXiv.2103.02152", | 7 | "https://doi.org/10.48550/arXiv.2305.01506", | ||
23 | "https://doi.org/10.48550/arXiv.2302.08476", | 8 | "https://doi.org/10.48550/arXiv.2206.11488", | ||
24 | "https://doi.org/10.48550/arXiv.1905.04270", | ||||
25 | "https://doi.org/10.14569/IJACSA.2020.0110102", | ||||
26 | "https://doi.org/10.48550/arXiv.1904.09021", | 9 | "https://doi.org/10.48550/arXiv.1904.06490", | ||
27 | "https://doi.org/10.48550/arXiv.2406.18580", | ||||
28 | "https://doi.org/10.48550/arXiv.2104.10544", | ||||
29 | "https://doi.org/10.48550/arXiv.2209.01404", | ||||
30 | "https://doi.org/10.48550/arXiv.2208.04693", | 10 | "https://doi.org/10.48550/arXiv.2108.09551", | ||
31 | "https://doi.org/10.48550/arXiv.2204.07610", | 11 | "https://doi.org/10.48550/arXiv.2006.07161", | ||
32 | "https://doi.org/10.48550/arXiv.1911.04636", | ||||
33 | "https://doi.org/10.48550/arXiv.1808.10696", | ||||
34 | "https://doi.org/10.48550/arXiv.2405.12175", | ||||
35 | "https://doi.org/10.48550/arXiv.2309.03774", | ||||
36 | "https://doi.org/10.48550/arXiv.2201.11113", | ||||
37 | "https://doi.org/10.48550/arXiv.2406.10737", | ||||
38 | "https://doi.org/10.48550/arXiv.2211.13778", | ||||
39 | "https://doi.org/10.48550/arXiv.1805.08624", | ||||
40 | "https://doi.org/10.48550/arXiv.1611.04994", | ||||
41 | "https://doi.org/10.48550/arXiv.1503.01428", | ||||
42 | "https://doi.org/10.1109/ICASSP.2017.7953112", | ||||
43 | "https://doi.org/10.48550/arXiv.1811.01335", | ||||
44 | "https://doi.org/10.48550/arXiv.2012.04061", | ||||
45 | "https://doi.org/10.48550/arXiv.2407.14726", | ||||
46 | "https://doi.org/10.48550/arXiv.1702.04595", | ||||
47 | "https://doi.org/10.48550/arXiv.2402.02342", | ||||
48 | "https://doi.org/10.48550/arXiv.1902.03984", | ||||
49 | "https://doi.org/10.48550/arXiv.2009.05835", | ||||
50 | "https://doi.org/10.48550/arXiv.2404.15881", | ||||
51 | "https://doi.org/10.48550/arXiv.2305.03601", | ||||
52 | "https://doi.org/10.48550/arXiv.1901.08278", | ||||
53 | "https://doi.org/10.48550/arXiv.2009.07024", | ||||
54 | "https://doi.org/10.1007/978-3-031-73010-8_26", | ||||
55 | "https://doi.org/10.1007/s11063-019-10043-7", | ||||
56 | "https://doi.org/10.48550/arXiv.2404.02947", | 12 | "https://doi.org/10.48550/arXiv.2304.02798", | ||
57 | "https://doi.org/10.48550/arXiv.1711.09554", | ||||
58 | "https://doi.org/10.48550/arXiv.1909.09034", | 13 | "https://doi.org/10.48550/arXiv.1909.00145" | ||
59 | "https://doi.org/10.1145/3403572", | ||||
60 | "https://doi.org/10.48550/arXiv.1708.05357", | ||||
61 | "https://doi.org/10.48550/arXiv.2111.02249", | ||||
62 | "https://doi.org/10.48550/arXiv.1908.02729", | ||||
63 | "https://doi.org/10.48550/arXiv.2003.03233", | ||||
64 | "https://doi.org/10.48550/arXiv.1606.04838", | ||||
65 | "https://doi.org/10.48550/arXiv.2401.11471" | ||||
66 | ], | 14 | ], | ||
67 | "creator_user_id": "17755db4-395a-4b3b-ac09-e8e3484ca700", | 15 | "creator_user_id": "17755db4-395a-4b3b-ac09-e8e3484ca700", | ||
n | 68 | "defined_in": "https://doi.org/10.48550/arXiv.1810.03307", | n | 16 | "defined_in": "https://doi.org/10.48550/arXiv.2106.14156", |
69 | "doi": "10.57702/nh8omwqx", | 17 | "doi": "10.57702/nh8omwqx", | ||
70 | "doi_date_published": "2024-11-25", | 18 | "doi_date_published": "2024-11-25", | ||
71 | "doi_publisher": "TIB", | 19 | "doi_publisher": "TIB", | ||
72 | "doi_status": true, | 20 | "doi_status": true, | ||
73 | "domain": "https://service.tib.eu/ldmservice", | 21 | "domain": "https://service.tib.eu/ldmservice", | ||
74 | "extra_authors": [ | 22 | "extra_authors": [ | ||
75 | { | 23 | { | ||
n | 76 | "extra_author": "Marco Baroni", | n | 24 | "extra_author": "Jinyoung Choi", |
25 | "orcid": "" | ||||
26 | }, | ||||
27 | { | ||||
28 | "extra_author": "Bohyung Han", | ||||
77 | "orcid": "" | 29 | "orcid": "" | ||
78 | } | 30 | } | ||
79 | ], | 31 | ], | ||
80 | "groups": [ | 32 | "groups": [ | ||
81 | { | 33 | { | ||
82 | "description": "", | 34 | "description": "", | ||
83 | "display_name": "Computer Vision", | 35 | "display_name": "Computer Vision", | ||
84 | "id": "d09caf7c-26c7-4e4d-bb8e-49476a90ba25", | 36 | "id": "d09caf7c-26c7-4e4d-bb8e-49476a90ba25", | ||
85 | "image_display_url": "", | 37 | "image_display_url": "", | ||
86 | "name": "computer-vision", | 38 | "name": "computer-vision", | ||
87 | "title": "Computer Vision" | 39 | "title": "Computer Vision" | ||
88 | }, | 40 | }, | ||
89 | { | 41 | { | ||
90 | "description": "", | 42 | "description": "", | ||
n | 91 | "display_name": "Generative Adversarial Networks", | n | ||
92 | "id": "42559fef-c85e-4fb4-8ba0-c1af289545ce", | ||||
93 | "image_display_url": "", | ||||
94 | "name": "generative-adversarial-networks", | ||||
95 | "title": "Generative Adversarial Networks" | ||||
96 | }, | ||||
97 | { | ||||
98 | "description": "", | ||||
99 | "display_name": "Image Classification", | 43 | "display_name": "Image Classification", | ||
100 | "id": "18b77292-26aa-4caf-89ed-cbd35fa60474", | 44 | "id": "18b77292-26aa-4caf-89ed-cbd35fa60474", | ||
101 | "image_display_url": "", | 45 | "image_display_url": "", | ||
102 | "name": "image-classification", | 46 | "name": "image-classification", | ||
103 | "title": "Image Classification" | 47 | "title": "Image Classification" | ||
104 | }, | 48 | }, | ||
105 | { | 49 | { | ||
106 | "description": "", | 50 | "description": "", | ||
n | 107 | "display_name": "Image Compression", | n | ||
108 | "id": "094941e1-ba8a-4742-a18e-4da139345758", | ||||
109 | "image_display_url": "", | ||||
110 | "name": "image-compression", | ||||
111 | "title": "Image Compression" | ||||
112 | }, | ||||
113 | { | ||||
114 | "description": "", | ||||
115 | "display_name": "Image Dataset", | 51 | "display_name": "Image Dataset", | ||
116 | "id": "fc745cca-b21e-4ced-ba81-06a456938edf", | 52 | "id": "fc745cca-b21e-4ced-ba81-06a456938edf", | ||
117 | "image_display_url": "", | 53 | "image_display_url": "", | ||
118 | "name": "image-dataset", | 54 | "name": "image-dataset", | ||
119 | "title": "Image Dataset" | 55 | "title": "Image Dataset" | ||
120 | }, | 56 | }, | ||
121 | { | 57 | { | ||
122 | "description": "", | 58 | "description": "", | ||
n | 123 | "display_name": "Image Generation", | n | ||
124 | "id": "be25a76c-def1-4e73-8b1c-b81222d63867", | ||||
125 | "image_display_url": "", | ||||
126 | "name": "image-generation", | ||||
127 | "title": "Image Generation" | ||||
128 | }, | ||||
129 | { | ||||
130 | "description": "", | ||||
131 | "display_name": "Image Recognition", | 59 | "display_name": "Image Recognition", | ||
132 | "id": "42c0f83b-b61c-4bbd-a704-133e4e4a0c15", | 60 | "id": "42c0f83b-b61c-4bbd-a704-133e4e4a0c15", | ||
133 | "image_display_url": "", | 61 | "image_display_url": "", | ||
134 | "name": "image-recognition", | 62 | "name": "image-recognition", | ||
135 | "title": "Image Recognition" | 63 | "title": "Image Recognition" | ||
136 | }, | 64 | }, | ||
137 | { | 65 | { | ||
138 | "description": "", | 66 | "description": "", | ||
n | 139 | "display_name": "ImageNet dataset", | n | ||
140 | "id": "4f3bbdf4-4f87-4015-bd92-300e9571bf71", | ||||
141 | "image_display_url": "", | ||||
142 | "name": "imagenet-dataset", | ||||
143 | "title": "ImageNet dataset" | ||||
144 | }, | ||||
145 | { | ||||
146 | "description": "", | ||||
147 | "display_name": "Images", | ||||
148 | "id": "45883730-52c6-44ac-bea8-d50c9cbc8882", | ||||
149 | "image_display_url": "", | ||||
150 | "name": "images", | ||||
151 | "title": "Images" | ||||
152 | }, | ||||
153 | { | ||||
154 | "description": "", | ||||
155 | "display_name": "Object Classification", | 67 | "display_name": "Object Classification", | ||
156 | "id": "06361be7-5f3f-4e76-a104-f4f908fa6d91", | 68 | "id": "06361be7-5f3f-4e76-a104-f4f908fa6d91", | ||
157 | "image_display_url": "", | 69 | "image_display_url": "", | ||
158 | "name": "object-classification", | 70 | "name": "object-classification", | ||
159 | "title": "Object Classification" | 71 | "title": "Object Classification" | ||
160 | }, | 72 | }, | ||
161 | { | 73 | { | ||
162 | "description": "", | 74 | "description": "", | ||
163 | "display_name": "Object Detection", | 75 | "display_name": "Object Detection", | ||
164 | "id": "ca2cb1af-d31c-49b0-a1dd-62b22f2b9e20", | 76 | "id": "ca2cb1af-d31c-49b0-a1dd-62b22f2b9e20", | ||
165 | "image_display_url": "", | 77 | "image_display_url": "", | ||
166 | "name": "object-detection", | 78 | "name": "object-detection", | ||
167 | "title": "Object Detection" | 79 | "title": "Object Detection" | ||
n | 168 | }, | n | ||
169 | { | ||||
170 | "description": "", | ||||
171 | "display_name": "Object Recognition", | ||||
172 | "id": "1b471529-d821-46c3-8ac0-1ec99f0c80bc", | ||||
173 | "image_display_url": "", | ||||
174 | "name": "object-recognition", | ||||
175 | "title": "Object Recognition" | ||||
176 | } | 80 | } | ||
177 | ], | 81 | ], | ||
178 | "id": "9925760a-2207-4355-b865-d39e04db108a", | 82 | "id": "9925760a-2207-4355-b865-d39e04db108a", | ||
179 | "isopen": false, | 83 | "isopen": false, | ||
180 | "landing_page": "https://www.image-net.org/", | 84 | "landing_page": "https://www.image-net.org/", | ||
181 | "license_title": null, | 85 | "license_title": null, | ||
182 | "link_orkg": "", | 86 | "link_orkg": "", | ||
183 | "metadata_created": "2024-11-25T14:18:05.427265", | 87 | "metadata_created": "2024-11-25T14:18:05.427265", | ||
n | 184 | "metadata_modified": "2024-12-02T17:41:31.200081", | n | 88 | "metadata_modified": "2024-12-02T17:54:54.029294", |
185 | "name": "imagenet-dataset", | 89 | "name": "imagenet-dataset", | ||
n | 186 | "notes": "The dataset used in this paper is a collection of images | n | 90 | "notes": "Object recognition is arguably the most important problem |
187 | from the ImageNet dataset, preprocessed and used for training and | 91 | at the heart of computer vision. Recently, Barbu et al. introduced a | ||
188 | evaluation of the proposed diffusion spectral entropy and diffusion | 92 | dataset called ObjectNet which includes objects in daily life | ||
189 | spectral mutual information methods.", | 93 | situations.", | ||
190 | "num_resources": 1, | 94 | "num_resources": 0, | ||
191 | "num_tags": 48, | 95 | "num_tags": 15, | ||
192 | "organization": { | 96 | "organization": { | ||
193 | "approval_status": "approved", | 97 | "approval_status": "approved", | ||
194 | "created": "2024-11-25T12:11:38.292601", | 98 | "created": "2024-11-25T12:11:38.292601", | ||
195 | "description": "", | 99 | "description": "", | ||
196 | "id": "079d46db-32df-4b48-91f3-0a8bc8f69559", | 100 | "id": "079d46db-32df-4b48-91f3-0a8bc8f69559", | ||
197 | "image_url": "", | 101 | "image_url": "", | ||
198 | "is_organization": true, | 102 | "is_organization": true, | ||
199 | "name": "no-organization", | 103 | "name": "no-organization", | ||
200 | "state": "active", | 104 | "state": "active", | ||
201 | "title": "No Organization", | 105 | "title": "No Organization", | ||
202 | "type": "organization" | 106 | "type": "organization" | ||
203 | }, | 107 | }, | ||
204 | "owner_org": "079d46db-32df-4b48-91f3-0a8bc8f69559", | 108 | "owner_org": "079d46db-32df-4b48-91f3-0a8bc8f69559", | ||
205 | "private": false, | 109 | "private": false, | ||
206 | "relationships_as_object": [], | 110 | "relationships_as_object": [], | ||
207 | "relationships_as_subject": [], | 111 | "relationships_as_subject": [], | ||
n | 208 | "resources": [ | n | 112 | "resources": [], |
209 | { | ||||
210 | "cache_last_updated": null, | ||||
211 | "cache_url": null, | ||||
212 | "created": "2024-12-02T18:38:42", | ||||
213 | "data": [ | ||||
214 | "dcterms:title", | ||||
215 | "dcterms:accessRights", | ||||
216 | "dcterms:creator", | ||||
217 | "dcterms:description", | ||||
218 | "dcterms:issued", | ||||
219 | "dcterms:language", | ||||
220 | "dcterms:identifier", | ||||
221 | "dcat:theme", | ||||
222 | "dcterms:type", | ||||
223 | "dcat:keyword", | ||||
224 | "dcat:landingPage", | ||||
225 | "dcterms:hasVersion", | ||||
226 | "dcterms:format", | ||||
227 | "mls:task", | ||||
228 | "datacite:isDescribedBy" | ||||
229 | ], | ||||
230 | "description": "The json representation of the dataset with its | ||||
231 | distributions based on DCAT.", | ||||
232 | "format": "JSON", | ||||
233 | "hash": "", | ||||
234 | "id": "2e880d25-7f48-45ab-a90e-d5f38de354ec", | ||||
235 | "last_modified": "2024-12-02T17:41:31.186546", | ||||
236 | "metadata_modified": "2024-12-02T17:41:31.203142", | ||||
237 | "mimetype": "application/json", | ||||
238 | "mimetype_inner": null, | ||||
239 | "name": "Original Metadata", | ||||
240 | "package_id": "9925760a-2207-4355-b865-d39e04db108a", | ||||
241 | "position": 0, | ||||
242 | "resource_type": null, | ||||
243 | "size": 4003, | ||||
244 | "state": "active", | ||||
245 | "url": | ||||
246 | resource/2e880d25-7f48-45ab-a90e-d5f38de354ec/download/metadata.json", | ||||
247 | "url_type": "upload" | ||||
248 | } | ||||
249 | ], | ||||
250 | "services_used_list": "", | 113 | "services_used_list": "", | ||
251 | "state": "active", | 114 | "state": "active", | ||
252 | "tags": [ | 115 | "tags": [ | ||
253 | { | 116 | { | ||
n | 254 | "display_name": "1000 Classes", | n | ||
255 | "id": "95de9dfc-4461-489b-8338-decd10049200", | ||||
256 | "name": "1000 Classes", | ||||
257 | "state": "active", | ||||
258 | "vocabulary_id": null | ||||
259 | }, | ||||
260 | { | ||||
261 | "display_name": "AlexNet", | ||||
262 | "id": "173477a1-4fb2-4b96-8e60-fade73baccb3", | ||||
263 | "name": "AlexNet", | ||||
264 | "state": "active", | ||||
265 | "vocabulary_id": null | ||||
266 | }, | ||||
267 | { | ||||
268 | "display_name": "CNN", | ||||
269 | "id": "66ec0da0-d205-4a39-90c6-bae7e7b1cbd6", | ||||
270 | "name": "CNN", | ||||
271 | "state": "active", | ||||
272 | "vocabulary_id": null | ||||
273 | }, | ||||
274 | { | ||||
275 | "display_name": "CNNs", | ||||
276 | "id": "6ae3de49-c894-4ca3-b91f-0995e7586704", | ||||
277 | "name": "CNNs", | ||||
278 | "state": "active", | ||||
279 | "vocabulary_id": null | ||||
280 | }, | ||||
281 | { | ||||
282 | "display_name": "Computer Vision", | 117 | "display_name": "Computer Vision", | ||
283 | "id": "77b96eda-8a43-406f-9c54-d87b14f3f63e", | 118 | "id": "77b96eda-8a43-406f-9c54-d87b14f3f63e", | ||
284 | "name": "Computer Vision", | 119 | "name": "Computer Vision", | ||
285 | "state": "active", | 120 | "state": "active", | ||
286 | "vocabulary_id": null | 121 | "vocabulary_id": null | ||
287 | }, | 122 | }, | ||
288 | { | 123 | { | ||
n | 289 | "display_name": "Computer vision", | n | ||
290 | "id": "4c7f3390-b5b7-4476-ba59-f5551656e121", | ||||
291 | "name": "Computer vision", | ||||
292 | "state": "active", | ||||
293 | "vocabulary_id": null | ||||
294 | }, | ||||
295 | { | ||||
296 | "display_name": "Deep Learning", | ||||
297 | "id": "3feb7b21-e049-4dca-9372-0d438c483f6a", | ||||
298 | "name": "Deep Learning", | ||||
299 | "state": "active", | ||||
300 | "vocabulary_id": null | ||||
301 | }, | ||||
302 | { | ||||
303 | "display_name": "Deep learning", | ||||
304 | "id": "a8638dc7-b339-4a0b-8daa-ef73a49e2688", | ||||
305 | "name": "Deep learning", | ||||
306 | "state": "active", | ||||
307 | "vocabulary_id": null | ||||
308 | }, | ||||
309 | { | ||||
310 | "display_name": "Generative Adversarial Networks", | ||||
311 | "id": "b384af43-f86b-489d-a8d4-9595f25d6e95", | ||||
312 | "name": "Generative Adversarial Networks", | ||||
313 | "state": "active", | ||||
314 | "vocabulary_id": null | ||||
315 | }, | ||||
316 | { | ||||
317 | "display_name": "ILSVRC", | ||||
318 | "id": "e2c79b63-9d46-47bf-9c96-a5ffd793f58d", | ||||
319 | "name": "ILSVRC", | ||||
320 | "state": "active", | ||||
321 | "vocabulary_id": null | ||||
322 | }, | ||||
323 | { | ||||
324 | "display_name": "Image Classification", | 124 | "display_name": "Image Classification", | ||
325 | "id": "418e2ddf-a1d3-42ac-ad05-156f79ca8e22", | 125 | "id": "418e2ddf-a1d3-42ac-ad05-156f79ca8e22", | ||
326 | "name": "Image Classification", | 126 | "name": "Image Classification", | ||
327 | "state": "active", | 127 | "state": "active", | ||
328 | "vocabulary_id": null | 128 | "vocabulary_id": null | ||
329 | }, | 129 | }, | ||
330 | { | 130 | { | ||
n | 331 | "display_name": "Image Compression", | n | 131 | "display_name": "Image Dataset", |
332 | "id": "9319a62c-672c-4d44-a6ef-7d18ec5ba2b7", | 132 | "id": "51aed645-6dd9-4e08-894a-10944ecefd8b", | ||
333 | "name": "Image Compression", | 133 | "name": "Image Dataset", | ||
334 | "state": "active", | 134 | "state": "active", | ||
335 | "vocabulary_id": null | 135 | "vocabulary_id": null | ||
336 | }, | 136 | }, | ||
337 | { | 137 | { | ||
338 | "display_name": "Image Recognition", | 138 | "display_name": "Image Recognition", | ||
339 | "id": "2cc216cd-24af-483e-8e50-6fcd86cf89ac", | 139 | "id": "2cc216cd-24af-483e-8e50-6fcd86cf89ac", | ||
340 | "name": "Image Recognition", | 140 | "name": "Image Recognition", | ||
341 | "state": "active", | 141 | "state": "active", | ||
342 | "vocabulary_id": null | 142 | "vocabulary_id": null | ||
343 | }, | 143 | }, | ||
344 | { | 144 | { | ||
n | 345 | "display_name": "Image Resolution", | n | ||
346 | "id": "cf4be6df-c6bc-4ce7-8f00-c16d36823e5c", | ||||
347 | "name": "Image Resolution", | ||||
348 | "state": "active", | ||||
349 | "vocabulary_id": null | ||||
350 | }, | ||||
351 | { | ||||
352 | "display_name": "Image classification", | ||||
353 | "id": "786db19f-59bd-48ee-96c8-dfff6df61746", | ||||
354 | "name": "Image classification", | ||||
355 | "state": "active", | ||||
356 | "vocabulary_id": null | ||||
357 | }, | ||||
358 | { | ||||
359 | "display_name": "ImageNet", | 145 | "display_name": "ImageNet", | ||
360 | "id": "114653a3-d688-42fb-8e76-350752af988b", | 146 | "id": "114653a3-d688-42fb-8e76-350752af988b", | ||
361 | "name": "ImageNet", | 147 | "name": "ImageNet", | ||
362 | "state": "active", | 148 | "state": "active", | ||
363 | "vocabulary_id": null | 149 | "vocabulary_id": null | ||
364 | }, | 150 | }, | ||
365 | { | 151 | { | ||
n | 366 | "display_name": "ImageNet dataset", | n | 152 | "display_name": "ImageNet Dataset", |
367 | "id": "17b2585a-d35d-4520-a5f3-b6d58e73932a", | 153 | "id": "c25f25f6-daad-404d-b599-ae4470044e40", | ||
368 | "name": "ImageNet dataset", | 154 | "name": "ImageNet Dataset", | ||
369 | "state": "active", | ||||
370 | "vocabulary_id": null | ||||
371 | }, | ||||
372 | { | ||||
373 | "display_name": "Images", | ||||
374 | "id": "b653d5cb-88da-4373-ae34-3b0b41d37fad", | ||||
375 | "name": "Images", | ||||
376 | "state": "active", | 155 | "state": "active", | ||
377 | "vocabulary_id": null | 156 | "vocabulary_id": null | ||
378 | }, | 157 | }, | ||
379 | { | 158 | { | ||
380 | "display_name": "Large Scale", | 159 | "display_name": "Large Scale", | ||
381 | "id": "9d0a7af8-406e-4d7d-b558-ac2b45093bbf", | 160 | "id": "9d0a7af8-406e-4d7d-b558-ac2b45093bbf", | ||
382 | "name": "Large Scale", | 161 | "name": "Large Scale", | ||
383 | "state": "active", | 162 | "state": "active", | ||
384 | "vocabulary_id": null | 163 | "vocabulary_id": null | ||
385 | }, | 164 | }, | ||
386 | { | 165 | { | ||
n | 387 | "display_name": "Large-scale Dataset", | n | ||
388 | "id": "a5016f51-6dbd-4a95-be00-213c41da86b5", | ||||
389 | "name": "Large-scale Dataset", | ||||
390 | "state": "active", | ||||
391 | "vocabulary_id": null | ||||
392 | }, | ||||
393 | { | ||||
394 | "display_name": "Large-scale dataset", | ||||
395 | "id": "868f7828-1cf2-4bed-ba67-61c621bf1aa5", | ||||
396 | "name": "Large-scale dataset", | ||||
397 | "state": "active", | ||||
398 | "vocabulary_id": null | ||||
399 | }, | ||||
400 | { | ||||
401 | "display_name": "Natural images", | ||||
402 | "id": "a88c38fb-4e73-4220-80fb-39fd3f89178d", | ||||
403 | "name": "Natural images", | ||||
404 | "state": "active", | ||||
405 | "vocabulary_id": null | ||||
406 | }, | ||||
407 | { | ||||
408 | "display_name": "Neural Networks", | ||||
409 | "id": "b8e60d98-1c66-40d1-b944-74216c2bd378", | ||||
410 | "name": "Neural Networks", | ||||
411 | "state": "active", | ||||
412 | "vocabulary_id": null | ||||
413 | }, | ||||
414 | { | ||||
415 | "display_name": "Object Classification", | ||||
416 | "id": "b04e5bd7-f108-4123-8c16-68c67ae2fe31", | ||||
417 | "name": "Object Classification", | ||||
418 | "state": "active", | ||||
419 | "vocabulary_id": null | ||||
420 | }, | ||||
421 | { | ||||
422 | "display_name": "Object Classi\ufb01cation", | ||||
423 | "id": "fa16e756-2685-4dc4-8d0b-42f1f0d7676b", | ||||
424 | "name": "Object Classi\ufb01cation", | ||||
425 | "state": "active", | ||||
426 | "vocabulary_id": null | ||||
427 | }, | ||||
428 | { | ||||
429 | "display_name": "Object Detection", | 166 | "display_name": "Object Detection", | ||
430 | "id": "44adc011-570b-46cf-9a65-ab72ca690477", | 167 | "id": "44adc011-570b-46cf-9a65-ab72ca690477", | ||
431 | "name": "Object Detection", | 168 | "name": "Object Detection", | ||
432 | "state": "active", | 169 | "state": "active", | ||
433 | "vocabulary_id": null | 170 | "vocabulary_id": null | ||
434 | }, | 171 | }, | ||
435 | { | 172 | { | ||
n | 436 | "display_name": "Object detection", | n | ||
437 | "id": "84a57b7d-e522-4fc2-9f65-9aeb121659f1", | ||||
438 | "name": "Object detection", | ||||
439 | "state": "active", | ||||
440 | "vocabulary_id": null | ||||
441 | }, | ||||
442 | { | ||||
443 | "display_name": "Object recognition", | 173 | "display_name": "Object Recognition", | ||
444 | "id": "52052816-97ee-4de1-bfb9-2097add55ede", | 174 | "id": "6a4e0b0a-637f-41ee-a647-8af2e035b203", | ||
445 | "name": "Object recognition", | 175 | "name": "Object Recognition", | ||
446 | "state": "active", | 176 | "state": "active", | ||
447 | "vocabulary_id": null | 177 | "vocabulary_id": null | ||
448 | }, | 178 | }, | ||
449 | { | 179 | { | ||
n | 450 | "display_name": "VGG-16", | n | ||
451 | "id": "4320f109-14d0-4cbb-8f35-de935be8f2b0", | ||||
452 | "name": "VGG-16", | ||||
453 | "state": "active", | ||||
454 | "vocabulary_id": null | ||||
455 | }, | ||||
456 | { | ||||
457 | "display_name": "Variations", | 180 | "display_name": "WordNet", | ||
458 | "id": "d0980308-eeb4-4b2d-aead-f29730b8e2ec", | 181 | "id": "0767e69a-00de-42d9-a760-b01cdf190fdf", | ||
459 | "name": "Variations", | 182 | "name": "WordNet", | ||
460 | "state": "active", | ||||
461 | "vocabulary_id": null | ||||
462 | }, | ||||
463 | { | ||||
464 | "display_name": "adversarial attacks", | ||||
465 | "id": "7a52265e-3a4a-433a-acd2-466ff2bef434", | ||||
466 | "name": "adversarial attacks", | ||||
467 | "state": "active", | ||||
468 | "vocabulary_id": null | ||||
469 | }, | ||||
470 | { | ||||
471 | "display_name": "classification", | ||||
472 | "id": "30b96a8e-ded8-465e-9a77-e4163624d903", | ||||
473 | "name": "classification", | ||||
474 | "state": "active", | ||||
475 | "vocabulary_id": null | ||||
476 | }, | ||||
477 | { | ||||
478 | "display_name": "computer vision", | ||||
479 | "id": "f650b4e3-9955-49b0-ba7b-2d302a990978", | ||||
480 | "name": "computer vision", | ||||
481 | "state": "active", | ||||
482 | "vocabulary_id": null | ||||
483 | }, | ||||
484 | { | ||||
485 | "display_name": "convolutional neural networks", | ||||
486 | "id": "84ee1255-0c3e-4697-8c35-a177efdf4b6d", | ||||
487 | "name": "convolutional neural networks", | ||||
488 | "state": "active", | 183 | "state": "active", | ||
489 | "vocabulary_id": null | 184 | "vocabulary_id": null | ||
490 | }, | 185 | }, | ||
491 | { | 186 | { | ||
492 | "display_name": "deep learning", | 187 | "display_name": "deep learning", | ||
493 | "id": "19e41883-3799-4184-9e0e-26c95795b119", | 188 | "id": "19e41883-3799-4184-9e0e-26c95795b119", | ||
494 | "name": "deep learning", | 189 | "name": "deep learning", | ||
495 | "state": "active", | 190 | "state": "active", | ||
496 | "vocabulary_id": null | 191 | "vocabulary_id": null | ||
497 | }, | 192 | }, | ||
498 | { | 193 | { | ||
n | 499 | "display_name": "generative models", | n | ||
500 | "id": "35aa2ac6-6fb6-496c-80a4-55b6f720d205", | ||||
501 | "name": "generative models", | ||||
502 | "state": "active", | ||||
503 | "vocabulary_id": null | ||||
504 | }, | ||||
505 | { | ||||
506 | "display_name": "image classification", | 194 | "display_name": "image classification", | ||
507 | "id": "34936550-ce1a-41b5-8c58-23081a6c673d", | 195 | "id": "34936550-ce1a-41b5-8c58-23081a6c673d", | ||
508 | "name": "image classification", | 196 | "name": "image classification", | ||
509 | "state": "active", | 197 | "state": "active", | ||
510 | "vocabulary_id": null | 198 | "vocabulary_id": null | ||
511 | }, | 199 | }, | ||
512 | { | 200 | { | ||
n | 513 | "display_name": "image dataset", | n | ||
514 | "id": "d3acafab-ad07-46a1-88d5-540c2fd41466", | ||||
515 | "name": "image dataset", | ||||
516 | "state": "active", | ||||
517 | "vocabulary_id": null | ||||
518 | }, | ||||
519 | { | ||||
520 | "display_name": "image recognition", | 201 | "display_name": "image recognition", | ||
521 | "id": "5dbee310-ce23-4ee3-b86b-0ad4c203b2a6", | 202 | "id": "5dbee310-ce23-4ee3-b86b-0ad4c203b2a6", | ||
522 | "name": "image recognition", | 203 | "name": "image recognition", | ||
523 | "state": "active", | 204 | "state": "active", | ||
524 | "vocabulary_id": null | 205 | "vocabulary_id": null | ||
525 | }, | 206 | }, | ||
526 | { | 207 | { | ||
n | 527 | "display_name": "images", | n | ||
528 | "id": "40152090-cbbf-4339-b7d3-f14b68cb7621", | ||||
529 | "name": "images", | ||||
530 | "state": "active", | ||||
531 | "vocabulary_id": null | ||||
532 | }, | ||||
533 | { | ||||
534 | "display_name": "large-scale", | ||||
535 | "id": "206c1169-4356-4a08-aeaf-b83f56de3222", | ||||
536 | "name": "large-scale", | ||||
537 | "state": "active", | ||||
538 | "vocabulary_id": null | ||||
539 | }, | ||||
540 | { | ||||
541 | "display_name": "large-scale dataset", | ||||
542 | "id": "a9c694bf-f591-4625-a20e-d53d3f90d489", | ||||
543 | "name": "large-scale dataset", | ||||
544 | "state": "active", | ||||
545 | "vocabulary_id": null | ||||
546 | }, | ||||
547 | { | ||||
548 | "display_name": "large-scale image database", | ||||
549 | "id": "404f29b3-cd6b-4fa4-9001-c5642034f950", | ||||
550 | "name": "large-scale image database", | ||||
551 | "state": "active", | ||||
552 | "vocabulary_id": null | ||||
553 | }, | ||||
554 | { | ||||
555 | "display_name": "large-scale image dataset", | ||||
556 | "id": "96249c22-932e-482d-869d-342864969486", | ||||
557 | "name": "large-scale image dataset", | ||||
558 | "state": "active", | ||||
559 | "vocabulary_id": null | ||||
560 | }, | ||||
561 | { | ||||
562 | "display_name": "object classification", | 208 | "display_name": "object classification", | ||
563 | "id": "25b11f7f-115e-4d5b-b294-abacfd7e53c0", | 209 | "id": "25b11f7f-115e-4d5b-b294-abacfd7e53c0", | ||
564 | "name": "object classification", | 210 | "name": "object classification", | ||
565 | "state": "active", | 211 | "state": "active", | ||
566 | "vocabulary_id": null | 212 | "vocabulary_id": null | ||
567 | }, | 213 | }, | ||
568 | { | 214 | { | ||
n | 569 | "display_name": "object labels", | n | ||
570 | "id": "29d44177-e84b-44b3-a00a-47da96585319", | ||||
571 | "name": "object labels", | ||||
572 | "state": "active", | ||||
573 | "vocabulary_id": null | ||||
574 | }, | ||||
575 | { | ||||
576 | "display_name": "object recognition", | 215 | "display_name": "object detection", | ||
577 | "id": "cbac3f3d-b235-44b0-8bfb-30bf00e44b8c", | 216 | "id": "607283c7-9e12-4167-9101-7f8078fb6537", | ||
578 | "name": "object recognition", | 217 | "name": "object detection", | ||
579 | "state": "active", | ||||
580 | "vocabulary_id": null | ||||
581 | }, | ||||
582 | { | ||||
583 | "display_name": "thousand object categories", | ||||
584 | "id": "ef1aa66f-2e13-4c2a-8693-adf225cd77e9", | ||||
585 | "name": "thousand object categories", | ||||
586 | "state": "active", | 218 | "state": "active", | ||
587 | "vocabulary_id": null | 219 | "vocabulary_id": null | ||
588 | } | 220 | } | ||
589 | ], | 221 | ], | ||
t | 590 | "title": "ImageNet dataset", | t | 222 | "title": "ImageNet Dataset", |
591 | "type": "dataset", | 223 | "type": "dataset", | ||
592 | "version": "" | 224 | "version": "" | ||
593 | } | 225 | } |