Dergi makalesi Açık Erişim

PURSUhInT: In Search of Informative Hint Points Based on Layer Clustering for Knowledge Distillation

Keser, Reyhan Kevser; Ayanzadeh, Aydin; Aghdam, Omid Abdollahi; Kilcioglu, Caglar; Toreyin, Behcet Ugur; Ure, Nazim Kemal


DataCite XML

<?xml version='1.0' encoding='utf-8'?>
<resource xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://datacite.org/schema/kernel-4" xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4.1/metadata.xsd">
  <identifier identifierType="URL">https://aperta.ulakbim.gov.tr/record/253867</identifier>
  <creators>
    <creator>
      <creatorName>Keser, Reyhan Kevser</creatorName>
      <givenName>Reyhan Kevser</givenName>
      <familyName>Keser</familyName>
      <affiliation>Istanbul Tech Univ, Informat Inst, Signal Proc Computat Intelligence Res Grp SP4CING, Istanbul, Turkiye</affiliation>
    </creator>
    <creator>
      <creatorName>Ayanzadeh, Aydin</creatorName>
      <givenName>Aydin</givenName>
      <familyName>Ayanzadeh</familyName>
      <affiliation>Univ Maryland, Dept Comp Sci &amp; Elect Engn, Baltimore, MD USA</affiliation>
    </creator>
    <creator>
      <creatorName>Aghdam, Omid Abdollahi</creatorName>
      <givenName>Omid Abdollahi</givenName>
      <familyName>Aghdam</familyName>
      <affiliation>Arcelik Res &amp; Dev, Istanbul, Turkiye</affiliation>
    </creator>
    <creator>
      <creatorName>Kilcioglu, Caglar</creatorName>
      <givenName>Caglar</givenName>
      <familyName>Kilcioglu</familyName>
      <affiliation>Arcelik Res &amp; Dev, Istanbul, Turkiye</affiliation>
    </creator>
    <creator>
      <creatorName>Toreyin, Behcet Ugur</creatorName>
      <givenName>Behcet Ugur</givenName>
      <familyName>Toreyin</familyName>
      <affiliation>Istanbul Tech Univ, Informat Inst, Signal Proc Computat Intelligence Res Grp SP4CING, Istanbul, Turkiye</affiliation>
    </creator>
    <creator>
      <creatorName>Ure, Nazim Kemal</creatorName>
      <givenName>Nazim Kemal</givenName>
      <familyName>Ure</familyName>
      <affiliation>Istanbul Tech Univ, Artificial Intelligence &amp; Data Sci Applicat &amp; Res, Istanbul, Turkiye</affiliation>
    </creator>
  </creators>
  <titles>
    <title>Pursuhint: In Search Of Informative Hint Points Based On Layer Clustering For Knowledge Distillation</title>
  </titles>
  <publisher>Aperta</publisher>
  <publicationYear>2023</publicationYear>
  <dates>
    <date dateType="Issued">2023-01-01</date>
  </dates>
  <resourceType resourceTypeGeneral="Text">Journal article</resourceType>
  <alternateIdentifiers>
    <alternateIdentifier alternateIdentifierType="url">https://aperta.ulakbim.gov.tr/record/253867</alternateIdentifier>
  </alternateIdentifiers>
  <relatedIdentifiers>
    <relatedIdentifier relatedIdentifierType="DOI" relationType="IsIdenticalTo">10.1016/j.eswa.2022.119040</relatedIdentifier>
  </relatedIdentifiers>
  <rightsList>
    <rights rightsURI="http://www.opendefinition.org/licenses/cc-by">Creative Commons Attribution</rights>
    <rights rightsURI="info:eu-repo/semantics/openAccess">Open Access</rights>
  </rightsList>
  <descriptions>
    <description descriptionType="Abstract">One of the most efficient methods for model compression is hint distillation, where the student model is injected with information (hints) from several different layers of the teacher model. Although the selection of hint points can drastically alter the compression performance, conventional distillation approaches overlook this fact and use the same hint points as in the early studies. Therefore, we propose a clustering based hint selection methodology, where the layers of teacher model are clustered with respect to several metrics and the cluster centers are used as the hint points. Our method is applicable for any student network, once it is applied on a chosen teacher network. The proposed approach is validated in CIFAR-100 and ImageNet datasets, using various teacher-student pairs and numerous hint distillation methods. Our results show that hint points selected by our algorithm results in superior compression performance compared to state-of-the-art knowledge distillation algorithms on the same student models and datasets.</description>
  </descriptions>
</resource>
30
6
görüntülenme
indirilme
Görüntülenme 30
İndirme 6
Veri hacmi 1.4 kB
Tekil görüntülenme 29
Tekil indirme 6

Alıntı yap