Document detail
ID

oai:arXiv.org:2409.05368

Topic
Computer Science - Computation and...
Author
Rai, Rohit Raj Borah, Angana Awekar, Amit
Category

Computer Science

Year

2024

listing date

9/11/2024

Keywords
target data specific application compressed compression models
Metrics

Abstract

Large Deep Learning models are compressed and deployed for specific applications.

However, current Deep Learning model compression methods do not utilize the information about the target application.

As a result, the compressed models are application agnostic.

Our goal is to customize the model compression process to create a compressed model that will perform better for the target application.

Our method, Application Specific Compression (ASC), identifies and prunes components of the large Deep Learning model that are redundant specifically for the given target application.

The intuition of our work is to prune the parts of the network that do not contribute significantly to updating the data representation for the given application.

We have experimented with the BERT family of models for three applications: Extractive QA, Natural Language Inference, and Paraphrase Identification.

We observe that customized compressed models created using ASC method perform better than existing model compression methods and off-the-shelf compressed models.

;Comment: Accepted in the Proceedings of the 8th Joint International Conference on Data Science & Management of Data (12th ACM IKDD CODS and 30th COMAD) for the Short Research Paper track, 5 pages

Rai, Rohit Raj,Borah, Angana,Awekar, Amit, 2024, Application Specific Compression of Deep Learning Models

Document

Open

Share

Source

Articles recommended by ES/IODE AI