detalle del documento
IDENTIFICACIÓN

oai:arXiv.org:2409.05368

Tema
Computer Science - Computation and...
Autor
Rai, Rohit Raj Borah, Angana Awekar, Amit
Categoría

Computer Science

Año

2024

fecha de cotización

11/9/2024

Palabras clave
target data specific application compressed compression models
Métrico

Resumen

Large Deep Learning models are compressed and deployed for specific applications.

However, current Deep Learning model compression methods do not utilize the information about the target application.

As a result, the compressed models are application agnostic.

Our goal is to customize the model compression process to create a compressed model that will perform better for the target application.

Our method, Application Specific Compression (ASC), identifies and prunes components of the large Deep Learning model that are redundant specifically for the given target application.

The intuition of our work is to prune the parts of the network that do not contribute significantly to updating the data representation for the given application.

We have experimented with the BERT family of models for three applications: Extractive QA, Natural Language Inference, and Paraphrase Identification.

We observe that customized compressed models created using ASC method perform better than existing model compression methods and off-the-shelf compressed models.

;Comment: Accepted in the Proceedings of the 8th Joint International Conference on Data Science & Management of Data (12th ACM IKDD CODS and 30th COMAD) for the Short Research Paper track, 5 pages

Rai, Rohit Raj,Borah, Angana,Awekar, Amit, 2024, Application Specific Compression of Deep Learning Models

Documento

Abrir

Compartir

Fuente

Artículos recomendados por ES/IODE IA