Skip to content
/ ALPS Public

[ACL25] Code for paper "ALPS: Attention Localization and Pruning Strategy for Efficient Alignment of Large Language Models"

License

Notifications You must be signed in to change notification settings

VoiceBeer/ALPS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

ALPS

Repository Status: Under Construction
We are actively organizing the codebase and reproduction scripts. Stay tuned—code will be available here soon!

About ALPS

ALPS (Attention Localization and Pruning Strategy) is a novel PEFT method for efficiently aligning large language models by identifying and pruning attention heads that are less relevant to downstream tasks. This approach reduces computational and memory overhead while maintaining or even improving model performance. The full paper and reference implementation will be linked here upon release.

Code and detailed instructions are on the way!

About

[ACL25] Code for paper "ALPS: Attention Localization and Pruning Strategy for Efficient Alignment of Large Language Models"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published