Skip to content

Commit

Permalink
comments
Browse files Browse the repository at this point in the history
  • Loading branch information
horheynm committed Jun 17, 2024
1 parent eb6ad2b commit d39e7f9
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,7 @@ def fasterprune(
Run pruning and quantization(if applicable) on the layer up to the target
sparsity value.
:param actorder: Flag to apply activation reordering
:param blocksize: Number of columns to compress in one pass
:param percdamp: Amount of dampening to apply to H, as a fraction of the
diagonal norm
Expand Down
2 changes: 2 additions & 0 deletions src/sparseml/modifiers/utils/layer_compressor.py
Original file line number Diff line number Diff line change
Expand Up @@ -134,6 +134,8 @@ def revert_layer_wrappers(self):
def compress(self, actorder: bool = False):
"""
Apply compression to each wrapped submodule in the layer
:param: actorder: flag to apply activation reordering
"""

@torch.no_grad()
Expand Down

0 comments on commit d39e7f9

Please sign in to comment.