Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Algorithm 1 loop #6

Open
AndreydeAguiarSalvi opened this issue Oct 24, 2020 · 0 comments
Open

Algorithm 1 loop #6

AndreydeAguiarSalvi opened this issue Oct 24, 2020 · 0 comments

Comments

@AndreydeAguiarSalvi
Copy link

AndreydeAguiarSalvi commented Oct 24, 2020

Hello, I have some questions about your code.

  • I could not understand where the loop from Algorithm 1 of the paper is on the code, if it is the multishot.py loop of line 53, or if it is the prune_loop function, as both loops compute a sparsity exponentiated at a term of kind k/n.

  • My goal is to remove 90% of the parameters from my custom network with synflow. What should be the values of compression_list, level_list, and prune_epochs? pre_epochs should be 0, right? I did a test just printing the sparsity values from the multishot.py, using compression_list = 1 (assuming that 10^1 in your notation is equal 90%), level_list = 20, and prune_epochs = 1. Evaluating the sparsity values, it seemed to me that this would remove more than 90% (in the third iteration, it had already passed). My test and results below (assume the code is correctly indented):

`
schedule = "exponential" # default
compression_list = [1] # 10^1 or 90%
level_list = [20] # default = []
prune_epochs = 1 # default

W = 1000

for compression in compression_list:
for level in level_list:
for l in range(level):
# Prune Model
sparsity = (10**(-float(compression)))**((l + 1) / level)

        # Prune model (summarized prune_loop function)
        for epoch in range(prune_epochs):
            if schedule == "exponential":
                sparse = sparsity**((epoch + 1) / prune_epochs)
            elif schedule == "linear":
                sparse = 1.0 - (1.0 - sparsity)*((epoch + 1) / prune_epochs)

            prune = round(W * sparse)
            W_ = W - prune
            print("W: {:.0f}\tSparsity: {:.2f}\tRemoving: {:.0f}\tRemaining:{:.0f}".format(W, sparse, prune, W_))
            W = W_

W: 1000 Sparsity: 0.89 Removing: 891 Remaining:109
W: 109 Sparsity: 0.79 Removing: 87 Remaining:22
W: 22 Sparsity: 0.71 Removing: 16 Remaining:6
W: 6 Sparsity: 0.63 Removing: 4 Remaining:2
W: 2 Sparsity: 0.56 Removing: 1 Remaining:1
W: 1 Sparsity: 0.50 Removing: 1 Remaining:0
W: 0 Sparsity: 0.45 Removing: 0 Remaining:0
W: 0 Sparsity: 0.40 Removing: 0 Remaining:0
W: 0 Sparsity: 0.35 Removing: 0 Remaining:0
W: 0 Sparsity: 0.32 Removing: 0 Remaining:0
W: 0 Sparsity: 0.28 Removing: 0 Remaining:0
W: 0 Sparsity: 0.25 Removing: 0 Remaining:0
W: 0 Sparsity: 0.22 Removing: 0 Remaining:0
W: 0 Sparsity: 0.20 Removing: 0 Remaining:0
W: 0 Sparsity: 0.18 Removing: 0 Remaining:0
W: 0 Sparsity: 0.16 Removing: 0 Remaining:0
W: 0 Sparsity: 0.14 Removing: 0 Remaining:0
W: 0 Sparsity: 0.13 Removing: 0 Remaining:0
W: 0 Sparsity: 0.11 Removing: 0 Remaining:0
W: 0 Sparsity: 0.10 Removing: 0 Remaining:0

`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant