Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

nn_pruning doesn't seem to work for T5 Models, Roberta-based Models #36

Open
ghost opened this issue Mar 17, 2022 · 3 comments
Open

nn_pruning doesn't seem to work for T5 Models, Roberta-based Models #36

ghost opened this issue Mar 17, 2022 · 3 comments

Comments

@ghost
Copy link

ghost commented Mar 17, 2022

Hi @madlag @julien-c @co42 @srush @Narsil

I am trying to use nn_pruning for Pruning different transformer models.

Code:

model_checkpoint = "t5-small"
t5small_model = AutoModelForSeq2SeqLM.from_pretrained(model_checkpoint).to(device)
mpc.patch_model(t5small_model)

t5small_model.save_pretrained("models/patched")

Error:

---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
[<ipython-input-47-602943fc51a1>](https://localhost:8080/#) in <module>()
     1 
     2 t5small_model = AutoModelForSeq2SeqLM.from_pretrained(model_checkpoint).to(device)
----> 3 mpc.patch_model(t5small_model)
     4 
     5 t5small_model.save_pretrained("models/patched")

[/usr/local/lib/python3.7/dist-packages/nn_pruning/patch_coordinator.py](https://localhost:8080/#) in patch_model(self, model, trial)
   640             patched_count += 2 * layers_count
   641 
--> 642         assert (patcher.stats["patched"] == patched_count)
   643 
   644         if layer_norm_patch:

AssertionError:

[Colab] (https://colab.research.google.com/drive/1Gz7rozG8NbeBtsiWXjGNQ5wnVU7SE_Wl?usp=sharing)

@robotsp
Copy link

robotsp commented Mar 8, 2023

Hi @madlag @julien-c @co42 @srush @Narsil

I am trying to use nn_pruning for Pruning different transformer models.

Code:

model_checkpoint = "t5-small"
t5small_model = AutoModelForSeq2SeqLM.from_pretrained(model_checkpoint).to(device)
mpc.patch_model(t5small_model)

t5small_model.save_pretrained("models/patched")

Error:

---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
[<ipython-input-47-602943fc51a1>](https://localhost:8080/#) in <module>()
     1 
     2 t5small_model = AutoModelForSeq2SeqLM.from_pretrained(model_checkpoint).to(device)
----> 3 mpc.patch_model(t5small_model)
     4 
     5 t5small_model.save_pretrained("models/patched")

[/usr/local/lib/python3.7/dist-packages/nn_pruning/patch_coordinator.py](https://localhost:8080/#) in patch_model(self, model, trial)
   640             patched_count += 2 * layers_count
   641 
--> 642         assert (patcher.stats["patched"] == patched_count)
   643 
   644         if layer_norm_patch:

AssertionError:

[Colab] (https://colab.research.google.com/drive/1Gz7rozG8NbeBtsiWXjGNQ5wnVU7SE_Wl?usp=sharing)

I have the same problem, did you fix it? @shubham-krishna

@zixuli123
Copy link

I have the same problem, did you fix it? @robotsp @ghost

@Narsil
Copy link

Narsil commented Jun 7, 2023

sorry I don't know, I can't really help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants