Skip to content

Conversation

@Vaibhavs10
Copy link
Contributor

@Vaibhavs10 Vaibhavs10 commented Aug 1, 2023

Hi there,

As observed earlier the most recent update to transformers has broken load_state_dict function in CLAP.
This patch fixes that error. It is effectively removing the position_ids key from the text branch. It errors out because of the recent change in transformers huggingface/transformers#24505 :)

This change fixes it. Feel free to let me know if you need any clarification. 🤗

cc: @lukewys 🙏

@lukewys
Copy link
Contributor

lukewys commented Aug 3, 2023

Hi @Vaibhavs10 ! Many thanks for the PR! I would like to double-check:

Is this fix backward compatible? That is, do we still need the position_ids in loading the early version of the transformers? If so, could you kindly add an if statement for checking the transformers version? Many thanks!

@ylacombe
Copy link

ylacombe commented Oct 30, 2023

Hi @lukewys and @Vaibhavs10 , any update on this ? Many thanks 🤗

import re
from copy import deepcopy
from pathlib import Path
from packaging import version
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since it is already a dependency of setup tools - we don't quite need to install this.

state_dict = {k[7:]: v for k, v in state_dict.items()}

# removing position_ids to maintain compatibility with latest transformers update
if version.parse(transformers.__version__) >= version.parse("4.31.0"):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Explicit statement since this change only effects beyond 4.31.0.

@Vaibhavs10
Copy link
Contributor Author

H @lukewys - This should fix it now, Here's how you can verify it (made a quick colab for you to test): https://github.com/Vaibhavs10/scratchpad/blob/main/laion_clap_fix_repro.ipynb

@lukewys
Copy link
Contributor

lukewys commented Nov 8, 2023

Sorry for the late reply. That seems great! Many thanks @Vaibhavs10 ! I will quickly check with Ke and merge this PR asap!

@lukewys lukewys merged commit 0d57b47 into LAION-AI:main Nov 8, 2023
@lukewys
Copy link
Contributor

lukewys commented Nov 8, 2023

Done! Thanks again @Vaibhavs10 !

@Vaibhavs10 Vaibhavs10 deleted the fix_transformers_upd branch November 24, 2023 15:00
@dakl
Copy link

dakl commented Mar 22, 2024

@lukewys is there a new PYPI release coming with this fix? 1.1.5 for example.

@lukewys
Copy link
Contributor

lukewys commented Mar 31, 2024

Oh, thanks for the reminder! Looping in @RetroCirce here!

@kamalojasv181
Copy link

@lukewys @RetroCirce a polite reminder again. I am tired of downloading this package manually for all the repos and importing stuff from the local files since I cannot afford to downgrade transformers 😢

@RetroCirce
Copy link
Contributor

RetroCirce commented May 1, 2024 via email

@kamalojasv181
Copy link

Thanks so much for the response!

@RetroCirce
Copy link
Contributor

New update: the 1.15 PYPI library is ready and we will push it in the next week.

@mcherep
Copy link

mcherep commented Jul 7, 2024

It'd be great to have the new release :)

@RetroCirce
Copy link
Contributor

Sorry for the late update!
Just graduate from my PHD lol.
The new PIP library of laion-clap (1.1.6) is ready and released: https://pypi.org/project/laion-clap/
We fix some bugs, including supporting the new transformers library.

@mcherep
Copy link

mcherep commented Jul 9, 2024

Congrats and thank you for the great work!

hykilpikonna added a commit to microsoft/fadtk that referenced this pull request Sep 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants