-
-
Couldn't load subscription status.
- Fork 10.8k
Closed as not planned
Labels
feature requestNew feature or requestNew feature or requeststaleOver 90 days of inactivityOver 90 days of inactivity
Description
🚀 The feature, motivation and pitch
Hello,
I was delighted to see the implementation of the multi LoRa feature and would like to express my gratitude and appreciation for your efforts. However, the LoRa we have developed operates with r=128 and r=256, and currently, it does not work for me, resulting in the following error:
ValueError: max_lora_rank (128) must be one of (8, 16, 32, 64).
I am curious to know if there are any plans to support higher ranks? Is this on the priority list? It's quite crucial for us, and we would greatly appreciate any development in this area.
Thank you.
Alternatives
No response
Additional context
No response
51616, JohnUiterwyk, YorickdeJong, Rachneet and jxmorris1251616 and jxmorris12
Metadata
Metadata
Assignees
Labels
feature requestNew feature or requestNew feature or requeststaleOver 90 days of inactivityOver 90 days of inactivity