Skip to content

[Feature]: Support for Higher than 64 LoRa Ranks #3934

@MrzEsma

Description

@MrzEsma

🚀 The feature, motivation and pitch

Hello,

I was delighted to see the implementation of the multi LoRa feature and would like to express my gratitude and appreciation for your efforts. However, the LoRa we have developed operates with r=128 and r=256, and currently, it does not work for me, resulting in the following error:

ValueError: max_lora_rank (128) must be one of (8, 16, 32, 64).

I am curious to know if there are any plans to support higher ranks? Is this on the priority list? It's quite crucial for us, and we would greatly appreciate any development in this area.

Thank you.

Alternatives

No response

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    feature requestNew feature or requeststaleOver 90 days of inactivity

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions