Skip to content

Conversation

aobolensk
Copy link

@aobolensk aobolensk commented Sep 25, 2025

Changes

  • Extract common part from backend specific backend_parameters.py files and move it from Torch FX and OpenVINO backends to the common advanced_parameters.py

Related tickets

Closes Issue #3668

@aobolensk aobolensk requested a review from a team as a code owner September 25, 2025 15:28
@aobolensk aobolensk force-pushed the 3668 branch 2 times, most recently from 48cd605 to d1a1ca6 Compare September 25, 2025 15:44
@aobolensk aobolensk force-pushed the 3668 branch 2 times, most recently from a390263 to 04cf656 Compare October 1, 2025 10:49
@github-actions github-actions bot added the API Public API-impacting changes label Oct 1, 2025
@aobolensk aobolensk force-pushed the 3668 branch 4 times, most recently from bea908d to bc9ddb8 Compare October 1, 2025 12:54
Comment on lines 292 to 294
def is_weight_compression_needed(self) -> bool:
"""
Determine whether weight compression is needed based on advanced quantization parameters.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please move the function to the module scope, currently it is a method of advanced parameters

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Moved

Comment on lines 259 to 261
:param COMPRESS_WEIGHTS: A key in the `backend_params` dictionary that indicates whether
weight compression should be applied. If set to False, weight compression is disabled.
By default, weight compression is enabled (True).
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
:param COMPRESS_WEIGHTS: A key in the `backend_params` dictionary that indicates whether
weight compression should be applied. If set to False, weight compression is disabled.
By default, weight compression is enabled (True).
:param compress_weights: Indicates whether
weight compression should be applied. If set to False, weight compression is disabled.
By default, weight compression is enabled (True).

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see what you mean, applied

:param COMPRESS_WEIGHTS: A key in the `backend_params` dictionary that indicates whether
weight compression should be applied. If set to False, weight compression is disabled.
By default, weight compression is enabled (True).
:type COMPRESS_WEIGHTS: str
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
:type COMPRESS_WEIGHTS: str
:type compress_weights: bool

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

backend_params: dict[str, Any] = field(default_factory=dict)

# Backend parameter names
COMPRESS_WEIGHTS = "compress_weights"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
COMPRESS_WEIGHTS = "compress_weights"
compress_weights = True

Please move this line to the line 272

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@andrey-churkin
Copy link
Contributor

Should be merged after #3662

@github-actions github-actions bot added the NNCF ONNX Pull requests that updates NNCF ONNX label Oct 6, 2025
@aobolensk
Copy link
Author

Should be merged after #3662

Rebased the changes after merging this PR and applied similar changes for ONNX as well

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
API Public API-impacting changes NNCF ONNX Pull requests that updates NNCF ONNX
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants