-
Notifications
You must be signed in to change notification settings - Fork 83
Add Graph class #403
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Graph class #403
Conversation
@FilippoOlivo I would avoid to write a graph solver, |
pina/graph.py
Outdated
if isinstance(pos, torch.Tensor): | ||
pos = [pos] | ||
edge_index = [edge_index] | ||
distance = [pos_[edge_index_[0]] - pos_[edge_index_[1]] ** 2 for |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
edge_attr
is computed as the square of the distance, instead of the standard distance. Is this the desired behavior?
@FilippoOlivo please remove from commit a862dc8 to f0ddee6, they should pertain to a different PR #433 . Only graph updates should be here |
@dario-coscia Done! Remove useless commits |
I would say green light on my side! |
@FilippoOlivo there is still the
|
@FilippoOlivo @gc031298 This refactoring looks very nice! |
pina/graph.py
Outdated
|
||
x, pos, edge_index = self._check_input_consistency(x, pos, edge_index) | ||
print(len(pos)) | ||
if edge_index is None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i don't know if this is nice to be kept here.. why radius and not knn? Do we really need a temporal graph? One could do a normal RadiusGraph and then add time as additional parameter
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok. As for now I remove TemporalGraph
internal_func=internal_func, | ||
external_func=external_func) | ||
self.n_layers = n_layers | ||
self.forward = self.forward_shared |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't like a lot this forward separation, is there a way to combine the two?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In order to have an efficient way to store parameters (avoid to use torch.nn.ModuleList with the same model repeated n_layer times), another possible solution is using an if in the forward. Otherwise I can define another 2 classes: one for the shared_weights and one for the non shared_weights. Let me know what how to proceed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can keep it like this for the moment, maybe two classes is the best but for a single model I would not care a lot
@FilippoOlivo @gc031298 overall very nice, this looks a very nice integration in PINA! Just minor changes, I think we will be able to merge it soon :)) |
…ndling. Remove TemporalGraph class
This PR refers to the issue #400