-
Notifications
You must be signed in to change notification settings - Fork 46
Big breaking API redesign #264
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
85b77ab
to
1f6f644
Compare
fix README.md file name casing
1f6f644
to
8d585bb
Compare
[](https://python.org/downloads) | ||
[](https://pypi.org/project/torch_sim_atomistic) | ||
[](https://github.com/torchsim/torch-sim/actions/workflows/test.yml) | ||
[](https://codecov.io/gh/torchsim/torch-sim) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
https://codecov.io/gh/torchsim/torch-sim is a broken link currently
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tried adding codecov app to this new org. seems to have worked but then but it's not showing up for me in the web UI (yet)...
if step % 10 == 0: | ||
hybrid_state = swap_step(hybrid_state, kT=torch.tensor(kT), generator=generator) | ||
hybrid_state = ts.swap_mc_step( | ||
model=model, state=hybrid_state, kT=kT, seed=42 + step |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why not keep passing the same generator? reseeding is bad for entropy vs using the same prng sequence
|
||
# %% Create individual filenames for each system | ||
filenames = [f"batch_traj_{i}.h5" for i in range(len(systems))] | ||
filenames = [f"tmp/batch_traj_{i}.h5" for i in range(len(systems))] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: Do you want to auto-clean up at the end of the tutorial and delete tmp?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
better not, could be people (like myself) have other stuff in tmp/
folder which they want to keep
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could write to a subfolder in tmp/
and clean that up instead. then only minor concern is that people might want to manually inspect the files and be annoyed if they need to comment out the removal code and rerun tutorial
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's fine letting people clean up their own tutorials
tests/test_integrators.py
Outdated
temperatures = [] | ||
for _step in range(n_steps): | ||
state = update_fn(state=state) | ||
state = ts.npt_langevin_update( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
feels like the refactor likely introduces a bug here in that we should be passing a generator to this update function?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i don't follow. you mean we need to externally seed?
torch-sim/torch_sim/integrators/npt.py
Line 152 in 49be107
noise = torch.randn_like(state.cell_positions, device=state.device, dtype=state.dtype) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would think (not sure/haven't/won't check) before the init function that returned the update function would have made a generator in the factory that was used for the PRNG sequence inside the update. Here this is unseeded which is probably fine for now but would be better to fix alongside something like: #51
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That would have been the right way to do it but not what we did. Fixing the seeds doesn't really work on most integrators beyond init. Agree it can be addressed later.
for #211 no, for #127 a few minor changes to bring it inline with the new API (was still using |
…id_swap_tutorial.py
lots of bug fixes, type fixes, moving integrator+optimizer functions into module scope (for easier typing and easier-to-reason-about-API since no more function closures over models). this PR includes the fairchem v2 PR #238, #127 by @stefanbringuier.