Skip to content

Conversation

@fgerzer
Copy link
Contributor

@fgerzer fgerzer commented Jan 15, 2020

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos, doc improvements)
  • Did you read the contributor guideline?
  • Did you make sure to update the docs?
    • No visible changes
  • Did you write any new necessary tests?
    • Old behaviour continues; new behaviour relies on timing outside the code.

What does this PR do?

Fixes #688 .

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

@williamFalcon
Copy link
Contributor

@fdiehl can you rebase master?

@fgerzer fgerzer force-pushed the atomic_checkpoints branch from ea2a94b to b52365a Compare January 20, 2020 14:21
@fgerzer
Copy link
Contributor Author

fgerzer commented Jan 20, 2020

@williamFalcon Done.

(I had originally used an older commit because the tests previously failed on master due to tensorboard issues. They are passing with the current master.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Checkpoint saving isn't atomic

3 participants