Skip to content

Releases: Charl-AI/stochastic-caching

0.2.1

12 Feb 14:03

Choose a tag to compare

0.2.1 Pre-release
Pre-release

Fix int overflow bug for windows.

Full Changelog: 0.2.0...0.2.1

0.2.0

29 Oct 19:06

Choose a tag to compare

0.2.0 Pre-release
Pre-release

BREAKING: changes to dtype API in SharedCache constructor. Previously passed a string, now pass a torch dtype.

BREAKING: rename underlying_array property to just array.

Improvements to internals. Now, we keep an auxiliary array to track which samples are set/empty/oob. Simplifies the code and avoids doing questionable stuff to check if a slot has been set yet. Also add clear() method.

Added tests.

Full Changelog: 0.1.1...0.2.0

Fix locking bug

27 Oct 14:44
3873851

Choose a tag to compare

Fix locking bug Pre-release
Pre-release

Implementing mutex locks involved having one lock per slot in the cache. Somewhere between 50k and 100k dataset samples, this fails because Python itself couldn't create a list with that many lock objects.

Solved by removing all mutex locking. This is generally fine -- in each epoch, the same datapoint will never be accessed twice anyway.

If reintroducing in future, there are basically three options:

  • lock for each slot (like before), but all locking gets automatically disabled for large datasets.
  • one lock for the whole object, but it protects writes only. This would slow down the first epoch, but then be fine.
  • Implement both of the above, with a flag for advanced users to toggle the options

Initial release

27 Oct 10:32

Choose a tag to compare

Initial release Pre-release
Pre-release

Now published to PyPI