OpenHFT logo

Chronicle Map

Replicate your Key Value Store across your network, with consistency, persistance and performance. (by OpenHFT)

Stats

Basic Chronicle Map repo stats
1
2,152
9.0
7 days ago

OpenHFT/Chronicle-Map is an open source project licensed under Apache License 2.0 which is an OSI approved license.

Chronicle Map Alternatives

Similar projects and alternatives to Chronicle Map based on common topics and language
  • GitHub repo MapDB

    MapDB provides concurrent Maps, Sets and Queues backed by disk storage or off-heap-memory. It is a fast and easy to use embedded Java database engine.

  • GitHub repo SmoothieMap

    A gulp of low latency Java

  • GitHub repo Oak

    A Scalable Concurrent Key-Value Map for Big Data Analytics (by yahoo)

  • GitHub repo java-concurrent-hash-trie-map

    Java port of a concurrent trie hash map implementation from the Scala collections library

  • GitHub repo lasher

    Lasher is an embeddable key-value store written in Java.

  • GitHub repo off-heap-tests

  • GitHub repo HikariCP

    光 HikariCP・A solid, high-performance, JDBC connection pool at last.

NOTE: The number of mentions on this list indicates mentions on common posts. Hence, a higher number means a better Chronicle Map alternative or higher similarity.

Posts

Posts where Chronicle Map has been mentioned. We have used some of these posts to build our list of alternatives and similar projects - the last one was on 2021-04-08.
  • Off-heap memory in Java
    dev.to | 2021-04-08
    Chronicle-Map: Chronicle Map is an in-memory, key-value store, designed for low-latency, and/or multi-process applications.
  • Solution for hash-map with >100M values
    reddit.com/r/java | 2020-12-21
    https://github.com/OpenHFT/Chronicle-Map - Maybe a better offheap map
    reddit.com/r/java | 2020-12-21
    I've wrangled data sets in the ~600gb range using nothing but plain old Java and a few beefy boxes. This can all be kept in memory, but you have to go off-heap. You can use Chronicle Map and Chronicle Values to model this data and work with it off-heap in a way that's still very clean and object oriented. 128gb of RAM is cheap these days, whether you're in the cloud or not.