A user-assistant model prompter with powerful memory and context features that create an incredibly human experience over longer periods of time. Based on the MemoryBank whitepaper by Wanjun Zhong, Lianghong Guo1, Qiqi Gao, He Ye, and Yanlin Wang.
MonikaiV2 instances feature a Short-Term Memory (STM) and Long-Term Memory (LTM), while also mimicing conscious/subconscious retrieval cues, retrieval failure, and selective forgetting.
A Read-Eval-Print Loop (REPL) communication interface for Monikai. The least convoluted method of communication, very straightforward. It's worth noting that conversations can be had seamlessly between the online client and the REPL, as demonstrated in the below photo.
Anything which is not a command is forwarded as a prompt to the Monikai.
- wipe: Clears the Monikai's memories and recent conversation, preserves the description.
- save: Writes the Monikai in memory to 'monikai.json'.
- end: Manually marks the current conversation as completed and encodes it to memory.
- log: Prints the Monikai in memory to stdout.
- get: Takes another line as input, and prints the memory most similar in cosine.
When starting the repo, a graphical web client is hosted on port 3000.
This contains an additional interface layer, allowing the Monikai to express a preset range of emotions! This can create a more human-like interaction, and puts a face to text.
All assets can be customized by replacing the files in ./public/assets. Ensure that you modify either the import code in ./public/index.html or mimic the original file names.
Note: A given emotion must have two files to be properly rendered: "EMOTION.png" and "EMOTIONSPEAKING.png". If you don't want a speaking version, simply duplicate and rename EMOTION.png.
MonikaiV2's memory pruning system draws insight from the Trace Decay Theory of Forgetting and the Ebbinghaus Forgetting Curve.
During testing, I found that this created the most 'human' experience and lead to the least hallucination.
Memories are not pruned until 7 days have passed, which is considered the average time a human remembers a conversation, after which they are subject to forgetting.
The following equation was used to model 'forgetting':
Where d is the number of days since the memory was formed and r is the number of times the memory has been retrived. Memories with a score over 1 are forgotten.
The model retrieves a memory either subconsciously with a 'retrieval cue', or consciously if the Monikai determines that it needs more context.
The user prompts the Monikai "Hey, what was the baking cookbook you recommended me?"
- Subconsciously:
- Subconscious memory search with key phrase "baking cooking"
- A memory where the user talked about lunch, where the user ate a baked pretzel
- Consciously:
- The Monikai determines they need more information
- Conscious memory search with key phrase "cookbook recommendation"
- A memory where the Monikai recommended The College Cookbook.
Monikai will save automatically every 5 seconds, and a conversation is considered 'over' after 5 minutes of inactivity. When a conversation is 'over', it will automatically self-encode into LTM.
-
Start by cloning the repo:
git clone https://github.com/hiibolt/monikaiv2.git
-
Set environment key OPENAI_API_KEY to your OpenAI API key.
export OPENAI_API_KEY="YOUR KEY HERE"
(method varies) -
(Optional) Customize the character description field in monikai/data/monikai.json.
-
Start the Monikai!
cargo run