Skip to content

A locally running Large Language Model (LLM) combined with a vector database designed to assist developers in adding ChatGPT features secure and for free.

License

Notifications You must be signed in to change notification settings

matiasvillaverde/MobileLLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

About MobileLLM

A locally running Large Language Model (LLM) combined with a vector database designed to assist developers in adding ChatGPT features secure and for free.

This project provides an LLM tool that runs locally, augmenting your application with the capability to understand and use language effectively, driven by deep learning technology.

Why use MobileLLM

  • It runs locally and offline
  • It is free
  • User data stays within their control
  • Augment the knowledge of an LLM with your app's data

Early prototype

We currently support the RWKV model that can run using less than 2GB of RAM. In the roadmap, we plan to incorporate more potent models such as Llama2 and Mistral to provide an even more robust solution.

Furthermore, the following updates will also include smoother integrations. We aim to provide an easier way to connect CoreData and SwiftData with the vector database, thus bridging the gap between your data entities and knowledge enhancement functionalities.

Demo App

You can try the demo chat. The following recording shows how it works:

demo.mov

Usage

  1. Download the model RWKV from Huggingface
  2. Select the model from the file system of your device
  3. Load the model in memory
  4. Send a prompt
import Facade

do {
    try MobileLLM.shared.load(model: "path/to/model", parameters: .default, type: .rwkv)
    let result = try await MobileLLM.shared.ask(question: "Tell me about the meaning of life")
    print(result)
} catch {
    print(error.localizedDescription)
}

Retrieval-Augmented Generation (RAG)

For more information on what an RAG is, check the video.

  1. Add a string to the vector database
  2. Send a prompt specifying the similarity score
  3. The LLM will respond based on the local knowledge
import Facade

do {
    try MobileLLM.shared.load(model: "path/to/model", parameters: .default, type: .rwkv)
    try MobileLLM.shared.add(document: "The dog is named Max")
    let result = try await MobileLLM.shared.ask(question: "How is the dog named?", similarityThreshold: 0.5)
    print(result)
} catch {
    print(error.localizedDescription)
}

Installation

Swift Package Manager

dependencies: [
    .package(url: "https://github.com/windwithbirds/mobilellm.git", .branch("main"))
]

License

MobileLLM is available under the MIT license. See the LICENSE.md file for more info.

About

A locally running Large Language Model (LLM) combined with a vector database designed to assist developers in adding ChatGPT features secure and for free.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published