Skip to content

Commit

Permalink
fix small nits (#76)
Browse files Browse the repository at this point in the history
* remove deprecated background commit flag

* move yaml import out of global scope

* rename huggingface secret to default
  • Loading branch information
charlesfrye authored Aug 8, 2024
1 parent 4c2edf3 commit f7bfd4c
Show file tree
Hide file tree
Showing 4 changed files with 3 additions and 5 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ Inference on the fine-tuned model displays conformity to the output structure (`
1. Create a [Modal](https://modal.com/) account.
2. Install `modal` in your current Python virtual environment (`pip install modal`)
3. Set up a Modal token in your environment (`python3 -m modal setup`)
4. You need to have a [secret](https://modal.com/docs/guide/secrets#secrets) named `huggingface` in your workspace. You can [create a new secret](https://modal.com/secrets) with the HuggingFace template in your Modal dashboard, using the key from HuggingFace (in settings under API tokens) to populate `HF_TOKEN` and changing the name from `my-huggingface-secret` to `huggingface`.
4. You need to have a [secret](https://modal.com/docs/guide/secrets#secrets) named `my-huggingface-secret` in your workspace. You can [create a new secret](https://modal.com/secrets) with the HuggingFace template in your Modal dashboard, using the key from HuggingFace (in settings under API tokens) to populate `HF_TOKEN`.
5. For some LLaMA models, you need to go to the Hugging Face page (e.g. [this page for LLaMA 3 8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B)\_ and agree to their Terms and Conditions for access (granted instantly).
6. If you want to use [Weights & Biases](https://wandb.ai) for logging, you need to have a secret named `wandb` in your workspace as well. You can also create it [from a template](https://modal.com/secrets). Training is hard enough without good logs, so we recommend you try it or look into `axolotl`'s integration with [MLFlow](https://mlflow.org/)!
</details>
Expand Down
2 changes: 1 addition & 1 deletion src/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@
app = modal.App(
APP_NAME,
secrets=[
modal.Secret.from_name("huggingface"),
modal.Secret.from_name("my-huggingface-secret"),
modal.Secret.from_dict({"ALLOW_WANDB": os.environ.get("ALLOW_WANDB", "false")}),
*([modal.Secret.from_name("wandb")] if ALLOW_WANDB else []),
],
Expand Down
2 changes: 1 addition & 1 deletion src/inference.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
import os
import time
import yaml
from pathlib import Path

import modal
Expand All @@ -21,6 +20,7 @@
from vllm.engine.async_llm_engine import AsyncLLMEngine
from vllm.sampling_params import SamplingParams
from vllm.utils import random_uuid
import yaml


def get_model_path_from_run(path: Path) -> Path:
Expand Down
2 changes: 0 additions & 2 deletions src/train.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,6 @@
gpu=GPU_CONFIG,
volumes=VOLUME_CONFIG,
timeout=24 * HOURS,
_allow_background_volume_commits=True,
)
def train(run_folder: str, output_dir: str):
import torch
Expand All @@ -48,7 +47,6 @@ def train(run_folder: str, output_dir: str):
gpu=SINGLE_GPU_CONFIG,
volumes=VOLUME_CONFIG,
timeout=24 * HOURS,
_allow_background_volume_commits=True,
)
def preproc_data(run_folder: str):
print("Preprocessing data.")
Expand Down

0 comments on commit f7bfd4c

Please sign in to comment.