Skip to content

Commit 5abf1de

Browse files
committed
Update README with HuggingFace content
1 parent 4febd2e commit 5abf1de

File tree

2 files changed

+10
-17
lines changed

2 files changed

+10
-17
lines changed

llama2_c/README.md

+6-15
Original file line numberDiff line numberDiff line change
@@ -149,6 +149,7 @@ For quick tests, we have included a really small model, with only 260K parameter
149149
150150
The CI/CD using a GitHub actions workflow, and the demo_pytest.sh script are based on this model.
151151
152+
152153
# demo_pytest.sh
153154
154155
- The demo_pytest.sh script starts the local network, deploys llama2_260K, uploads the model & tokenizer, and runs the QA with pytest:
@@ -161,25 +162,15 @@ The CI/CD using a GitHub actions workflow, and the demo_pytest.sh script are bas
161162
- `./demo.sh` , on Linux / Mac
162163
- `.\demo.ps1` , in Windows PowerShell (Miniconda recommended)
163164
164-
# More models
165+
# Models
165166
166-
- You can get other model checkpoints, as explained in [karpathy/llama2.c](https://github.com/karpathy/llama2.c):
167+
## HuggingFace
167168
168-
This command downloads the 15M parameter model that was trained on the TinyStories dataset (~60MB download) and stores it in a `models` folder:
169+
You can find many models in the llama2_c *.bin format on HuggingFace, for example:
169170
170-
```bash
171-
# on Linux/Mac
172-
mkdir -p models
173-
wget -P models https://huggingface.co/karpathy/tinyllamas/resolve/main/stories15M.bin
174-
```
171+
- [onicai/llama2_c_canister_models](https://huggingface.co/onicai/llama2_c_canister_models)
172+
- [karpathy/tinyllamas](https://huggingface.co/karpathy/tinyllamas)
175173
176-
```powershell
177-
# in Windows PowerShell (Miniconda recommended)
178-
if (-not (Test-Path -Path .\models)) {
179-
New-Item -Path .\models -ItemType Directory
180-
}
181-
Invoke-WebRequest -Uri https://huggingface.co/karpathy/tinyllamas/resolve/main/stories15M.bin -OutFile .\models\stories15M.bin
182-
```
183174
184175
# Deploying to the IC main net
185176

llama2_c/scripts/nft_update_story.py

+4-2
Original file line numberDiff line numberDiff line change
@@ -156,12 +156,14 @@ def main() -> int:
156156
print(response)
157157
if "Ok" in response[0].keys():
158158
# Check if the number of generated tokens is less than the requested tokens
159-
if response[0]["Ok"]["num_tokens"] < prompt['steps']:
159+
if response[0]["Ok"]["num_tokens"] < prompt["steps"]:
160160
print(f'The end! - num_tokens = {response[0]["Ok"]["num_tokens"]}')
161161
break
162162
# Check if the response is an empty string. If it is, break out of the loop.
163163
if response[0]["Ok"]["inference"] == "":
164-
print("The end! - we got an empty string. THIS IS AN ERROR ACTUALLY. WE SHOULD NOT GET HERE..")
164+
print(
165+
"The end! - we got an empty string. (ERROR: WE SHOULD NOT GET HERE)"
166+
)
165167
print("Something went wrong:")
166168
sys.exit(1)
167169
else:

0 commit comments

Comments
 (0)