Skip to content

Commit 025261a

Browse files
committed
huggingface -> transformers
1 parent 3daa328 commit 025261a

File tree

12 files changed

+22
-22
lines changed

12 files changed

+22
-22
lines changed

en/examples/bert.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ The IMDB dataset labels 1 as positive and 0 as negative.
4545
import torch
4646
from datasets import load_dataset
4747
from transformers import AutoTokenizer, AutoModelForSequenceClassification, Trainer, TrainingArguments
48-
from swanlab.integration.huggingface import SwanLabCallback
48+
from swanlab.integration.transformers import SwanLabCallback
4949
import swanlab
5050

5151
def predict(text, model, tokenizer, CLASS_NAME):

en/examples/pretrain_llm.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -145,7 +145,7 @@ print(args)
145145
Use the built-in training from Transformers and integrate SwanLab for visualization and logging:
146146

147147
```python
148-
from swanlab.integration.huggingface import SwanLabCallback
148+
from swanlab.integration.transformers import SwanLabCallback
149149
trainer = transformers.Trainer(
150150
model=model,
151151
tokenizer=tokenizer,
@@ -188,7 +188,7 @@ Project directory structure:
188188
import datasets
189189
import transformers
190190
import swanlab
191-
from swanlab.integration.huggingface import SwanLabCallback
191+
from swanlab.integration.transformers import SwanLabCallback
192192
import modelscope
193193

194194
def main():

en/examples/qwen_finetune.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -88,7 +88,7 @@ import pandas as pd
8888
import torch
8989
from datasets import Dataset
9090
from modelscope import snapshot_download, AutoTokenizer
91-
from swanlab.integration.huggingface import SwanLabCallback
91+
from swanlab.integration.transformers import SwanLabCallback
9292
from peft import LoraConfig, TaskType, get_peft_model
9393
from transformers import AutoModelForCausalLM, TrainingArguments, Trainer, DataCollatorForSeq2Seq
9494
import os

en/guide_cloud/integration/integration-huggingface-transformers.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ You can use Transformers to quickly train models while using SwanLab for experim
1111
## 1. Import SwanLabCallback
1212

1313
```python
14-
from swanlab.integration.huggingface import SwanLabCallback
14+
from swanlab.integration.transformers import SwanLabCallback
1515
```
1616

1717
**SwanLabCallback** is a logging class adapted for Transformers.
@@ -24,7 +24,7 @@ from swanlab.integration.huggingface import SwanLabCallback
2424
## 2. Pass to Trainer
2525

2626
```python (1,7,12)
27-
from swanlab.integration.huggingface import SwanLabCallback
27+
from swanlab.integration.transformers import SwanLabCallback
2828
from transformers import Trainer, TrainingArguments
2929

3030
...
@@ -47,7 +47,7 @@ trainer.train()
4747
import evaluate
4848
import numpy as np
4949
import swanlab
50-
from swanlab.integration.huggingface import SwanLabCallback
50+
from swanlab.integration.transformers import SwanLabCallback
5151
from datasets import load_dataset
5252
from transformers import AutoModelForSequenceClassification, AutoTokenizer, Trainer, TrainingArguments
5353

en/guide_cloud/integration/integration-sentence-transformers.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ You can use Sentence Transformers to quickly train models while using SwanLab fo
99
## 1. Import SwanLabCallback
1010

1111
```python
12-
from swanlab.integration.huggingface import SwanLabCallback
12+
from swanlab.integration.transformers import SwanLabCallback
1313
```
1414

1515
**SwanLabCallback** is a logging class adapted for HuggingFace series tools (such as Transformers).
@@ -22,7 +22,7 @@ from swanlab.integration.huggingface import SwanLabCallback
2222
## 2. Pass to Trainer
2323

2424
```python (1,7,12)
25-
from swanlab.integration.huggingface import SwanLabCallback
25+
from swanlab.integration.transformers import SwanLabCallback
2626
from sentence_transformers import SentenceTransformer, SentenceTransformerTrainer
2727

2828
...
@@ -45,7 +45,7 @@ trainer.train()
4545
from datasets import load_dataset
4646
from sentence_transformers import SentenceTransformer, SentenceTransformerTrainer
4747
from sentence_transformers.losses import MultipleNegativesRankingLoss
48-
from swanlab.integration.huggingface import SwanLabCallback
48+
from swanlab.integration.transformers import SwanLabCallback
4949

5050
model = SentenceTransformer("bert-base-uncased")
5151

en/guide_cloud/integration/integration-unsloth.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ The parameters that can be defined for SwanLabCallback include:
2424
## 2. Passing to Trainer
2525

2626
```python {1,7,12}
27-
from swanlab.integration.huggingface import SwanLabCallback
27+
from swanlab.integration.transformers import SwanLabCallback
2828
from trl import GRPOTrainer
2929

3030
...

zh/examples/bert.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ IMDB数据集的1是positive,0是negative。
4848
import torch
4949
from datasets import load_dataset
5050
from transformers import AutoTokenizer, AutoModelForSequenceClassification, Trainer, TrainingArguments
51-
from swanlab.integration.huggingface import SwanLabCallback
51+
from swanlab.integration.transformers import SwanLabCallback
5252
import swanlab
5353

5454
def predict(text, model, tokenizer, CLASS_NAME):

zh/examples/pretrain_llm.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -149,7 +149,7 @@ print(args)
149149
使用transformers自带的train开始训练,并且引入swanlab作为可视化日志记录
150150

151151
```python
152-
from swanlab.integration.huggingface import SwanLabCallback
152+
from swanlab.integration.transformers import SwanLabCallback
153153
trainer = transformers.Trainer(
154154
model=model,
155155
tokenizer=tokenizer,
@@ -196,7 +196,7 @@ swanlab login
196196
import datasets
197197
import transformers
198198
import swanlab
199-
from swanlab.integration.huggingface import SwanLabCallback
199+
from swanlab.integration.transformers import SwanLabCallback
200200
import modelscope
201201

202202
def main():

zh/examples/qwen_finetune.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -94,7 +94,7 @@ import pandas as pd
9494
import torch
9595
from datasets import Dataset
9696
from modelscope import snapshot_download, AutoTokenizer
97-
from swanlab.integration.huggingface import SwanLabCallback
97+
from swanlab.integration.transformers import SwanLabCallback
9898
from peft import LoraConfig, TaskType, get_peft_model
9999
from transformers import AutoModelForCausalLM, TrainingArguments, Trainer, DataCollatorForSeq2Seq
100100
import os

zh/guide_cloud/integration/integration-huggingface-transformers.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ Hugging Face 的 [Transformers](https://github.com/huggingface/transformers) 是
1111
## 1. 引入SwanLabCallback
1212

1313
```python
14-
from swanlab.integration.huggingface import SwanLabCallback
14+
from swanlab.integration.transformers import SwanLabCallback
1515
```
1616

1717
**SwanLabCallback**是适配于Transformers的日志记录类。
@@ -24,7 +24,7 @@ from swanlab.integration.huggingface import SwanLabCallback
2424
## 2. 传入Trainer
2525

2626
```python (1,7,12)
27-
from swanlab.integration.huggingface import SwanLabCallback
27+
from swanlab.integration.transformers import SwanLabCallback
2828
from transformers import Trainer, TrainingArguments
2929

3030
...
@@ -47,7 +47,7 @@ trainer.train()
4747
import evaluate
4848
import numpy as np
4949
import swanlab
50-
from swanlab.integration.huggingface import SwanLabCallback
50+
from swanlab.integration.transformers import SwanLabCallback
5151
from datasets import load_dataset
5252
from transformers import AutoModelForSequenceClassification, AutoTokenizer, Trainer, TrainingArguments
5353

0 commit comments

Comments
 (0)