Skip to content

Commit 2601c45

Browse files
committed
docs(chapter5): 修复LLaMA2 Attention结构图中图片链接格式
1 parent 2fca30c commit 2601c45

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

docs/chapter5/第五章 动手搭建大模型.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -114,7 +114,7 @@ torch.Size([1, 50, 768])
114114
在 LLaMA2 模型中,虽然只有 LLaMA2-70B模型使用了分组查询注意力机制(Grouped-Query Attention,GQA),但我们依然选择使用 GQA 来构建我们的 LLaMA Attention 模块,它可以提高模型的效率,并节省一些显存占用。
115115

116116
<div align='center'>
117-
<img src="https://raw.githubusercontent.com/datawhalechina/happy-llm/main/docs/images/5-images/llama2-attention" alt="alt text" width="70%">
117+
<img src="https://raw.githubusercontent.com/datawhalechina/happy-llm/main/docs/images/5-images/llama2-attention.png" alt="alt text" width="50%">
118118
<p>图 5.2 LLaMA2 Attention 结构</p>
119119
</div>
120120

0 commit comments

Comments
 (0)