Skip to content
This repository was archived by the owner on Oct 16, 2023. It is now read-only.

Update README.md #66

Merged
merged 1 commit into from
May 18, 2022
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,16 +59,16 @@ Here GPT3-12-layers in FP16 is adopted.
Here a node with 8 A100 80 GB GPUs is adopted. GPUs are fully connected with NvLink.
Energon adopts the redundant computation elimination method from [EffectiveTransformer](https://github.com/bytedance/effective_transformer) and the sequence length is set the half of the padding length.
<div align="center">
<img src="https://user-images.githubusercontent.com/12018307/168971637-ffd1d6ba-44bb-4043-a275-3dc2a008c048.png" width = "500" height = "200" alt="Architecture" align=center />
<img src="https://user-images.githubusercontent.com/12018307/168971637-ffd1d6ba-44bb-4043-a275-3dc2a008c048.png" width = "600" height = "240" alt="Architecture" align=center />
</div>

#### Latency
Here GPT3 in FP16 is adopted.
Here a node with 8 A100 80 GB GPUs is adopted. Every two GPUs are connected with NvLink.
Here the sequence length is set the half of the padding length.
FasterTransformer does not support the redundant computation elimination method in distributed execution.
Here FasterTransformer is adopted in comparison and it does not support the redundant computation elimination method in distributed execution.
<div align="center">
<img src="https://user-images.githubusercontent.com/12018307/168971637-ffd1d6ba-44bb-4043-a275-3dc2a008c048.png" width = "500" height = "200" alt="Architecture" align=center />
<img src="https://user-images.githubusercontent.com/12018307/168983141-44faad9f-7f44-4296-8be6-63c7f6d76263.png" width = "600" height = "300" alt="Architecture" align=center />
</div>

### Contributing
Expand Down