Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WeeklyReport] fxfxfxfxfxfxfxfx 2024.08.12.~2024.08.25 #354

Merged
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
### 姓名
冯潇

### 实习项目
动静统一自动并行支持MoE专家并行策略

### 本周工作

1. 构建了一个等效qwen2moe SparseMoEBlock的模型
2. 将该模型改成自动并行版本并且验证其与单节点结果的一致性

* 相关pr: https://github.com/PaddlePaddle/Paddle/pull/67594

### 下周工作

1. 将qwen2moe SparseMoEBlock改成自动并行版本
2. 编写单元测试

### 导师点评
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

尽量在不改动源代码逻辑下实现自动并行版本

Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
### 姓名
冯潇

### 实习项目
动静统一自动并行支持MoE专家并行策略

### 本周工作

1. 阅读论文 GShard: Scaling Giant Models with Conditional Computation and Automatic Sharding,深入了解moe并行的实现
2. 阅读deepspeed关于moe的代码

### 下周工作

1. 将qwen2moe SparseMoEBlock改成自动并行版本,尽量少改动原本的代码,最大化利用自动并行相关api的功能。

### 导师点评
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

熟悉代码中专家并行的实现原理和流程,有助于修改代码和排查问题

Loading