Skip to content

Conversation

@pkuyym
Copy link
Contributor

@pkuyym pkuyym commented Jan 3, 2018

Resolves #7173

@pkuyym pkuyym requested review from JiayiFeng and reyoung January 4, 2018 04:54

// should consider multiple levels
size_t height = dst_num_rows;
auto lod_level = rank_table.level();
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

rank_table.level() is not related to ShrinkMemory.

Also, lod_level must always be zero? Since you cannot shrink the fine level of LoD without shrink the coarse level of LoD.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, follow comments.

framework::AppendLoD(&remain, lod_offset.first);
for (size_t j = 0; j < remain.size(); ++j) {
out_lod->emplace_back(remain[j]);
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This logic is too complex. Since we just shrink the first level of LoD, we can just drop the last N element of the LoD.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Follow comments.

Copy link
Contributor Author

@pkuyym pkuyym left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do shrink for the first level LoD.


// should consider multiple levels
size_t height = dst_num_rows;
auto lod_level = rank_table.level();
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, follow comments.

framework::AppendLoD(&remain, lod_offset.first);
for (size_t j = 0; j < remain.size(); ++j) {
out_lod->emplace_back(remain[j]);
}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Follow comments.

Copy link
Collaborator

@JiayiFeng JiayiFeng left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@pkuyym pkuyym merged commit a320276 into PaddlePaddle:develop Jan 10, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants