-
Notifications
You must be signed in to change notification settings - Fork 51
Fix slow sync #1567
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix slow sync #1567
Conversation
14a879e
to
2d4328c
Compare
67a43b3
to
dab6124
Compare
sync/src/block/extension.rs
Outdated
// FIXME: handle import errors | ||
Err(err) => { | ||
cwarn!(SYNC, "Cannot import header({}): {:?}", header.hash(), err); | ||
break | ||
} | ||
_ => {} | ||
_ => queued.push(hash), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You should handle only queue related error here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now I'm pushing hashes to the queue after importing error occurs.
sync/src/block/downloader/header.rs
Outdated
.get(&self.pivot.hash) | ||
.map(Clone::clone) | ||
.or_else(|| self.client.block_header(&BlockId::Hash(self.pivot.hash))) | ||
.unwrap(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why did you use function chaining here?
Using match
seems to be more readable for me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I changed it to match
style.
Please fix the commit message. |
6b80947
to
2cd87be
Compare
@HoOngEe Please resolve conflicts. |
In current structure, multithreading helps little. Because the verification step should be sequential.
I resolved the conficts |
Now the total time taken for a node to synchronize with the corgi blocks is about 145 minutes.
Queued cache has introduced distinct from the downloaded cache in
HeaderDownloader
.I failed to analyze the correct reason but decreasing
MAX_HEADERS_TO_IMPORT
down to 1,000 helps reducing sync time.And lastly, in the current structure it doesn't help to spawn many threads because the cpu intensive jobs should be sequential. So I decreased it down to
2
.The following image shows the cpu time spent during 50,000 block synchronization.

Now the main cpu intensive jobs are all related to EC calculation for verification.