Skip to content
This repository was archived by the owner on Apr 16, 2020. It is now read-only.

Do preliminary load-testing with 1-2TB in IPFS test cluster #126

Closed
flyingzumwalt opened this issue Jan 19, 2017 · 4 comments
Closed

Do preliminary load-testing with 1-2TB in IPFS test cluster #126

flyingzumwalt opened this issue Jan 19, 2017 · 4 comments
Assignees

Comments

@flyingzumwalt
Copy link
Contributor

flyingzumwalt commented Jan 19, 2017

Do preliminary load-testing on IPFS test cluster with 1-2TB of data. Use exactly the docker containers that we are going to deploy on the data.gov collaborators' machines.

We want to get ipfs-pack and filestore working before the collaborators start loading data into IPFS and replicating it (see #117 and #118) because those will change the structure of the DAG. In the meantime, we want to load-test IPFS with multi-TB loads, so we will run those tests on a separate cluster of machines with disposable data.

@jbenet
Copy link
Contributor

jbenet commented Jan 25, 2017

Am i needed in this issue? Can i unassign myself until i'm needed?

@flyingzumwalt
Copy link
Contributor Author

I think you've got it covered now, unless @whyrusleeping still needs anything from you.

@flyingzumwalt
Copy link
Contributor Author

try doing this with a sample dataset based on jack's output from running du

@flyingzumwalt
Copy link
Contributor Author

With Jack's dataset, which is sharded into pairtrees, adding the data to ipfs will take about 20% longer to add because of the pairtrees. We will have to optimize for this case in the future.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants