You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Apr 16, 2020. It is now read-only.
Do preliminary load-testing on IPFS test cluster with 1-2TB of data. Use exactly the docker containers that we are going to deploy on the data.gov collaborators' machines.
We want to get ipfs-pack and filestore working before the collaborators start loading data into IPFS and replicating it (see #117 and #118) because those will change the structure of the DAG. In the meantime, we want to load-test IPFS with multi-TB loads, so we will run those tests on a separate cluster of machines with disposable data.
The text was updated successfully, but these errors were encountered:
With Jack's dataset, which is sharded into pairtrees, adding the data to ipfs will take about 20% longer to add because of the pairtrees. We will have to optimize for this case in the future.
Do preliminary load-testing on IPFS test cluster with 1-2TB of data. Use exactly the docker containers that we are going to deploy on the data.gov collaborators' machines.
We want to get ipfs-pack and filestore working before the collaborators start loading data into IPFS and replicating it (see #117 and #118) because those will change the structure of the DAG. In the meantime, we want to load-test IPFS with multi-TB loads, so we will run those tests on a separate cluster of machines with disposable data.
The text was updated successfully, but these errors were encountered: