-
Notifications
You must be signed in to change notification settings - Fork 2k
Support exporting model weights from browser #13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
+1 |
No +1's please |
+1 |
+1 |
Will this also include exporting without the browser? I would like to be able to export a trained model to a file from node (after running model.fit(...)). |
Add support for Firefox.
This issue specifically doesn't talk about it, but we will plan on supporting checkpointing to TensorFlow SavedModels in node.js. |
any updates on this? |
We're working on the design! |
Looks like some progress in the last few days... You guys rock, looking forward to this! Is there an ETA? :) |
@broggi, thanks for checking. For basic features such as saving |
Usage example: ```sh tensorflowjs_converter \ --input_format tensorflowjs --output_format keras \ /tmp/tfjs-artifacts /tmp/keras-model.h5 ``` Also in this change: * Remove the requirement that build-pip-package.sh must not be run from a virtualenv. Activating another virtualenv from the current virtualenv works fine. Towards: tensorflow/tfjs#13
@caisq so erm... how is the progress going? |
@PheoOneWhoMade Apart from the comments made in this issue, you can also look at the linked pull requests to see the progress as it happens. |
* `tf.io.browserDownloads()` causes the browser to download a model's artifact as files. * In the case of ` tf.Model` (Keras-style model), two files consistent with the tensorflowjs_converter format will be donwnloaded: * A JSON file that contains model topology and weight manifest * A binary file that contains the weights of the model * `tf.io.browserFiles()` supports loading model artifacts from files such as user-selected files. Towards tensorflow/tfjs#13
* Model.save() uses IOHandler.save() to save model as artifacts. * Model.load() uses IOHandler.load() to get artifacts and construct a Model object. * Using dummy implementations of IOHandler in unit tests. Towards: tensorflow/tfjs#13
for saving model artifacts to browser IndexedDB and loading them. Towards: tensorflow/tfjs#13
@caisq local storage before the user persists on the server over HTTP handles one use case. The HTTP implementation in your fork would almost do it, but I would need to tweak some CORS configuration for the fetch. It might more flexible to take a Fetch Configuration object as a parameter and replace the body (then you also don't need to manually take a method). Alternatively, for my specific use case, exposing the credentials property would probably be sufficient. |
@rajsite I can make it possible to configure additional fields of the |
@atanasster I see. Interesting use case. We're going to support saving weights from a subset of a model's layers and loading weights into a subset of a model's layers. But it may not come out in the initial release with the save/export features. |
@caisq those should handle my current usage 👍 |
@caisq , thanks looking forward to it. If you don't mind, can I send you my manual serialization link on GitHub for some feedback? |
* Add tf.io.browserIndexedDB for saving model artifacts to browser IndexedDB and loading them. Towards: tensorflow/tfjs#13
* Allows model artifacts to be sent via a multipart/form-data HTTP request Towards: tensorflow/tfjs#13
* Add tf.io.browserHTTPRequest * Allows model artifacts to be sent via a multipart/form-data HTTP request Towards: tensorflow/tfjs#13
…173) * Add integration_test/tfjs2keras; Fix the bugs uncovered by the test This is a node.js-Python integration test. It checks that the models exported from tfjs-layers can be loaded properly by Keras in Python. Fix bugs uncovered by the test - In Activation layers - In RNN layers The new test can be invoked with command `yarn test-integ`. The test runs continuously on travis, under the `stage: integ` tag in a node.js 8 environment, which comes with python. Towards: tensorflow/tfjs#13
@caisq I am trying to load a Keras model through node.js on the backend. I see that an IOHandler interface is present and it is used by the loadModel function. Is it currently possible to implement such an interface for use with node.js ? Should I give this a try or are other parts missing for this to work ? Is this functionality maybe going to be implemented on the tfjs-node repo ? |
We're going to implement IOHandlers for writing to disk for tfjs-node very soon! |
FEATURE -- Add capability to save and tf.Models using the following mediums: browser local storage, IndexedDB, file downloads and updates and HTTP requests. Towards: tensorflow/tfjs#13
Update: Please use the new API and let us know what you think. If you find bugs or have enhancement requests, please file GitHub issues. I will leave this issue open for now because we plan to modify a few examples in https://github.com/tensorflow/tfjs-examples so they can save/load trained or fine-tuned models locally. The saving/loading support for TensorFlow.js in Node.js will be released later. |
You can play with the runnable code snippets on our website to get a feeling of the new API: |
Now that a concrete example has been added to the Iris example in tfjs-example, I will close this issue. I filed a separate issue to track the support of model saving and loading in Node.js: |
* Drop old binding-demo * Keep names the same.
No description provided.
The text was updated successfully, but these errors were encountered: