Skip to content

Support exporting model weights from browser #13

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
tafsiri opened this issue Mar 21, 2018 · 27 comments
Closed

Support exporting model weights from browser #13

tafsiri opened this issue Mar 21, 2018 · 27 comments
Assignees
Labels
comp:layers type:feature New feature or request

Comments

@tafsiri
Copy link
Contributor

tafsiri commented Mar 21, 2018

No description provided.

@tafsiri tafsiri added type:feature New feature or request [layers] labels Mar 21, 2018
@huan
Copy link
Contributor

huan commented Mar 31, 2018

+1

@quantuminformation
Copy link

No +1's please

@venkatesh-sakthivel
Copy link

+1

@thibo73800
Copy link

+1

@travisstaloch
Copy link

Will this also include exporting without the browser? I would like to be able to export a trained model to a file from node (after running model.fit(...)).

easadler pushed a commit to easadler/tfjs that referenced this issue Apr 12, 2018
@nsthorat
Copy link
Contributor

This issue specifically doesn't talk about it, but we will plan on supporting checkpointing to TensorFlow SavedModels in node.js.

@0xDaksh
Copy link

0xDaksh commented Apr 15, 2018

any updates on this?

@nsthorat
Copy link
Contributor

We're working on the design!

@broggi
Copy link

broggi commented Apr 30, 2018

Looks like some progress in the last few days... You guys rock, looking forward to this! Is there an ETA? :)

@caisq
Copy link
Contributor

caisq commented Apr 30, 2018

@broggi, thanks for checking.

For basic features such as saving tf.Models (Keras-style Models) to browser local storage or as downloaded files, we are looking at next couple of weeks as the ETA. For more advanced features, such as sending models to HTTP servers, the ETA will be longer. The support for saving non Keras-style models, e.g., FrozenModels loaded from converted TensorFlow SavedModels, the ETA will be even longer.

caisq added a commit to caisq/tfjs-converter-1 that referenced this issue May 1, 2018
Usage example:
```sh
tensorflowjs_converter \
  --input_format tensorflowjs --output_format keras \
  /tmp/tfjs-artifacts /tmp/keras-model.h5
```

Also in this change:
* Remove the requirement that build-pip-package.sh must not be run
  from a virtualenv. Activating another virtualenv from the current
  virtualenv works fine.

Towards: tensorflow/tfjs#13
@FreezePhoenix
Copy link

@caisq so erm... how is the progress going?

@caisq
Copy link
Contributor

caisq commented May 2, 2018

@PheoOneWhoMade Apart from the comments made in this issue, you can also look at the linked pull requests to see the progress as it happens.

caisq added a commit to tensorflow/tfjs-core that referenced this issue May 3, 2018
* `tf.io.browserDownloads()` causes the browser to download a model's artifact as files.
  * In the case of ` tf.Model` (Keras-style model), two files consistent with the tensorflowjs_converter format  will be donwnloaded:
    * A JSON file that contains model topology and weight manifest
    * A binary file that contains the weights of the model
* `tf.io.browserFiles()` supports loading model artifacts from files such as user-selected files.

Towards tensorflow/tfjs#13
caisq added a commit to caisq/tfjs-layers-1 that referenced this issue May 3, 2018
* Model.save() uses IOHandler.save() to save model as artifacts.
* Model.load() uses IOHandler.load() to get artifacts and construct
  a Model object.
* Using dummy implementations of IOHandler in unit tests.

Towards: tensorflow/tfjs#13
caisq added a commit to caisq/deeplearnjs that referenced this issue May 3, 2018
for saving model artifacts to browser IndexedDB and loading them.

Towards: tensorflow/tfjs#13
@rajsite
Copy link

rajsite commented May 9, 2018

@caisq local storage before the user persists on the server over HTTP handles one use case.

The HTTP implementation in your fork would almost do it, but I would need to tweak some CORS configuration for the fetch. It might more flexible to take a Fetch Configuration object as a parameter and replace the body (then you also don't need to manually take a method).

Alternatively, for my specific use case, exposing the credentials property would probably be sufficient.

@caisq
Copy link
Contributor

caisq commented May 9, 2018

@rajsite I can make it possible to configure additional fields of the RequestInit used with the fetch, including cache, credentials, headers and mode. How does that sound?

@caisq
Copy link
Contributor

caisq commented May 9, 2018

@atanasster I see. Interesting use case. We're going to support saving weights from a subset of a model's layers and loading weights into a subset of a model's layers. But it may not come out in the initial release with the save/export features.

@rajsite
Copy link

rajsite commented May 9, 2018

@caisq those should handle my current usage 👍
Also, being able to make my own IOHandler sounds like a good backup.

@atanasster
Copy link
Contributor

@caisq , thanks looking forward to it. If you don't mind, can I send you my manual serialization link on GitHub for some feedback?

caisq added a commit to tensorflow/tfjs-core that referenced this issue May 9, 2018
* Add tf.io.browserIndexedDB

for saving model artifacts to browser IndexedDB and loading them.

Towards: tensorflow/tfjs#13
caisq added a commit to caisq/deeplearnjs that referenced this issue May 9, 2018
* Allows model artifacts to be sent via a multipart/form-data
  HTTP request

Towards: tensorflow/tfjs#13
caisq added a commit to tensorflow/tfjs-core that referenced this issue May 12, 2018
* Add tf.io.browserHTTPRequest

* Allows model artifacts to be sent via a multipart/form-data
  HTTP request

Towards: tensorflow/tfjs#13
caisq added a commit to tensorflow/tfjs-layers that referenced this issue May 12, 2018
…173)

* Add integration_test/tfjs2keras; Fix the bugs uncovered by the test

This is a node.js-Python integration test. It checks that the models
exported from tfjs-layers can be loaded properly by Keras in Python.

Fix bugs uncovered by the test
- In Activation layers
- In RNN layers

The new test can be invoked with command `yarn test-integ`.
The test runs continuously on travis, under the `stage: integ` tag in a node.js 8 environment, which comes with python.

Towards: tensorflow/tfjs#13
@traboukos
Copy link

@caisq I am trying to load a Keras model through node.js on the backend. I see that an IOHandler interface is present and it is used by the loadModel function. Is it currently possible to implement such an interface for use with node.js ? Should I give this a try or are other parts missing for this to work ? Is this functionality maybe going to be implemented on the tfjs-node repo ?

@nsthorat
Copy link
Contributor

We're going to implement IOHandlers for writing to disk for tfjs-node very soon!

caisq added a commit to tensorflow/tfjs-layers that referenced this issue May 15, 2018
FEATURE -- Add capability to save and tf.Models using the following mediums: browser local storage, IndexedDB, file downloads and updates and HTTP requests.

Towards: tensorflow/tfjs#13
caisq added a commit to tensorflow/tfjs-website that referenced this issue May 17, 2018
@caisq
Copy link
Contributor

caisq commented May 18, 2018

Update:
The model importing and exporting feature for the browser environment is launched with @tensorflow/tfjs release 0.11.1. Please see tutorial at:
https://js.tensorflow.org/tutorials/model-save-load.html

Please use the new API and let us know what you think. If you find bugs or have enhancement requests, please file GitHub issues.

I will leave this issue open for now because we plan to modify a few examples in https://github.com/tensorflow/tfjs-examples so they can save/load trained or fine-tuned models locally.

The saving/loading support for TensorFlow.js in Node.js will be released later.

@caisq
Copy link
Contributor

caisq commented May 18, 2018

You can play with the runnable code snippets on our website to get a feeling of the new API:
https://js.tensorflow.org/api/latest/#loadModel

@caisq
Copy link
Contributor

caisq commented May 25, 2018

Now that a concrete example has been added to the Iris example in tfjs-example, I will close this issue. I filed a separate issue to track the support of model saving and loading in Node.js:
#343

@caisq caisq closed this as completed May 25, 2018
nsthorat pushed a commit that referenced this issue Aug 19, 2019
* Drop old binding-demo

* Keep names the same.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:layers type:feature New feature or request
Projects
None yet
Development

No branches or pull requests