Skip to content

loadModel from url doesn't work in Node #410

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
dsmilkov opened this issue Jun 8, 2018 · 23 comments
Closed

loadModel from url doesn't work in Node #410

dsmilkov opened this issue Jun 8, 2018 · 23 comments
Assignees
Labels
comp:core duplicate This issue or pull request already exists

Comments

@dsmilkov
Copy link
Contributor

dsmilkov commented Jun 8, 2018

(Reported by another user (which is why I don't have the stack trace))

loadModel with a url path doesn't work in Node. This is most likely related to fetch missing in node. We should detect the environment and use node's built-in HTTP, or conditionally import node-fetch when we are not in the browser.

cc @nsthorat, @tafsiri for ideas on node <--> browser interop.

@dsmilkov
Copy link
Contributor Author

dsmilkov commented Jun 8, 2018

I think the best solution is to import node-fetch conditionally, and add 'ignore' flag to rollup to ignore this import when making a browser bundle, just like we do for crypto (rollup config) which came from our seedrandom dependency (see seedrandom's conditional import here)

@nsthorat
Copy link
Contributor

That's one solution. Another solution is to basically extend the backend, or create a "platform" which would be node vs browser which has a few methods that we override (fromPixels, fetch, etc). This seems a little cleaner than sprinkling conditional imports.

@caisq
Copy link
Contributor

caisq commented Jun 10, 2018

@dsmilkov @nsthorat The IOHandlerRegistry is exactly designed to accommodate this kind of environment-dependent handling of URL schemes. In particular, the http:// URL scheme will be "routed" to different IOHandler implementations depending on whether the environment is browser or Node.js. This issue can be regarded as a duplicate of #343, which is underway. The status is that the file:// handler has been implemented for Node.js and the http:// or https:// handler will happen soon.
So I will close this issue now.

@caisq caisq closed this as completed Jun 10, 2018
@caisq caisq added the duplicate This issue or pull request already exists label Jun 10, 2018
@dsmilkov
Copy link
Contributor Author

Sounds good! Would be great if browser specific handlers only get registered in the browser . This way in node it would say "no io handler registered for http" as opposed to using the browser specific one

@tafsiri
Copy link
Contributor

tafsiri commented Jun 11, 2018

I think i am in favour of providing someway to pass a function that can things like override fetch. This issue intersects with ones like #272 where on platforms like Ionic or React Native (hybrid web/native platforms) the developer may want to load from some local store that we can't apriori know how to load from (and where fetch isn't implemented). Allowing for callbacks that enable a user to pull the necessary paths from whatever platform they get tfjs could be quite useful.

@caisq what do you think of this case, is there another way to handle it?

@tafsiri tafsiri reopened this Jun 11, 2018
@caisq
Copy link
Contributor

caisq commented Jun 12, 2018

Thanks, @tafsiri for the comment. How about an API like the following:

  1. If the user doesn't need to override fetch for environments like React Native, then the fetch method will be selected under the hood, automatically, based on whether the environment is browser or node.js. In either case, the following code will work:
const model = await tf.loadModel('http://foo/path/to/model.json');
  1. If the user needs to override fetch, the following API can be used:
const model = await tf.loadModel(tf.io.httpRequest('http://foo/path/to-model.json', {fetch: myCustomFetch}));

@tafsiri
Copy link
Contributor

tafsiri commented Jun 12, 2018

@caisq In the example above does getting the initial json file also use myCustomFetch? It looks like it will do a regular http request which may throw a user off if they know they can't do a 'local' http request. If the first param doesn't have 'http' in it will it work?

Is this the point where a user need to implement their IOHandler? Is there a way to implement this minimally in a way that controls how each type of resource is loaded, while delegating as much as possible to our existing code.

Something like

const model = await tf.loadModel('path/to/manifest.json', {
	load: myCustomLoadFunc, // what is the required signature for myCustomLoadFunc?
});

I imagine myCustomLoadFunc being responsible for either getting a json string or a binary blob to our loadModel code.

or

const model = await tf.loadModel(tf.io.customIO('path/to/manifest.json', load: loadFunc));

Though IMO this is less concise.

@gabrielfreire
Copy link

Hi guys, i don't even remember how i got here in this issue, but as a tfjs user and nodeJs lover i would love to be able to do a simple

const model = await tf.loadModel('path/to/model.json');
// or
const model = await tf.loadModel('https://www.path.com/to?my=model.json');
// or
const model = await tf.loadModel('file://path/to/model.json');

and let the library do the guessing, sorry for my ignorance, i'm not familiar with the tfjs code for this method, but couldn't you just have something like

async loadModel(url: string): Promise<SomeModelClass> {
   if(typeof document === 'undefined'){ // or something else to figure out if there is a browser
       // NodeJS land
       // maybe use some package to extrack url metadata ?
       if(url.match(/\.json$/) == -1) throw 'some error';
       if(url.match(/file(?=\:)/) > -1) console.log(`It\'s a file protocol request, maybe use the node fs module?`);
       else if(url.match(/http(?=\:)/) > -1) console.log(`It\'s a http request, probably use Request module or some cool library`);
       else console.log('It\'s a local request, fs module?');
       return someModelClassInstance;
   }
}

?

Sorry again

@caisq
Copy link
Contributor

caisq commented Jun 15, 2018

Currently (v0.11.6), only the file:// URL scheme following works in tfjs-node.

Absolute path example:

const model = await tf.loadModel('file:///tmp/path/to/model.json');
// Notice the three slashes after the file:. The first two belong to the scheme. The last one belongs
// to the absolute file path.

Relatve path example:

const model = await tf.loadModel('file://./path/to/model.json');
``

We are working on the http:// and no-scheme on in node.js.

@hyun-yang
Copy link

@caisq @tafsiri
Hi guys,

Actually, I raised the issue tf.loadModel not working in ionic #272.

I tested it using tf.loadModel('file://./path/to/model.json') and tf.loadModel('file:///tmp/path/to/model.json');

Both of them, got the same error message "TypeError : Failed to fetch", I already uploaded sample ionic project Tensorflow Pre-Trained Model Import in Ionic Demo

Essentially, hybrid app developer want to load pretrained model from local path.

@caisq
Copy link
Contributor

caisq commented Jun 16, 2018

@hyun-yang I'm pretty sure saving and loading models with file:// is working with the latest versions of @tensorflow/tfjs and @tensorflow/tfjs-node. I wrote a simple example at:
https://github.com/caisq/tfjs-dump/tree/master/tfjs-node-doodle

@hyun-yang
Copy link

hyun-yang commented Jun 18, 2018

@caisq I'm not sure, if we are on the same page.

As I mentioned in "tf.loadModel not working in ionic #272"

"What I really want to do is, I want to load this model from local folder( In this case from assets/model/ ) not using http server.

In ionic( I think lots of hybrid app platform has a same build process), when developer builds a native app for Android, iOS and Windows. Developer might want to load model from the local folder which is already packaged inside output file ( apk, ipa ) not using http server.

It'd be great if we have an API like a tf.loadModelFromLocal.

Thanks."

So, I don't think I need to install @tensorflow/tfjs-node as well.

I hope this makes sense to you.

Ps.
I tested it your example with node main.js and it shows the result, however that's not what I'm talking about.
Tensor
[[0.2704779, 0.2301091, 0.21263, 0.2867831],
[0.2704779, 0.2301091, 0.21263, 0.2867831]]

@caisq
Copy link
Contributor

caisq commented Jun 18, 2018

@hyun-yang Thanks for the clarification. You are right. I overlooked the fact that you are not in Node.js, but in ionic. Loading in ionic is currently not directly supported. We plan to support it through the custom fetch configuration to tf.io.httpRequest as I wrote above. Currently, you may need to write a custom IOHandler implementation. If you need an example, you can look at this code in tfjs-node:
https://github.com/tensorflow/tfjs-node/blob/master/src/io/file_system.ts#L26

@hyun-yang
Copy link

hyun-yang commented Jun 18, 2018

@caisq

We plan to support it through the custom fetch configuration to tf.io.httpRequest as I wrote above

Sounds good and I'll have a look tfjs-node code you mentioned.

Thanks.

@limscoder
Copy link

limscoder commented Sep 26, 2018

I get a fetch error even when using a local file running tfjs 13.1 and node 8.11.

Models was saved from Keras with the Python package

 tfjs.converters.save_keras_model(model, path)
model = await tf.loadModel('file:///absolute/path/to/model.json');
(node:71934) UnhandledPromiseRejectionWarning: Error: browserHTTPRequest is not supported outside the web browser without a fetch polyfill.
    at new BrowserHTTPRequest (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-core/dist/io/browser_http.js:46:19)
    at Object.browserHTTPRequest (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-core/dist/io/browser_http.js:247:12)
    at Object.<anonymous> (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:98:50)
    at step (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:42:23)
    at Object.next (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:23:53)
    at /Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:17:71
    at new Promise (<anonymous>)
    at __awaiter (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:13:12)
    at Object.loadModelInternal (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:92:12)
    at Object.loadModel (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/exports.js:17:21)

Update -- I also get an error when trying to run the example file loader code from @caisq: https://github.com/caisq/tfjs-dump/tree/master/tfjs-node-doodle

@schipiga
Copy link

schipiga commented Oct 1, 2018

@limscoder looks like @tensorflow/[email protected] isn't ok. For me it works with previous v0.1.16.

├─┬ @tensorflow/[email protected]
│ ├─┬ @tensorflow/[email protected]
│ ├─┬ @tensorflow/[email protected]
│ └── @tensorflow/[email protected]
├─┬ @tensorflow/[email protected]

Looks like error happens because they started to use @tensorflow/tfjs as dependency, not as devDependency: tensorflow/tfjs-node@c3e1e06#diff-b9cfc7f2cdf78a7f4b91a753d10865a2R40

Or as another variant, you have to use the same version of @tensorflow/tfjs, as it is specified in dependencies of @tensorflow/[email protected]:

➜  tfjs-node-doodle git:(master) ✗ npm list|grep tensorflow
├─┬ @tensorflow/[email protected]
│ ├─┬ @tensorflow/[email protected]
│ ├─┬ @tensorflow/[email protected]
│ └── @tensorflow/[email protected]
├─┬ @tensorflow/[email protected]
│ ├── @tensorflow/[email protected] deduped
➜  tfjs-node-doodle git:(master) ✗ node main.js 
_____________________________________________________________
Layer (type)                 Output shape              Param #   
=================================================================
dense_Dense1 (Dense)         [null,10]                 60        
_________________________________________________________________
dense_Dense2 (Dense)         [null,4]                  44        
=================================================================
Total params: 104
Trainable params: 104
Non-trainable params: 0
_________________________________________________________________
Tensor
    [[0.3059654, 0.2283318, 0.1902294, 0.2754734],
     [0.3059654, 0.2283318, 0.1902294, 0.2754734]]
(node:30346) Warning: N-API is an experimental feature and could change at any time.
{ modelArtifactsInfo: 
   { dateSaved: 2018-10-01T13:21:02.113Z,
     modelTopologyType: 'JSON',
     modelTopologyBytes: 1006,
     weightSpecsBytes: 248,
     weightDataBytes: 416 } }
Tensor
    [[0.3059654, 0.2283318, 0.1902294, 0.2754734],
     [0.3059654, 0.2283318, 0.1902294, 0.2754734]]
➜  tfjs-node-doodle git:(master) ✗

@insensitive
Copy link

@caisq

We plan to support it through the custom fetch configuration to tf.io.httpRequest as I wrote above

Sounds good and I'll have a look tfjs-node code you mentioned.

Thanks.

Hello @hyun-yang
I am facing a similar problem. Did you find any workaround to this?
As of now I have to host the files on some server.

Thanks

@hpssjellis
Copy link
Contributor

Just thought I should add this as an alternative if the above solutions are not working for you. Found it on Stackoverflow to do with loading local files for use with Ionic may also work for Phonegap.

https://stackoverflow.com/questions/50224003/tensorflowjs-in-ionic/55306342#55306342

Use a polyfill (https://github.com/github/fetch) and replace the fetch.

window.fetch = fetchPolyfill;

Now, it's possible to load local files (file:///) like:

const modelUrl = './model.json'

const model = await tf.loadGraphModel(modelUrl);

@nsthorat
Copy link
Contributor

We're working on a fix that should fix this across the board without a fetch polyfill in node: tensorflow/tfjs-core#1648

@caisq
Copy link
Contributor

caisq commented Mar 26, 2019

FYI, if you use tfjs-node or tfjs-node-gpu, loading a tf.LayersModel (i.e., a model converted from Keras or constructed from TF.js itself) should be working with the latest release.

Code sample:

package.json looks like:

{
    "devDependencies": {
        "@tensorflow/tfjs-node": "^1.0.2"
    }
}

Node.js code looks like:

const tf = require('@tensorflow/tfjs-node');

(async function() {
    const modelURL = `https://storage.googleapis.com/tfjs-models/tfjs/mobilenet_v1_0.25_224/model.json`;
    const model = await tf.loadLayersModel(modelURL);
    model.summary();
})();

@AlberErre
Copy link

AlberErre commented Jul 24, 2019

FYI, if you use tfjs-node or tfjs-node-gpu, loading a tf.LayersModel (i.e., a model converted from Keras or constructed from TF.js itself) should be working with the latest release.

Code sample:

package.json looks like:

{
    "devDependencies": {
        "@tensorflow/tfjs-node": "^1.0.2"
    }
}

Node.js code looks like:

const tf = require('@tensorflow/tfjs-node');

(async function() {
    const modelURL = `https://storage.googleapis.com/tfjs-models/tfjs/mobilenet_v1_0.25_224/model.json`;
    const model = await tf.loadLayersModel(modelURL);
    model.summary();
})();

Same behaviour here ^^

In case you are using tfjs-node, updating from ^0.1.21 to ^1.0.2 has solved the issue for me.

thank you @caisq

@aminBenSlimen
Copy link

[ Workaround i guess ] hello everyone ... so i'm facing a problem when trying to load my model in ionic 5 it work on browser but wont in android ( i'm using ml5.js but its the same thing since it based on tensorflow.js ) my solution is simply moving all my loading code into the index.html . hope someone find this useful .

@fernando12170209
Copy link

Just thought I should add this as an alternative if the above solutions are not working for you. Found it on Stackoverflow to do with loading local files for use with Ionic may also work for Phonegap.

https://stackoverflow.com/questions/50224003/tensorflowjs-in-ionic/55306342#55306342

Use a polyfill (https://github.com/github/fetch) and replace the fetch.

window.fetch = fetchPolyfill;

Now, it's possible to load local files (file:///) like:

const modelUrl = './model.json'

const model = await tf.loadGraphModel(modelUrl);

Help!!

When code the same appear this in the browser:
Not allowed to load local resource
image

this is the message in console
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:core duplicate This issue or pull request already exists
Projects
None yet
Development

No branches or pull requests