-
Notifications
You must be signed in to change notification settings - Fork 2k
loadModel from url doesn't work in Node #410
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I think the best solution is to import |
That's one solution. Another solution is to basically extend the backend, or create a "platform" which would be node vs browser which has a few methods that we override (fromPixels, fetch, etc). This seems a little cleaner than sprinkling conditional imports. |
@dsmilkov @nsthorat The IOHandlerRegistry is exactly designed to accommodate this kind of environment-dependent handling of URL schemes. In particular, the |
Sounds good! Would be great if browser specific handlers only get registered in the browser . This way in node it would say "no io handler registered for http" as opposed to using the browser specific one |
I think i am in favour of providing someway to pass a function that can things like override fetch. This issue intersects with ones like #272 where on platforms like Ionic or React Native (hybrid web/native platforms) the developer may want to load from some local store that we can't apriori know how to load from (and where fetch isn't implemented). Allowing for callbacks that enable a user to pull the necessary paths from whatever platform they get tfjs could be quite useful. @caisq what do you think of this case, is there another way to handle it? |
Thanks, @tafsiri for the comment. How about an API like the following:
const model = await tf.loadModel('http://foo/path/to/model.json');
const model = await tf.loadModel(tf.io.httpRequest('http://foo/path/to-model.json', {fetch: myCustomFetch})); |
@caisq In the example above does getting the initial json file also use myCustomFetch? It looks like it will do a regular http request which may throw a user off if they know they can't do a 'local' http request. If the first param doesn't have 'http' in it will it work? Is this the point where a user need to implement their IOHandler? Is there a way to implement this minimally in a way that controls how each type of resource is loaded, while delegating as much as possible to our existing code. Something like const model = await tf.loadModel('path/to/manifest.json', {
load: myCustomLoadFunc, // what is the required signature for myCustomLoadFunc?
}); I imagine myCustomLoadFunc being responsible for either getting a json string or a binary blob to our loadModel code. or const model = await tf.loadModel(tf.io.customIO('path/to/manifest.json', load: loadFunc)); Though IMO this is less concise. |
Hi guys, i don't even remember how i got here in this issue, but as a tfjs user and nodeJs lover i would love to be able to do a simple const model = await tf.loadModel('path/to/model.json');
// or
const model = await tf.loadModel('https://www.path.com/to?my=model.json');
// or
const model = await tf.loadModel('file://path/to/model.json'); and let the library do the guessing, sorry for my ignorance, i'm not familiar with the tfjs code for this method, but couldn't you just have something like async loadModel(url: string): Promise<SomeModelClass> {
if(typeof document === 'undefined'){ // or something else to figure out if there is a browser
// NodeJS land
// maybe use some package to extrack url metadata ?
if(url.match(/\.json$/) == -1) throw 'some error';
if(url.match(/file(?=\:)/) > -1) console.log(`It\'s a file protocol request, maybe use the node fs module?`);
else if(url.match(/http(?=\:)/) > -1) console.log(`It\'s a http request, probably use Request module or some cool library`);
else console.log('It\'s a local request, fs module?');
return someModelClassInstance;
}
} ? Sorry again |
Currently (v0.11.6), only the Absolute path example: const model = await tf.loadModel('file:///tmp/path/to/model.json');
// Notice the three slashes after the file:. The first two belong to the scheme. The last one belongs
// to the absolute file path. Relatve path example: const model = await tf.loadModel('file://./path/to/model.json');
``
We are working on the http:// and no-scheme on in node.js. |
Actually, I raised the issue tf.loadModel not working in ionic #272. I tested it using tf.loadModel('file://./path/to/model.json') and tf.loadModel('file:///tmp/path/to/model.json'); Both of them, got the same error message "TypeError : Failed to fetch", I already uploaded sample ionic project Tensorflow Pre-Trained Model Import in Ionic Demo Essentially, hybrid app developer want to load pretrained model from local path. |
@hyun-yang I'm pretty sure saving and loading models with |
@caisq I'm not sure, if we are on the same page. As I mentioned in "tf.loadModel not working in ionic #272" "What I really want to do is, I want to load this model from local folder( In this case from assets/model/ ) not using http server. In ionic( I think lots of hybrid app platform has a same build process), when developer builds a native app for Android, iOS and Windows. Developer might want to load model from the local folder which is already packaged inside output file ( apk, ipa ) not using http server. It'd be great if we have an API like a tf.loadModelFromLocal. Thanks." So, I don't think I need to install @tensorflow/tfjs-node as well. I hope this makes sense to you. Ps. |
@hyun-yang Thanks for the clarification. You are right. I overlooked the fact that you are not in Node.js, but in ionic. Loading in ionic is currently not directly supported. We plan to support it through the custom |
Sounds good and I'll have a look tfjs-node code you mentioned. Thanks. |
I get a fetch error even when using a local file running tfjs 13.1 and node 8.11. Models was saved from Keras with the Python package tfjs.converters.save_keras_model(model, path) model = await tf.loadModel('file:///absolute/path/to/model.json'); (node:71934) UnhandledPromiseRejectionWarning: Error: browserHTTPRequest is not supported outside the web browser without a fetch polyfill.
at new BrowserHTTPRequest (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-core/dist/io/browser_http.js:46:19)
at Object.browserHTTPRequest (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-core/dist/io/browser_http.js:247:12)
at Object.<anonymous> (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:98:50)
at step (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:42:23)
at Object.next (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:23:53)
at /Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:17:71
at new Promise (<anonymous>)
at __awaiter (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:13:12)
at Object.loadModelInternal (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:92:12)
at Object.loadModel (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/exports.js:17:21) Update -- I also get an error when trying to run the example file loader code from @caisq: https://github.com/caisq/tfjs-dump/tree/master/tfjs-node-doodle |
@limscoder looks like
Looks like error happens because they started to use Or as another variant, you have to use the same version of
|
Hello @hyun-yang Thanks |
…tfjs-node (#182) FEATURE Fixes tensorflow/tfjs#410 tensorflow/tfjs#343
Just thought I should add this as an alternative if the above solutions are not working for you. Found it on Stackoverflow to do with loading local files for use with Ionic may also work for Phonegap. https://stackoverflow.com/questions/50224003/tensorflowjs-in-ionic/55306342#55306342 Use a polyfill (https://github.com/github/fetch) and replace the fetch.
Now, it's possible to load local files (file:///) like:
|
We're working on a fix that should fix this across the board without a fetch polyfill in node: tensorflow/tfjs-core#1648 |
FYI, if you use tfjs-node or tfjs-node-gpu, loading a tf.LayersModel (i.e., a model converted from Keras or constructed from TF.js itself) should be working with the latest release. Code sample: package.json looks like: {
"devDependencies": {
"@tensorflow/tfjs-node": "^1.0.2"
}
} Node.js code looks like: const tf = require('@tensorflow/tfjs-node');
(async function() {
const modelURL = `https://storage.googleapis.com/tfjs-models/tfjs/mobilenet_v1_0.25_224/model.json`;
const model = await tf.loadLayersModel(modelURL);
model.summary();
})(); |
Same behaviour here ^^ In case you are using thank you @caisq |
[ Workaround i guess ] hello everyone ... so i'm facing a problem when trying to load my model in ionic 5 it work on browser but wont in android ( i'm using ml5.js but its the same thing since it based on tensorflow.js ) my solution is simply moving all my loading code into the index.html . hope someone find this useful . |
Help!! When code the same appear this in the browser: |
Uh oh!
There was an error while loading. Please reload this page.
(Reported by another user (which is why I don't have the stack trace))
loadModel with a url path doesn't work in Node. This is most likely related to
fetch
missing in node. We should detect the environment and use node's built-in HTTP, or conditionally import node-fetch when we are not in the browser.cc @nsthorat, @tafsiri for ideas on node <--> browser interop.
The text was updated successfully, but these errors were encountered: