-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Passing serialized TensorFlow Example to TF Serving SavedModel #5336
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi @rs22 The Can you change your Can you please try that let me know? And can you please share your |
Thanks for your support with my issue! I've tried setting the type of InputExampleTensor to
I've attached a cleaned-up version of my pretrained model that you can use for reproducing the problem. To (temporarily) make the prediction work, you'll need to create a TFExample protobuf message using this generated source file, the Func<float, Feature> makeFeature = (float x) =>
{
var floatList = new FloatList();
floatList.Value.Add(x);
return new Feature { FloatList = floatList };
};
var example = new Example {Features = new Features()};
example.Features.Feature.Add("my_feature", makeFeature(0));
var engine = mlContext.Model.CreatePredictionEngine<ModelInput, ModelPrediction>(mlModel);
engine.Predict(new ModelInput { InputExampleTensor = new[] { new string(example.ToByteArray().Select(x => (char)x).ToArray()) } }); ... and change the encoding-related line in bytes[i] = ((ReadOnlyMemory<char>)(object)data[i]).ToArray().Select(x => (byte)x).ToArray(); |
As far as I understand this issue is related to the usage of a model trained in TensorFlow and exported into some '.pb' file. A recipe showing a concrete example from the export in TensorFlow to the usage of that model in ML.Net would be extremely useful as interaction between both sides seems really tricky, up to now all my attempts have been unsuccessful. However, I'm sure that the possibility to serve a trained model on other platforms like dot.net would be very interesting for a lot of people. But for now, I could not find any helpful information regarding that matter. |
@mstfbl Can you please take a look at this? |
@jsgonsette I think you're right: The TensorFlow documentation on these topics is really sparse and I've spent quite some time to figure out what the Estimator-SavedModel expects as an input. I've written down my findings here -- maybe this is helpful to you. |
Hello Robert,
Many thanks for your investigation work. Your page with your findings
sounds great.
I'm trying that asap.
<https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
Garanti
sans virus. www.avast.com
<https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
<#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
Le dim. 6 sept. 2020 à 16:17, Robert Schmid <[email protected]> a
écrit :
… @jsgonsette <https://github.com/jsgonsette> I think you're right: The
TensorFlow documentation on these topics is really sparse and I've spent
quite some time to figure out what the Estimator-SavedModel expects as an
input. I've written down my findings here
<https://www.cyberwelt.io/using-a-tf-serving-savedmodel-from-net/> --
maybe this is helpful to you.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#5336 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AIGPSHWLPF7QPCB4NT6ZFWLSEOKXZANCNFSM4PTE5UTQ>
.
|
@rs22 Looks like you are trying to load a pb model from ML.NET. Note ML.NET only allows load a frozen tensorflow model as below: Is your model frozen or not? Could you please try to frozen your model first if it is not already frozen? This is post on how to convert model into frozen model file: |
I wasn't really aware of the concept of frozen saved models, so thanks for this pointer! Maybe it would be good to show a warning/throw an exception when users try to load 'unfrozen' saved models (you can't tell them apart from the .pb file extension). In my case, there was in fact a variables folder next to the .pb file, but I think that the graph exported by the
Using the frozen_model.pb, I am still getting the exception. For your reference I've attached a complete Visual Studio project with the .pb files: tfrepro.zip |
@rs22 Thanks for providing repro project and model file. This issue is basically an encoding issue. So the problem here is how to do convert [byte array] -> [string] -> [byte array] properly to keep complete information from byte array. There are some ways to do that but most recommended way is using same encoding to do the conversion, in "CastDataAndReturnAsTensor" method from ML.NET we are using below to do the [string] -> [byte array] conversion as below which means you need to use same encoding when do [byte array] -> [string]: I tried to use same encoding when convert [byte array] -> [string] using UTF8 encoding as below but the problem here is UTF8 is not working properly when you byte string has byte whose value is larger than 127(example.ToByteArray()[24] is 128 which cause issue that we can't convert back to exact same byte array): So, one way to fix that issue is we use both reliable encoding on both [byte array] -> [string] and [string] -> [byte array] conversion like below: However, recently we upgraded tensorflow.net version in below PR and some old API to create Tensor as byte[][] is no longer exist so I need some more time to figure out an workable solution: #5404 Will update in this issue if I find an workable solution for this issue in new tensorflow.net version. |
Opened below PR to ask advice from Tensorflow team: tensorflow/tensorflow#44225 |
@rs22 sorry, looks like the default encoding for tensorflow is UTF8 so I can't use Unicode here: https://www.tensorflow.org/tutorials/load_data/unicode So I think the problem now is that your model is using protobuf encoded byte array of example object as input which is not compatible with UTF8. Since we can't change the encoding and ML.NET don't support byte array directly with variance length, what you can do is use serialized string of example object (use method ToText) as input and you need change your model little bit to do de-serialize on input string. cc @harishsk see if Harish has other suggestions. |
Thanks for your investigations! The reason why I'm trying to use a Protobuf as the model input is to maintain compatibility with TF Serving -- which I would lose when I change the model input signature... |
@rs22 I see, this looks like new feature request to me now, from our current design and implementation looks like we have no way to tell whether user passed in pure string that can be passed to TF.NET directly or an encoded string that need to be decode to byte array and pass to TF.NET. I will mark this as feature request. |
System information
Issue
What did you do?
I would like to use the PredictionEnginePool (eventually) in combination with a pretrained Tensorflow Model that I exported using the Estimator.export_saved_model function in combination with
build_parsing_serving_input_receiver_fn
.Specifically, I went through this tutorial: https://www.tensorflow.org/tfx/tutorials/transform/census. Below, you can find the Tensorflow Serving signature definition according to
saved_model_cli
.What happened?
The
input_example_tensor
input expects a serialized Example message (a binary buffer, not a text string). This does not work using the ML.NET library because it re-encodes the data that I'm providing as the model input.What did you expect?
There should be the option in ML.NET to pass raw binary data as a TFString to the model (maybe as a
byte[]
orReadOnlyMemory<byte>
?).Source code / logs
Saved model signature:
My code:
Which fails with:
The text was updated successfully, but these errors were encountered: