Description
TensorBoard version (from pip package, also printed out when running tensorboard
): 1.12.0
TensorFlow version if different from TensorBoard: 1.12.0
OS Platform and version (e.g., Linux Ubuntu 16.04): Arch Linux
Python version (e.g. 2.7, 3.5): 3.6.5
Hi there,
I am trying to plot PR curves using pr_curve_raw_data_op
function with precisions, recalls and other metrics which I have calculated before hand, but I'm getting wrong results in chart. I am using Estimator API, meaning I create my own dictionary which contains tuple of tf.Tensor
and tf.Operation
instances which are then forwarded to estimator for dumping summaries.
I evaluate multiple steps during each checkpoint save, which means that I accumulate precision and recall through multiple steps (but I don't use pr_curves plugin to do so, I use custom written metrics), and all I want from the plugin is to dump the PR curve into the tensorboard. Here is the part of the code which serves for storing the pr-curves
summary tensor into the metrics dictionary:
metrics_dict['pr_curve/CLASS_{}'.format(cls_index)] = (
summary.raw_data_op(
'pr_curve/CLASS_{}'.format(cls_index),
true_positive_counts=tps,
false_positive_counts=fps,
true_negative_counts=tns,
false_negative_counts=fns,
precision=precisions,
recall=recalls,
num_thresholds=11
),
tf.no_op()
)
I also tried to dump precisions as text summary just to be sure that output printed in terminal is the same that is being stored in summaries, and it was I dumped those summaries using this code:
metrics_dict['precisions_text/CLASS_{}'.format(cls_index)] = (
tf.summary.text(name="aa",
tensor=tf.as_string(precs)),
tf.no_op()
)
In the end I will attach couple of screenshots. First one shows precisions dumped to text summary and their values ( I only showed first 6 values, as for the remaining 4 values are 0.0):
Second screenshot contains those same precisions for same class and step plotted on PR curve:
You can see that there is only first point on the plot, with precision being 1.0 for 0.0 recall. That occurs in several more cases, along with chart cutting off at last point (for example, I have precision values from recalls 0.0 up to 0.6, but chart plots only up to 0.5).
Activity
stephanwlee commentedon Nov 16, 2018
Hi @danchyy! I would love to fix the problem. Although I can try to reproduce what you are witnessing, I think it would be easier if you can share the event file you were using. Would you be able to share it Thanks!
danchyy commentedon Nov 16, 2018
Hi @stephanwlee, so I made a bit of a smaller example for which I can share event file (because the case from my original post had some custom ops which I can't share) and the results are still the same.
I used code from Custom Estimators tutorial, and just added my metric and pr-curves plugin.
Here is compressed event file which you can open in tensorboard and view the pretty much same results as in above mentioned issue.
stephanwlee commentedon Nov 16, 2018
Hi @danchyy, thanks a lot for the event file and more detail! I was able to identify one problem but I do not think I am seeing exactly what you are seeing. With the event file you attached, I see below:

Is this correct?
danchyy commentedon Nov 16, 2018
That is correct, now if you open up the tab Text, you will see precisions for class 1 in this example are given even for recalls beyond 0.5 (there will be 11 precisions in one column, which represent precision at recall 0.0, 0.1, and so on..), and PR chart doesn't plot that. That is the issue that is bothering me. I am not sure if that is the expected behaviour because it looks kinda odd to me.
stephanwlee commentedon Nov 16, 2018
@danchyy Ah! What you are seeing seems to relate to #444 -- only the 6th element has non-zero TP+FP > 0 and rest are truncated as per #444.
Context: https://github.com/tensorflow/tensorboard/blame/master/tensorboard/plugins/pr_curve/pr_curves_plugin.py#L371-L372
stephanwlee commentedon Nov 17, 2018
One thing that I noticed is that, while fixing another bug, TB seems to assume that the recall is sorted in descending order. What happens if you do that?
danchyy commentedon Nov 19, 2018
Hey @stephanwlee,
So I guess that the solution would be to fake the TPs and FPs? Since I calculate precision and recall on my own anyway I guess I can just put fill the tensor with ones so the chart can run smoothly, like this ( I filled TPs, FPs, FNs and TNs with ones):
And for your other question, If I sort recalls in descending order (from 1.0 to 0.0), I get wrong results, like this:
By the way, there is also another issue which I am having, when I hover over the chart I can only see values at 0.0 recall and 1.0 recall