Skip to content

[Feature request] Tensor attributes #1213

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
jsubag opened this issue Jul 4, 2018 · 11 comments
Open

[Feature request] Tensor attributes #1213

jsubag opened this issue Jul 4, 2018 · 11 comments

Comments

@jsubag
Copy link
Contributor

jsubag commented Jul 4, 2018

It would be useful to have more attributes on tensor objects.
Specifically, whether a tensor is an input/output for the function and whether its constant or mutable during inference etc.
Alternatively this could be added as a utility function that backends can invoke.

@nadavrot
Copy link
Contributor

nadavrot commented Jul 4, 2018

@jsubag Jacob, thanks for working on this. Input/output annotation is a solution to some problem. I am not sure if it's the only solution and if it's the best solution. We need to make sure that any new attributes match the design of the rest of the system. For example, how do new attributes interact with public/private, mutability, etc.

Let's talk about the problem. What usecase is the current system not handling well?

@jsubag
Copy link
Contributor Author

jsubag commented Jul 5, 2018

@nadavrot Thanks for pointing out WeightVar::MutabilityKind, I missed that in the code :)
In that case, are the following categories correct?

  1. Private+Constant: constants known during compilation e.g. weights/biases/initial hidden states
  2. Public+Constant: constant during inference (not known during compilation) e.g. input tensors
  3. Private+Mutable: internal tensors
  4. Public+Mutable: tensors mutable during inference and exposed to the application e.g. output tensors

This however doesn't map the case where a tensor is used as both input and output, e.g. weights during training.
Unless this condition is illegal in Glow and the resulting function will create separate tensors for input and output.
Can you clarify this point please?

Thanks.

@jfix71
Copy link
Contributor

jfix71 commented Jul 5, 2018

@jsubag We actually do not have anything that can be Public+Constant currently; Public implies Mutable. So both input and output tensors are considered to be Public+Mutable.

Perhaps it makes more sense to allow input tensors to be Public+Constant, though, as they are still constant during execution even if modified/accessed before or after. If we made such a change, then if a tensor was used as both input and output then it would be Public+Mutable, as it currently is.

@jsubag
Copy link
Contributor Author

jsubag commented Jul 8, 2018

@jfix71 I think the thing I'm missing is a clear way to figure out if a tensor is considered an input and/or output.
The motivation is figuring out what should be copied to/from memory resources of the backend before and after the forward pass.
Even though the common cases are easy to identify by traversing the graph, it probably makes more sense to either have an explicit notation on them or some shared utility function that all backends can use.

@nadavrot
Copy link
Contributor

nadavrot commented Jul 8, 2018

@jsubag @jfix71 I think that the description of the problem in the last post is great. The next step would be to post proposal for a design that solves the problem. @qcolombet can help in brainstorming ideas for a design. What do you think?

@qcolombet
Copy link
Contributor

Hi @jsubag,

I am not sure I get the problem. Outputs should be easy to identify, they are used by Save node. Having a backend utility function sounds reasonable though.

Do you see cases where we would need to explicitly tag a variable?

@jsubag
Copy link
Contributor Author

jsubag commented Jul 10, 2018

Hi @qcolombet ,

  1. Is it fair to say then that all public tensors targeted by Save nodes are outputs and all other public tensors are inputs?
    This implies inputs and outputs are mutually exclusive, does that fit with weights usage during training?

  2. Is there something propagated from Save nodes to the instruction/variable level?
    I.e. if a backend wants to execute a Function by going over its instructions how can it determine if an instruction destination variable is a Function output?

@qcolombet
Copy link
Contributor

Hi @jsubag,

  1. Yes, but that is not mutually exclusive. When we build for training, we end up with variables being used both by save nodes and elsewhere.
  2. At the instruction level, the variables visible from the outside are represented with the WeightVar class. Those would be the input/output of the program. Now to know what is an output, you would need to check which one are written into. That should be doable by walking through the users of the variables and check if they are ever used as output (@out in the textual representation).

Now, for figuring out what and when should these be copied around, we are reworking the graph IR to have variables accessed exclusively through save and a new load nodes. We would potentially lower then into some new IR instructions that would make it easy to determine when such copies should occur (basically, just where you see such instructions).

@jsubag
Copy link
Contributor Author

jsubag commented Jul 11, 2018

Hi @qcolombet ,

Thanks for clarifying - that was pretty much what I understood.
I think all of these corner cases will be neatly resolved with Load/Save instructions.
Also, this makes it easier for backends to optimize data transfer (e.g. copy output tensors as soon as they're ready instead of at the end of the forward pass).

Btw, will this change be applied to all variables or just public ones? and will constants be treated differently (weights/biases)?

@qcolombet
Copy link
Contributor

Btw, will this change be applied to all variables or just public ones? and will constants be treated differently (weights/biases)?

All variables public or not, constant or not.

@nadavrot
Copy link
Contributor

@jsubag This issue is related to #1334

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants