Skip to content

Adding Nuclear Norm Constraint into conditional_gradient optimizer #1105

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
pkan2 opened this issue Feb 18, 2020 · 0 comments
Closed

Adding Nuclear Norm Constraint into conditional_gradient optimizer #1105

pkan2 opened this issue Feb 18, 2020 · 0 comments

Comments

@pkan2
Copy link
Contributor

pkan2 commented Feb 18, 2020

Describe the feature and the current behavior/state.
Current conditional_gradient optimizer only supports with Frobenius norm constraint. We can improve the function with providing the choice of nuclear norm constraint, based on the following update formula:

variable -= (1-learning_rate) * (variable
    + lambda_ * top_singular_vector(gradient))

where learning_rate and lambda_ are the parameters that are needed to input into the model when initializing the model.

Relevant information

  • Are you willing to contribute it (yes/no): Yes

  • Are you willing to maintain it going forward? (yes/no): Yes

  • Is there a relevant academic paper? (if so, where):
    https://arxiv.org/pdf/1803.06453.pdf

  • Is there already an implementation in another framework? (if so, where):
    No.

  • Was it part of tf.contrib? (if so, where):
    No.

Which API type would this fall under (layer, metric, optimizer, etc.)
Optimizer.

Who will benefit with this feature?
We provide an API for an optimizer which can enforce hard constraints on neural networks. It is based on conditional gradient descent algorithm. The community primarily benefiting from this feature would be machine learning researchers and scientists.

Any other info.
This version is an improvement of current version of conditional_gradient optimizer, with more available choices of norms in this optimizer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants