-
-
Notifications
You must be signed in to change notification settings - Fork 2.8k
hook for cache encoding #2899
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
you should never ever need that whats your use case |
For testing our AWS configs we want to cache responses from botocore and use them as fixtures. The responses can include datetimes. |
the cache plugin has support for creating directories and using them to store file that use a more rich storage than json for the pytest internals we limit the capabilities of the storage mechanisms to prevent edge-cases the basic idea is - if you need really fancy storage, go for a own file while it would be easy to say support adding a encoder type, we'd suddenly be api locked what we could consider is providing a better usable vriant of the loader/dumper mechanism in order for people to easily create richer setups without needing to lock ourselves in |
Perhaps we should just go ahead and change our encoding to support This does not replace the general idea of a more general save/load mechanism, but would probable be enough for most users. |
json does not support also there is the much richer encoder from execnet, which also supports more data-types like frozensets, sets, tuples, bytes and so on - all of them native, fundamental python types that json also has no native support for |
I could swear there was a flag for that, but I'm probably thinking of YAML. Yeah then it probably is a no go. So I think we are reaching the consensus of creating new hooks for encoding/decoding of the cache contents? |
i believe its reasonable to add this via a new keyword argument for example - a whole hook seems the wrong way around |
Rough idea: def pytest_make_cache_codec(rootdir, config, ...):
"""
Return an object with get/set signature identical to the internal cache codec.
Return None to use the default cache codec.
""" |
Hmm keyword argument to what function? |
@nicoddemus i beleive we should take a look at providing a api via the cache plugin to get cache sections with a dedicated serializer/deserializer pair the default serializer/deserializer being the from my pov the pytest hook system is entirely unsuitable for managing them since it does not support the required flows of controll and data |
an example would be cache.set('someplugin/foo', value, serializer=date_json)` as well as |
Hmm sorry, what would be the advantage over what I proposed? |
a) control - your proposal gave control over what codec to use to a piece of code that has literally no idea and is first serve first come |
I see, thanks for clarifying; it was not clear to me at first sight.
This two points make sense, but IMHO in practice most uses of codecs just want to same some simple Python objects, nothing too fancy. Our limitations of what can be currently saved by the cache plugin are purely because we chose the JSON backend; if we had chosen a more complete serializer (for example pickle) which can save more objects, we probably wouldn't need this discussion. Note that I'm not criticizing the choice of JSON, in anyway, given that is adequate most of the time and is in the standard library. Also this poses all new complexity which will have to be implemented from scratch, as you rightfully commented that the current plugin system is not adequate for it. I wonder how different serializers would be installed and by which plugins, how to register them etc. IMHO this seems overly complex in two aspects, both in implementation (which incurs the risk of never happening) and over engineering (I don't think we need all that flexibility). I think examples of reasonable implementation of a cache-provider plugin might be one for Again, I think most usages of the cache plugin is for very simple objects and does not require a new multi-layer of cache encoders/decoders, which can be registered multiple times and in different flavors. Also we can consider that But of course I may being simplistic now and I have to eat my foot later 😝 |
the main reason i did split the specification of the serializer in such a way is to allow multiple definitions of a compatible serializer in different kinds of plugins, a more ideal way would be to let a plugin get something like a "sub-cache" that is configured with a namespace , a encoder/decoder and a file extension, the default just being json with .json as exposed in the global cache object but i dont see a hook being a sane tool for that since the hooks simply have the wrong arity - for more arities that plugin systems can have see stevedore |
(bascially the arity i see needed is specifying a "driver" for a key or a name-space - hooks are fundamentally unable to provide that |
Closing this issue as the proposal has been inactive for over a year. |
Reopening as I still see this as relevant. 👍 |
@nicoddemus given the lack of initiative i think its reasonable to go for "write your own file" until a reasonable proposal gets delivered and based on that i'd like to close this |
Closing again as per #12465 (comment). |
It'd be handy to have a way to set JSON encoding options for the default cacheprovider.
I resorted to patching cache.set from
pytest_configure
to encodedatetime.datetime
s.Is there a cleaner way to do this? Would
-p no:_cacheprovider
and a custom cache provider be recommended instead?The text was updated successfully, but these errors were encountered: