-
Notifications
You must be signed in to change notification settings - Fork 116
Vocabulary normative, context isn't? #1103
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
In a way yes, but the situation is different. I presume @msporny was considering changes on the context file that adapts it to new vocabularies popping up, using some extra quirk in the JSON-LD context mechanism, etc. However, none of this affects the vocabulary proper. Put it another way: because the vocabulary lists only the terms and specifications defined in the VCDM spec, any change (except for minor bugs like spelling mistakes, but that can be handled easily) means, in fact, a change in the VCDM core spec as well, because the two are strongly linked. |
I think the The reason is that the base media type (Term URLS produced by the context object) Unless, we make these resources normative. This also helps address confusion over "mappings", that might exploit alternative context values, such as the following:
{ "@context": { "@vocab": "https://www.w3.org/2018/credentials#" } } If we say a "verifier" needs to understand these terms, we probably need to make them normative, and explain how a verifiers get them. |
We seem to agree on one aspect, namely that the vocabulary, ie, I am not sure about the context file, though.
I understand what you say, but we have to pragmatic. What that requirement means is that any If we simply say that (I am not sure how to put that into the spec editorially. What we are looking at is to declare some part of |
I have just hit a different example in #1074: the current context file includes mappings to schema.org terms. And that is fine, we should not reinvent the wheel. If we followed a the approach whereby the context should only include references terms that are defined the VCDM, these should be taken out, ie, the application should include a separate context reference to schema.org... |
Agree with @OR13 While the proposal of making the JSON-LD The By making the JSON-LD Moreover, the JSON-LD context file is essential in providing a mapping from the standard VCDM terms to their corresponding vocabulary URLs, which makes the Linked Data representation of the VCDM consistent with the standard. This mapping is crucial for achieving proper semantic interoperability between different implementations. In conclusion, the JSON-LD |
This comment was marked as outdated.
This comment was marked as outdated.
@iherman I think it is ok to make a context file normative, that points to random websites on the internet, that is basically what w3id.org is ... and that is already impossible to change in the current context, because Data Integrity Proof vocabulary is defined by w3id.org not w3c. The problem remains that if the context is not normative, you cannot assert that any JSON-LD object will actually use any of the URLs we are reserving... For example, I frequently replace:
with
Which is still 100% spec compliant, and saves me the trouble of If the object behind the URL is not normative, it does not need to be understood, and then it can be whatever people find useful... I find it useful for it to not cause errors, which is why I use the value above instead of the one hosted by W3C. |
I have the impression that a single, binary choice on "normative"/"informative" may not be appropriate for a context file. I try, below, to characterize what I feel we should say in the specification regarding the context; maybe that would help us to move forward.
So far I side-stepped whether the vcdm context is indeed "normative" or not. Per (2) and (3) it is not, because it does not, and should not, define any feature. Per (4) it may be, because that (plus (1)) ensures the type of stability that @melvincarvalho asked for (and that I agree with) in #1103 (comment). My feeling is that the "normative" terminology can be misunderstood, hence my hesitation of declare it as normative. Note that the current spec (in §B.1) uses the SHA-256 digest of the "vcdm context" as security measure for the stability described in (4). One alternative is to turn §B.1 into a normative appendix (it is informative now), ie, we use the SHA digest to normatively "implement" (4) point above without getting into unnecessary discussions about what it means for a context file to be normative. |
I appreciate the nuances you've highlighted, especially regarding the |
The issue was discussed in a meeting on 2023-06-13 List of resolutions:
View the transcript1. Vocabulary normative, context isn't? (issue vc-data-model#1103)See github issue vc-data-model#1103. Ivan Herman: introduces topic - vocab vs context, starting with vocab being normative or not.
Ivan Herman: what i think is that the vcdm is obviously normative, and the vocab should also be normative, though in practice that is not always the case.
Ivan Herman: with the small diff that some terms in the security vocab may be defined in another spec, but it should still be normative.
Ivan Herman: from a purely theoretical point of view the context is just a transformation tool and does not define anything other than a mapping between urls and terms.
Ivan Herman: the context contains mapping between definitions of terms defined both in the wg and on the web at large in other well known vocabs.
Ivan Herman: the ld world does not require the context, but it is helpful on the pure json level.
Ivan Herman: the spec might actually point in the informative section.
Manu Sporny: the general question for the group is that if we make either or both normative, what changes on the implementation side.
Manu Sporny: one option is to lock everything in with a normative statement and a hash.
Manu Sporny: as far as vocab being normative we are not sure what will change there, and a lot of tests to validate that.
Manu Sporny: want clarity on what is normative - the static representation, the tests, etc.
Manu Sporny: there is a change that if we need any changes that we will need to note that things will break.
Michael Prorock: I see the multiple sides to this issue. I wanted to highlight something. I opened a PR on how to hash a context, expanded to something it wasn't intended to. If we are going to define how this is done, we should take this into account. Orie Steele: want to comment on how impl might change.
Manu Sporny: want to agree with a focus on hashing and statement as to url and hash.
Manu Sporny: think that this makes things easier, and lets us test stuff cleanly, while also preventing dns poisoining, domain takeover, etc.
Manu Sporny: can do the same with vocab.
Dave Longley: something we may want to consider is jcs prior to hash.
Brent Zundel: normative approach to provide a hash and link to to context. Michael Prorock: I'm happy to let Ivan go first.
Ivan Herman: find with context and hash - keep to opinion that vocab should be normative. Michael Prorock: Appreciate Ivan adding clarification to have vocab point back to core data model spec, helpful in general, good exploitation of LD in general.
Manu Sporny: concerned re certain items in vocab that might become normative statements like range of domain and similar. Ivan Herman: i think the way to keep things together is that range etc fall outside scope of group.
Ivan Herman: if there are statements in the vocab that fall outside the vcdm then there is an issue since they are not normatively defined by the vcdm.
Manu Sporny: concern not around discrepancies, concerns around stuff we are not testing today. Ivan Herman: if they are in vcdm we should test them. Manu Sporny: notes that we will have to add tests for coverage, especially data types, ranges, etc.
Ivan Herman: the vcdm does state that there are constraints on values normatively - question is do we test or not - nquads are irrelevant - we can test a multitude of ways.
Orie Steele: we did resolve something we didn't have last time is that the base media type is compact json-ld which means that unless there are additional normative requirements we don't have to test to the level being suggested. Dmitri Zagidulin: want to push back that vocab is primary normative artifact.
Manu Sporny: +1 to dmitriz.
Michael Prorock: -1 to add stuff from elsewhere vs. just checking hashes. Have the resource and then the hash to the resource. You do want to detect those changes.
Michael Prorock: Regarding testing on vocab, there might be approaches there.
Michael Prorock: If we say something has certain parameters, value of certain shape, we have to test for it. Does it mean more work, it's something we have to do. Ivan Herman: good to assign a hash < i think, garbled >.
Brent Zundel: concerns around testing - what i have heard is no opposition to a link to vocab / context and hash for each and normative statements that those match. Orie Steele: think i heard: there are normative statements that should be testable.
Orie Steele: think that i also heard that there should be a normative statement that includes the hash.
Michael Prorock: Possible language for a first proposal would be that we would normatively define URL for both context and vocab and provide hash that must be included. Manu Sporny: -1 to tying this to 1140. Michael Prorock: We should follow model set by subresource integrity and use that mechanism if multiple hashes are provided. Brent Zundel: is there anyone that wants changes or alternates to that proposal.
Shigeya Suzuki: would versioning change hash - major or minor change?
Brent Zundel: not seeing or hearing objections. |
The question arose yesterday (I think from @msporny) on whether there are any W3C Recs that followed the model I proposed. The answer is yes, the Web Annotation specification. This spec consists of three different Recommendation documents:
On a personal level (having been in that WG) I do not remember any major discussions about the normativeness of the I hope this helps. |
Another question came up on our call yesterday: what do we gain by making the vocabulary normative. My answer is: clarity (and we should not underestimate the importance of that). The question that needs an answer is: where is it normatively defined that the official URL of the I have already argued that the (This confusion led to proposals to prune the If we do not use Note that I do not propose to make the vocabulary description in HTML a normative document (which would follow the model of the aforementioned Web Annotation specification). What I propose is that we include the JSON-LD representation of the vocabulary into a normative appendix of the VCDM spec. (We can also "prune" that normative part not to include the "annotation" style entries, like |
@iherman why not put it in a script tag? |
That would be invisible to the reader. Besides, the W3C publishing rules, more specifically the statement whether some parts are normative or not, do not apply to the content of the script tag. |
This has been addressed, closing |
I was not present at the special call on 2023-04-25 but, by reading the minutes, I have the impression that one of the controversy comes from the status of the JSON-LD
@context
file, namely, whether it is normative or not. That file being normative seems to clash with the desire to provide a context file that developers can use without hassle in cases where the JSON-LD behaviour is necessary, ie, if the URDNA canonicalization is used.So here is a proposal that, though maybe controversial, may make the situation cleaner:
@context
file is not normative. Its goal is, among other things, is to make the life of developers easy insofar as all the "usual" terms are there, mapping to the VCDM and the various security vocabulary terms.That approach may cleanly separate the issues once and for all. The
@context
file's role is "just" to provide a mapping from the (standard) VCDM terms in JSON to the (standard) Vocabulary URLs, making the Linked Data representation of the VCDM following the standard. The JSON-LD representation of the vocabulary gives a clean specification for all VCDM terms by providing its URL and some of the essential characteristics (type, range, domain). And it does it in a machine readable way, assuring that the VCDM also plays following the rules established for Linked Data.(The same approach should be used for the Security vocabulary, with the particularity that the terms appearing in the security vocabulary may be defined by the Data Integrity spec or one of the cryptosuite specs.)
@msporny @dlongley @OR13 @decentralgabe
The text was updated successfully, but these errors were encountered: