Verifying Verifiable Credentials, Verifiable Credentials Cryptography JSON-LD
Verifying Verifiable Credentials
To quote the Verifiable Credentials Data Model v1.1:
Credentials are a part of our daily lives; driver’s licenses are used to assert that we are capable of operating a motor vehicle, university degrees can be used to assert our level of education, and government-issued passports enable us to travel between countries. These credentials provide benefits to us when used in the physical world, but their use on the Web continues to be elusive.
This (verifiable credentials) specification provides a standard way to express credentials on the Web in a way that is cryptographically secure, privacy respecting, and machine-verifiable.
This note looks at a verification procedure for a particular format of verifiable credential. In particular we look at verifiable credentials (VCs) expressed in JSON-LD which builds on the ubiquitous JSON data-interchange format that is used throughout the web and beyond.
Extra files: Proofs_JSON_LDv2.ipynb
JSON-LD Based Proofs
A number of specifications and emerging specifications explain and specify how VCs can be “secured”. Here we will look at the “digital signing” of VCs and draw upon the following specifications:
- Verifiable Credential Data Integrity 1.0 Securing the Integrity of Verifiable Credential Data Latest as of October 2022.
- JSON-LD Website
- JSON-LD 1.1 A JSON-based Serialization for Linked Data, W3C Recommendation 16 July 2020.
- EdDSA Cryptosuite v2020 Draft Community Group Report 31 October 2022
To walk through examples of cryptographic verification we will use the following Python packages in addition to what a typical basic data science installation, e.g., Anaconda would provide.
- PyLD: GitHub: PyLD, PyPi: PyLD, for JSON-LD processing.
- cryptography Generally comes with Anaconda and other python distributions. Used for basic cryptographic operations.
- py-multibase
- py-multicodec
from pyld import jsonld
from cryptography.hazmat.primitives import hashes
from multibase import encode, decode # binary <=> text
import multicodec # Add/Remove "codec" information
from cryptography.hazmat.primitives import serialization # For working with key
from cryptography.hazmat.primitives.asymmetric import ed25519 # For sign and verify
Walkthough Verification of a Credential
The data integrity specification has a general section on verifying proofs. Some of the essential steps are:
- Let unsecuredDocument be a copy of securedDocument with the proof value removed.
- Let transformedDocument be the result of transforming the unsecuredDocument according to a transformation algorithm associated with the cryptographic suite specified in proof and the options parameters provided as inputs to the algorithm. The type of cryptographic suite is specified by the proof.type value and MAY be further described by cryptographic suite-specific properties expressed in proof.
- Let hashData be the result of hashing the transformedDocument according to a hashing algorithm associated with the cryptographic suite specified in the proof and options parameters provided as inputs to the algorithm.
To make this concrete we use the EdDSA Cryptosuite v2020 Draft Community Group Report 31 October 2022
Getting a Sample Credential
At the time of this writing interoperability testing of VCs was occuring and the CHAPI PLayground was available as a source of test VCs. Note CHAPI = Credential Handler API.
Below I chose a VC without an internal image field to avoid excessive inline text.
# Got this from [CHAPI PLayground](https://playground.chapi.io/issuer)
test_doc = {
"@context": [
"https://www.w3.org/2018/credentials/v1",
"https://purl.imsglobal.org/spec/ob/v3p0/context.json",
"https://w3id.org/security/suites/ed25519-2020/v1"
],
"id": "urn:uuid:a63a60be-f4af-491c-87fc-2c8fd3007a58",
"type": [
"VerifiableCredential",
"OpenBadgeCredential"
],
"name": "JFF x vc-edu PlugFest 2 Interoperability",
"issuer": {
"type": [
"Profile"
],
"id": "did:key:z6MkrhRudUu39qsdsp3xCFufZ1NS8TkfVW6rBpkfgRKK9Pe8",
"name": "Jobs for the Future (JFF)",
"image": {
"id": "https://w3c-ccg.github.io/vc-ed/plugfest-1-2022/images/JFF_LogoLockup.png",
"type": "Image"
}
},
"issuanceDate": "2022-11-03T17:21:34.152Z",
"credentialSubject": {
"type": [
"AchievementSubject"
],
"id": "did:key:123",
"achievement": {
"id": "urn:uuid:bd6d9316-f7ae-4073-a1e5-2f7f5bd22922",
"type": [
"Achievement"
],
"name": "JFF x vc-edu PlugFest 2 Interoperability",
"description": "This credential solution supports the use of OBv3 and w3c Verifiable Credentials and is interoperable with at least two other solutions. This was demonstrated successfully during JFF x vc-edu PlugFest 2.",
"criteria": {
"narrative": "Solutions providers earned this badge by demonstrating interoperability between multiple providers based on the OBv3 candidate final standard, with some additional required fields. Credential issuers earning this badge successfully issued a credential into at least two wallets. Wallet implementers earning this badge successfully displayed credentials issued by at least two different credential issuers."
},
"image": {
"id": "https://w3c-ccg.github.io/vc-ed/plugfest-2-2022/images/JFF-VC-EDU-PLUGFEST2-badge-image.png",
"type": "Image"
}
}
},
"proof": {
"type": "Ed25519Signature2020",
"created": "2022-11-03T17:21:34Z",
"verificationMethod": "did:key:z6MkrhRudUu39qsdsp3xCFufZ1NS8TkfVW6rBpkfgRKK9Pe8#z6MkrhRudUu39qsdsp3xCFufZ1NS8TkfVW6rBpkfgRKK9Pe8",
"proofPurpose": "assertionMethod",
"proofValue": "z4CeSVwGLeaKmT2CWR8SWjESsPmBRvUBR5BLYCw2Ehe5pzwUfoK5SMttaiHYVH72ZYTfmg9Bdq2VsSg9GUCqG6ZCR"
}
}
Extract Proof
Extract the proof portion of the credential and look at it as nicely formated JSON. We need to work with both the document without the proof and the proof portion separately.
import json
# Gets the proof and removes it from the document
test_proof = test_doc.pop("proof")
print(json.dumps(test_proof, indent=2))
{
"type": "Ed25519Signature2020",
"created": "2022-11-03T17:21:34Z",
"verificationMethod": "did:key:z6MkrhRudUu39qsdsp3xCFufZ1NS8TkfVW6rBpkfgRKK9Pe8#z6MkrhRudUu39qsdsp3xCFufZ1NS8TkfVW6rBpkfgRKK9Pe8",
"proofPurpose": "assertionMethod",
"proofValue": "z4CeSVwGLeaKmT2CWR8SWjESsPmBRvUBR5BLYCw2Ehe5pzwUfoK5SMttaiHYVH72ZYTfmg9Bdq2VsSg9GUCqG6ZCR"
}
Canonize the Document without Proof
JSON and JSON-LD documents can contain the same information but look very different. For example the order in which the fields of JSON object occur can be changed with no effect on the use of the object. This property is problematic when trying to assert data integrity since such equivalent objects would not generate the same “signatures”.
The solution to this is to apply a canonicalization scheme to produce a unique representation. Such standard schemes are available for JSON, JSON Canonicalization Scheme (JCS): RFC8785, and JSON-LD, RDF Dataset Canonicalization A Standard RDF Dataset Canonicalization Algorithm Final Community Group Report 15 October 2022. Note that the outputs of such algorithms typically are not pretty to look at! But we will look at it for informational purposes. Note that JSON-LD canonical output is much different from JSON canonical output. We only use JSON-LD here.
test_cannon = jsonld.normalize(
test_doc, {'algorithm': 'URDNA2015', 'format': 'application/n-quads'})
# Look at only a limited part of the result
print(test_cannon[0:400])
<did:key:123> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://purl.imsglobal.org/spec/vc/ob/vocab.html#AchievementSubject> .
<did:key:123> <https://purl.imsglobal.org/spec/vc/ob/vocab.html#Achievement> <urn:uuid:bd6d9316-f7ae-4073-a1e5-2f7f5bd22922> .
<did:key:z6MkrhRudUu39qsdsp3xCFufZ1NS8TkfVW6rBpkfgRKK9Pe8> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://purl.imsglobal.org/
Hash the Canonical Document
After obtaining a canonical version of the document we can now compute a cryptographic hash of the document. The specification indiates that SHA256, Wikipedia SHA256, should be used.
# Compute the hash
digest = hashes.Hash(hashes.SHA256())
digest.update(test_cannon.encode('utf8'))
doc_hash = digest.finalize()
print(len(doc_hash)) # Should be 32
print(doc_hash.hex())
32
f1b5ac838c30956beff1cdb3f85d2fdc3bbd64595c8d3d25dc68912528bad6cf
Protecting the Proof “Options”
Not stated in the generic procedures referenced above is that it is good to protect the “options” portions of the proof, i.e., that part of the proof that has information about the algorithm used, who is signing it, etc… but not the actual signature.
To do this we will create a JSON-LD document from the proof portion of
the document without the actual signature value, i.e., remove any (JWS,
signatureValue, proofValue) fields. We will then add in JSON-LD
@context
in formation from the VC, canonize it, and then take the
hash.
# Create "proof options" JSON-LD
# Make a copy of proof and remove signature fields
reduced_proof = test_proof.copy()
del_stuff = ["jws", "signatureValue", "proofValue"]
for del_thing in del_stuff:
if del_thing in reduced_proof:
del reduced_proof[del_thing]
# Add in the JSON-LD context from the docment
reduced_proof["@context"] = test_doc["@context"]
print(json.dumps(reduced_proof, indent=2)) # Print it nicely
{
"type": "Ed25519Signature2020",
"created": "2022-11-03T17:21:34Z",
"verificationMethod": "did:key:z6MkrhRudUu39qsdsp3xCFufZ1NS8TkfVW6rBpkfgRKK9Pe8#z6MkrhRudUu39qsdsp3xCFufZ1NS8TkfVW6rBpkfgRKK9Pe8",
"proofPurpose": "assertionMethod",
"@context": [
"https://www.w3.org/2018/credentials/v1",
"https://purl.imsglobal.org/spec/ob/v3p0/context.json",
"https://w3id.org/security/suites/ed25519-2020/v1"
]
}
proof_canon = jsonld.normalize(
reduced_proof, {'algorithm': 'URDNA2015', 'format': 'application/n-quads'})
print("Proof Options Canonicalized:")
print(proof_canon[0:300])
temp_digest = hashes.Hash(hashes.SHA256())
temp_digest.update(proof_canon.encode('utf8'))
proof_hash = temp_digest.finalize()
print("Proof Options hash in hex:")
print(proof_hash.hex())
Proof Options Canonicalized:
_:c14n0 <http://purl.org/dc/terms/created> "2022-11-03T17:21:34Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> .
_:c14n0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://w3id.org/security#Ed25519Signature2020> .
_:c14n0 <https://w3id.org/security#proofPurpose> <https://w3id.org/security#as
Proof Options hash in hex:
73ca75164d8a3a6a01660bc521c77c8df24b3b904e3926aa3f5f8d1d4a034be4
Key and Signature Extraction
For the particular verification method we are dealing with the public key is included. See did:key. We need to convert both the public key and the signature to raw bytes for use in signature verification with the Python cryptography library.
# Extracting the encoded key from the verification method string
veri_string = test_proof["verificationMethod"]
pound_index = veri_string.index("#")
encode_key = veri_string[pound_index+1:]
print("Encoded public key:")
print(encode_key)
Encoded public key:
z6MkrhRudUu39qsdsp3xCFufZ1NS8TkfVW6rBpkfgRKK9Pe8
# Get raw key bytes and check the multicodec prefix
multi_key = decode(encode_key)
print(f"Type of key: {multicodec.get_codec(multi_key)}") # should be ed25519
key_bytes = multicodec.remove_prefix(multi_key)
print(f"Length of key: {len(key_bytes)}") # Should be 32 bytes
print("Key in hex:")
print(key_bytes.hex())
Type of key: ed25519-pub
Length of key: 32
Key in hex:
b5ed86b9bb4d4a53a8a7449d443ba27575fa964710c89194a1dc3194a484faa1
# Get raw signature value in bytes
sig_string = test_proof["proofValue"]
print(f"Raw signature: {sig_string}")
sig_bytes = decode(sig_string)
print(f"Length of signature: {len(sig_bytes)}") # Should be 64 bytes
print("Signature in hex:")
print(sig_bytes.hex())
Raw signature: z4CeSVwGLeaKmT2CWR8SWjESsPmBRvUBR5BLYCw2Ehe5pzwUfoK5SMttaiHYVH72ZYTfmg9Bdq2VsSg9GUCqG6ZCR
Length of signature: 64
Signature in hex:
a017f8e7fd3ffc6953438ce986837ee4218ead6b128294f318230732621ed06f9c5b471e3753e6d90c5628a40b1e769ccf2e13e1d1ccd9c6e9ff7a16de1f5f0e
# Get the "public key" object from raw bytes for later verification
loaded_public_key = ed25519.Ed25519PublicKey.from_public_bytes(key_bytes)
Combining Hashes and Verifying Signature
We now take the hash of the “proof options” and concatenate the hash of the document (without proof) to it. Then we check this combined hash with the signature and public key.
test_combined_hash = proof_hash + doc_hash
print(test_combined_hash.hex())
print(f"Length of combined hashes: {len(test_combined_hash)}")
73ca75164d8a3a6a01660bc521c77c8df24b3b904e3926aa3f5f8d1d4a034be4f1b5ac838c30956beff1cdb3f85d2fdc3bbd64595c8d3d25dc68912528bad6cf
Length of combined hashes: 64
# Verify it...
try:
loaded_public_key.verify(sig_bytes, test_combined_hash)
print("Verified!!!")
except:
print("Not Verified!!!")
Verified!!!
# Let's check that it will throw an error if the signature is wrong
temp = list(sig_bytes)
# print(temp)
temp[0] = temp[0] - 1 # Change one byte of the signature
bad_sig = bytes(temp) # Now its bad ;-)
try:
loaded_public_key.verify(bad_sig, test_combined_hash)
except:
print("Could not verify!!!")
Could not verify!!!