Arabic Text Problem During Clearance

When sending a Standard Tax Invoice for clearance to your Simulation Portal, whenever there is any Arabic text (ex. Line-Item Description etc.), we get a failed clearance BUT with non-Arabic text, we PASSED /CLEARED and our XML is signed by your Web API.

We are using your SDK (3.2.1 or 3.2.2) for .NET.

Appreciate and thanks for any assistance from the group please.

Below is the error we received from your Web API.

{“validationResults”:{“infoMessages”:[{“type”:“INFO”,“code”:“XSD_ZATCA_VALID”,“category”:“XSD validation”,“message”:“Complied with UBL 2.1 standards in line with ZATCA specifications”,“status”:“PASS”}],“warningMessages”:,“errorMessages”:[{“type”:“ERROR”,“code”:“invalid-invoice-hash”,“category”:“INVOICE_HASHING_ERRORS”,“message”:“The invoice hash API body does not match the (calculated) Hash of the XML”,“status”:“ERROR”}],“status”:“ERROR”},“clearanceStatus”:"NOT_CLEARED

As discussed during Q&A session, kindly check sample XMLs from SDK. Sample XMLs contain description and address fields in Arabic and English. These samples are validated successfully.

Our xml also validated successfully if you are referring to xml validation for tags. The problem lies when submitting xml for clearance and zatca return with invoice hashing errors… We tried as per your instruction and below is what we have sent to our account manager.

  1. Submitting the xml without modification – we purposely submit the xml invoice despite knowing that it will give us below error which is “certificate errors” just to comply with webinar host instruction.

{“validationResults”:{“infoMessages”:[{“type”:“INFO”,“code”:“XSD_ZATCA_VALID”,“category”:“XSD validation”,“message”:“Complied with UBL 2.1 standards in line with ZATCA specifications”,“status”:“PASS”}],“warningMessages”:,“errorMessages”:[{“type”:“ERROR”,“code”:“invalid-invoice-hash”,“category”:“INVOICE_HASHING_ERRORS”,“message”:“The invoice hash API body does not match the (calculated) Hash of the XML”,“status”:“ERROR”},{“type”:“ERROR”,“code”:“certificate-permissions”,“category”:“CERTIFICATE_ERRORS”,“message”:“User only allowed to use the vat number that exists in the authentication certificate”,“status”:“ERROR”}],“status”:“ERROR”},“clearanceStatus”:“NOT_CLEARED”,“clearedInvoice”:null}

  1. Submitting changing the Supplier party with correct VAT number – gives us the same error “INVOICE_HASHING_ERRORS “ when submitting our own xml invoice.

{“validationResults”:{“infoMessages”:[{“type”:“INFO”,“code”:“XSD_ZATCA_VALID”,“category”:“XSD validation”,“message”:“Complied with UBL 2.1 standards in line with ZATCA specifications”,“status”:“PASS”}],“warningMessages”:,“errorMessages”:[{“type”:“ERROR”,“code”:“invalid-invoice-hash”,“category”:“INVOICE_HASHING_ERRORS”,“message”:“The invoice hash API body does not match the (calculated) Hash of the XML”,“status”:“ERROR”}],“status”:“ERROR”},“clearanceStatus”:"NOT_CLEARED

Please make sure to generate a new Hash using the below command on SDK:
:
fatoora -generateHash -invoice [invoicename.xml]

Then replace the first [DigestValue] tag value in the invoice with the new generated hash. Also, make sure to use the new hash to be sent in the JSON request body in Clearance API.

I just did that today using our application and postman but I have the same error. which is hashing. We only have problem on xml with arabic text while PASS and CLEARED on plain xml invoice without arabic text.

Please share the invoice, the generated hash, and the used Postman collection through the email channel so we can investigate it from our side.

We already send the invoice and the error message through email to the person in-charge to us

This is to update you on what we have done on our part to eliminate our doubts that we are wrongfully providing the has value in our xml in digest value for Invoice hash value tags.

  1. Hashing the xml invoice (complete xml tags as per sample tax invoice) and replace the digest value for hash with the new hash value.
    1.1 Submit the xml invoice for clearance in your web API (English). – We got a web response as cleared with the new stamp invoice.
    1.2 Submit the xml invoice for clearance in your web API (With Arabic). – We got a web response failed to clear the invoice with below json message
    {“validationResults”:{“infoMessages”:[{“type”:“INFO”,“code”:“XSD_ZATCA_VALID”,“category”:“XSD validation”,“message”:“Complied with UBL 2.1 standards in line with ZATCA specifications”,“status”:“PASS”}],“warningMessages”:,“errorMessages”:[{“type”:“ERROR”,“code”:“invalid-invoice-hash”,“category”:“INVOICE_HASHING_ERRORS”,“message”:“The invoice hash API body does not match the (calculated) Hash of the XML”,“status”:“ERROR”}],“status”:“ERROR”},“clearanceStatus”:“NOT_CLEARED”,“clearedInvoice”:null}

  2. Hashing the xml invoice (remove the ubl extension) and replace the digest value for hash with the new hash value.
    2.1 Put back the ubl extension together with the new value of digest for hashing which derive after hashing the xml invoice (step 2)
    2.1.1 Submit the xml invoice for clearance in your web API (English). – We got a web response as cleared with the new stamp invoice.
    2.1.2 Submit the xml invoice for clearance in your web API (With Arabic). – We got a web response failed to clear the invoice with below json
    {“validationResults”:{“infoMessages”:[{“type”:“INFO”,“code”:“XSD_ZATCA_VALID”,“category”:“XSD validation”,“message”:“Complied with UBL 2.1 standards in line with ZATCA specifications”,“status”:“PASS”}],“warningMessages”:,“errorMessages”:[{“type”:“ERROR”,“code”:“invalid-invoice-hash”,“category”:“INVOICE_HASHING_ERRORS”,“message”:“The invoice hash API body does not match the (calculated) Hash of the XML”,“status”:“ERROR”}],“status”:“ERROR”},“clearanceStatus”:“NOT_CLEARED”,“clearedInvoice”:null}

As a result of multiple testing and debugging, we found out that hashing the invoice using the zatca SDK will provide the same result whether UBL extension tag is present or not which means that we are providing correct hash value when submitting the xml invoice.

Please update

Hi Team,

We are also facing the same issue. Please suggest the solution for this