ConsumerDataStandardsAustralia / standards

Work space for data standards development in Australia under the Consumer Data Right regime
Other
321 stars 56 forks source link

Decision Proposal 240 - ADR Metrics #240

Closed CDR-API-Stream closed 2 years ago

CDR-API-Stream commented 2 years ago

This decision proposal contains a proposal for the addition to the regime of an API to collect metrics from Accredited Data Recipients.

The decision proposal is embedded below: Decision Proposal 240 - ADR Metrics.pdf

Note that this proposal has arisen as a result of discussion at the February and March meetings of the Data Standards Advisory Committee. The Advisory Committee members have already reviewed the proposal which is now being opened up for public review and comment.

Consultation on this proposal has been extended to the 27th of May 2022 due to the caretaker period arising from the Federal election.

CDR-API-Stream commented 2 years ago

We've put together a short video introducing Decision Proposal 240 - ADR Metrics.

Video: https://youtu.be/TiCI1jIIako

perlboy commented 2 years ago

This proposal risks introducing more vanity metrics for the ACCC to misinterpret as non-compliance and therefore is likely to be more damaging to the ecosystem than benefit.

By way of example the current incidents raised by the ACCC has had various misinterpretations of the NFRs including, but not limited to:

Given that context, specifically on this proposal: 1) averageResponse is not an NFR and providing this information will only result in another inaccurate data point to be misused and misinterpreted by the regulator (which is already happening) 2) *schemaErrors is highly subjective especially in the context of very large amounts of conditional values in the ecosystem and the only official schema validation being dsb-schema-tools which, by the very definition of JSON Schema, cannot process business rules to assess beyond null or not null ("Hot Dog, Not Hot Dog") 3) latencyMetrics is very likely to result in a very high number of false positives for investigation because it is based on implied data latency. By way of example a bank transaction date/time vs. when the API call is made may, in all likelihood, be an ever increasing number because the vast number of sharing arrangements right now are attached to Production Verification Testing accounts (ie. "fake" accounts intended for testing without continuous transaction flow)

Finally, I'm confused why the DSB is expending resources on this sort of proposal when adoption is vanishingly small - it appears to be public capital expenditure on contracting resources purely for the benefit of the government and not the taxpayer. Consumers don't care about the governments vanity metrics (they, in all likelihood, don't even know they exist), they do care about whether they can achieve what they want to achieve. Perhaps if the DSB and the Government focused on increasing adoption and utilisation the ecosystem (along with innovation) or setup their own data recipient to collect these metrics there would actually be some real data points worthy of discussing with regulated entities.

P.S. meta isn't marked optional or mandatory, given the DSB seems to be predisposed to publishing it as Mandatory in energy specs it should probably be clearly specified in this DP as Optional.

commbankoss commented 2 years ago

CBA welcomes the opportunity to provide feedback on DP-240.

At a high level, CBA recommends that the following principles are adopted:

  1. Metrics must be measurable;
  2. The formula to derive the metric from the data is clearly defined, transparent and aligned to objectives; and
  3. There is a clearly defined framework for using metrics to meet stated objectives;

To provide substantiative feedback to this DP before it closes CBA would appreciate the DSB confirming (or adding to) our understanding of the objectives and issues that the DP seeks to address.

After reviewing both DP-240 and DP-145, CBA understands that the objectives are limited to the following:

• Remove/reduce requirement for manual reporting on the ADR side (per context of DP-145) • Provide a mechanism to measure ADR performance (per detail of DP-145) • Provide insights to allow CX optimisation (per detail of DP-145) • Provide a mechanism to measure DH performance from the perspective of an invoking client system (per context of DP-240)

Kind regards CBA Team

CDR-API-Stream commented 2 years ago

Thanks for the feedback to date. To provide clarity the purpose of the proposal is to address issues raised in the Advisory Committee related to perceived Data Holder data quality and performance issues that is being observed by ADRs. Data is needed for the ACCC to be able to identify and address these issues with individual Data Holders.

In the context of the objectives raised by CBA this is specifically:

At the same time we are canvassing the opportunity to look at CX performance i.e.:

We have tried to propose metrics that are objective and measurable. Feedback on specific metrics where measurability concerns exist would be welcome.

Also, constructive feedback on how to actually improve the metrics proposed would be welcome.

spikejump commented 2 years ago

@CDR-API-Stream Thanks for the above context of this DP. This DP seems to be a bit different to the driver in DP-145 where the "Data Recipient Metrics" mentioned in DP-145 is more about intermediary performance?

If we understand the driver behind this DP correctly, based on the above comment, it is that ACCC needs formal data from ADR on potential data quality and performance issues on specific DH that the ADR is raising with ACCC. Without such data ACCC is not in a position to investigate and discuss the claims with the DHs.

If this is correct, this DP seems to impose a very heavy burden on ADRs needing to support the DP if it turns into a mandatory requirement. There are costs in standing up API services; security and maintenance costs are a huge part of that. For the potential hundreds and thousands ADRs, there will be a large portion that will never have an issue with any DHs; there will be those ADRs that have specific use-case related issues with some DHs. One ADR's claim may be different to another ADR's. For example, ADR1 claims DH1's Get Transactions response is too slow while ADR2 is perfectly happy with DH1's Get Transactions performance but it is having transaction data quality issue with DH1 but it is not an issue for ADR1. When an issue needs to be investigated the data required, by all parties, will need to deeper and wider than the proposed reporting data in the DP.

As an ADR, it is great to see ACCC wanting to resolve issues raised by ADRs to improve the usability of the ecosystem. The good old "user pays" concept comes to mind. By this, we mean, the ADRs that need to raise issues with ACCC should be prepared to "pay" by preparing data in the required format and submit them to ACCC via ADRs' own CDR portal account (just a suggestion). ACCC, via the submission of the required data, can systematically process the data succinctly and also trigger an investigation event. This way, only ADRs that need to raise any issues have to "pay" for the cost of raising issues. ACCC, on the other hand, also reduced its cost to stand-up a service to continuously poll for performance data from hundreds and thousands of ADRs.

In addition, it should be point out that data quality is not something that the schema can pick up, especially given such large amounts of optional data in CDR. The data quality issue can be, for example, Product Name not matching actual product, to optional data not supplied when DHs have such data. In fact, our experience so far tells us that DHs are quite willing to address identified schema related errors. It is the non-measurable part of the data quality that's harder for DHs to respond in a timely manner.

We are also assuming that there is no feedback required on CX performance for this DP as mentioned in the above comment as this DP does not address CX.

AusBanking commented 2 years ago

Please see below ABA's response to this consultation:

DP 240 - ABA response.pdf

commbankoss commented 2 years ago

Commonwealth Bank supports the ABA’s view on Decision Proposal 240.

RobHale-Truelayer commented 2 years ago

Some good dialogue and responses have already been provided...
.

Additionally...

Why would ADRs want to do this?

What would ADR-sourced numbers tell us?

Would ADR metrics be representative?

What would a valid assessment look like?

The "Who has the right numbers" problem

There is a very real danger that we could end up debating who has the right numbers, rather than doing something about the numbers?

The Centralised, Independent Assessment option

Whatever approach is used, what data are we collecting?

Perhaps there is a solution to both?

Right now ADRs and their equivalents in other jurisdictions are creating “test accounts” in banks across the world. They do this so they can test their services and solutions. That’s a lot of bank accounts, and here in Australia it will soon extend to energy and telco accounts. These accounts are created for no purpose other than to test a specific service for a specific ADR. It’s a global problem that needs solving and a global burden for banks and other data holders.

Maybe Australia and the CDR could show the way and fix three things at once:

  1. Consistent independent and reliable assessment of DH performance
  2. Realistic test data to help ADRs develop market propositions
  3. Access to test accounts to help ADRs and intermediaries confirm platform operational access
jimbasiq commented 2 years ago

Thank you for the opportunity to feed back on Decision Proposal 240 - ADR Metrics. In principal Basiq are supportive of the proposal with the following comments/observations:

If the above approach is taken and the ADR metrics obligations are kept to a minimum, Basiq is supportive of a mandatory ADR Metrics API.

ACCC-CDR commented 2 years ago

The ACCC values ongoing feedback and guidance on how we best apply efforts for the benefits of CDR. As the provider of the CDR RAAP, we are an active part of the CDR technology ecosystem and expect changes that have cost and time implications should be evaluated for the benefit they provide.

For clarity, this decision proposal was not raised at the request of the ACCC. We currently have a process whereby participants can raise concerns with our Technical Operations team that we will investigate and seek to resolve using the various options at our disposal. Details can submitted via CDRTechnicalOperations@accc.gov.au or via our CDR Service Management Portal. The low volume of issues raised via the current process puts question to whether the CDR would currently benefit from effort required to support introduction of the approach suggested by DP240.

We continue to look for other perspectives in how best to monitor the CDR, the feedback and alternate view provided by True Layer provided a useful perspective.

CDR-API-Stream commented 2 years ago

Thank you for all of the responses on this proposal. As this proposal was initiated by a discussion at the Data Standards Advisory Committee the DSB will report a summary of the feedback to the next meeting and seek advice on the next steps.

The summary of feedback is understood to be along the following lines:

CDR-API-Stream commented 2 years ago

This consultation will now be closed with no decision taken. If additional action items arise from the DSAC meeting the consultation will either be reopened or a new consultation will be raised.