Closed JamesMBligh closed 5 years ago
Just a quick note to request that RateString data type should allow for negative interest rates as these are not only possible but increasingly common within capital markets. The specification for RateString explicitly defines "A positive number (or zero)", this is not an assumption that can be made and there is demonstrable instances within European institutions where negative interest rates are being offered.
Suggest the range of RateString is -1 < 0 > 1.
Hi @csirostu, at the moment the fields that RateString apply to are positive onlay as we spilt deposit and lending rates as well as bonus and penalty rates in all structures. As such expanding the definition of the type could be detrimental to those structures rather than advantageous.
When we encounter a field where positive and negative are needed I would suggest we would rename the existing type to PositiveRateString (consistent with PositiveInteger) and create a new RateString with expanded constraints.
-JB-
Hi @JamesMBligh, 10 years ago I would have thought the suggestion that deposit rates could be negative would be absurd however there is already demonstrable cases (in Germany in particular) of precisely this (negative interest on business deposit accounts) happening. With Q1 inflation having just come back as 0% and official interest rates much closer to zero than not it is not outside the realm of possibility.
For this reason I would suggest the CDS should avoid predicting monetary policy. I'd also suggest an emergency change to facilitate this possibility would be problematic as well given a situation where Data Holders are likely to have some much larger external challenges should it eventuate.
Looking to the UK Open Banking standard, in particular on the SME Unsecured Loan definition (although I believe this is consistent) they specify ^(-?\d{1,3}){1}(\.\d{1,4}){0,1}$
for the interest rate validation which in essence represents a positive or negative number of between 1 & 3 numbers for the integral part and between 1 & 4 numbers for the fractional component.
I’d be interested in other opinions from the community on this topic since it hasn’t been raised as a concern by the Finanicial Institutioms (which could be oversight). Some alternatives for addressing this are:
imho, rate range is a regulatory concern and should be mostly market driven. CDS only needs to set an agreed data type and data format for rates so that different parties can interpret the data in the same way.
Changing data type constraints post-go-live would obviously be significantly more costly. Data types should reflect the market reality and not be artificially constrained. Unnecessarily tight constraints do not give the right guidance to developers and testers of providers or recipients and an ‘edge case’ with a negative interest rate should not fail. We therefore recommend PositiveRateString be created and retro-fitted and RateString being used for lending and deposit rates (also possible even though most Australian banks would have a zero floor). Note: The existing usage of positive rates for both deposit and lending rates in the standards (in ‘normal’ market conditions) follow market conventions so their meaning is guided by the context.
Fair enough. I'll make the change unless there are objections.
Hi James, We were wondering if you could advise on timelines for the V1 APIs (with feedback addressed) being reflected on the standards page (i.e. https://consumerdatastandardsaustralia.github.io/standards/#introduction). Thanks
Hi @anzbankau, I believe that the V1 API end point and payload decisions are currently reflected on the standards site if you're looking at v0.8.0 or higher according to the change log.
That said, there were many changes and my reviews may have missed something. If you see any discrepancies on the standards site please raise them here and Brian or I will fix them.
We are planning to publish a final V1 position at the end of May so correcting things between now and then would be the intent.
-JB-
Hi James, The updates for minor corrections to Product Reference (earlier post) in v0.8.0 are a bit hit and miss so I've relisted the outstanding ones below. And when I checked on Friday point 1 was fixed but seems to have reverted to the description that was to be corrected.
Thanks.
Thanks @anzbankau, I’ll check these. I thought they were already done.
Access Token Commonwealth Bank requires greater flexibility in the Standards of the length of time an Access Token is valid. The Bank recommends that Access Tokens are as short lived as possible. Best practice depends on how long it takes for the relevant operation to complete but can be as short as two minutes. We recommend that the specific time be adjusted by the data holder inline with real-world experience post go-live, but not exceed 10 minutes.
Refresh Token Commonwealth Bank supports the alignment of lifetime of Consent with the lifetime of Refresh Tokens. Either the Standards or the Rules should ensure this is technically enforced. We recommend consent re-authentication at least every three months.
@anzbankau, you were correct, we had accidentally reverted. Should be fixed now.
In response to @commbankoss:
Access Tokens Allowing shorter expiry timeframes has direct impact on the unattended and "Single Use" scenarios. For unattended we have decided to use the lifespan of the Access Token as a measure for NFRs and for limiting unattended access by recipients to protect Bank infrastructure. Having inconsistent expiries for Access tokens would make NFR enforcement inconsistent. For "Single Use" we have defined this as the provision of an Access Token with no Refresh Token so a shorter expiry would make a single use authorisation potential unworkable and drive ADRs to seek longer authorisation than they actually need or want. The selection of ten minutes was made through consultation and the actual limit was recommended by a Bank from memory and is generally in alignment with the length of session tokens for online banking channels. If more flexibility is desired then solutions to the above concerns would need to be proposed in tandem.
Refresh Tokens Other participants have requested the ability to cycle refresh tokens, which has currently been accommodated. If any specific holder wishes to set the expiry of the refresh token with the overall consent and eschew refresh token cycling they are free to do so. Consent reauthorisation is explicitly called out in the ACCC rules (section 4.1 (4)) as being limited to the length of time the customer has consented up to a maximum of 12 months. It is not in alignment with the rules for the standard to arbitrarily limit the Customer's ability to consent for up to twelve months.
-JB-
This change was agreed to but not actioned or has since been reverted. See proposed post 21/12/18 and accepted post 29/1/19.
Mandatory arrays with Description = 'May be empty' are still present (e.g. Person.emailAddresses) after proposed post 12/3/19 and accepted post 13/3/19.
Hi James, Suggest BankingLoanAccount.minInstalmentAmount is made optional, as we will not know the min instalment amount for the Interest Only loans until the Interest is due. Thanks
This comment is largely a continuation of the discussion at the now closed Decision Proposal 064 - Scopes, Claims & Tokens and includes responses to questions posed there.
In response to this comment:
@assem-ali raises an interesting point. If a Data Recipient is attempting to both conform to the ACCC Rule’s data minimisation principle and give good CX, for example, by stating that only the three months of transactions will be downloaded, the standards as they stand do not facilitate this message being reinforced at the Data Holder.
In response to this comment:
Language may be causing a bit of confusion here. A separate consent API is not an extension to the OIDC standard. It augments OIDC where it does not cater for them. It is treated more like a data API than an Identity Provider functionality. Please see this draft extension to the FAPI standard for a fuller explanation.
We disagree that extending OAuth2 and OIDC minimises the cost of compliance for both recipients and holders versus a consent API. The extension approach is expensive for all parties because consumers and data holders can no longer rely on existing certified OpenID libraries, servers and services without modification, testing and recertification. If the regime is later moved to a consent API solution, then the change would again create complexity for data recipients and data holders as they move back towards standard security solutions.
In response to this comment:
- There is no mechanism of exchanging the consent purpose, consent start date, consent period and consent type as in Data61's CX Phase 2 prototypes nor extensibility for future rule or standard requirements (e.g a historic transaction limit as described by @assem-ali).
It is expected that consent purpose and type (where needed by the holder) would be static language provided from the Register. Consent start is defined by the completion of the consent process. Consent period is the new Sharing Duration component of the request object.
Static registration of purpose and type will lead to multiple client_ids being registered for the same physical application in a number of cases.
Consider the example of a credit card origination application which uses transaction data to make a credit decision and to also allow a customer to optionally import payees as part of the set up process for internet banking.
In this case the Data Recipient will need to register two client_ids for the same physical application:
client_id
1 will have a purpose similar to “credit card application approval”client_id
2 will have a purpose similar to “import payees”
- It is not possible for a Data Recipient to enquire on the status of an in-progress consent authorisation. For example, they will not know the difference between when a customer has not yet completed the authorisation and when a redirect error has occurred and the access token has not been returned to the third party.
The consent process has been limited to 5 minutes. It is not a long running process so the need for introspection during consent is minimal. It is also not a standard feature of OAuth 2 or OIDC. If a customer chooses to abandon or reject the consent during the process with the holder then there is no obligation for their reasons to be made transparent to the recipient.
We agree that this not part of OIDC. However OIDC does not have a profile for Known Channel Authentication. In Known Channel Authentication, there are more steps, each with their own risk of error. Hence, there is a greater need for error information in order to provide a quality customer experience.
A notable consequence of limiting the request to 5 mins is that Data Holders will be unable to exercise their option allow joint account holders to authorise a consent inside a consent authorisation, rather than through the pre-authorisation process. It is possible to solve this problem (and introduce new ones) by allowing consent authorisations to be modified. In this way a joint account holder can complete their authorisation an hour/day/week later and the newly authorised account can be added to the consent at that time. This solution though, is complex for Data Recipients to implement – a new account may appear at any time without notification. Further, the disclosure of exactly which accounts were included in the consent when may be confusing to customers.
- For long-lived consent, it is not possible for the Data Recipient to obtain rich status information, such as the reason for revocation, or an error status. The only data returned is whether the consent is active or not and it's expiry.
There is no obligation to communicate the Customer's reasons for revocation and the Customer may not wish this information to be requested or communicated. With regard to error status, what sort of errors are being referred to?
Statuses that may be useful to the Data Recipient:
- If an invalid set of scopes is requested by the Data Recipient then it becomes a user-facing error due to the consents being validated in the front channel, rather than back channel.
Yes, but this is a systemic issue for the recipient rather than a transactional issue. There are only seven scopes in the regime so I would not expect this to be a serious problem. If it does we can apply a treatment such as the requirement to silently ignore invalid scopes for instance. Do you think this case should be addressed now? Our understanding of OIDC is that scopes should not be silently ignored, an error should be returned. As such, Westpac does not recommend silently ignoring invalid scopes.
- Ties the lifecycle of the refresh token to the lifecycle of the consent authorisation thereby reducing the ability to adapt to future needs.
- For example, suppose the rules are updated to be more aligned to a typical bank's policy on token lifespan. That is, the new rule is that authorisations are valid for 12 months, however if a Customer has not used a Data Recipient's service for three months then the authorisation is deemed to have lapsed. This would imply that refresh tokens expire in three months and that refresh tokens are reissued every three months, as long as the Customer has logged into the Data Recipient's service during that period, up to a maximum of twelve months. While this can be implemented with the described flow, there is no way to communicate to the Data Recipient when the original 12 month authorisation expires.
The introduction of the Sharing ID and associated expiry time specifically addresses this issue. It disconnects the refresh token from the lifespan of the refresh token. I note that CBA requested that refresh tokens be specifically tied to the lifespan of the consent so this is something where a middle position between multiple viewpoints is required.
Westpac supports disconnecting the expiry of the authorisation from the expiry of the refresh token.
Hi James, Just noting I have posted https://github.com/ConsumerDataStandardsAustralia/standards/issues/69#issuecomment-492611158 that is related.
This is a clarification In response to a question received from a member of this community. This is not a decision related to the standards, it is an informative clarification regarding the use or interpretation of the standards. A paraphrase of the question is:
In the standards under the definition of optionality the following statement is made:
Note that optional fields indicate that data may sometimes not be held by a Data Holder and this is an expected scenario. It does not indicate that the field may be optionally implemented by a Data Holder. If a Data Holder holds the data represented by an optional field in digital form then the Data Holder MUST supply that data in the response.
Does this also apply to the Product Reference data as well considering this is not customer data?
To clarify, the intent of the statements around optionality align with the intent of the regime to give Customers' control of the sharing of their data. Product Reference data, however, is not Customers data. It is data that represents the Bank's products. In this context the technical standard for Product Reference has been designed to allow the Bank the flexibility to represent their products as accurately as they would in other channels. The ability to leave out certain parts of the schema, where doing so best represents the products, is a part of the schema. This clarification only applies to compliance with the technical standards, however. Other regulatory requirements that apply to the accurate representation of financial products would also need to be considered.
Hi @JamesMBligh, We are currently working on mapping the products payload. We came across a scenario where the last tier for a product has no upper limit (i.e.) maximumValue is unlimited, example, amount of $50K and above gets a deposit rate of 4%. Currently the field maximumValue is mandatory, therefore we can't use NULL to represent unlimited upper bound for the last tier. What would be your expectation in this scenario?
With regards to the POST /banking/accounts/balances, what is the purpose (or what will it be used for in future) of the meta: {} in the accountIds body parameter of the request?
{ "data": { "accountIds": [ "string" ] }, "meta": {} }
Hi @JamesMBligh, We are currently working on mapping the products payload. We came across a scenario where the last tier for a product has no upper limit (i.e.) maximumValue is unlimited, example, amount of $50K and above gets a deposit rate of 4%. Currently the field maximumValue is mandatory, therefore we can't use NULL to represent unlimited upper bound for the last tier. What would be your expectation in this scenario?
James,
Could you please make the BankingProductRateTier.maximumValue
property Optional and add to the Description: 'If absent the tier's range has no upper bound.'. This was the original intention of the Rate Tier type and is necessary for representing ANZ products. Thanks to @assem-ali for spotting this.
Thanks
A number of minor changes have occurred in the [FAPI-R] and [FAPI-RW] normative reference drafts which need to be accommodated in the documentation. These include:
x-fapi-financial-id
client provision have been removed.x-fapi-customer-last-logged-time
client provision has been renamed to x-fapi-auth-date
. We’ve noticed the following issues with the product data specification:
In the definition of BankingAccountDetail, I believe there is a redundant definition.
It nominates that addressUType is required. addressUType is not a property of BankingAccountDetail. It is a property of CommonPhysicalAddress where it is already defined as required.
For TransactionDetail, if there is no extra transaction detail, should the call have to return an extendedData section with just a "service": "X2P1.01" record
eg
{
"data": {
"accountId": "26476",
"transactionId": "1535297",
"isDetailAvailable": false,
"type": "OTHER",
"status": "POSTED",
"description": "Test Description",
"postedDateTime": "2018-03-02T00:00:00",
"valueDateTime": "2018-03-02T00:00:00",
"executionDateTime": "2018-03-02T00:00:00",
"amount": "236",
"currency": "AUD",
"reference": "",
"merchantName": "",
"merchantCategoryCode": "",
"billerCode": "",
"billerName": "",
"crn": "",
"apcaNumber": "",
"extendedData": {
"service": "X2P1.01"
}
},
"links": {
"self": "https://api.example.com/cdr-au/v1/banking/accounts/26476/transactions/1535297"
},
"meta": {}
}
Hi James,
There is currently inconsistency in the documentation for pagination on the standards site.
https://consumerdatastandardsaustralia.github.io/standards/#pagination states:
https://consumerdatastandardsaustralia.github.io/standards/#tocSlinkspaginated states:
Thanks.
There appear to be mixed parameters representing date boundaries in the transaction calls.
Should they all be newest-time and oldest-time, or should they be start-time and end-time?
There is a typo in the license declaration of the published swagger spec:
"license": {
"name": "MIT Licence",
"url": "https://opensource.org/licenses/MIT"
}
The MIT License is issued by an American entity and therefore the spelling is American. As the word "license" is being used as a noun this should not be changed and instead referenced as "MIT License".
Changes to FAPI drafts
A number of minor changes have occurred in the [FAPI-R] and [FAPI-RW] normative reference drafts which need to be accommodated in the documentation. These include:
- References to the
x-fapi-financial-id
client provision have been removed.- The
x-fapi-customer-last-logged-time
client provision has been renamed tox-fapi-auth-date
.
These changes have been incorporated and will be pushed live shortly
Minor issues with product data specification
We’ve noticed the following issues with the product data specification:
- We have encountered the same issue mentioned by @assem-ali, namely that there are scenarios where the advertised maximumValue may be unlimited. We suggest that this field is made optional with omission indicating no upper bound.
This is now fixed in the published standards.
- The amount field for product fees is mandatory. Either this should be made optional and the “One of amount, balanceRate, transactionRate and accruedRate is mandatory” or the descriptions of balanceRate, transactionRate and accruedRate should be updated.
This has been changed and will be pushed live shortly.
-JB-
In response to @speedyps:
In the definition of BankingAccountDetail, I believe there is a redundant definition.
It nominates that addressUType is required. addressUType is not a property of BankingAccountDetail. It is a property of CommonPhysicalAddress where it is already defined as required.
@kirky61 is doing some housekeeping and removing some of the redundant objects and files in the standards repo
-JB-
In response to @speedyps:
For TransactionDetail, if there is no extra transaction detail, should the call have to return an extendedData section with just a "service": "X2P1.01" record
It would be reasonable to interpret this transaction as not having any additional data so a call to the transaction detail end point should not occur as the isDetailAvailable field in the transaction list API should be set to false.
This raises the question as to the expected handling of a call to transaction detail when isDetailAvailable is false. This is currently indeterminate in the standard but the implication is that it should result in an error (probably 400 Bad Request). If clarification on this is required feedback would be welcome as part of consultation on the imminent May draft.
-JB-
In response to @anzbankau:
There is currently inconsistency in the documentation for pagination on the standards site.
https://consumerdatastandardsaustralia.github.io/standards/#pagination states:
- first - A URI to request the first page. This field MUST be present.
- last - A URI to request the last page. This field MUST be present unless there is only one page in the set.
https://consumerdatastandardsaustralia.github.io/standards/#tocSlinkspaginated states:
- first URIString conditional none URI to the first page of this set. Mandatory if this response is not the first page
- last URIString conditional none URI to the last page of this set. Mandatory if this response is not the last page
Thanks. This has been fixed and will be pushed live shortly.
-JB-
In response to @speedyps:
There appear to be mixed parameters representing date boundaries in the transaction calls.
- Get Transactions For Account has start-time and end-time
- Get Transactions For Multiple Accounts has newest-time and oldest-time
- Get Transactions For Specific Accounts has newest-time and oldest-time
Should they all be newest-time and oldest-time, or should they be start-time and end-time?
Thanks. This was an oversight. It is fixed and will be pushed shortly.
-JB-
In response to @csirostu:
There is a typo in the license declaration of the published swagger spec:
"license": { "name": "MIT Licence", "url": "https://opensource.org/licenses/MIT" }
The MIT License is issued by an American entity and therefore the spelling is American. As the word "license" is being used as a noun this should not be changed and instead referenced as "MIT License".
Licence is now replaced with License.
-JB-
The items flagged above have now been pushed live
-JB-
Hi @JamesMBligh in current draft, all of the endpoints have two tags, e.g.
"tags": [
"Banking APIs",
"Accounts"
]
May I suggest to remove APIs
, so that they look like
"tags": [
"Banking",
"Accounts"
]
The main reason is these tags are fed into standard swagger tool to group endpoints into API, some codegen generates Interface name out of the tags. e.g, JavaInflectorServerCodegen
will generate an interface called BankingApIsApi
if we tag the endpoints as Banking APIs
. If they are tagged as Banking
, the generated interface would be more sensible BankingApi
.
Secondly, we will not lose info by removing them, since swagger itself implies we are defining API
Another minor issue I found in current draft swagger https://consumerdatastandardsaustralia.github.io/standards/includes/swagger/cds_full.json is the indents are inconsistent. We can use online tools such as https://jsonformatter.curiousconcept.com/ or a JavaScript IDE to automatically format it. Since JSON stands for JavaScript Simple Object Notation, we can probably follow the most popular JavaScript indent convention that is 2 space indent.
Can we please get some clarification on the below scenarios? It may be more implementation related but would prefer there's a position in the standard to provide consistent behaviour across the board.
Assume a customer has 5 accounts: A1, A2, A3, A4, A5.
On day 1, customer gives consent to A1, A2. Data Recipient receives:
Accts Avail | Refresh Token | Customer ID (sub in IDToken) | Consent ID |
---|---|---|---|
A1, A2 | T1 | S1 | C1 |
On day 10, customer adds account A3. Data Recipient receives:
Accts Avail | Refresh Token | Customer ID (sub in IDToken) | Consent ID |
---|---|---|---|
A3 | T2 | S1 | C2 |
NOTE: Customer ID (sub in ID Token) does not change between different consent/authorisation based on our understanding.
Questions:
On day 20, customer adds accounts A1 & A4. Assuming here Data Holder presents all possible accounts to customer during consent. Data Recipient receives:
Accts Avail | Refresh Token | Customer ID (sub in IDToken) | Consent ID |
---|---|---|---|
A1, A4 | T3 | S1 | C3 |
NOTE: A1 in this case is duplicated in C1 and C3. Data Recipient has now 3 different consents with 3 different validity periods.
Questions:
Thank you.
With respect to Decision #61 there's a comment in the decision that we'd like to get some clarification.
It is mentioned that:
the identifier SHOULD also be unique relative to the scenario in which the end-user has authenticated. For example, the identifier generated for the same person when they are using a business account SHOULD be different to the identifier that is generated when that same individual is authorising as an individual.
Our interpretation to this statement is that:
When a banking portal provides access to both personal accoounts as well as business accoounts then the same PPID will be generated for the customer.
When a customer is required to sign in to a personal banking portals that's different to the business banking portal then the bank SHOULD supply two different PPIDs for the same customer.
Is this correct?
Should case 1 be a MUST for same PPID for personal as well as business accounts from the same portal?
Thanks.
Hi James,
Product API header values Could it please be clarified that the following header fields would not be expected for product API calls:
Request
Response
Unless the intention is otherwise?
Additionally we notice that x-Correlation-Id is specified in the response header, but not in the request header documentation.
Thanks.
Hi James,
BankingProductDiscount.amount
like BankingProductFee.amount
Thanks for making BankingProductFee.amount
Conditional to reflect that fees can be expressed in terms of amount
, balanceRate
, transactionRate
or accruedRate
as per 3rd item in your response. Could you please do the same for BankingProductDiscount
in the same way. amount
is Mandatory and has Description 'Value of the discount' when following properties include 'One of amount, balanceRate, transactionRate, accruedRate and feeRate is mandatory.'.
Thanks.
I am locking this thread. Please provide feedback at #70
-JB-
This issue has been opened to capture feedback on the standards as a whole. The standards site will be incrementally updated to accommodate this feedback. This is the sixth cycle of holistic feedback for the standards and has been open to capture feedback post the closure of version 1 of the API end points and payloads.
-JB-