Closed vickywane closed 3 years ago
@vickywane This is a static document. We need a CI workflow that automatically re-generates and publishes to github pages API docs based on any PR included code changes.
@vickywane upon further research, I noticed that the latest best practice is as follows:
Notice that there are some details that need to be carefully worked out. Netlify deployments guarantee that all production deployments remain at their permanent versioned URLs. This is critically important to avoid maintenance windows with service downtime. If a serverless API has a breaking change, the currently deployed PWA will keep running against the previous API version URL. Deploying a new API version and consequently a new PWA version should lead to a smooth user transition to the new versions without any downtime. This cycle has to be carefully thought out and tested.
@ivelin i have the OpenAPI generation ( client, stubs & docs ) setup.
However, I need more clarification on how they would be integrated into this repo. After each generation, it creates an entirely new project with several unneeded files.
I feel we should clean up some of the generated files ( using a script) and the rest should be stored in docs
directory.
We need a CI workflow that automatically re-generates and publishes to GitHub pages API docs based on any PR included code changes.
Do we still need to publish the API to Github pages? It should be possible by generating a markdown file from the Swagger YAML and publish to Github pages.
We need a CI workflow that automatically re-generates and publishes to GitHub pages API docs based on any PR included code changes.
Do we still need to publish the API to Github pages? It should be possible by generating a markdown file from the Swagger YAML and publish to Github pages.
The end results should be a freshly updated public API docs that is pushed in sync with code release.
@ivelin i have the OpenAPI generation ( client, stubs & docs ) setup.
However, I need more clarification on how they would be integrated into this repo. After each generation, it creates an entirely new project with several unneeded files.
I feel we should clean up some of the generated files ( using a script) and the rest should be stored in
docs
directory.
Makes sense. We only need to commit in github the source files required to re-produce a binary release of a client SDK, server implementation and API docs. Interim local build artifacts can be ignored as usually.
:exclamation: No coverage uploaded for pull request base (
main@6098713
). Click here to learn what that means. The diff coverage isn/a
.:exclamation: Current head fdf975a differs from pull request most recent head 50037de. Consider uploading reports for the commit 50037de to get more accurate results
@@ Coverage Diff @@
## main #9 +/- ##
=======================================
Coverage ? 22.01%
=======================================
Files ? 8
Lines ? 536
Branches ? 157
=======================================
Hits ? 118
Misses ? 418
Partials ? 0
@ivelin i have the OpenAPI generation ( client, stubs & docs ) setup.
However, I need more clarification on how they would be integrated into this repo. After each generation, it creates an entirely new project with several unneeded files.
I feel we should clean up some of the generated files ( using a script) and the rest should be stored in
docs
directory.
Yes, agreed. Only keep in github files that are required for the build to succeed. Interim files can be cleaned up. Deployment artifacts should be automatically tested and published. Docs should be updated and published with each new release.
See how we do that for the Edge Python API.
Once we have an URL for the premium subscription API, we will included it in the official docs.
@ivelin for some strange reason changes within the
docs/package.json
file are not being applied in the CI. It has been changed to generate a code coverage report using instanbul which is further being used byCodeCov
.
I am not sure what you are asking me here. Please clarify what the issue is and what you want me to help with.
@vickywane which of these TODOs are now done?
[ ] semantic release
- [ ] tag and label a new github release
- [ ] publish client sdk to npm
- [ ] follow up: PWA import of client sdk for user subscription management
The following items have been implemented;
Tests against Netfliy function pre-deploy
Tests validating that the latest implementation of API functions matches the latest open api definitions. This is auto generated by the openapi generator and run together with the test script in the CI steps
Semantic versioning, release and deployment of the Netfliy functions implementing the latest open api spec.
This has been implemented with the semantic-release
package in the package.json
file.
semantic release of versioned docs, generated and published on github pages
Static HTML/CSS pages from the openapi specs are currently being generated using [bootprint]() as part of the steps run in the CI. Also the static pages is being published on each run using the actions-gh-pages
action.
Also, previously created serverless function endpoints have been renamed to follow standard naming conventions
@ivelin To use actions-gh-pages with the Ambianic repository we need to create and supply GITHUB_TOKEN
using secrets. The publishing step has been commented out from the CI steps as it currently breaks the CI.
@ivelin i have the OpenAPI generation ( client, stubs & docs ) setup. However, I need more clarification on how they would be integrated into this repo. After each generation, it creates an entirely new project with several unneeded files. I feel we should clean up some of the generated files ( using a script) and the rest should be stored in
docs
directory.Yes, agreed. Only keep in github files that are required for the build to succeed. Interim files can be cleaned up. Deployment artifacts should be automatically tested and published. Docs should be updated and published with each new release.
See how we do that for the Edge Python API.
Once we have an URL for the premium subscription API, we will included it in the official docs.
@vickywane which of these TODOs are now done?
[ ] semantic release
- [ ] tag and label a new github release
- [ ] publish client sdk to npm
- [ ] follow up: PWA import of client sdk for user subscription management
Within the TODO above, the following items have been done;
semantic-release
auto tagFormat )I am deferring the item below to follow-up PR;
@ivelin I have the actions-gh-pages
action working on my own fork. It won't work here until the access token is provided using secrets
.
Note: To access the published site, this repo has to be renamed to [username].github.io or [organization].github.io.
See docs.
Can I see what they look like on your fork?
On Thu, Apr 1, 2021 at 8:13 PM Nwani Victory @.***> wrote:
@ivelin https://github.com/ivelin I have the actions-gh-pages action working on my own fork. It won't work here until the access token is provided using secrets.
Note: To access the published site, this repo has to be renamed to .github.io http://github.io or .github.io http://github.io. See docs https://docs.github.com/en/pages/getting-started-with-github-pages/about-github-pages#types-of-github-pages-sites .
[image: github-pages] https://user-images.githubusercontent.com/29664439/113369304-f6166500-9358-11eb-8bd4-64b847479619.png
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ambianic/ambianic-subscriptions/pull/9#issuecomment-812263261, or unsubscribe https://github.com/notifications/unsubscribe-auth/AARBUFL6OSUL7JXSVWFET4LTGUK3ZANCNFSM4YZGSWBA .
@ivelin To use actions-gh-pages with the Ambianic repository we need to create and supply
GITHUB_TOKEN
using secrets. The publishing step has been commented out from the CI steps as it currently breaks the CI.
@vickywane incorrect assessment.See again the [gh pages docs](https://docs.github.com/en/pages/getting-started-with-github-pages/configuring-a-publishing-source-for-your-github-pages-site. The reason why the CI fails is that you are trying to push gh pages to a PR branch. GH Pages push is usually on the master branch unless a difference branch is specifically configured.. See how its supposed to be done
Here is an example of the github-page from my fork:
https://vickywane.github.io/ambianic-subscriptions.github.io/
@vickywane incorrect assessment.See again the [gh pages docs](https://docs.github.com/en/pages/getting-started-with-github-pages/configuring-a-publishing-source-for-your-github-pages-site. The reason why the CI fails is that you are trying to push gh pages to a PR branch. GH Pages push is usually on the master branch unless a different branch is specifically configured.. See how its supposed to be done
Alright. I guess the actions-gh-pages action can improve their documentation on this. I would make a patch for it.
The root readme file for the repository has been updated to reflect the project make-up, CI/CD process alongside all sensitive credentials used within this project.
The netflify-wait action has been working fine for a long time on the ambianic-ui repo. Are you sure its a bug? Did you discuss with the project maintainer?
The netlify-wait-action is getting a wrong PR number and it mismatches with netlify's deploy number.
E.g; The wait action generates:
https://deploy-preview-9--ambianic-serverless.netlify.app/.netlify/functions/customers
Pointing 1 as the PR number, While the correct PR number is 9.
The error could probably be from the git history. If after merge it works normally, I would revert to it
If its not used, why is it still part of the PR commit?
I would include it in the .gitignore
@ivelin
With regards to the naming convention used by the openapi-generator
, I think I finally found how the names are getting generated.
The name given to the function within an endpoint is a concatenation of the endpoint url + endpoint operation.
e.g;
netlifyFunctionsCustomersGet
subscriberPost
After manually altering these names, they were renamed back by the generator. This leads to the question;
Would we manually edit the names and stop auto-generating the API Specs with openapi-generator
?
I can rename the endpoints to something clearer ( e.g .netlify/functions/customers
to something shorter) but we would still end up with something similar ( endpoint URL + operation method ).
With regards to the security risk from fetching all premium subscribers, this is still an issue;
It's actually a problem originating from Stripe's fetchcustomer endpoint. It ideally should contain a means of getting a single user by email or any other recognizable field.
Rather It requires a customer-id which becomes a problem for us, since a database is not being used to store user's details.
I am proposing a solution;
The subscribers list can be fetched and filtered within the serverless function then it returns a single user as a response to the PWA.
In theory, a request is made to fetch a user detail passing in the user's email address e.g john@mail.com
, the serverless function receives it, pulls the entire customers, filters for john@mail.com
and returns john's
details if available.
This mitigates the security risk in the following ways;
It returns only a single requested object
A STRIPE_KEY
is supplied from the environment variables to pull the premium customers, hence even when cloned by other users, they don't have access to Ambianic's Stripe Credentials.
If a request is made to the serverless function without specifying a registered user's email, it returns no data!.
@ivelin
With regards to the naming convention used by the
openapi-generator
, I think I finally found how the names are getting generated.The name given to the function within an endpoint is a concatenation of the endpoint url + endpoint operation.
e.g;
- .netlify/functions/customers ( GET method ) =
netlifyFunctionsCustomersGet
- /subscriber ( POST method ) =
subscriberPost
After manually altering these names, they were renamed back by the generator. This leads to the question;
Would we manually edit the names and stop auto-generating the API Specs with
openapi-generator
?I can rename the endpoints to something clearer ( e.g
.netlify/functions/customers
to something shorter) but we would still end up with something similar ( endpoint URL + operation method ).
Why do we need a .netlify directory before ./functions ?
Also, if you define operationId
in the API spec the generator will use it to create function names.
@ivelin With regards to the naming convention used by the
openapi-generator
, I think I finally found how the names are getting generated. The name given to the function within an endpoint is a concatenation of the endpoint url + endpoint operation. e.g;
- .netlify/functions/customers ( GET method ) =
netlifyFunctionsCustomersGet
- /subscriber ( POST method ) =
subscriberPost
After manually altering these names, they were renamed back by the generator. This leads to the question; Would we manually edit the names and stop auto-generating the API Specs with
openapi-generator
? I can rename the endpoints to something clearer ( e.g.netlify/functions/customers
to something shorter) but we would still end up with something similar ( endpoint URL + operation method ).Why do we need a .netlify directory before ./functions ?
With regards to the security risk from fetching all premium subscribers, this is still an issue;
It's actually a problem originating from Stripe's fetchcustomer endpoint. It ideally should contain a means of getting a single user by email or any other recognizable field.
Rather It requires a customer-id which becomes a problem for us, since a database is not being used to store user's details.
I am proposing a solution;
The subscribers list can be fetched and filtered within the serverless function then it returns a single user as a response to the PWA.
In theory, a request is made to fetch a user detail passing in the user's email address e.g
john@mail.com
, the serverless function receives it, pulls the entire customers, filters forjohn@mail.com
and returnsjohn's
details if available.This mitigates the security risk in the following ways;
- It returns only a single requested object
- A
STRIPE_KEY
is supplied from the environment variables to pull the premium customers, hence even when cloned by other users, they don't have access to Ambianic's Stripe Credentials.- If a request is made to the serverless function without specifying a registered user's email, it returns no data!.
@vickywane
Here is what the premium subscriber function could look like:
name: isAmbianicPremiumSubscriptionActive(user-id)
parameter: user-id
- unique user identified for the Auth0 service.
implementation:
user-id
function parameter and read the corresponding stripe customer-id
attribute.STRIPE_KEY
to check if the customer-id
has an active premium subscription.True
if stripe confirms. False otherwise.Flow:
user-id
from Auth0 after user logs inisAmbianicPremiumSubscriptionActive
. A future implementation may optimize this with caching the expiration time of the subscription service and invalidating the cache if the subscription is cancelled.Why do we need a .netlify directory before ./functions ?
That endpoint was making direct requests to the Netlify function retrieving customers from Stripe's API. This has been corrected to fix the security issue.
@vickywane
Here is what the premium subscriber function could look like:
name:
isAmbianicPremiumSubscriptionActive(user-id)
parameter:
user-id
- unique user identified for the Auth0 service.
This has been implemented partly in the subscription-data
endpoint. It returns a user_metadata
containing a duration
field with a start_date
and end_date
for the subscription. From the PWA, a check is done to see if the day is within the subscription duration.
@vickywane Please add code coverage report to the CI and test runs against a local install and a cloud deployment preview.
@ivelin I had to make adjustments here for code coverage.
Previously we decided to stick with using tests from Postman ( run with Newman), but this doesn't fulfill the code coverage requirement. ( see this discussion ).
So I had to bring back using Mocha tests. Using Istanbul, we can generate code coverage.
I think we later have to decide which to stick with or evaluate if we really need code coverage here.
@ivelin I have fixed the naming comments on this pull request.
I am however being held back by an issue with the wait-for-netlify-action
being used in the CI pipeline. I have reported the issue in the wait-for-netlify-action repository here and also in the Netlify community forum here.
In the interim, can I substitute the wait-for-netlify-action
with my custom script written in the scripts.sh
file ( wait-for-url
function). It takes the preview number and waits for the deployed URL to return a 200
status before proceeding.
This is the reason why the current CI check fails and also should address your two comments on the CI file above here.
How are we going to protect these public APIs from malicious attacks that result into random notifications sent to users?
I have created this new issue within this repository to track work done to address and provide a solution for this potential problem.
It would involve the PWA and edge device authenticating with this Cloud API each time a request is made.
@ivelin As requested, below are the relevant links generated from my own fork of this pull request here;
Below is a list of follow-up issues generated from this feature to worked upon after this pull request is merged;
@ivelin As requested, below is a list of objectives for this pull request.
Each respective item below contains a link to where they were implemented in the continuous integration workflow and executed on my fork;
Correct. Which is why we need to be thoughtful before we make it official.
On Tue, May 25, 2021 at 5:55 PM Nwani Victory @.***> wrote:
@.**** commented on this pull request.
In openapi-docs/docs/DefaultApi.md https://github.com/ambianic/ambianic-subscriptions/pull/9#discussion_r639259736 :
@@ -0,0 +1,224 @@ +# AmbianicFunctionsCollection.DefaultApi + +All URIs are relative to https://33743be6-3197-4dd4-8471-50b3640320c9.mock.pstmn.io + +Method | HTTP request | Description +------------- | ------------- | ------------- +deleteSubscription | DELETE /subscription | Delete an Ambianic's user subscription +getSubscriptionData | GET /subscription | Get a user's subscription data +sendNotification | POST /notification | Send an event detection notification +subscribeUser | POST /subscribe | Subscribe Ambianic user to premium notification
Okay.
Something to note is that each time a change is made in the route, it also has to be updated in the Ambianic PWA.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ambianic/ambianic-subscriptions/pull/9#discussion_r639259736, or unsubscribe https://github.com/notifications/unsubscribe-auth/AARBUFNFXCV26UVQXKWHDFLTPQTFLANCNFSM4YZGSWBA .
@ivelin I just implemented a change to use a local mock server created using @stoplight/prism-cli
for mocking the responses made in each of the API tests. Previously we used mock servers managed by postman.
With this new change, you would find some inlineResponse
files autogenerated from the openapi-spec.yaml
file. These files contain the data for the mock response served by Prism
That's great progress.
On Wed, May 26, 2021 at 5:37 PM Nwani Victory @.***> wrote:
@ivelin https://github.com/ivelin I just implemented a change to use a local mock server created using @stoplight/prism-cli for mocking the responses made in each of the API tests. Previously we used mock servers managed by postman.
With this new change, you would find some inlineResponse files autogenerated from the openapi-spec.yaml file. These files contain the data for the mock response served by Prism https://meta.stoplight.io/docs/prism/README.md
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ambianic/ambianic-subscriptions/pull/9#issuecomment-849166131, or unsubscribe https://github.com/notifications/unsubscribe-auth/AARBUFMYBTP5HVHYD5V5F4LTPVZZTANCNFSM4YZGSWBA .
@ivelin
If adding all generated docs for the Cloud API is what is left from this pull request, can we proceed with a merge so I can add the live links and push it to the docs pull request?
This resolves this issue
This PR contains a YAML exported API Definition for the Ambianic-functions-API using the created API collection on Postman.