Azure / azure-sdk-for-js

This repository is for active development of the Azure SDK for JavaScript (NodeJS & Browser). For consumers of the SDK we recommend visiting our public developer docs at https://docs.microsoft.com/javascript/azure/ or our versioned developer docs at https://azure.github.io/azure-sdk-for-js.
MIT License
2.1k stars 1.21k forks source link

[Samples] Provide a container to run samples in #13855

Closed deyaaeldeen closed 8 months ago

deyaaeldeen commented 3 years ago

I was talking to @jeremymeng privately and we think it can more convenient for customers if they download standalone zipped samples they want and try it out in a container so having a Dockerfile living beside samples can be helpful in this case.

cc @witemple-msft @richardpark-msft @ramya-rao-a

deyaaeldeen commented 3 years ago

Jeremy's rough proposal:

# 15/14/12/10
ARG NODE_VERSION
ARG FILE_TO_RUN

FROM node:$NODE_VERSION-alpine

ARG NODE_VERSION
ENV NODE_VERSION=$NODE_VERSION
ARG FILE_TO_RUN
ENV FILE_TO_RUN=$FILE_TO_RUN
ARG ACCOUNT_NAME
ARG ACCOUNT_KEY
ENV ACCOUNT_NAME=$ACCOUNT_NAME
ENV ACCOUNT_KEY=$ACCOUNT_KEY

RUN mkdir -p /samples
WORKDIR /samples
ENV HOME=/samples

COPY . ./
RUN npm install --ci

CMD node $FILE_TO_RUN
 docker build -t storage-sample:12 --build-arg NODE_VERSION=12 .
 docker run -e FILE_TO_RUN=basic.js -e ACCOUNT_NAME=<account name> -e ACCOUNT_KEY=<key> -it --rm storage-sample:12
jeremymeng commented 3 years ago

for customers, we could probably just use the latest node version.

richardpark-msft commented 3 years ago

This is interesting but packaging them in the container means they don't have easy access to the samples themselves (ie, the running of samples is interesting, but only in the context of having the code and being able to change it).

Or are you saying don't package up the samples, just -v them into the container (that's not in your sample command line)

deyaaeldeen commented 3 years ago

I see your point. It depends on whether we want to just show them off to our customers or to enable them to play with the source code. The latter is the more general scenario but maybe someone could get spooked to mount stuff from their host filesystem into a random container?

jeremymeng commented 3 years ago

The idea is to provide an aditional file Dockerfile along side the samples code so people can build a container which contains the samples and run the container. They can change the samples code, re-build container and run again. They could also potentially use this as a start point to containerize their applications.

richardpark-msft commented 3 years ago

Yeah, that makes sense. If we walk it a bit through the customers perspective, they want to containerize it because they probably have somewhere to run that container (ie, we're saying this isn't really something they're doing just to do development on their box).

If that's the case then we could consider going a little further and showing them how to deploy the container, or maybe give them enough to build the container and then pointing them to one of the several Azure offerings/tutorials that show them how to deploy it.

Another thing I'm wondering is what that sample would be - having a sample that runs and quits isn't super interesting to run in say ACS or AKS - you're typically trying to run something with some lifetime to it. In Service Bus or Event Hubs that's somewhat easy because you're basically starting up a little server and you expect it to stay up for awhile. Often times the orchestrators will also have a "restart this thing on quit" which would be detrimental if the sample is short-lived (like upload a single blob and quit).

What kind of samples do you currently have in mind that would fit in this model? Would we sometimes include some express code or something so they have a persistent process to hang around that can do some work? Like processing new blobs in a container or something of that nature?

deyaaeldeen commented 3 years ago

Yeah these are all great points.

you're typically trying to run something with some lifetime to it

Perhaps we can write real-world but still tiny apps as samples that have some lifetime? it does not have to be a server per se. Also, we can make the output interesting so the customer is willing to deploy the container to see the interesting results?

Often times the orchestrators will also have a "restart this thing on quit" which would be detrimental if the sample is short-lived

I think this can be configured, e.g. see https://docs.microsoft.com/en-us/azure/container-instances/container-instances-restart-policy#container-restart-policy

What kind of samples do you currently have in mind that would fit in this model?

For text analytics, perhaps we can write a tiny app to fetch documents from public sources and keep producing analytics for it for a predetermined amount of time? Perhaps analyze sentiment for public news across major categories for a given period of time. Given the rate/input limits TA have, this could take a long time.

richardpark-msft commented 3 years ago

Perhaps analyze sentiment for public news across major categories for a given period of time.

Now that's an interesting sample - very creative! And I could definitely see why we'd make it possible to run that for an extended time!

richardpark-msft commented 3 years ago

Often times the orchestrators will also have a "restart this thing on quit" which would be detrimental if the sample is short-lived

I think this can be configured, e.g. see https://docs.microsoft.com/en-us/azure/container-instances/container-instances-restart-policy#container-restart-policy

Just to note - you're 100% right it's configurable but the default is typically to Always restart on quit (which would be bad for samples that launch and die).

github-actions[bot] commented 8 months ago

Hi @deyaaeldeen, we deeply appreciate your input into this project. Regrettably, this issue has remained inactive for over 2 years, leading us to the decision to close it. We've implemented this policy to maintain the relevance of our issue queue and facilitate easier navigation for new contributors. If you still believe this topic requires attention, please feel free to create a new issue, referencing this one. Thank you for your understanding and ongoing support.