Closed furlanrapha closed 7 years ago
According to Docker's run document, you could use -e
flags to set any environment variable in the container.
For example:
docker run \
-d \
-e "NODE_ENV=production" \
-e "REACT_APP_APIKEY=foObArBAz" \
your-image-name
Then, your could get the value from process.env
in your JS code:
console.log(process.env.REACT_APP_APIKEY)
// foObArBAz
Sorry @jihchi, I will give more context here:
I'm trying to run the npm run build
and set this build version for Staging and Production environment.
Since I have 2 environments, I was trying to use the AWS ECS env vars (defining it inside the Task Definition) to set the environment variables. The catch is, when I run npm run build
it tries to copies the local env vars to the build version. For development, I use the .env
file but I don't want to use it for creating the Docker image, so I have created a .dockerignore
file to ignore the .env
file. That's all working fine here. Now it is generating the build version and I saw in the minified JS the env vars as REACT_APP_APIKEY
(following your example). But now when I set the env vars it looks like it doesn't get my configuration.
My final question is: when I generate the build version (with npm run build
), I can't have a custom configuration to use in this build version? Or I am missing some point here? If I can't have a builde version with a custom configuration, I have to assume running the "development" version to set custom configuration?
I will attach here my Dockerfile
to give a better view of the situation.
Dockerfile
FROM node:6.3.1
# Create app directory
RUN mkdir -p /src/app
WORKDIR /src/app
# Install app dependencies
COPY package.json /src/app/
RUN npm install
# Bundle app source
COPY . /src/app
# Build and optimize react app
RUN npm run build
EXPOSE 3000
# defined in package.json
CMD [ "npm", "run", "start:server" ]
packages.json (the rest was omitted for context purposes)
...
"start:server": "http-server -p 3000 ./build",
...
Maybe you could use ARG directive in Dockerfile.
For example, Add following code in your Dockerfile:
ARG NODE_ENV=staging
ENV NODE_ENV=$NODE_ENV
ARG REACT_APP_APIKEY=foObArBAz
ENV REACT_APP_APIKEY=$REACT_APP_APIKEY
Then, execute docker build
command with additional --build-arg flags:
(Assume that you have environment variables in host machine called HOST_NODE_ENV
and HOST_REACT_APP_APIKEY
)
docker build \
--build-arg NODE_ENV=$HOST_NODE_ENV \
--build-arg REACT_APP_APIKEY=$HOST_REACT_APP_APIKEY \
.
docker build
will pass in host environment variables to your Dockerfile.
@jihchi my intention is to use the build
version with custom env vars. Looking to the build version it looks like is everything minified so I can't have a build version that I can customize the env vars for Staging and Production.
I think that what I want is not supported. I will have to probably run the development server in order to accomplish this.
I don't know anything about Docker. Is there anything I could help you with? Do we need to fix something in Create React App, or is it just a usage question?
So @gaearon, in Docker you generate an image and send this to the container (in my case in using AWS ECR). This image is like a build
version, but I can use this image for all my environments. In my case here, we have Staging and Production environment. So I have the possibility to have the same image version of my application (0.1.0, 0.2.0, etc..) and use through the environments.
The point for create-react-app
is, if I generate the Docker image using the npm run build
it will try to read my env vars and I actually doesn't have it at this time (because the env vars stay on the AWS ECR). What I need would be the ability to generate the build version with the minified files but still have the ability to set the env vars to use in this build version.
I don't know if my explanation was good for your understanding and I don't know if what I want is possible to achieve since in the build version we only have static files.
So you build the image on the local machine, then deploy it? Once it is built, you cannot add any env var to it, because CRA builds a static file. I think the best way is to serve the html with php or node backend that reads from env var to print a global variable.
@gaearon No, for the most part docker works fine with create-react-app, however if anyone has a similar issue to this, they just need to make sure they add env_file: .env
to their docker-compose.yml
@gearon Docker is amazing, I can run create-react-app without installing node on my local, hence avoiding having to use node/npm version managers such as nvm. Plus, it just works! I would even go so far to say at the very least you should add a how to integrate with docker section to the readme.
If anyone is interested I got it to work with the following (for all steps below, replace the string boilerplate with you apps name) - if there are any improvements I can make, please let me know.
Step 1) After creating your react app with create-react-app -> cd into your newly created apps directory and run this command
docker network create boilerplate
Step 2) Add a dockerfile to the root of your app directory
Dockerfile
FROM node:6.9.4
# Prepare app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app/
# Install dependencies
COPY package.json /usr/src/app/
RUN npm install --silent
ADD . /usr/src/app/
EXPOSE 3000
CMD [ "npm", "start" ]
Step 3) create a docker-compose.yml in your root directory
docker-compose.yml
version: "2"
services:
frontend:
container_name: "boilerplate"
build: .
environment:
env_file: .env
NODE_ENV: development
ports:
- '3000:3000'
volumes:
- .:/usr/src/app
networks:
default:
external:
name: boilerplate
You can now access your app as normal at http://localhost:3000
and you can even interact with your new docker container via the following command
docker-compose run --rm boilerplate /bin/bash
Yes, this isn't following some of the nodejs docker recommend best practices, but I use the above only for the dev environment configuration so I'm unaware if those recommendations are needed. I'm open minded to improving this though.
Obviously you need to configure the build docker differently than above.
@haluvibe How can I run create-react-app without having nodejs installed on my local?
@cschroeter
By using Docker as he describes. Docker makes a "container" that contains system level dependencies, and executes the Node runtime inside that.
If you have a dockerized app, you could completely uninstall Node from your system, and still run the app.
@LucasKA But I have to have NodeJS installed to create the boilerplate? That is a little disadvantage. Also if I am using Typescript, it looks up the typings files in the corresponding node_modules folder. So I am not 100% convinced how I can get rid of NodeJS during development time :/
@cschroeter I can't speak for your use cases, as I don't know what boilerplate you are running and I don't use typescript.
However, a container is just that, a system level container. You can SSH into a Docker container and run commands, or set up scripts for container build or what have you.
I don't use Docker to "remove Node completely", I still keep the Current version of Node on my laptop (with all the global CLI tools).
I use docker to run my apps in as similar of an environment I am deploying to. Mostly Node 6.9.x LTS with NPM 3.10
thanks for all the tips on this page, I've been able to get CRA running in docker. I found a few other handy things:
Here's my setup: Dockerfile
FROM node:6.3.1
RUN apt-get update && \
apt-get install -y nginx
WORKDIR /src
COPY . /src
RUN npm install
CMD /bin/bash ./run.sh
run.sh (executed by the docker container)
#!/usr/bin/env bash
set -e
set -x
export NODE_ENV="${NODE_ENV:-development}"
if [ $NODE_ENV == "development" ]; then
# this runs webpack-dev-server with hot reloading
npm start
else
# build the app and serve it via nginx
npm run build
mkdir -p $ROOT/logs/nginx
nginx -g 'daemon off;' -c $ROOT/src/nginx.conf
nginx -c $ROOT/src/nginx.conf
fi
NOTES: If NODE_ENV=development this will run the webpack-development-server, otherwise it will build the app and serve it via nginx. (I got this from a great blog post which I can't for the life of me find anymore)
nginx.conf
worker_processes 1;
events {
worker_connections 1024;
}
http {
access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log;
server {
gzip on;
listen 8000;
server_name localhost;
root /src/build;
include /etc/nginx/mime.types;
location /nginx_status {
stub_status on;
access_log off;
}
location / {
try_files $uri $uri/ /index.html;
}
}
}
docker-compose.yml
version: '2'
services:
dev:
build:
context: .
dockerfile: Dockerfile
image: ui-dev
container_name: webpack-container
environment:
- NODE_ENV=development
ports:
- "8080:3000"
- "35729:35729"
volumes:
- .:/src
- /src/node_modules
test:
build:
context: .
dockerfile: Dockerfile
image: ui-test
container_name: webpack-test-container
environment:
- NODE_ENV=test
volumes:
- .:/src
- /src/node_modules
command: npm test
prod:
build:
context: .
dockerfile: Dockerfile
image: guest-ui-prod
container_name: prod-container
environment:
- NODE_ENV=production
ports:
- "8000:8000"
volumes:
- /src/node_modules
For my development with hot reloading, I run one terminal window and execute the following:
docker-compose up -d dev
To watch and run tests, I then open another terminal windown and execute:
docker-compose up test
(I don't use the -d flag as I want to see the output)
And for a production sanity check, I can run
docker-compose up prod
NOTE: - this will serve the react app at the time the docker image was created, so if you've been making changes since the docker build, these won't appear. However, if this is used as part of a CI system which checks out the code and builds it, the image built will be the latest.
On an unrelated note, I also found it useful to set the NODE_PATH in my package.json to allow relative imports:
"scripts": {
"start": "NODE_PATH=./src/ react-scripts start",
"build": "NODE_PATH=./src/ react-scripts build",
...
}
@furlanrapha Did you have your problem solved? Thanks advices
No @jayhuang75, I'm still doing npm install
and running the development server in the Docker image so I can have the env vars set in AWS Task Definition.
@haluvibe your Docker configuration looks good. Solely creating additional network seems unneccessary, because by default Docker Compose sets up a network for you:
By default Compose sets up a single network for your app. Each container for a service joins the default network and is both reachable by other containers on that network, and discoverable by them at a hostname identical to the container name.
Also in order to enable livereload feature on Windows host, you have to enable chokidar polling, because of inotify does not work on Docker for Windows. Here I shared my experience and how I made it working on Windows.
@furlanrapha I have had a very similar issue, and I was able to sucessfully fix it. I am using webpack to do the "build", and I am able to access all environment variables inside my react app.
Can you please tell me how do you do your builds ? I might be able to help.
@daveamit you ejected your app from react-scripts
?
Here is the Dockerfile that I use today (we don't use docker-compose
):
FROM node:6.9.4
EXPOSE 3000
CMD [ "npm", "run", "start" ]
WORKDIR /src/app
# Install app dependencies
COPY npm-shrinkwrap.json .
COPY package.json .
RUN npm install
COPY public ./public
COPY src ./src
@furlanrapha This is what I do.
Step1:
Added DefinePlugin in webpack config. This tells webpack to include these node environment variables (process.env.
new webpack.DefinePlugin({
'process.env': {
NODE_ENV: JSON.stringify(process.env.NODE_ENV),
API_ENDPOINT: JSON.stringify(process.env.API_ENDPOINT),
DEFAULT_TOKEN: JSON.stringify(process.env.DEFAULT_TOKEN),
SHOW_VERSION_INFO: JSON.stringify(process.env.SHOW_VERSION_INFO),
},
}),
Step 2:
This is my build command (in package.json)
"build": "cross-env NODE_ENV=production webpack --config internals/webpack/webpack.prod.babel.js --color -p --progress",
This will procude minified, optimized, chuncked output.
Step 3:
"start:production": "npm run test && npm run build && npm run start:prod",
This command, runs tests, then builds and then starts the production server
Step 4:
"start:prod": "cross-env NODE_ENV=production node server",
Runs production server (serves static content generated in step 2
- via npm run build command)
Step 5: My docker file
FROM node
ADD /package.json /tmp/package.json
WORKDIR /tmp
RUN npm install
ADD . /tmp/
EXPOSE 3000
ENTRYPOINT npm run start:production
The magic happens at the "build" command. The webpack configuration that I do in step 1 causes the step 2 build step to actuall compile the app with given environment variables.
The trick is I have put "npm run start:production" in "ENTRYPOINT", this will enable access to the environemnt vairables passed while spinning off a container from the image.
For testing if I run following commands
docker build -t test
and then run it using
docker run -p 3000:3000 -e "SHOW_VERSION_INFO=true" /
-e "SHOW_VERSION_INFO: true" /
-e "API_ENDPOINT=http://localhost:6010" /
-e "DEFAULT_TOKEN=sometoken" test
We are taking build version from package.json, but decided to show or hide depending on environment variable. The component looks like this
import React from 'react';
import styled from 'styled-components';
import { version } from '../../../package.json';
const VersionWrapper = styled.span`
position: fixed;
align: top;
top: 5px;
right: 20px;
fontSize: 9px;
color: black;
line-height: 1;
display: ${process.env.SHOW_VERSION_INFO ? 'block' : 'none'}
`;
// Find a suitable way to show build numbers only during dev/qa/int.
// UPDATE: Solution incorporated at VersionWrapper (observe display prop)
const BuildInfo = () => <VersionWrapper> {version} </VersionWrapper>;
BuildInfo.propTypes = {
};
export default BuildInfo;
Hope this helps.
@baxford I liked your solution very much đ the problem is that i wasn't watching my files.
This guys gave the solution by setting this
CHOKIDAR_USEPOLLING=true
It is also reported to the official documentation.
If the project runs inside a virtual machine such as (a Vagrant provisioned) VirtualBox, create an .env file in your project directory if it doesnât exist, and add CHOKIDAR_USEPOLLING=true to it. This ensures that the next time you run npm start, the watcher uses the polling mode, as necessary inside a VM.
@daveamit in order to do this, you ejected your CRA right? because we have no access to webpack.config.js normally. Or has this changed?
@furlanrapha in order to be able to make environment variables for your container available to your already-built-for-production react app at run time, do any of the approaches outlined in #578 work for you? Basically this comes down to generating an env.js file (either during container startup within the entry point or dynamically through an addition to the (nodejs or something else) server you might have) and referencing it within the index.html of CRA.
@furlanrapha I'm having the exact need and wondering what's your final approach, without eject CRA.
@rmoorman also tried the approaches you suggested but not working properly, maybe I've done something wrong.
@james-tiqk would it be possible for you to share what you have got right now?
@rmoorman Hi mate thanks for the reply. I have the env variables set in Beanstalk and in project Dockerfile, I use npm run build
to create the prod bundle. In the src code, I'm referencing the process.env.REACTAPP* but seems like it is not working properly.
I'm not using docker-compose and I wouldn't prefer eject
react scripts.
@james-tiqk do you run npm run build
as part of your docker ENTRYPOINT
/CMD
script or within a RUN
directive? The environment variables need to be around when the build command is run and if you build the react app within RUN
, the build actually happens when building the image, not while running the container. Then, this could be why the environment variables set for your container in Beanstalk are not being picked up as they might affect the environment variables the script called for ENTRYPOINT
or CMD
receive (not knowing your precise setup though).
That's also what the discussion in #578 is mainly about. Usually, you actually don't want to run npm run build
within the container entry point but rather within a RUN
directive (so it can be cached and does not have to be executed everytime a container is launched).
So how to get the variables in there when npm run build
was already executed? That's where the public/env.js
file comes in. When you add a <script src="%PUBLIC_URL%/env.js"></script>
to your public/index.html
and add a public/env.js
file (with some defaults for development..), you could then: a) within the docker entrypoint script write the environment variables (in your case from beanstalk) to the env.js file or b) in case you have a customizable server thing in your container (nodejs/python/whatever) that is serving your files, you could add a route in there for /env.js
and render the javascript with the correct environment variables on the fly.
(By the way, an env.js
file is chosen instead of a env.json
within the linked thread because you can reference it within the script tag and it will be loaded along with the app.js
right from the start so you won't have to fetch it manually and don't have to account for the additional loading time within the UX; the variables are then just there within the window
global)
@rmoorman thanks mate for the explanation, now knowing the cause I get it works by using a shell command executed after the docker image built.
@james-tiqk, @rmoorman is right, my environment is AWS, uses Jenkins to build and deploy. npm run build
then copy .env
and Dockerfile
to the build
dir then run the docker build -t
then in the docker file I do the usual updates, theninstall serve
expose port and CMD
which then runs a shell script that replaces (using sed
commands) the envars
in the `.env' file from a docker compose file.
Its long winded but fits an existing deployment structure I am used too.
Ultimately, its the CMD ["bash"]
that runs the command to insert the environment variables.
Because we are using Docker
, we should not install node
, npm
, create-react-app
in our development machine, not even for generating create-react-app
scaffold.
For this purpose I am using 2-step docker configuration:
create-react-app
and we can use it to generate create-react-app
scaffold.nodemon
, npm
, development servers, etc.Because we are not installing create-react-app
locally in our development machine, Let's build new docker container for it.
Dockerfile
FROM node:8.2.1-alpine
RUN npm install -g create-react-app \
create-react-native-app \
react-native-cli
RUN mkdir /app
WORKDIR /app
ADD . /app
docker-compose
NOTE: we need to fix this problem, please help me if anyone know the solution.
docker build . -t react-cli
create-react-app
scaffold using this image:docker run react-cli create-react-app myApp
EXPECTED
myApp
in my current directory.RESULT
Got nothing đ . Looks like it is generating app inside docker
container.
Because I was unable to solve it by using just docker
command, I am solving it using
docker-compose
.
docker-compose.yml
version: '3'
services:
web:
build: .
image: react-cli
container_name: react-cli
volumes:
- .:/app
Now using docker-compose
we can generate our react application:
docker-compose run web create-react-app myApp
Everthing should work this time. And you should get react generated application inside myApp
directory.
If anyone knows how to use react-cli
image to generate create-react-app
without using docker-compose.yml file, please let me know.
Thanks
@przbadu Did you start the container with a volume as you did in the docker compose config?
After reading this whole thread, for me, solving the missing env variables was solved by the fifth comment... I simply needed to add the ARG and ENV parameters to my dockerfile so that my build process had access to certain env variables.
I noticed a lot of people struggle with this so provided a Github repo tutorial:
Background via a public article on LinkedIn articles if you care to read:
A lot of people on this thread didn't seem to read the actual description of the problem. It wasn't how to use docker but how to have a single docker image and have the browser use an API_URL that is set by environment variables on the server.
I wrote up the question on stackoverflow.com at https://stackoverflow.com/q/49975735/329496
A solution is provided over two comments at https://github.com/facebook/create-react-app/issues/578 that I have written up in the answer to the question at https://stackoverflow.com/a/49989975/329496
There is working sample code over at https://github.com/simbo1905/react-redux-realworld-example-app/tree/openshift
Update: If you are using a commercial container orchestrator then you might be charged differently for build memory and runtime memory. This answer ensures that you can keep the runtime image to a minimum especially if you run "npm prune" to strip your devDependencies at the end of your build. A big react build can be slow. This approach builds once and lets you move that work between environments which is a big time saver compared with building in each environment.
I created a tutorial on LinkedIN articles but also a companion repo: https://github.com/mikesparr/tutorial-react-docker
What you need to do is build
your app during the run
or cmd
phase of Dockerfile and not before, otherwise the env params will be missing. I add a run
bash script and call that from the Dockerfile and it works, but note you still need to name your ENV vars with REACT_APP_
prefix.
I have the same issue, and I agree with @simbo1905 that half the commenters in this thread didn't really 'get' the issue that was being discussed (also don't know why it's closed, as there's no clean/official way to do this).
Basically:
npm build
the production artifact for my React app.my/react-app:1.0.0
(e.g. inside an Nginx container image)my/react-app:1.0.0
) to the following environments:
backendUrl
, and it needs to be different per environment:
http://stage.mybackend.com/
https://www.mybackend.com/
And if I'm using a container orchestration tool like Kubernetes, ECS, Fargate, Mesos, Rancher, etc., then for all my other apps, I do something like:
my/react-app:1.0.0
on stage with environment variable REACT_APP_BACKEND =http://stage.mybackend.com/
my/react-app:1.0.0
on production with environment variable REACT_APP_BACKEND=https://www.mybackend.com/
The problem is that when I run npm build
, the value for backendUrl
is baked into the compiled JS file for my app, and I can't override it using configuration on my server.
Right now I'm considering doing something like @simbo1905's Stack Exchange answer, namely to render a env.js
script using an entrypoint script when starting the container, and then modify my React app's index.html to read that JS file like <script src="%PUBLIC_URL%/env.js"></script>
.
It would just be nice if there were a cleaner way to do this, or an officially-documented pattern.
When running the node.js development server, it's super easy because I can just refer to process.env. REACT_APP_BACKEND
and know that it will pick it up. (And it looks like that's what @furlanrapha had resorted to as of this comment). But the development server itself has a warning "don't use this in production"... so I want to build
the app and use that, but I'm finding it a bit annoying to deploy one Docker image in multiple environments with per-environment settings.
To use ENV vars in CRA, you must prefix them as REACT_APP_YOUR_VAR_NAME
Furthermore, I learned that you need to pass them in at runtime and not build time, so it's best to create a script the builds and runs your application from the image.
Check out my example (it's older but still works fine) for ideas: https://github.com/mikesparr/tutorial-react-docker
On Tue, Dec 4, 2018 at 4:04 PM Jeff Geerling notifications@github.com wrote:
I have the same issue, and I agree with @simbo1905 https://github.com/simbo1905 that half the commenters in this thread didn't really 'get' the issue that was being discussed (also don't know why it's closed, as there's no clean/official way to do this).
Basically:
- I want to npm build the production artifact for my React app.
- I want to put that artifact into a container image tagged my/react-app:1.0.0 (e.g. inside an Nginx container image)
- I want to deploy that container image (my/react-app:1.0.0) to the following environments:
- Stage
- Prod
- I have a variable like backendUrl, and it needs to be different per environment:
- In stage: http://stage.mybackend.com/
- In prod: https://www.mybackend.com/
And if I'm using a container orchestration tool like Kubernetes, ECS, Fargate, Mesos, Rancher, etc., then for all my other apps, I do something like:
- Run my/react-app:1.0.0 on stage with environment variable REACT_APP_BACKEND =http://stage.mybackend.com/
- Run my/react-app:1.0.0 on production with environment variable REACT_APP_BACKEND=https://www.mybackend.com/
The problem is that when I run npm build, the value for backendUrl is baked into the compiled JS file for my app, and I can't override it using configuration on my server.
Right now I'm considering doing something like @simbo1905 https://github.com/simbo1905's Stack Exchange answer https://stackoverflow.com/a/49989975/100134, namely to render a env.js script using an entrypoint script when starting the container, and then modify my React app's index.html to read that JS file like <script src="%PUBLIC_URL%/env.js">.
It would just be nice if there were a cleaner way to do this, or an officially-documented pattern.
When running the node.js development server, it's super easy because I can just refer to process.env. REACT_APP_BACKEND and know that it will pick it up. (And it looks like that's what @furlanrapha https://github.com/furlanrapha had resorted to as of this comment https://github.com/facebook/create-react-app/issues/982#issuecomment-278051204). But the development server itself has a warning "don't use this in production"... so I want to build the app and use that, but I'm finding it a bit annoying to deploy one Docker image in multiple environments with per-environment settings.
â You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/facebook/create-react-app/issues/982#issuecomment-444293174, or mute the thread https://github.com/notifications/unsubscribe-auth/AFS70YndYAxlTQbN4Fvj521tSCWu6vytks5u1v9ggaJpZM4Kj-Pn .
Specifically, see the "run" bash script that is called from the Dockerfile and that is how ENV vars get injected into your app. https://github.com/mikesparr/tutorial-react-docker
On Tue, Dec 4, 2018 at 4:25 PM Mike Sparr msparr@gmail.com wrote:
To use ENV vars in CRA, you must prefix them as REACT_APP_YOUR_VAR_NAME
Furthermore, I learned that you need to pass them in at runtime and not build time, so it's best to create a script the builds and runs your application from the image.
Check out my example (it's older but still works fine) for ideas: https://github.com/mikesparr/tutorial-react-docker
On Tue, Dec 4, 2018 at 4:04 PM Jeff Geerling notifications@github.com wrote:
I have the same issue, and I agree with @simbo1905 https://github.com/simbo1905 that half the commenters in this thread didn't really 'get' the issue that was being discussed (also don't know why it's closed, as there's no clean/official way to do this).
Basically:
- I want to npm build the production artifact for my React app.
- I want to put that artifact into a container image tagged my/react-app:1.0.0 (e.g. inside an Nginx container image)
- I want to deploy that container image (my/react-app:1.0.0) to the following environments:
- Stage
- Prod
- I have a variable like backendUrl, and it needs to be different per environment:
- In stage: http://stage.mybackend.com/
- In prod: https://www.mybackend.com/
And if I'm using a container orchestration tool like Kubernetes, ECS, Fargate, Mesos, Rancher, etc., then for all my other apps, I do something like:
- Run my/react-app:1.0.0 on stage with environment variable REACT_APP_BACKEND =http://stage.mybackend.com/
- Run my/react-app:1.0.0 on production with environment variable REACT_APP_BACKEND=https://www.mybackend.com/
The problem is that when I run npm build, the value for backendUrl is baked into the compiled JS file for my app, and I can't override it using configuration on my server.
Right now I'm considering doing something like @simbo1905 https://github.com/simbo1905's Stack Exchange answer https://stackoverflow.com/a/49989975/100134, namely to render a env.js script using an entrypoint script when starting the container, and then modify my React app's index.html to read that JS file like <script src="%PUBLIC_URL%/env.js">.
It would just be nice if there were a cleaner way to do this, or an officially-documented pattern.
When running the node.js development server, it's super easy because I can just refer to process.env. REACT_APP_BACKEND and know that it will pick it up. (And it looks like that's what @furlanrapha https://github.com/furlanrapha had resorted to as of this comment https://github.com/facebook/create-react-app/issues/982#issuecomment-278051204). But the development server itself has a warning "don't use this in production"... so I want to build the app and use that, but I'm finding it a bit annoying to deploy one Docker image in multiple environments with per-environment settings.
â You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/facebook/create-react-app/issues/982#issuecomment-444293174, or mute the thread https://github.com/notifications/unsubscribe-auth/AFS70YndYAxlTQbN4Fvj521tSCWu6vytks5u1v9ggaJpZM4Kj-Pn .
The problem with using one React build in multiple environments is that the environment variables are baked in at build time. Docker or not, those built files are static. Done. Fully-baked. Unless your server can change them somehow.
During development, you get the advantage of a development server, which makes it recompile with every change, and can make it feel like a "live" environment. In production, React has no way of modifying itself based on environment variables since it's just a bunch of static HTML and JS files.
One thing you can key off is the hostname. So, what I've done in the past is to have one file called api-config.js
which knows about all my different environments, and sets the API endpoint based on the window.location.hostname
at runtime. Then, anything that needs to make an API call can import this file and will know which URL to hit. I wrote up an article with an example of how to set this up and configure API endpoints dynamically. This is the relevant part:
//// api-config.js
let backendHost;
const apiVersion = 'v1';
const hostname = window && window.location && window.location.hostname;
if(hostname === 'realsite.com') {
backendHost = 'https://api.realsite.com';
} else if(hostname === 'staging.realsite.com') {
backendHost = 'https://staging.api.realsite.com';
} else if(/^qa/.test(hostname)) { // starts with "qa"
backendHost = `https://api.${hostname}`;
} else {
backendHost = process.env.REACT_APP_BACKEND_HOST || 'http://localhost:8080';
}
export const API_ROOT = `${backendHost}/api/${apiVersion}`;
Then anywhere you need that endpoint, you can import it and make your calls:
import { API_ROOT } from './api-config';
function getUsers() {
fetch(API_ROOT + "/users").then(...);
}
The hostname check is a hack and doesn't comply with 12factor.net
What you have to do is move your build step into a script executed by Docker CMD. Then call a shell script that will perform the build and serve steps during runtime.
SEE: https://github.com/mikesparr/tutorial-react-docker/blob/master/Dockerfile
SEE: https://github.com/mikesparr/tutorial-react-docker/blob/master/run
If you DO NOT run the build step in your Dockerfile, then it will pick up ENV vars during runtime and you make your image portable avoiding the need for the mentioned hacks.
On Tue, Dec 4, 2018 at 4:36 PM Dave Ceddia notifications@github.com wrote:
The problem with using one React build in multiple environments is that the environment variables are baked in at build time. Docker or not, those built files are static. Done. Fully-baked. Unless your server can change them somehow.
During development, you get the advantage of a development server, which makes it recompile with every change, and can make it feel like a "live" environment. In production, React has no way of modifying itself based on environment variables since it's just a bunch of static HTML and JS files.
One thing you can key off is the hostname. So, what I've done in the past is to have one file called api-config.js which knows about all my different environments, and sets the API endpoint based on the window.location.hostname at runtime. Then, anything that needs to make an API call can import this file and will know which URL to hit. I wrote up an article with an example of how to set this up and configure API endpoints dynamically https://daveceddia.com/multiple-environments-with-react/. This is the relevant part:
//// api-config.js let backendHost;const apiVersion = 'v1'; const hostname = window && window.location && window.location.hostname; if(hostname === 'realsite.com') { backendHost = 'https://api.realsite.com'; } else if(hostname === 'staging.realsite.com') { backendHost = 'https://staging.api.realsite.com'; } else if(/^qa/.test(hostname)) { // starts with "qa" backendHost =
https://api.${hostname}
; } else { backendHost = process.env.REACT_APP_BACKEND_HOST || 'http://localhost:8080'; } export const API_ROOT =${backendHost}/api/${apiVersion}
;Then anywhere you need that endpoint, you can import it and make your calls:
import { API_ROOT } from './api-config'; function getUsers() { fetch(API_ROOT + "/users").then(...); }
â You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/facebook/create-react-app/issues/982#issuecomment-444301020, or mute the thread https://github.com/notifications/unsubscribe-auth/AFS70RCi3HVzagKKpLLwFuz3RnWH5P5fks5u1wcTgaJpZM4Kj-Pn .
@mikesparr - That's just the problemâyou're saying that I need to have a docker image that has Node.js and all the associated baggage.
If you're supposed to run React apps compiled using npm build
, with just the static compiled files, using a webserver like Nginx, then it's crazy to also have to require a Node.js runtime be present on the Docker image so I can do an npm build
when I start my Docker container. Not only that, the container startup would take at least many seconds, if not minutes... and this all is completely anathema to 12fa apps and pretty much every other type of application I build and deploy in Kubernetes and ECS clusters.
I just noticed you're using serve
(which is a Node.js-based http server, it seemsâI haven't personally used it), but you still have to wait at container startup for the entire app to be built. That option seems like it could work, but it would not be ideal and limits you to either having a Docker image with Node.js and a decent webserver (meaning at least double or triple the size of an image like nginx-alpine
), or a Docker image with Node.js and serve
or some other Node-based http server.
@mikesparr The problem with building the app in run
is that our app needs 1.2G of memory to npm run build --production
but only 0.25G to run. We are charged per gig by our k8s provider. And as its a large app it takes a long, long, time for build. If that happens in run
our rolling deployments and crash recovery are slowed down by minutes. Our CI builds can happily supply the 1.2G and pay the time cost once to build a release of our app. And out of that we want a docker image that is super fast to startup and only takes up the space it takes to serve the static app as our backend APIs are written in other languages. 12factor.net says to separate "build, release, run" and the example you give of running npm run build
on startup cuts a corner. When the build time takes minutes for a large app and you get charged money for the memory you don't use after the compile phase its a real problem.
@mikesparr I don't know much about 12factor - which part is violated by configuring the endpoint based on hostname? For the most part, all it does is effectively prepend api.
on the existing hostname, so I'm interested to learn where that breaks down in practice.
Good points/questions. I'll respond to both threads in one.
I'm sharing how you can access ENV vars within a Dockerized CRA app. As mentioned if you add build layer in Docker, you won't be able to read them, so the way is to build and run from script and called from CMD.
Understand concern on cold starts given the lengthy build. If HA requirement, likely have an add number of instances at any given time (3, 5, 7, 9, ...) and if k8s health check fails and it has to spin up one, the load balancer/ingress will still route to other instances. Even when your CI builds/tests/deploys new versions, k8s will replace one node after another until health checks pass and then route to the new instances. I'm unsure whether cloud hosts have a "high water mark" metering of memory but if only the build step takes extra memory, then running uses less, perhaps it's only a concern during cold starts and not normal running.
12factor.net advises not having conditional configs but using a single config param and the ENV passes in the value making the app more portable. The prior way was to have conditional config file and typically using the NODE_ENV param, but with containers that's. The "build, release, run" refers to gathering dependencies during build phase. The Dockerfile I shared does that with the "npm install" layer. Then the release occurs with CI, deploying to host, and the run occurs when k8s deployment/pod starts up container runtime.
To make the Dockerized apps completely portable, you're better off with a single set of config values derived from the ENV. This is ideal but there are many ways to skin a cat as the saying goes. If you accept the example/logic above then I hope that helps. I haven't tried a multi-stage Docker build for CRA apps, although they are useful for small images for Golang apps. Perhaps ping Dan Abromov on Twitter and see if he has suggestions on smaller image size. As far as I found for my apps, this was the only reasonable way to inject ENV vars without too much hacking.
Cheers!
On Tue, Dec 4, 2018 at 4:55 PM Dave Ceddia notifications@github.com wrote:
@mikesparr https://github.com/mikesparr I don't know much about 12factor - which part is violated by configuring the endpoint based on hostname? For the most part, all it does is effectively prepend api. on the existing hostname, so I'm interested to learn where that breaks down in practice.
â You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/facebook/create-react-app/issues/982#issuecomment-444304947, or mute the thread https://github.com/notifications/unsubscribe-auth/AFS70TZv71aEII6INvjOf4foiYlqINlyks5u1wt4gaJpZM4Kj-Pn .
For what it's worth, I still would suggest to take the route of building your react app using the regular docker build functionality (as also already suggested in this thread, multistage build could be a great optimization). In case you need environment specific settings, you will have to make them available at runtime through dynamic means as it is the nature of the beast. To summarize, it will mean you likely will have to set some global variables in some additional js code or load some json from somewhere. When a you'd like to have the configuration right there when your app is loading, you can use an additional script tag inside your index.html. (there are other options too such as having index.html become a runtime template but lets keep it simple here). Now for generating the environment specific configuration... You can opt for building a "static" (for the runtime of the container) file from the container entrypoint or if that's easier for your use-case serve the configuration dynamically through some mini app server endpoint (node, go, python or whatever you like) along with your react app. Imo the appeal of having the entrypoint generate the file is that then you only need something that serves static files (may be faster and more secure). The same goes for a solution where you put some configuration externally (some cdn as also already suggested) and have your app load that.. But still, IMO you'd want to build your react app once and run it many times without additional overhead and delay (some of my react app builds take ages to build amd a lot of memory) everytime it has to be started somewhere...
Already set the env vars in my Docker on AWS, but the app is not using it when running.
Does anyone know how I can do it?
For development, I'm using the
.env
file. For the build, I'm putting the.env
file in.dockerignore
so this development file doesn't go into the build version. My intention is to use the default Env Vars configuration in AWS ECS.I'm not using
docker-compose
. I'm just using theDockerfile
.