Services | Javascript | noSQL |
---|---|---|
Register - When user registers w/ email, username, password, the system sends a email verification. Until the user clicks the link within the email, their Email Verified field is false. When clicked, the field is set to true. The response from Register contains a "Session Token" for subsequent authentication.
Login When the user logs in with their username and password, the sytem responds with a "Session Token".
Log Out When the user logs out, the Session Token is blacklisted using Redis. Every entry point to the server that requires authentication first checks if the Session Toke has already been revoked by checking it's presence in Redis. If present, the request is denied.
Reset Password Once the user provides an Email address, they receive an email with a link that takes them to a form to submit a new password
Profile Once a user is logged in, they can view their profile.
Profile Update A User can modify their username and/or email. If the email address is modified, their Email Verified value is set again to false and a new email is sent to the new address for verification.
OpenShift supports a free NodeJS setup that will scale with web traffic. This Snowflake Server setup will use MongoDB and Redis.
Some commands that you'll want to know about, once you've install the 'rhc' client:
You can check the performance of your application using the link
your-app-domain/haproxy-status/
The HAProxy Status Page
The nodeJS server uses Hapi. Hapi was developed and Open Sourced by Walmart Labs. It has been battle tested by Walmart, the largest retailer on earth. I chose it over Express 'cause Hapi is more targeted to API support and it looked interesting.
This server is documented here in its entirety.
Here's some flavor of what Hapi offers. Below is the declarative definition of the /account/login
end point. The payload
is validated here and shows how the username
has a regex expression and is required. The same for the email
. The config
option has the tags, description, notes
that document how the api
is used. The handler
is defined elsewhere. Separating the end point validation and declaration from the implementation cleans up the code.
{
method: 'POST',
path: '/account/login',
handler: AccountHandlers.loginUser,
config: {
// Include this API in swagger documentation
tags: ['api'],
description: 'A user can login',
notes: 'The user login will return a sessionToken',
validate: {
payload: {
//username required with same regex as client
username: Joi.string().regex(CONFIG.validation.username).required(),
//password required with same regex as client
password: Joi.string().regex(CONFIG.validation.password).required()
}
}
}
},
Mongodb will host our documents, namely User information, at this time. We'll be using Mongoose for interacting with Mongo within our code.
Once you're ssh'd into Openshift via rhc ssh -a mysnowflake
, you
can use the mongo
shell.
Redis is fantastic for key,value pair access. We're using it here for "Black Listing Json Web Tokens". You can read about this concept here https://auth0.com/blog/2015/03/10/blacklist-json-web-token-api-keys/
Swagger provides the api documentation - simply augmenting the endpoints generates a page showing all the API access points.
Shown below is the generated API documentation -
NOTE: you can test the APIs directly from the browser with the forms that Swagger provides!
JWT is used in the Authentication as a Session Token. You can read the docs here showing how it's setup.
Using JMeter allowed me to performance test the API. I created a test suite with JMeter as shown below and debugged the script by running locally. Once I was satisfied, I changed the HTTP Request Defaults
and uploaded to BlazeMeter for testing.
Shown below is the script defined in JMeter
BlazeMeter was used to perform the tests as it is much better equipped to host the threads then my personal mac.
<img src="https://i.ytimg.com/s_vi/HKDw5po4TYM/1.jpg?sqp=CKyjprQF&rs=AOn4CLDM4rBr05-tstNYVcwhO09V1WXdNA&time=1451856598959" alt="Running BlazeMeter" width="240" height="180" border="10" />
The following screens show the results of running 50 concurrent users performing the following actions with a 1 second delay between each action:
Original Test Configuration
Overview
Timeline Report
Load Report
Aggregate Report
Monitoring Report
Below are the instructions for setting up the server on your local machine. These instructions work fine on the Mac - no promise is made for other OSs.
You may need to "Allow Less Secure Apps" in your gmail account (it's all the way at the bottom). You also may need to "Allow access to your Google account"
Install Mongo db https://www.mongodb.org/downloads#production
sudo mongod
Install Redis
cd redis-2.8.24
make
cd src/
./redis-server
Update ip in config file w/ ip from ifconfig
Example:
hapi: {
port: 5000,
ip: '127.0.0.1'
}
Update Snowflake src/lib/config.js
w/ same ip from step above
Example:
HAPI: {
local: {
url: 'http://127.0.0.1:5000'
},
remote: {
url: 'enter your remote url here'
}
}
npm start
Watch the following video to see all the steps to install the Hapi nodes server to Openshift in action:
<img src="https://i.ytimg.com/s_vi/Js4Kvd9gG6E/2.jpg?sqp=CMzbprQF&rs=AOn4CLDZzm3zHrS-YdAJCi4a-yHXv73NyQ" alt="Snowflake Hot Loading" width="240" height="180" border="10" />
Create an account at http://openshift.redhat.com/
Install the command line tool rhc
Create a namespace, if you haven't already done so via the web interface
rhc domain create <yournamespace>
-s
option is for "Scaling"rhc app-create mysnowflake nodejs-0.10 mongodb-2.4 -s
Note that if you get an error during this step, most likely it has to do with copying the Openshift GIT repository to your local system. What you can do is to your OpenShift account and use the link they provided for the Source Code. Just git clone xxx
where xxx is the link you copied from Openshift.
This next command will load the Redis cartridge
rhc add-cartridge \
http://cartreflect-claytondev.rhcloud.com/reflect?github=transformatordesign/openshift-redis-cart \
-a mysnowflake
cd mysnowflake
git remote add upstream -m master git://github.com/bartonhammond/snowflake-hapi-openshift.git
git pull -s recursive -X theirs upstream master
cp src/config.sample.js src/config.js
git push origin master
That's it!!!
You can now checkout your application at: http://mysnowflake-$yournamespace.rhcloud.com
If you want to have this project also in your personal GitHub account also, follow these steps but be WARNED: unless you use a "Private" repository, your
config.js
will be visible.