You can access the site here.
FEi Systems (FEi), a leading Information Technology (IT) services and analysis organization, headquartered in Columbia, MD, is pleased to respond to the California Department of Technology (DoT), Request for Interest (RFI) Number CDT-ADPQ-0117: “Pre-Qualified Vendor Pool for Digital Services-Agile Development” for Prototype A.
FEi has successfully implemented Agile software solution development for clients, including federal agencies, as well as over 32 state and county state projects. In California, five counties have adopted FEi solutions for Electronic Health Record (EHR) systems and standardized assessment tools. Most recently, FEi joined the qualified vendor pool for Agile Development Services for the Mississippi Department of Child Protection Services (CPS). Additionally, as a testament of organizational maturity, FEi has sustained Capability Maturity Model Integration (CMMI) Level 3 certification for the last five years and is ISO/IEC 20000-1:2011 certified.
FEi’s completed Prototype A can be accessed at http://ca-adpq-prototype.eastus2.cloudapp.azure.com/. The user will be prompted to login and can use the following user names and passwords:
Authorized Users:
Administrative Users:
At its core, Prototype A is an application that allows state employees to shop for end-user computer products (e.g., hardware, software, and related services) from pre-established contracts with the State.
There are two types of users for the site:
User Role 1: Authorized User (User)
User Role 2: Administrative User (Admin)
The Home Page was designed so that users can easily browse for products across three categories: Hardware, Software, and Services. We grouped items so that each menu displays products from the corresponding category in the dataset provided in the Prototype A resources file.
After logging in, users can browse each category to select products. Alternatively, users can perform a general keyword search to find the desired product. The Home Page also features an Advanced Search feature, where users can enter more granular search parameters, such as:
The Home Page also includes space to highlight “Featured Products.” We anticipate this portion of the site could contribute to improved efficiency in the acquisition process by streamlining the products that are displayed for the users and expediting the shopping process.
Note: You cannot add items to your cart or compare list unless you are logged in.
Users can add up to four items to a Compare List by selecting the “Compare” box found in each item.
A counter on the Compare List prominently displays the number of items in the compare list. After the user clicks the Compare List link in the upper right corner of the top navigation bar, the Compare List displays the selected items for comparison side by side.
Users can view the following information about the products:
Users can add items to the shopping cart by clicking the red “Add to Cart” button for each item. The button grays out once an item has been added to the cart and updates to read, “Added to Cart.” As with the compare feature, a counter on the Shopping Cart prominently displays the number of items that have been added to the cart.
Users can view their cart at any time and adjust the quantities of the items they would like to purchase. To remove an item, users can enter a quantity of “0” and click “Update Cart” or click the “Remove” link next to the desired item. The cart provides a subtotal of the order and also allows the user to continue shopping via the “Continue Shopping” button, which returns the user to the Home Page.
From the cart, the user can checkout by clicking “Proceed to Checkout.” On the Checkout Information page, the user will enter their contact information (Name, Phone Number, and Email Address, their Shipping Address, and select a Payment Account. For the purposes of the prototype, we created payment accounts for four state agencies to which users could attribute their order.
Clicking “Back to Cart” allow the user to make any additional revisions to their cart if necessary. The user will click “Continue” when they are ready to proceed with the order. From the Review Order page, the user can click “Place Order” and an Order Confirmation page displays the order number.
Users can view, track, and cancel placed orders from the “My Order History” page. For the purposes of the prototype, an order’s status will change from “Placed” to “Shipped” 15 minutes after the order was placed. The user can cancel the order while it is still in the “Placed” status by first clicking “View Order” from the Order History page and then “Cancel Order” from the Order Details page.
By selecting the “Admin” link, admins can perform key functions, including:
To add, edit, or delete any items from the catalog, click “Catalog” from the Admin Home page.
Click “Add Product” from the left menu to add items individually and enter the following formation:
Multiple items can be added by using clicking “Import Products” from the left menu and click “Choose File” to select a .CSV file to upload. Finally, click “Import CSV Inventory File.” We anticipate this feature could contribute to improved efficiency in maintaining the catalog by reducing the time it takes to upload items. Note the columns on the file correspond to the columns in the Prototype A resource data set.
Admins can upload images by clicking “Import Images” from the left menu, selecting “Choose Files,” and then finally “Import Images.”
To edit or delete a catalog item, select “Catalog” from the Admin Product Management page. Then, find the row containing the item to be edited or deleted and click either “Edit” or “Delete” from the Options column.
To delete multiple items from the catalog, select the checkbox next to the item and click “Delete selected”.
To easily locate catalog items, click the “Search” button from the Admin Product Management page and enter any of the following criteria:
After clicking “Search,” results are displayed in a table format. The user can then edit or delete items from the search results page.
To view reports on expenditures and purchases, click “Reports” from the Admin Home page.
The reports “Dashboard” is driven by start and end dates of the desired report period. After entering the start and end dates, click “Run Report” and the following reports display:
The admin can also optionally filter data by selecting the order status (Placed, UserCancelled, and/or Shipped) from the top right of the Dashboard. Be aware that items in the Placed status transition to Shipped status 15 minutes after the order was placed for the purpose of the prototype.
Click on any of the reports in the dashboard for a larger view. Alternatively, the user can click any of the reports from the left menu or “Expand All Charts.” To print a report, right-click any of the reports and select print.
Additionally, the left menu contains the following report functions:
Note for the purposed of the prototype, the FEi Team developed tools to create orders for testing purposes and so that report data would be visible.
When developing the prototype, FEi used an Agile Scrum process that is detailed in this [diagram](./artifacts/Agile Scrum Process Diagram/Agile_Development_Scrum_Process_Diagram.png).
During the planning stage, we analyzed the RFI requirements and selected Prototype A. Then, we identified the roles and skillsets needed to complete the prototype and assembled a multi-disciplinary team based on the requirements of the project and Agile methodology. We developed a project budget and a schedule with three, one-week sprints. Throughout the sprints, the team worked as a single unit with daily stand-up meetings, frequent touchpoints, and demonstrations. More information is provided in the [Sprint Schedule](./artifacts/Sprint Schedule/Sprint_Schedule.pdf).
The overall project team was divided into the following functional teams to seamlessly perform like functions and responsibilities:
Leadership, Product Definition, and Requirements Team – This team consisted of the Product Owner (PO) (Labor Category: Product Manager), Scrum Master (Labor Category: Delivery Manager), and Business Analysts. As the leader of the team, the PO was responsible for defining the scope, prioritizing the work items, and ensuring completeness of the work items. The Scrum Master fostered an Agile team environment by demonstrating Scrum tools and techniques. Additionally, the Scrum Master facilitated the development process by removing any impediments identified by the team members and asking each member what they completed yesterday and what they will perform today. Business Analysts created user stories and acceptance criteria.
Visual and Frontend Development Team – This team consisted of the Technical Architect, Visual Designer, Interaction Designers/User Researchers/Usability Testers, and Frontend Web Developers, and was responsible for designing the entire user experience, information architecture, journey maps, wireframes, and style guides. This team worked closely with the PO and the rest of the Requirements Team and the Backend Development Team to ensure working end-to-end functionality that is robust and satisfies the product requirements.
Backend Development Team – This team consisted of Backend Web Developer who performed database design (including augmenting the sample data provided by the State), created the search functions and other ancillary functions. The Backend Development Team also worked closely with the Requirements Team and the Visual and Frontend Team to ensure a working end-to-end functionality that is robust and satisfies the product requirements.
DevOps Team – This team consisted of the DevOps Engineer who was responsible for writing the automation of infrastructure and maintaining it.
The FEi Team collaborated daily, using the Agile Scrum board to assess progress and assign work items. The Continuous Integration (CI) process established also helped with team interaction and with team (and work) integration, so that issues would be resolved in a timely manner.
When bugs were found, the team immediately created issues and assigned them to the right team member, or the assignment was made during the stand-up meeting the next day.
Our team relied on the following artifacts to create the prototype:
Sprint Schedule: After an initial planning sprint, we conducted three, one-week sprints. A summary of sprint schedule and activities are included in the [Sprint Schedule](./artifacts/Sprint Schedule/Sprint_Schedule.pdf).
Data Set: The catalog product [data set](./artifacts/Data Set/Data Set_ADPQ_v5.csv) was based on the Prototype A data set provided by the State.
Images: Images linked to this ReadMe file are located here.
Section 508 Compliance Scorecard: The prototype was tested using the WAVE web accessibility tool and JAWS 16 screen reader. During initial testing, there were some errors that showed up when using WAVE as well as doing keyboard navigation and selection. There were no issues reported when testing with the JAWS 16 screen reader. Results were captured in the Section 508 Compliance Scorecard
Digital Services Playbook: We followed U.S. Digital Service playbook guidelines. The process is described in greater detail here.
Design Process: Design notes, user testing notes, and wireframes can be found [here](./artifacts/Design Process/).
User Stories and Acceptance Criteria: We expanded the vendor challenge user story to include additional scenarios and for each user story and we also developed acceptance criteria for all [User Stories](./artifacts/User Stories).
Definition of Done: We used a [checklist](./artifacts/Definition of Done Checklist/Definition_of_Done_Checklist.pdf) for the definition of done.
Meeting Pictures: We captured images from one of our [daily standup meetings](./artifacts/Meeting Pictures/).
GitHub: The prototype framework and libraries are included in GitHub, https://github.com/FEISystems/ca-adpq-prototype
Survey Results: In addition to the interviews, we fielded a brief online survey to gather usability feedback. The survey can be found: https://www.surveymonkey.com/r/7DXQ58L. Survey results are shown here.
Test Scripts and Test Cases: Quality Assurance (QA) activities included creating test cases (actors, preconditions, and test steps) as well as test scripts (actions, expected results, and test results in a Pass/Fail format) are shown [here](./artifacts/Test Cases/Test_Cases.xlsx).
The code uses a multi-tier approach. The User Interface (UI) Layer consists of AngularJS components that communicate to AngularJS services. These services utilize Ajax calls to access the Web.API controllers in the controller layer. The controller layer accesses the service layer which has the business logic embedded. When needed, the service layer accesses the MySQL Database via the repository layer. IoC injection is utilized through all the layers enabling unit testing. An interactive code flow diagram presents more details.
In the following sections, we address each of the technical requirements described in the RFI and reference the corresponding GitHub issue number.
GitHub Issue #63
When we assembled the team, the Product Manager, Mr. Terry Boswell, was given the authority and responsibility for the quality of the prototype submitted.
GitHub Issue #64
We assembled a multidisciplinary and collaborative team that included:
GitHub Issue #46
Axure, a rapid prototyping tool, was used to design the application including the required features sets. The rapid prototype contained the necessary screens and features with sufficient detail to be tested by people. As a result, we were able to gain a better understanding of how people used the application including their needs and any pain points encountered. The information gathered from the user testing allowed our team to improve the application in subsequent iterations.
GitHub Issue #61
Journey Maps: Early in the requirements gathering phase, our team of Business Analysts and Interaction Designer/User Researcher/Usability Testers held a series of ideation sessions to gain a better understanding of the users’ needs and requirements. The journey map illustrates the users’ workflow and tasks they must complete in order to reach their goal. Our team of developers and designers use the [journey map](./artifacts/Design Process/Journey Maps/) as a reference for designing for the user’s needs.
Rapid Prototyping: In order to build an application designed to meet the user needs, our team created fully functional [rapid prototypes](./artifacts/Design Process/Wireframes and Prototyping/) that were tested with users. The feedback collected from the users allowed our team to make improvements to the application early in the design phase, resulting in a better user experience and usability for the user.
User Testing: To validate our designs and ensure that our application designing was consistent with user expectations and needs, we tested with users early and often. As a result of our testing, we were able to identify usability issues early and make corrections to the UI where necessary. The user testing allowed our team to better understand the users’ needs, and improve the application. We documented the user testing feedback.
GitHub Issue #68
GitHub was used as our source control system. Tickets were associated with commits as they were executed, https://github.com/FEISystems/ca-adpq-prototype
GitHub Issue #47
Swagger was used to document the RESTful API. The links to the User Interface and JSON data are included below.
User Interface:
http://ca-adpq-prototype.eastus2.cloudapp.azure.com/swagger/index.html
JSON Data:
http://ca-adpq-prototype.eastus2.cloudapp.azure.com/swagger/v1/swagger.json
GitHub Issues #48, #120, #121, #122, #123, and #185
Section 508 of the Rehabilitation Act of 1973, as amended (29 U.S.C. § 794 (d)) provides accessibility guidelines for the development, procurement, maintenance, or use of Electronic and Information Technology (EIT). The amendment mandates that federal agencies must give disabled employees and members of the public access to information that is comparable to access available to others (Section 508 Law and Related Laws and Policies. (n.d.). Retrieved from https://www.section508.gov/content/learn/laws-and-policies). Based on Section 508 accessibility requirements, the U.S. Access Board established standards and guidelines as to how all federal agencies can ensure Section 508 compliance for web-based applications and information, software applications, operating systems, computers, telecommunication, multimedia products, documentation, and more. At FEi, Section 508 compliance is not an afterthought, and our team advocates for users with disabilities at every stage. When validating applications and documentation for Section 508 compliance, we address the concerns of individuals with disabilities by using manual and automated testing techniques to confirm full accessibility and usability.
To ensure compliance without incurring refactoring costs, our team weaves Section 508 requirements into design, development, and testing at the beginning of every project. FEi also carefully considered the design of the prototype to ensure 508 compliance up to WCAG AA level through utilizing the following guidelines and technology:
The prototype was tested using the WAVE web accessibility tool and JAWS 16 screen reader. During initial testing, there were some errors that showed up when using WAVE as well as doing keyboard navigation and selection. There were no issues reported when testing with the JAWS 16 screen reader. Please refer to the 508 Compliance Scorecard for details.
GitHub Issue #49
Our team used the U.S. Web Design Standards UI framework as a starting point for our style guide and pattern library. When using the U.S. Web Standards, our team of designers applied the styles creatively by adding our own variations of the styles. Where applicable, we updated the style guide to reflect our own variation on the U.S. Web Standards UI framework.
GitHub Issue #50
We held several rounds of user testing to ensure that we were achieving a high degree of usability in our application. In this effort, we identified and recruited a group of people to participate in our testing. Utilizing the rapid prototype created, we asked our participants to complete a series of tasks. During the testing sessions, our observers collected notes on the users’ interaction with the application, including user feedback, thoughts, recommendations and pain points. As a result of our testing, our team was able to use the user feedback as a reference for improving our application subsequent iterations of the application development.
We documented the user testing feedback [here](./artifacts/Design Process/User Testing/User_Testing_Notes.pdf).
GitHub Issue #51
Our team used scrum, an Agile approach to iterative application development. Scrum provides a series of best practices and activities for planning and managing iterative development. At the end of our scrum iteration (known as a Sprint), Scrum provides a review and feedback activity called a sprint demo. During the sprint demo, the work in progress is demonstrated to our stakeholders and PO. The sprint demo allowed our team to provide feedback and comments and provided a good check point to ensure that the project kept moving in the right direction. The feedback and comments collected were documented as feature/bug backlog items and prioritized for implementation in subsequent sprints.
GitHub Issue #52
Our Frontend Web Developers initially created [wireframes and screen mockups](./artifacts/Design Process/Wireframes and Prototyping/), and discussed the view of web components and layout in different types of devices. Once the wireframes and Axure prototyping were completed, the team developed the interactive code with selected technologies and frameworks (U.S. Web Design Standards UI and AngularJS), and tested using different types of devices, such as PC, Phones, and Tablets; we also used browser built-in tools to simulate devices. The use of the U.S. Web Design Standard UI framework allowed our frontend code to be fully responsive and tested across multiple browsers and device sizes, as well as to support accessibility needs. The prototype has been tested using the following browsers: Internet Explorer 11, Google Chrome, and Firefox. In addition, the use of responsive web design desktop allows web pages to be viewed in response to the size of the device. FEi tested the prototype using the following devices and operating systems: PC (Windows 7 OS), Surface Pro (Windows 10), iPad (iOS9), Samsung Galaxy S5, and iPhone 6 (iOS9).
GitHub Issue #52
FEi’s prototype uses modern, open technologies, and we used Agile (Scrum) processes to manage, design, develop, test, and deploy the prototype. The following standards and guidance are used or referred to in this prototype.
GitHub Issue #54
Azure provides simple CI that can deploy a project from GitHub to Azure App Service. To facilitate quick builds, we used this capability in two locations: internally and in Azure. Two instances of the prototype are hosted internally for QC purposes. These are hosted on our own virtual infrastructure and they are both tied into an automated CI/CD pipeline. The first instance always has the latest changes from the development branch. The second instance has the latest merges into the stable production branch. Latest and greatest versus release candidate.
GitHub Issue #55
The FEi Team used xUnit to write unit tests for the ASP.Net Core Business Logic. Our unit tests can be found in https://github.com/FEISystems/ca-adpq-prototype/tree/master/ca_proto/ca_proto_tests.
Unit Test Runner results are shown below.
GitHub Issue #56
The CI pipeline was set up to be simple and reliable. It uses two free tools that integrate with GitHub: AppVeyor and Docker Hub Automated Builds.
GitHub is configured to use web hook to notify AppVeyor of a ‘push’. This event triggers AppVeyor to ‘checkout’ the related branch from GitHub. It will then run a build of the ca_proto and ca_proto_service projects as well as run the tests located in the ca_proto_tests project. The build is customized and controlled using a file called appveyor.yml which contains the build instructions for AppVeyor. After the build and test it will push the published release back to GitHub.
This in turn triggers another web hook integration to Docker Hub Automated Builds. Docker Hub uses its cloud build service to ‘checkout’ the related branch from GitHub and build a Docker Image based on the Dockerfile contained in the branch. Once the Docker Image is built, it is placed into its Docker Hub repository.
The final step, deployment, utilizes a Docker Container called Watchtower. This tool monitors Docker Hub repositories linked to Docker images in the local Docker Engine repository. It will periodically check for newer versions of each image per tag. When it detects a new image it will pull it, stop related containers, update them, and then restart them.
Project and repository links are listed below:
AppVeyor Project: https://ci.appveyor.com/project/ryan-chadwick-fei/ca-adpq-prototype
Docker Hub Repository: https://hub.docker.com/u/feidevops/
Watchtower Docker Hub Repository: https://hub.docker.com/r/centurylink/watchtower/
Watchtower GitHub Repository: https://github.com/v2tec/watchtower
GitHub Issue #57
Configuration management has been applied to the CI/CD pipeline, host provisioning, and a few other application points. There are several configuration items to be noted, all of which are stored in GitHub giving them version control and history.
Build Process configuration management can be found in AppVeyor and Docker build files. These items are used directly by AppVeyor and Docker Hub when they check out the GitHub repository. Configuration items for this process are:
appveyor.yml – this configuration file controls the details of each build. It is used by AppVeyor to define build environment and execution scripts.
build_aspdotnet_core_latest.ps1 – this script is defined for use in the appveyor.yml configuration for the development branch
build_aspdotnet_core_stable.ps1 – this script is defined for use in the appveyor.yml configuration for the QC branch
build_aspdotnet_core_production.ps1 – this script is defined for use in the appveyor.yml configuration for the release branch
devops/Dockerfiles/Dockerfile-latest – this Dockerfile denotes how Docker Hub Automated Builds should process the development branch output
devops/Dockerfiles/Dockerfile-stable – this Dockerfile denotes how Docker Hub Automated Builds should process the QC branch output
devops/Dockerfiles/Dockerfile-production – this Dockerfile denotes how Docker Hub Automated Builds should process the release branch output
Since the prototype is hosted both internally and in Azure, separate scripts had to be created for each. SCVMM 2016 PowerShell scripts include the ability to recreate hosting internally on FEi Systems infrastructure. The Azure Resource Group template is used to provision the production host in the cloud. Configuration items for this process can be found in:
Some portions of the monitoring configuration, specifically the Prometheus exporter, can scrape jobs, which can be found:
GitHub Issue #58
Continuous monitoring has been implemented using Prometheus, which is an aggregation and analytics server. Prometheus also supports Alerting, but this was not implemented due to time constraints.
The Prometheus server gathers metrics from multiple locations. Each integration point is called an Exporter. We deployed four of these Exporters into our Production environment to monitor the application at different layers, including:
In order to create meaningful dashboards, we coupled Prometheus with Grafana. This tool allows for customized dashboards based on gathered metrics from Prometheus as shown below.
Host Monitoring Dashboard
![System Monitoring Host](./artifacts/Images/Grafana Host Dashboard.png?raw=true "System Monitoring Host")
Docker Monitoring Dashboard
![System Monitoring Docker](./artifacts/Images/Grafana Docker Dashboard.png?raw=true "System Monitoring Docker")
Links for the tools in our Production environment and login information below:
DockerDash
http://ca-adpq-prototype.eastus2.cloudapp.azure.com:5050
admin/Letmein1!
Prometheus
http://ca-adpq-prototype.eastus2.cloudapp.azure.com:9090/status
Grafana
http://ca-adpq-prototype.eastus2.cloudapp.azure.com:3000/
monitor Letmein1! - dashboard viewing only
GitHub Issue #59
FEi deployed the Docker Hub Repository for our containers as below.
https://hub.docker.com/u/feidevops/
The environment build guides for QC and Production can be found in the Docker Environment Builds.md
GitHub Issue #: 167
The following steps were used to setup a development environment to install and run this prototype on another machine.
Docker Version
or the other way ...
Setup Tools
Note:Based on your OS, the tools above may have various downloads and install instructions.
Configure DEV Environment
Download or clone the source code from Github, https://github.com/FEISystems/ca-adpq-prototype
git clone https://github.com/FEISystems/ca-adpq-prototype.git
Configure Visual Studio Code
On the left side of VS Code, from the menu options, select the bottom icon which represents extensions. To run the prototype in VS Code you will need the C# extension to be installed.
Once that extension is installed, go to the next menu icon up (Debug), and click on the Gear at the top (the tooltip will say “Configure or Fix launch.json”).
Select “.NET Core”, and VS Code will create a launch.json file. Once that is done, click the green “play” button. VS Code will show the following:
Select “Configure Task Runner,” then select “.NET Core” from the option list that follows. VS Code will create a “tasks.json” file. The tasks.json file needs to be modified to pass the location of the “project.json” file, modify the “args” property underneath the “tasks” property as follows:
Once this change is made, switch back to the “launch.json” file. Under the entry for “.NET Core Launch (web)”, change the “program” and “cwd” attributes as follows:
The dependencies will then need to be restored. Go to the View | Integrated Terminal menu option. That should open a command prompt at the root of the repository. Change into the ca_proto directory and run “dotnet restore”:
Once this is done, go back to the Debug tab on the left, make sure that “.NET Core Launch (web)” is selected, and hit the play button.
GitHub Issue #60
The following platforms were used to create and run the prototype (these platforms are openly licensed and free of charge):