hygieia / ExecDashboard

Exec Dashboard Documentation
https://hygieia.github.io/ExecDashboard/
Apache License 2.0
67 stars 75 forks source link

Not able to get the Portfolio in executive dashboard #19

Open thedevopsguru opened 6 years ago

thedevopsguru commented 6 years ago

I am not able to get the Portfolio list in executive dashboard, the list is empty. Pl suggest. The documentation for it is not very good.

adityai commented 5 years ago

@thedevopsguru - Same issue here.

image

adityai commented 5 years ago

@thedevopsguru - I think this folder has stuff that might populate the MongoDB collections, but I am not sure yet because all I see is a db user creation step in the js file.

https://github.com/Hygieia/ExecDashboard/tree/master/exec-db

If I figure it out, I will update this post.

rajkumaradass commented 5 years ago

@adityai , Where you able to find how to configure/get the portfolios.

adityai commented 5 years ago

@rajkumaradass - No, I haven’t been able to workout the portfolios. I was planning to go through the code to figure it out, but my team decided to not use Hygieia because of lack of features that we want and lack of documentation and support.

rajkumaradass commented 5 years ago

@thedevopsguru, By chance were you able to figure this out?

Any inputs on this from hygieia dashboard team is much appreciated, as we are stuck with lack of documentation support on this.

Thanks!

adityai commented 5 years ago

Hello,There seems to be a lot of interest in getting ExecDashboard working. I will put some time into it this week and post a comment. My team played with Hygieia (not the ExecDashboard) for over a month. The features available are not that great. We were interested in it because we were planning to build our own features on top of Hygieia.  We compared our requirements with the features already available in Hygieie. Except for the data gathering/storing and displaying part, it did not meet any other requirements.  We decided to switch to Splunk, which is excellent for data collection. I know - there is a $ value that goes along with it, but it is totally worth anyone's time to at least evaluate it.  I have previously looked at Elasticsearch, Logstash and Kibana (ELK Stack). It is free, but there is a steep learning curve. I have some instructions here with some very old versions of Jenkins and ELK.  http://www.iaditya.com/search?q=jenkins+analytics ( 2 posts - the second one is a simpler docker container based setup - it may not work now because the Logstash plugin has changed a little bit.)

It worked beautifully, but using the tools to display my dashboards was not intuitive. If I had the time or bandwidth, I'd put a lot of effort into ELK stack and use that instead of Splunk. Additionally: Please look into the following: (These options do need a lot of coding, but most of it is like a template and easy to work with and build new dashboards and widgets). dashing.io - Love it!metricio - nodejs version of dashing.io - still trying to get it to work, but looks very promising and it is quite fast.

I hope this helps. 

I am a dashboard fanatic. If you have other questions or comments about dashboarding, please tweet or look me  up on Linkedin and send me a message. It will be my pleasure to discuss anything about dashboarding.

My twitter handle: @adityainapurapu  https://www.linkedin.com/in/adityai/

On Sunday, November 25, 2018, 10:06:24 PM PST, rajkumaradass <notifications@github.com> wrote:  

@thedevopsguru, By chance were you able to figure this out?

Any inputs on this from hygieia dashboard team is much appreciated, as we are stuck with lack of documentation support on this.

Thanks!

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.

rajeshak1971 commented 5 years ago

@adityai, @rajkumaradass @thedevopsguru - Could you please contact me at rajesh.dhanaraj@capitalone.com. I am from the Hygieia team and would like to understand your requirements and see how we can help.

adityai commented 5 years ago

@rajeshak1971 - my latest comments were about Hygieia (not Execdashboard). So far, the primary issue we have with ExecDashboard is - how do we get the dashboard to display anything? How do we get data into ExecDashboard db? The main page shows a 'Executive' level dashboard 'Select a portfolio' with nothing in it. We cannot figure out what data it is exactly. We can try and review the code and figure out the data model, but that's not ideal. I'd rather have some documentation about how to get that data into the db or sample data shipped with the code that can help us determine something.

I'll email you later tonight or tomorrow after I re-create my Hygieia and ExecDashboard docker containers.

rajeshak1971 commented 5 years ago

@adityai - Hygieia Executive dashboard aggregates information from across Hygieia team instance(s) and needs Hygiea Team/Product instance(s) running. Is your Hygieia team instance(s) stood up with key widgets configured such as the feature, code repository, build, and deploy?

rajkumaradass commented 5 years ago

@rajeshak1971 , In Hygieia dashboard, I have created a team dashboard with build widget, code repo widget and a custom widget with all configured properly and showing data in it. Then there is a product dashboard comprising the previously created team dashboard in it.

Note: Before creating a product dashboard, a team dashboard with atleast code repo and build widget must be configured as per this error screen shot. Could you please confirm if still some other widgets are required for it to be synced to Executive dashboard. Also it's unclear how does the portfolio/portfolio names are created in executive dashboard with this data.

image

Thanks!

rajkumaradass commented 5 years ago

Also while running the exec-analysis jar, see below exception(cannot resolve 'validConfigItem' given input columns: []; line 1 pos 176; ). How do we resolve this error?

2018-11-27 11:50:09.916 INFO 30849 --- [taskScheduler-1] org.apache.spark.SparkContext : Created broadcast 7 from b roadcast at MongoSpark.scala:536 2018-11-27 11:50:10.012 INFO 30849 --- [taskScheduler-1] o.a.spark.sql.execution.SparkSqlParser : Parsing command: dashboard s 2018-11-27 11:50:10.016 INFO 30849 --- [taskScheduler-1] o.a.spark.sql.execution.SparkSqlParser : Parsing command: SELECT _i d as productId, configurationItem as productName, environments, components, businessOwner, ownerDept, appServiceOwner, supportO wner, developmentOwner FROM cmdb where (validConfigItem = 1) and (businessOwner is not null) and (itemType = 'app') 2018-11-27 11:50:10.148 ERROR 30849 --- [taskScheduler-1] o.s.s.s.TaskUtils$LoggingErrorHandler : Unexpected error occurred in scheduled task.

org.apache.spark.sql.AnalysisException: cannot resolve 'validConfigItem' given input columns: []; line 1 pos 176; 'Project ['_id AS productId#32, 'configurationItem AS productName#33, 'environments, 'components, 'businessOwner, 'ownerDept, ' appServiceOwner, 'supportOwner, 'developmentOwner] +- 'Filter ((('validConfigItem = 1) && isnotnull('businessOwner)) && ('itemType = app)) +- SubqueryAlias cmdb +- Relation[] MongoRelation(MongoRDD[7] at RDD at MongoRDD.scala:52,Some(StructType()))

    at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42) ~[spark-catalyst_2.11-        2.2.0.jar!/:2.2.0]
    at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnal        ysis.scala:88) ~[spark-catalyst_2.11-2.2.0.jar!/:2.2.0]
    at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnal        ysis.scala:85) ~[spark-catalyst_2.11-2.2.0.jar!/:2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289) ~[spark-catalyst_2.11-        2.2.0.jar!/:2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289) ~[spark-catalyst_2.11-        2.2.0.jar!/:2.2.0]
    at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70) ~[spark-catalyst_2.11-2.2.0.jar!/:2        .2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:288) ~[spark-catalyst_2.11-2.2.0.jar!/:2.2.0        ]
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286) ~[spark-catalyst_2.11-2.2.0.jar!/:        2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286) ~[spark-catalyst_2.11-2.2.0.jar!/:        2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306) ~[spark-catalyst_2.11-2.2.0.jar!/:        2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187) ~[spark-catalyst_2.11-2.2.0.jar!        /:2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304) ~[spark-catalyst_2.11-2.2.0.jar!/:2.2.0        ]
    at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286) ~[spark-catalyst_2.11-2.2.0.jar!/:2.2.0        ]
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286) ~[spark-catalyst_2.11-2.2.0.jar!/:        2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286) ~[spark-catalyst_2.11-2.2.0.jar!/:        2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306) ~[spark-catalyst_2.11-2.2.0.jar!/:        2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187) ~[spark-catalyst_2.11-2.2.0.jar!        /:2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304) ~[spark-catalyst_2.11-2.2.0.jar!/:2.2.0        ]
    at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286) ~[spark-catalyst_2.11-2.2.0.jar!/:2.2.0        ]
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286) ~[spark-catalyst_2.11-2.2.0.jar!/:        2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286) ~[spark-catalyst_2.11-2.2.0.jar!/:        2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306) ~[spark-catalyst_2.11-2.2.0.jar!/:        2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187) ~[spark-catalyst_2.11-2.2.0.jar!        /:2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304) ~[spark-catalyst_2.11-2.2.0.jar!/:2.2.0        ]
    at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286) ~[spark-catalyst_2.11-2.2.0.jar!/:2.2.0        ]
    at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformExpressionsUp$1.apply(QueryPlan.scala:268) ~[spark-c        atalyst_2.11-2.2.0.jar!/:2.2.0]
    at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformExpressionsUp$1.apply(QueryPlan.scala:268) ~[spark-c        atalyst_2.11-2.2.0.jar!/:2.2.0]
    at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpression$1(QueryPlan.scala:279) ~[spark-catalyst_2.11-2.2.0        .jar!/:2.2.0]
    at org.apache.spark.sql.catalyst.plans.QueryPlan.org$apache$spark$sql$catalyst$plans$QueryPlan$$recursiveTransform$1(Qu        eryPlan.scala:289) ~[spark-catalyst_2.11-2.2.0.jar!/:2.2.0]
    at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$6.apply(QueryPlan.scala:298) ~[spark-catalyst_2.11-2.2.0.jar!        /:2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187) ~[spark-catalyst_2.11-2.2.0.jar!        /:2.2.0]
    at org.apache.spark.sql.catalyst.plans.QueryPlan.mapExpressions(QueryPlan.scala:298) ~[spark-catalyst_2.11-2.2.0.jar!/:        2.2.0]
    at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsUp(QueryPlan.scala:268) ~[spark-catalyst_2.11-2.2.        0.jar!/:2.2.0]
    at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:85) ~[spark-        catalyst_2.11-2.2.0.jar!/:2.2.0]
    at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:78) ~[spark-        catalyst_2.11-2.2.0.jar!/:2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:127) ~[spark-catalyst_2.11-2.2.0.jar!/:2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126) ~[spark-catalyst_2.11-2.        2.0.jar!/:2.2.0]
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126) ~[spark-catalyst_2.11-2.        2.0.jar!/:2.2.0]
    at scala.collection.immutable.List.foreach(List.scala:381) ~[scala-library-2.11.8.jar!/:na]
    at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:126) ~[spark-catalyst_2.11-2.2.0.jar!/:2.2.0]
    at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.checkAnalysis(CheckAnalysis.scala:78) ~[spark-catalyst_2.        11-2.2.0.jar!/:2.2.0]
    at org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:91) ~[spark-catalyst_2.11-2.2.0.jar!/:2        .2.0]
    at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:52) ~[spark-sql_2.11-2.2.0.jar!/:2        .2.0]
    at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:66) ~[spark-sql_2.11-2.2.0.jar!/:2.2.0]
    at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:623) ~[spark-sql_2.11-2.2.0.jar!/:2.2.0]
    at com.capitalone.dashboard.exec.collector.PortfolioCollector.collectCMDB(PortfolioCollector.java:120) ~[classes!/:1.0.        0-SNAPSHOT]
    at com.capitalone.dashboard.exec.collector.PortfolioCollector.collect(PortfolioCollector.java:93) ~[classes!/:1.0.0-SNA        PSHOT]
    at com.capitalone.dashboard.exec.collector.PortfolioCollector.run(PortfolioCollector.java:301) ~[classes!/:1.0.0-SNAPSH        OT]
    at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)         ~[spring-context-5.0.2.RELEASE.jar!/:5.0.2.RELEASE]
    at org.springframework.scheduling.concurrent.ReschedulingRunnable.run(ReschedulingRunnable.java:93) [spring-context-5.0        .2.RELEASE.jar!/:5.0.2.RELEASE]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_191]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_191]
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180        ) [na:1.8.0_191]
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [na:1        .8.0_191]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [na:1.8.0_191]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [na:1.8.0_191]
    at java.lang.Thread.run(Thread.java:748) [na:1.8.0_191]
PnutBUTTrWolf commented 5 years ago

Has this been resolved? I am facing the same issue. Hygieia functioning correctly but not able to load any portfolios in executive dashboard.

Sbrenthughes commented 5 years ago

For exec dashboard to work you need an integration with a tool such as hp service manager or service now. We offer a collector for hp service manager but not for service now as of yet.

For testing you can create a collection called cmdb in your Hygieia 2.0 DB. From there you will need to manually insert two records. I will post a copy of example data in just a moment.

Sbrenthughes commented 5 years ago

The Business Service

{
    "timestamp" : 0,
    "configurationItem" : "HygieiaApplication",
    "configurationItemSubType" : "ciSubType",
    "configurationItemType" : "ciType",
    "assignmentGroup" : "",
    "appServiceOwner" : "John Doe",
    "businessOwner" : "John Doe",
    "supportOwner" : "Jane Doe",
    "developmentOwner" : "Jane Doe",
    "ownerDept" : "Department",
    "commonName" : "hygieia",
    "itemType" : "app",
    "validConfigItem" : true,
    "environments" : null,
    "components" : [ 
        "HygieiaComponent"
    ]
}

and the Business Application

{
    "timestamp" : 0,
    "configurationItem" : "HygieiaComponent",
    "configurationItemSubType" : "ciSubType",
    "configurationItemType" : "ciType",
    "assignmentGroup" : "",
    "appServiceOwner" : "",
    "businessOwner" : "",
    "developmentOwner" : "",
    "ownerDept" : "Department",
    "ownerSubDept" : "",
    "commonName" : "componentHygieia",
    "itemType" : "component",
    "validConfigItem" : true
}

Once the above 2 are added to the cmdb collection in the Hygieia 2.0 db, you can do the following: 1) Navigate to they Hygieia 2.0 main page. 2) For a new dashboard (or existing ) search for the above two in image 3) For this dashboard make sure the SCM widget is configured 4) Once this is done you can then run the collector for the Exec dashboard to collect the data.

Sbrenthughes commented 5 years ago

We will be creating some apis for inserting above data

rajeshak1971 commented 5 years ago

Hygieia has three views 1. team, 2. product and 3. portfolio. Team provides component level details, product view is an aggregation of all the components and portfolio is an aggregation of all products that are assigned to a specific business owner. An example of components are front end, back end, business services etc and could be different for various organizations.

The reason why HPSM or Service Now or similar products are used in this context is to provide a central product catalog that identifies, the product name, the service owner, business owner, support owner, development owner and related information.

The executive dashboard automatically associates multiple products to an executive based on the relationship between a product (ASV) and a business owner (this info ideally comes from the CMDB). When we refer to executive portfolio, we mean one or more products that are assigned to an executive (business owner) that has accountability for those product(s). Therefore, when you create the dummy data in the CMDB (as outlined by Sbrenthughes) you may want to massage the data in such as way that you have 1 or many products assigned to a specific business owner. We are not recommending this approach but this may help you with standing up the system for evaluation or Proof of concept purposes sans the CMDB Collector.

Also make sure that the the following fields in the dashboards collection have values. Hygieia Exec needs this to map team/product dashboards to an ASV/product.

configurationItemBusAppName ---- Example: BAPFORASV configurationItemBusServName ---- Example: ASVMYPRODUCT

If you are using Service Now reach out to me or anyone from the Hygieia team.

Also example field values in the CMDB collection are as follows.

If someone from the community can write an excel import (CMDB Info) for non HPSM and Service Now Hygieia customers and contribute it back to the community that will be awesome.

rikameajay1 commented 5 years ago

Did anyone got the soln, I'm facing similar issue.

rikameajay1 commented 5 years ago

Resolved. Make sure to add server.contextPath=/api in the api.properties file

Sbrenthughes commented 5 years ago

@adityai are you needing any further assistance ?

adityai commented 5 years ago

Yes. I need further assistance to get this thing running and showing data. Please publish some document, a quick start or a script that will start the whole thing with some data displayed on the board.

On Fri, Apr 19, 2019 at 7:58 AM -0700, "Sbrenthughes" notifications@github.com wrote:

@adityai are you needing any further assistance ?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.

Sbrenthughes commented 5 years ago

Where are you stuck specifically

adityai commented 5 years ago

I start the docker container and I am able to view the web ui and I don't know what to do next. How do I get data into it so it displays something?

Thank you,

Aditya Inapurapu.

On Fri, Apr 19, 2019 at 9:50 AM -0700, "Sbrenthughes" notifications@github.com wrote:

Where are you stuck specifically

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.

Sbrenthughes commented 5 years ago

https://hygieia.github.io/ExecDashboard/EXECAPI_Setup.html - API setup https://hygieia.github.io/ExecDashboard/EXECCollectors_Setup.html - collector setup

kevin-estupinan commented 3 years ago

Following steps in Hygieia Executive documentation (https://hygieia.github.io/ExecDashboard/Introduction.html), it wasn't possible to get data from Hygieia documentation (https://hygieia.github.io/Hygieia/getting_started.html). When I display HygieiaExecutive view, I get UI without any portfolio or member, It doesn't matter that I create a collection "cmdb" with both records suggested by @Sbrenthughes.

My Hygieia Engineer View is up (UI, API, differents collectors are ON). My Hygieia Exec View is up (UI, API, differents collectors are ON). My MongoDB is UP with both DBs (HygEng and HygExec)

Here this is config in every component, by the way, I'm replicating it locally

1) MongoDB:

HygieiaEng DB (dashboarddb)

cmdb => Business Service and Business Application

{
    "_id" : ObjectId("604657e8995806a836c5aec5"),
    "timestamp" : 0,
    "configurationItem" : "HygieiaApplication",
    "configurationItemSubType" : "ciSubType",
    "configurationItemType" : "ciType",
    "configurationItemBusAppName" : "BAPFORASV",
    "configurationItemBusServName" : "ASVMYPRODUCT",
    "assignmentGroup" : "",
    "appServiceOwner" : "John Doe",
    "businessOwner" : "John Doe",
    "supportOwner" : "Jane Doe",
    "developmentOwner" : "Jane Doe",
    "ownerDept" : "Department",
    "commonName" : "hygieia",
    "itemType" : "app",
    "validConfigItem" : true,
    "environments" : null,
    "components" : [
        "HygieiaComponent"
    ]
}
{
    "_id" : ObjectId("604657f2995806a836c5aec6"),
    "timestamp" : 0,
    "configurationItem" : "HygieiaComponent",
    "configurationItemSubType" : "ciSubType",
    "configurationItemBusAppName" : "BAPFORASV",
    "configurationItemBusServName" : "ASVMYPRODUCT",
    "configurationItemType" : "ciType",
    "assignmentGroup" : "",
    "appServiceOwner" : "",
    "businessOwner" : "",
    "developmentOwner" : "",
    "ownerDept" : "Department",
    "ownerSubDept" : "",
    "commonName" : "componentHygieia",
    "itemType" : "component",
    "validConfigItem" : true
}

2) Hygieia Engineer View - application.properties

API:

dbname=dashboarddb
dbusername=dashboarduser
dbpassword=dbpassword
dbhost=localhost
dbport=27017
dbreplicaset=false
server.port=8080
server.contextPath=/api
logRequest=false
logSplunkRequest=false
corsEnabled=false
version.number=0.0.1
pageSize=10

hygieia-build-jenkins-collector:

auth.expirationTime=1200000
dbname=dashboarddb
dbhost=localhost
dbport=27017
dbreplicaset=false
dbusername=dashboarduser
dbpassword=dbpassword
jenkins.cron=0/5 * * * * *
jenkins.pageSize=1000
jenkins.folderDepth=10
jenkins.servers[0]=http://localhost:9595
jenkins.servers[1]=http://jenkins.usernames[0]:jenkins.apiKeys[0]@localhost:9595
jenkins.usernames[0]=**********
jenkins.apiKeys[0]=***********
jenkins.saveLog=true
jenkins.searchFields[0]= uaty
jenkins.searchFields[1]= dev
jenkins.connectTimeout=20000
jenkins.readTimeout=20000
server.port=8085

hygieia-scm-gitlab-collector:

dbname=dashboarddb
dbhost=localhost
dbport=27017
dbreplicaset=false
dbusername=dashboarduser
dbpassword=dbpassword
logging.file=./logs/gitlab.log
gitlab.cron=0 */1 * * * *
gitlab.host=gitlab.com
gitlab.protocol=http
gitlab.selfSignedCertificate=false
gitlab.apiToken=****************
gitlab.commitThresholdDays=15
gitlab.key=******************

3) Hygieia Exec View - application.properties

exec-api:

dbname=analyticsdb
dbusername=dashboarduser
dbpassword=dbpassword
dbhost=localhost
dbport=27017
dbreplicaset=false
server.port=8090
server.contextPath=/api
logRequest=false
logSplunkRequest=false
corsEnabled=false
version.number=0.0.1
pageSize=10

exec-analysis:

# MongoDB Details

dbname=analyticsdb
dbusername=dashboarduser
dbpassword=dbpassword
dbhost=localhost
dbport=27017

logging.file=./logs/gitlab.log

portfolio.cron=0 */5 * * * *
portfolio.readUriUserName=dashboarduser
portfolio.readUriPassword=dbpassword
portfolio.readUriDatabase=localhost:27017
portfolio.readUriPrefix=mongodb
portfolio.readDatabase=dashboarddb

portfolio.codeAnalysisCollectorFlag=true
portfolio.scmCollectorFlag=true
portfolio.incidentsCollectorFlag=true
portfolio.libraryPolicyCollectorFlag=true
portfolio.staticCodeAnalysisCollectorFlag=true
portfolio.unitTestCoverageCollectorFlag=true
portfolio.auditResultCollectorFlag=true
portfolio.securityCollectorFlag=true
portfolio.performanceCollectorFlag=true
server.port=8081

exec-ui: a) proxy.config.json

{
  "/api" : {
    "target": "http://localhost:8090/api",
    "changeOrigin": true,
    "secure": false,
    "logLevel": "debug",
    "pathRewrite": {"^/api": "http://localhost:8090/api"}
  }
}

b) environment.local.ts

export const environment = {
  production: false,
  apiUrl: 'http://localhost:8090/api'
};

c) environment.ts

export const environment = {
  production: false,
  apiUrl: 'http://localhost:8090/api'
};

Snapshot:

hygieiaexec

Any other requirement is needed in order to solve issue please ask to me.

prasad-clouduser commented 3 years ago

I have installed the Hygieia Exec Dashboard and we see the no portfolios in the dashboard and it is completely empty. I have also configured the cmdb collection with some dummy data in the hygieia db (not the exec dashboard db) still I am not able to see the portfolios information in the dashboard. Followed the documentation of exec dashboard to configure collector, api and DB and not able to figure out the issue with exec dashboard.

Do you have any documents/proper steps to configure the exec dashboard to pull the data from the hygieia db? and also please provide the compatibility information of both hygieia and hygieia exec dashboard. Appreciate your help on this.

Below is the issue I am facing.

2021-08-18 14:30:00.000 INFO 24952 --- [taskScheduler-1] c.c.d.exec.collector.PortfolioCollector : Running Hygieia EXEC Collector 2021-08-18 14:30:00.000 WARN 24952 --- [taskScheduler-1] o.apache.spark.sql.SparkSession$Builder : Using an existing SparkSession; some configuration may not take effect. 2021-08-18 14:30:00.001 INFO 24952 --- [taskScheduler-1] c.c.d.exec.collector.PortfolioCollector : ##### Begin: collectCMDB ##### 2021-08-18 14:30:00.002 WARN 24952 --- [taskScheduler-1] o.apache.spark.sql.SparkSession$Builder : Using an existing SparkSession; some configuration may not take effect. 2021-08-18 14:30:00.005 WARN 24952 --- [taskScheduler-1] org.apache.spark.storage.BlockManager : Block broadcast_1 could not be removed as it was not found on disk or in m emory 2021-08-18 14:30:00.006 ERROR 24952 --- [taskScheduler-1] o.s.s.s.TaskUtils$LoggingErrorHandler : Unexpected error occurred in scheduled task. java.lang.NoClassDefFoundError: Lcom/mongodb/MongoDriverInformation; at java.lang.Class.getDeclaredFields0(Native Method) ~[na:1.8.0_292] at java.lang.Class.privateGetDeclaredFields(Class.java:2583) ~[na:1.8.0_292] at java.lang.Class.getDeclaredFields(Class.java:1916) ~[na:1.8.0_292] at org.apache.spark.util.SizeEstimator$.getClassInfo(SizeEstimator.scala:330) ~[spark-core_2.11-2.3.3.jar!/:2.3.3] at org.apache.spark.util.SizeEstimator$.visitSingleObject(SizeEstimator.scala:222) ~[spark-core_2.11-2.3.3.jar!/:2.3.3]

ducdq1 commented 2 years ago

I have installed the Hygieia Exec Dashboard and we see the no portfolios in the dashboard and it is completely empty. I have also configured the cmdb collection with some dummy data in the hygieia db (not the exec dashboard db) still I am not able to see the portfolios information in the dashboard. Followed the documentation of exec dashboard to configure collector, api and DB and not able to figure out the issue with exec dashboard.

Do you have any documents/proper steps to configure the exec dashboard to pull the data from the hygieia db? and also please provide the compatibility information of both hygieia and hygieia exec dashboard. Appreciate your help on this.

Below is the issue I am facing.

2021-08-18 14:30:00.000 INFO 24952 --- [taskScheduler-1] c.c.d.exec.collector.PortfolioCollector : Running Hygieia EXEC Collector 2021-08-18 14:30:00.000 WARN 24952 --- [taskScheduler-1] o.apache.spark.sql.SparkSession$Builder : Using an existing SparkSession; some configuration may not take effect. 2021-08-18 14:30:00.001 INFO 24952 --- [taskScheduler-1] c.c.d.exec.collector.PortfolioCollector : ##### Begin: collectCMDB ##### 2021-08-18 14:30:00.002 WARN 24952 --- [taskScheduler-1] o.apache.spark.sql.SparkSession$Builder : Using an existing SparkSession; some configuration may not take effect. 2021-08-18 14:30:00.005 WARN 24952 --- [taskScheduler-1] org.apache.spark.storage.BlockManager : Block broadcast_1 could not be removed as it was not found on disk or in m emory 2021-08-18 14:30:00.006 ERROR 24952 --- [taskScheduler-1] o.s.s.s.TaskUtils$LoggingErrorHandler : Unexpected error occurred in scheduled task. java.lang.NoClassDefFoundError: Lcom/mongodb/MongoDriverInformation; at java.lang.Class.getDeclaredFields0(Native Method) ~[na:1.8.0_292] at java.lang.Class.privateGetDeclaredFields(Class.java:2583) ~[na:1.8.0_292] at java.lang.Class.getDeclaredFields(Class.java:1916) ~[na:1.8.0_292] at org.apache.spark.util.SizeEstimator$.getClassInfo(SizeEstimator.scala:330) ~[spark-core_2.11-2.3.3.jar!/:2.3.3] at org.apache.spark.util.SizeEstimator$.visitSingleObject(SizeEstimator.scala:222) ~[spark-core_2.11-2.3.3.jar!/:2.3.3]

Add this dependency:

    <dependency>
        <groupId>org.mongodb</groupId>
        <artifactId>mongo-java-driver</artifactId>
        <version>3.9.0</version>
    </dependency>