CSIRT-MU / Stream4Flow

A framework for the real-time network traffic analysis based on world-leading technologies for distributed stream processing, network traffic monitoring, and visualization.
https://csirt.muni.cz/?lang=en
MIT License
101 stars 36 forks source link

Application detail and the database #26

Closed severinsimko closed 7 years ago

severinsimko commented 7 years ago

Just to make sure that we have the idea about how the detail page of an application will look like. First of all, there will be a clickable link for each running application. 2017-03-14_21-30-20

This link redirects the user to the detail page of the particular application. This detail page will contain the table of details about the application and a graph. For the beginning, I'd try to implement the "Average Records Timeline" and then we can add any other graph.

applications_details

Note: I store the details in the following format: [application_id, application_name, application_timestamp, application_running_time, application_records, application_average_records, application_cores, application_memory]

Couple Questions about it:

  1. There will be a javascript script that will periodically store the application details in the DB, what would you say should be this period? Every 1 min, every 5 mins? @cermmik @tomjirsa

2, What should be the timeline for that graph? From the beginning of the application or for, let's say an hour? What should then be the granularity of the timeline? @cermmik @tomjirsa

3, What framework do you use to create graphs? @cermmik @tomjirsa

tomjirsa commented 7 years ago

There will be a javascript script that will periodically store the application details in the DB, what would you say should be this period? Every 1 min, every 5 mins?

I would vote for higher granularity, e.g. 20s. However, I do not know, how the often requests influence the web application performance. Any comments @cermmik?

What should be the timeline for that graph? From the beginning of the application or for, let's say an hour? What should then be the granularity of the timeline?

There is a scrolling timeline possibility in a graph - see example. In this case, I would vote for the displaying data from the application start, with maximum of e.g. 7 days (but it depends on the granularity of the data, viz question 1). What do you think @cermmik?

What framework do you use to create graphs?

Please use zingcharts - you can find an example application in our current protocol statistics application.

tomjirsa commented 7 years ago

More comments:

I store the details in the following format: [application_id, application_name, application_timestamp, application_running_time, application_records, application_average_records, application_cores, application_memory]

Please divide the table into more ones - application static information(application_name, application_cores, application_mem) should be in one table and the updated information in other table.

There will be a javascript script that will periodically store the application details in the DB,

The portal backend (python) (not the frontend) should store the data to DB.

tomjirsa commented 7 years ago

There will be a javascript script that will periodically store the application details in the DB, what would you say should be this period? Every 1 min, every 5 mins?

I would vote for higher granularity, e.g. 20s. However, I do not know, how the often requests influence the web application performance.

Finally we agreed on 1 min. interval

What should be the timeline for that graph? From the beginning of the application or for, let's say an hour? What should then be the granularity of the timeline?

There is a scrolling timeline possibility in a graph - see example. In this case, I would vote for the displaying data from the application start, with maximum of e.g. 7 days (but it depends on the granularity of the data, viz question 1).

Set the maximum on 1 day.

severinsimko commented 7 years ago

Progress report:

1, Created 2 database tables, one is only a summary of the running applications and the second one contains the performance information.

image

When the application is started, the info is inserted to the DB and when the application is killed then all information related to this application are deleted. As a unique identifier, I use application port.

I realized that in web2py and SQLite table must be the "id" attribute because, without him, it's not possible to delete/select the rows from the DB, I think that SQLite uses this "id" for some internal purposes. Did you realize the same thing? @cermmik

2, Added "Check" option to the application summary, this button redirects to a particular application page.

image

3, Detail Application Page added. It contains the table and a chart.

image

I stuck on passing the values from the python controller to the javascript, as soon as I fix this issue I will push the code on git.

4, The performance of getting the information about application is much better now by using the database

cermmik commented 7 years ago

I realized that in web2py and SQLite table must be the "id" attribute because, without him, it's not possible to delete/select the rows from the DB, I think that SQLite uses this "id" for some internal purposes. Did you realize the same thing? @cermmik

Yes :) If you look at users_auth and user_logins tables, then you see that there are two IDs (table id and user_id).

severinsimko commented 7 years ago

Graph successfully implemented. This is how it looks like: image

severinsimko commented 7 years ago

The data from the DB are filtered according to the given port. In the graph, there are shown only "difference values ". These "different values" are calculated as following: diff_val = current_num_of_events_from_app - last_total_num_of_events_from_db

This shown only the testing purposes, normally it fetches the data every minute image