apache / dolphinscheduler

Apache DolphinScheduler is the modern data orchestration platform. Agile to create high performance workflow with low-code
https://dolphinscheduler.apache.org/
Apache License 2.0
12.41k stars 4.5k forks source link

[Feature][e2e] project management test cases #7328

Open janeHe13 opened 2 years ago

janeHe13 commented 2 years ago

Search before asking

Description

Task status number function module test point priority service test steps expected results actual results remarks
  • [ ] finish
001 project list create a project and leave it blank P2 start API service 1 Click the "create project" button to open the "create project" pop-up box
2 Item name and description are blank
3 Click "submit" Button
Submission failed, Tip: Please enter the project name
  • [ ] finish
002 item list create item, description not required P2 start API service 1 Click the "create project" button to open the "create project" pop-up box
2 Enter the project name, and the description is blank
3 Click "submit" Button
The new project is successfully created. Close the pop-up box, t_ ds_ Project adds a new piece of data
  • [ ] finish
003 project list create project, project name and description P1 start API service 1 Click the "create project" button to open the "create project" pop-up box
2 Enter project name and description
3 Click "submit" Button
The new project is successfully created. Close the pop-up box, t_ ds_ Project adds a new piece of data
  • [ ] finish
004 project list cancel project creation P3 start API service 1 Click the "create project" button to open the "create project" pop-up box
2 Enter the project name and description
3 Click "Cancel" Button
Close the pop-up box, Cancel project creation
  • [ ] finish
005 item list item list data correctness P1 start API service view item list data 1. Admin users can view all items
2 Ordinary users can only see the projects created by themselves, but can't see the projects created by others
  • [ ] finish
006 project list click the project name link to enter the project home page P1 start API service click the project name Enter the project home page
  • [ ] finish
007 project list verify project editing P1 start API service 1 Click Edit to open the edit item pop-up box
2 Item name is required and description is not required. It is the same as new item
item edited successfully
  • [ ] finish
008 project list there is a workflow definition under the project. It is not allowed to delete the project P2 start API service 1 There are workflow definitions under the project
2 Click the "delete" button
items are not allowed to be deleted, and a prompt
  • [ ] finish
009 project list authorized projects, it is not allowed to delete P2 start API service 1 Admin authorizes user a's items to user B
2 User B selects the authorized item and clicks the "delete" button
it is not allowed to delete the authorized item
  • [ ] finish
010 project list there is no workflow definition under the project. It is allowed to delete the project P1 start the API service 1 There is no workflow definition under the project
2 Click the "delete" button
Delete item succeeded
  • [ ] finish
011 item list verification item query P1 start API service enter item name and click "query" button item name supports fuzzy query:
1 Query no data, item list displays no data
2 The query has data, and the item list displays the searched items correctly
  • [ ] finish
012 item list verify item sorting P3 start API service view item list sorting reverse sorting by creation time
  • [ ] finish
013 item list verify paging control, no more than 10 P2 start API service 1 Select 10 pieces / page, no more than 10 pieces of data, and view the pagination display
2 Select 30 pieces / page, no more than 30 pieces of data, and view the pagination display
3 Select 50 pieces / page, no more than 50 pieces of data, and view the pagination display
1 Select 10 pieces / page, no more than 10 pieces of data, and 1 page displays
2 Select 30 pieces / page, no more than 30 pieces of data, and 1 page displays
3 Select 50 pieces / page, no more than 50 pieces of data, and one page displays
  • [ ] finish
014 item list verify paging controls, more than 10 P2 start API service 1 Select 10 pieces / page, more than 10 pieces of data, and view the pagination display
2 Select 30 pieces / page and more than 30 pieces of data to view the pagination display
3 Select 50 pieces / page, more than 50 pieces of data, and view the pagination display
1 Select 10 pieces / page. If there are more than 10 pieces of data, turn the page to display
2 Select 30 pieces / page. If more than 30 pieces of data are selected, the page will turn to display
3 Select 50 pieces / page, and more than 50 pieces of data will be displayed on the page
  • [ ] finish
015 item list verify the > button of paging control P2 start API service click the > button, View the pagination display jump the next page
  • [ ] finish
016 item list verify the < button of paging control P2 start API service click the < button, View the pagination display jump to the previous page
  • [ ] finish
017 item list verify the input box of paging control P2 start API service enter the number of pages, View the pagination display enter the number of pages
  • [ ] finish
018 project homepage verification task status statistics P1 start API service 1 Enter project management, click the project name and enter the project home page
2 View the task status statistics on the project homepage
the data of task status statistics is consistent with the count value corresponding to t.state, t. The corresponding status of state is as follows:
0: submitted successfully
1: running
2: ready to pause
3: pause
4: ready to stop
5: stop
6: failure
7: success
8: fault tolerance required
9: kill
10: waiting for threads
  • [ ] finish
019 project homepage verification process status statistics P1 start API service 1 Enter project management, click the project name and enter the project home page
2 View the process status statistics on the project homepage and query the SQL as follows:
select t.state, count (0) as count
from t_ ds_ process_ instance t
        join t_ ds_ process_ definition d on d.code=t.process_ definition_ code
        join t_ ds_ project p on p.code=d.project_ code
        where 1 = 1
        and t.is_ sub_ process = 0
        and t.start_ time >= '2019-10-10 00:00:00' and t.start_ Time < ='2022-10-31 "11:16:00 '
and" p.id = 8
group by t.state
the process status statistics are consistent with the count value corresponding to t.state, t. The corresponding state is as follows:
0: submitted successfully
1: running
2: ready to pause
3: pause
4: ready to stop
5: stop
6: fail
7: success
8: fault tolerance required
9: kill
10: wait for threads
  • [ ] finish
020 project homepage validation process definition statistics P1 start API service 1 Enter project management, click the project name and enter the project home page
2 View the process definition statistics on the project homepage
select u.user_ name,count(0) as count from t_ ds_ process_ definition d
        join t_ ds_ project p on p.id=d.project_ id
  join t_ ds_ user u on u.code=d.user_ code
        where 1 = 1
  and p.id in (1)
  group by u.user_ Name
process definition statistics and u.user_ The count value corresponding to name is consistent
  • [ ] finish
021 project home page verify the default value of time control P1 start API service 1 View the default value of time control
2 View task status statistics Process status statistics
1. The default value of time control is 0:00 of the day - current time
2 The data of task status statistics and process status statistics within the default time period is correct
  • [ ] finish
022 project homepage verification time control query P1 start API service 1 Select time
2 View task status statistics Process status statistics
Task status statistics The data of process status statistics in the selected time period is correct
  • [ ] finish
023 workflow - workflow definition verify the "create workflow" button P1 start the API service click "create workflow" Button Enter DAG workflow editing page
  • [ ] finish
024 workflow - workflow definition create new shell workflow P1 start API service
if the resource needs to select a file, Hadoop or F3
1 is required Click "create workflow" to enter the workflow editing page
2 Drag the shell component onto the canvas
3 Fill in the node name and operation flag, select "normal", and fill in the description
4 Select the priority (from high to low: highest / high / medium / low / low), and select one of the priorities
5 Select worker group, environment name, failed retry times, failed retry interval and delayed execution time
7 Select timeout alarm. The timeout policy is checked as timeout alarm, timeout failure, and the timeout duration is 1 minute
8 Edit shell script:
echo "test shell start" ; echo $time; echo $today; echo ${today_ global}; sleep 70; echo "test shell end"
9. Select a resource (shell file must be created in file management), which is not required
10 Custom parameter time = $[yyyymmddhhmmss ], today = ${today_ global}
11. Click "confirm to add" to close the task editing pop-up window
12 Click "save" to pop up the "set DAG name" pop-up box
13 Enter the name and description of the workflow, select the tenant, click the timeout alarm, and set the timeout alarm for 1 minute
15 Set the global parameter today_global = $[yyyy-MM-DD ], click + to add a global parameter, and click Delete to delete the new global parameter
16 Online process definition is checked by default
17 Click the "add" button
1 Shell task is created successfully, and its status is online
2 t ds process A new data item is added to the definition table, release state=1,process definition json. tasks. type=SHELL
  • [ ] finish
025 workflow - workflow definition edit shell workflow P1 start API service 1 Click the Edit button or workflow name to enter the DAG page
2 Click task to pop up the task editing box. Task editing is the same as adding a shell task
3 Save workflow
Edit shell task succeeded
  • [ ] finish
025-1 workflow - workflow definition new / edit sub_ Process P1 start API service 1 Click "create workflow" to enter the workflow editing page
2 Sub_ Drag the process component onto the canvas
3 Fill in the node name and operation flag, select "normal", and fill in the description
4 Select the priority (from high to low: highest/high/medium/low /lowest), and select one of the priorities
5 Select worker group and environment name
6 Select timeout alarm. The timeout policy is checked as timeout alarm, timeout failure, and the timeout duration is 1 minute
7 Select the online child node
8 Click "confirm to add" to close the task editing pop-up window
9 Click "save" to pop up the "set DAG name" pop-up box
10 Enter the name and description of the workflow, select the tenant, click the timeout alarm, and set the timeout alarm for 1 minute
11 Online process definition is checked by default
12 Click the "add" button
1 SUB_ Process is created successfully, and the status is online
2 t_ ds_ process_ A new data item is added to the definition table, release_ state=1,process_ definition_ json. tasks. type=SUB_ PROCESS
  • [ ] finish
026 workflow - workflow definition sub node select the offline workflow P1 start API service 1 Click the "create workflow" button to enter the workflow editing page
2 Select a workflow
sub that has been offline in the child node_ Process can be created successfully, but the workflow cannot be run. Click the "run" button to pop up a prompt
  • [ ] finish
027 workflow - workflow definition create / edit procedure
P1 start API service 1 Click "create workflow" to enter the workflow editing page
2 Drag the procedure component into the canvas, and the public field editing is the same as the shell task
3 Select data source type and data source name
4 Enter SQL statement
5 In and out parameter types of user-defined parameters and test stored procedures:
1) for in type, you need to enter parameter name and parameter value
2) for out type, you only need to enter parameter name
1 Procedure is created successfully and can run normally
2 t_ ds_ process_ A new data entry is added to the definition table
  • [ ] finish
028 workflow - workflow definition create / edit SQL, send mail for query results P1 start API service 1 Click "create workflow" to enter the workflow editing page
2 Drag the SQL component into the canvas, and the public field editing is the same as the shell task
3 Select different data source types and addresses
4 The SQL type is query, check send email, enter email subject and alarm group, and select the number of rows of log query result
5 Edit SQL statements (only one is allowed)
6 Pre SQL and post SQL tests (select statements are not supported)
7 User defined parameter test (including local parameters and global parameters)
1 The SQL is created successfully. After normal operation, the results are sent to the mailbox. The query results can be viewed in the log of the task instance page
2 t_ ds_ process_ A new data item is added to the definition table, release_ state=1,process_ definition_ json. tasks. type=SQL
  • [ ] finish
029 workflow - workflow definition create / edit SQL, do not send mail for query results P1 start API service 1 Click "create workflow" to enter the workflow editing page
2 Drag the SQL component into the canvas, and the public field editing is the same as the shell task
3 Select different data source types and addresses
4 The SQL type is query. Do not check send mail. Select the number of rows of log query results
5 Edit SQL statements (only one is allowed)
6 Pre SQL and post SQL tests (select statements are not supported)
7 User defined parameter test (including local parameters and global parameters)
1 The SQL is created successfully. After normal operation, the results will not be sent to the mailbox. The query results can be viewed in the log of the task instance page
2 t_ ds_ process_ A new data item is added to the definition table, release_ state=1,process_ definition_ json. tasks. type=SQL
  • [ ] finish
030 workflow - workflow definition create / edit SQL type as non query P1 start API service 1 Click "create workflow" to enter the workflow editing page
2 Drag the SQL component into the canvas, and the public field editing is the same as the shell task
3 Select different data source types and addresses
4 The SQL type is non query
5 Edit SQL statements (only one is allowed)
6 Pre SQL and post SQL tests (select statements are not supported)
7 User defined parameter test (including local parameters and global parameters)
1 SQL is created successfully and can run normally
2 t_ ds_ process_ A new data item is added to the definition table, release_ state=1,process_ definition_ json. tasks. type=SQL
  • [ ] finish
031 workflow - workflow definition create / edit spark P1 start API service 1 Click "create workflow" to enter the workflow editing page
2 Drag the spark component into the canvas, and the public field editing is the same as the shell task
3 Program types Java, Scala, python
4 Spark version: select spark1 or spark2
5 Fill in the class of the main function, such as: com journey. spark. WordCount
6. Select the main package (when the program type is Java and Scala, only jar files can be selected; when python, only py files can be selected)
7 Select the "cluster" or "client" or "local" mode of spark
8 Fill in the number of driver cores, driver memory, executor, executor memory and executor cores
9 Fill in the main program parameters, such as: /jane1/ words txt  /jane1/out
10. Fill in option parameters
11 Select a resource (not required)
7 Fill in user-defined parameters, not required
1 Spark is created successfully and can run normally
2 t_ ds_ process_ A new data item is added to the definition table, release_ state=1,process_ definition_ json. tasks. type=SPARK
  • [ ] finish
032 workflow - workflow definition create / edit Flink P1 start API service 1 Click "create workflow" to enter the workflow editing page
2 Drag the flow component into the canvas, and the public field editing is the same as the shell task
3 Program types Java, Scala, python
4 Fill in the class of the main function, such as: org apache. flink. streaming. examples. wordcount. WordCount 
5. Select the main package (when the program type is Java and Scala, only jar files can be selected; when python, only py files can be selected)
6 Select cluster or local} mode for deployment
7 Flink version selection < 1.10 or > = 1.10
8 Fill in the task name
9 Fill in the number of jobmanager memory, taskmanager memory, slots, taskmanagers and parallelism
10 Fill in the main program parameters, such as: -ytm flink
11 Fill in option parameters
12 Select a resource (not required)
13 Fill in user-defined parameters (not required)
1 Flink is created successfully and can run normally
2 t_ ds_ process_ A new data item is added to the definition table, release_ state=1,process_ definition_ json. tasks. type=FLINK
  • [ ] finish
033 workflow - workflow definition create / edit Mr P1 start API service 1 Click "create workflow" to enter the workflow editing page
2 Drag the MR component into the canvas, and the public field editing is the same as the shell task
3 Program types Java, Scala, python
4 Fill in the class of the main function, such as: com journey. hadoop. WordCount
5. Select the main package (when the program type is Java and Scala, only jar files can be selected; when python, only py files can be selected)
6 Fill in the task name
7 Fill in the main program parameters, such as: / jane1 / words txt  /jane1/MRout1
8. Fill in the option parameter
9 Select a resource (not required)
10 Fill in user-defined parameters (not required)
1 MR is created successfully and can run normally
2 t_ ds_ process_ A new data item is added to the definition table, release_ state=1,process_ definition_ json. tasks. type=MR
  • [ ] finish
034 workflow - workflow definition create / edit Python P1 start API service 1 Click "create workflow" to enter the workflow editing page
2 Drag the python component into the canvas, and the public field editing is the same as the shell task
3 Writing Python scripts
4 If the script needs to reference resources, create a file in the file management module
5 Fill in user-defined parameters (not required)
1 Python is created successfully and can run normally
2 t_ ds_ process_ A new data item is added to the definition table, release_ state=1,process_ definition_ json. tasks. type=PYTHON
  • [ ] finish
035 workflow - workflow definition create / edit requirement P1 start API service 1 Click "create workflow" to enter the workflow editing page
2 Drag the requirement component into the canvas, and the public field editing is the same as the shell task
3 Add dependency, select project - > workflow - > task
2 Test the offset (first XXX) of interval (month, week, day and hour)
3 Dependent conditions and, or tests
1 Requirement is created successfully and can run normally
2 t_ ds_ process_ A new data item is added to the definition table, release_ state=1,process_ definition_ json. tasks. type=DEPENDENT
  • [ ] finish
036 workflow - workflow definition create / edit http P1 start API service 1 Click "create workflow" to enter the workflow editing page
2 Drag the HTTP component into the canvas, and the public field editing is the same as the shell task
3 Test request address
4 Test request type: get, post, head, put, delete
5 Test request parameters: parameter, body, headers
6 Test verification conditions: default response code 200, user-defined response code, content verification
7 Timeout setting: fill in connection timeout and socket timeout
8 Custom parameter
1 HTTP is created successfully and can run normally. If it times out, the operation fails
2 t_ ds_ process_ A new data item is added to the definition table, release_ state=1,process_ definition_ json. tasks. type=HTTP
  • [ ] finish
037 workflow - workflow definition create / edit dataX, non custom mode P1 start API service 1 Click "create workflow" to enter the workflow editing page
2 Drag the dataX component into the canvas, and the public field editing is the same as the shell task
3 Close the user-defined template (closed by default), and select different data source types and data sources
4 Write SQL statements
5 Select target database and data source
6 Fill in the target table
7 Write pre SQL and post SQL of target database (optional)
8 Fill in current limit (number of bytes), current limit (number of records) and running memory
dataX is created successfully and can run successfully
  • [ ] finish
038 workflow - workflow definition create / edit dataX, custom mode P1 start API service 1 Click "create workflow" to enter the workflow editing page
2 Drag the dataX component into the canvas, and the public field editing is the same as the shell task
3 Open the custom template
4 Write JSON, JSON reference template: {job ": {setting": {speed ": {channel": 3}, "errorlimit": {record ": 0," percentage ": 0.02}}," content ": \ [{reader": {name ":" mysqlreader "," parameter ": {username": "root", "password": "root", "column": \ ["Id", "name" ], "splitpk": "DB \ _id", "connection": \ [{table ": \ [" table "]," jdbcurl ": \ [" JDBC: mysql://127.0.0.1:3306/database "]}]}},"writer":{"name":"mysqlwriter","parameter":{"writeMode":"insert","username":"root","password":"root","column":["id","name"],"session":["set session sql_mode='ANSI'"],"preSql":["delete from test"],"connection":[{"jdbcUrl":"jdbc: mysql://127.0.0.1:3306/datax?useUnicode=true&characterEncoding=gbk ","table":["test"]}]}}}]}}
5. Fill in user-defined parameters
6 Fill in the running memory
dataX is created successfully and can run successfully
  • [ ] finish
039 workflow - workflow definition create / edit sqoop, select import P1 start API service 1 Click "create workflow" to enter the workflow editing page
2 Drag the sqoop component into the canvas, and the public field editing is the same as the shell task
3 Fill in the task name
4 Select import
5 for flow direction Fill in Hadoop parameters, such as: MapReduce, map memory. mb=2048
6. Fill in sqoop parameters, such as: MapReduce and reduce memory. mb=2048
7. Type level data source of data source: MySQL: mode selection form or SQL
8 Add hive type mapping and Java type mapping
9 Select HDFS or hive as the type of data destination, and fill in the corresponding field information according to different types
10 Fill in custom parameters
Sqoop is created successfully and can run successfully
  • [ ] finish
040 workflow - workflow definition create / edit sqoop, select export P1 start API service 1 Click "create workflow" to enter the workflow editing page
2 Drag the sqoop component into the canvas, and the public field editing is the same as the shell task
3 Fill in the task name
4 Select export
5 for flow direction Fill in Hadoop parameters, such as: MapReduce, map memory. mb=2048
6. Fill in sqoop parameters, such as: MapReduce and reduce memory. mb=2048
7. When import is selected as the flow direction, MySQL is selected as the type level data source of the data source: mode selection form or SQL
8 Add hive type mapping and Java type mapping
9 Select HDFS or hive as the type of data destination, and fill in the corresponding field information according to different types
10 Fill in the user-defined parameter
sqoop, create it successfully and run it successfully
  • [ ] finish
041 workflow - workflow definition create / edit conditions task P1 start API service 1 Create three other types of tasks a, B and C
2 Create the conditions task, connect task a before the conditions task, and connect tasks B and C after the conditions task
3 Double click the conditions task and select branch flow B in success status and branch flow C in failure status
4 Click the user-defined parameter and select the status of task a as success or failure
1 If the status of task a is selected as successful, task a will flow to Task B after running, and task C will not execute
2 If the status of task a is failed, task a will be transferred to task C after running, and Task B will not be executed
  • [ ] finish
042 workflow - workflow definition create multi node task DAG diagram P1 start API service 1 Click "create workflow" to enter the workflow editing page
2 Create Task1, task2, task3 task4
3. Task1 connects task2 and task3 (task2 and task3 are parallel), and task4 depends on task2 and task3
1 Workflow created successfully
2 Task2 runs in parallel with task3. Only after task2 and task3 run successfully at the same time can task4 be run
  • [ ] finish
043 workflow - workflow definition prohibit running P1 start API service task setting prohibit running When the process starts, the task association is broken first and the task is not executed
  • [ ] finish
044 workflow - workflow definition task priority P1 starting API services parallel tasks of the same process instance, Set task priority when the number of worker threads is insufficient, Run from high to low according to task priority
  • [ ] finish
045 workflow - workflow definition task worker group P1 start API service task selection worker group 1 The worker service has been started in the worker group, and the task can run successfully
2 The worker service is not started in the worker group, and the task status is always "submitted successfully"
3.1 After version 2.1, workers are grouped in worker Configure worker in properties Group = workergroupname, the default value of worker group is default
  • [ ] finish
046 workflow - workflow definition number of failed retries, task retry interval P1 start API service after the test task fails, Time interval of task rescheduling after the task fails, the task will run again automatically after the retry time interval, Stop retry after the number of failed retries is reached
  • [ ] finish
047 workflow - workflow definition task timeout alarm P1 start API service 1 Turn on the "timeout alarm" switch
2 Select "timeout alarm" for timeout strategy
3 Set "timeout duration"
4 Select alarm group
1 when running workflow Send the alarm in timeout. Send the alarm information according to the alarm instance associated with the alarm group. The alarm instance types are Feishu, whechat, email, HTTP, dingtalk, script, slack
2 t_ ds_ A new piece of data is added to the alert table_ status=1
  • [ ] finish
048 workflow - workflow definition task timeout failure P1 start API service 1 Turn on the "timeout alarm" switch
2 Select "timeout failed" for timeout policy
3 Set "timeout duration"
after timeout, the task status is set to failed, t_ ds_ task_ instance. state=6
  • [ ] finish
049 workflow - workflow definition workflow instance timeout alarm P1 start API service 1 Create workflow
2 Save the workflow and turn on the "timeout alarm" switch
3 Set timeout duration
4 Run the workflow and select alarm group
1 Send the alarm in timeout. Send the alarm information according to the alarm instance associated with the alarm group. The alarm instance types are Feishu, whechat, email, HTTP, dingtalk, script, slack
2 t_ ds_ A new piece of data is added to the alert table_ status=1
  • [ ] finish
050 workflow - workflow definition custom parameters P1 start API service 1 Create workflow
2 Set user-defined parameters in the task (user-defined parameters can reference global variables)
Custom parameters can be referenced in task scripts
  • [ ] finish
051 workflow - workflow definition parameter transfer P1 start API, master, worker and logger services 1 Create shell task Task1 - > task2. Task2 depends on Task1
2 Shell script input in Task1
echo ${setValue (trans = Hello trans)} ;
3 The custom parameter setting of Task1 is trans, and the type is out
4 Shell script input in task2 echo ${trans}
5 Run the workflow
the parameter trans value in Task1 is passed to task2
  • [ ] finish
052 workflow - workflow definition when saving the workflow, the workflow goes online P1 start the API service when saving the workflow, check "go online process definition" 1 The workflow is saved successfully, and the status is online
2 t_ ds_ process_ A new data item is added to the definition table, release_ state=1,process_ definition_ json. tasks. type=SHELL
  • [ ] finish
053 workflow - workflow definition when saving the workflow, the workflow is not online P1 start API service when saving the workflow, uncheck "online process definition" 1 The workflow is saved successfully, and the status is offline
2 t_ ds_ process_ A new data item is added to the definition table, release_ state=0,process_ definition_ json. tasks. type=SHELL
  • [ ] finish
054 workflow - workflow definition when saving the workflow, set the global parameter P1 start API service 1 Create workflow
2 When saving the workflow, set the global parameter
1 Global parameters can be referenced in task scripts and user-defined parameters
2 Global parameters can be constants or variables
  • [ ] finish
055 workflow - workflow definition export workflow P1 start API service 1 Click the Export button in the workflow menu bar
2 Check one or more workflows and click the "export" button at the bottom of the page to successfully export the workflow
  • [ ] finish
056 workflow - workflow definition import workflow P1 start API service 1 Import a new workflow
2 Import an existing workflow
3 Cross project import workflow
successfully imported workflow
  • [ ] finish
057 workflow - workflow definition copy workflow P1 start API service 1 Click the copy button in the workflow menu bar
2 Check one or more workflows, click the "batch copy" button, and select the project name
workflow copied successfully
  • [ ] finish
058 workflow - workflow definition mobile workflow P1 start API service 1 Check one or more workflows and click batch move
2 Select the project name
successfully moved the workflow
  • [ ] finish
059 workflow - workflow definition workflow definition list data correctness P2 start API service view the list header and data of workflow definition page 1 List header: number, workflow name, status, creation time, update time, description, modify user, timing status, operation (edit, run, timing, delete, timing management, delete, tree diagram, download)
2 The list data is displayed correctly
  • [ ] finish
060 workflow - workflow definition workflow definition name link P2 start API service click the workflow definition name link enter the workflow definition DAG page. The workflow status can only be edited when it is offline, Cannot edit when online
  • [ ] finish
061 workflow - workflow definition query workflow definition name P1 start API service 1 Enter the workflow definition page and enter the workflow name
2 Click the query button
workflow name supports fuzzy query:
1 Query no data, and the list displays no data temporarily
2 The query has data, and the list is displayed correctly
  • [ ] finish
062 workflow - workflow definition verify the "Edit" button P1 start the API service enter the workflow definition page and click the "Edit" button 1 Click the "Edit" button to enter the workflow editing page, and the workflow DAG diagram displays correctly
2 Workflow online status, unable to edit; Workflow offline status can be edited
  • [ ] finish
063 workflow - workflow definition cancel button verification P2 start API service 1 To create or edit a workflow, click cancel in task editing
2 Click "close" on DAG page
2 To save the workflow, click the Cancel button in the "set DAG name" pop-up box
click the cancel or close button to close the pop-up box without saving the workflow or task information
  • [ ] finish
064 workflow - workflow definition verify the "run" button P1 start the API service enter the workflow definition page, click the "run" button click the "run" button to pop up "start parameter setting" Bullet frame
  • [ ] finish
065 workflow - workflow definition start parameters - failure policy is "continue" P1 start API service 1 Workflow definition creates a DAG diagram of a parallel task
2 Enter the workflow definition page and click "run" to pop up a pop-up box
3 Select "continue" for the failure strategy in the pop-up box
1 One of the parallel tasks fails to execute, and the next node task continues to run after the other task succeeds
2 Workflow instance status is failed
  • [ ] finish
066 workflow - workflow definition start parameters - failure policy is "end" P1 start API service 1 Workflow definition creates a DAG diagram of a parallel task
2 Enter the workflow definition page and click "run" to pop up a pop-up box
3 Select "end" for the failure strategy in the pop-up box
1 One of the parallel tasks failed to execute. Kill the parallel task
2 The workflow instance status is "failed", and the task status of the killed is "kill"
  • [ ] finish
067 workflow - workflow definition start parameters - notification policy P1 start API service 1 Enter the workflow definition page and click "run" to pop up a pop-up box
2 Select the notification policy from the pop-up box: none, success, failure, success or failure
1 Select "don't send" to make the flow instance run successfully or fail without sending a notification
2 Select send successfully to send a notification after the workflow instance runs successfully. If it fails, it will not be sent
3 Select send failed to send a notification after the workflow instance fails to run. If it succeeds, it will not be sent
4 Select send both success and failure, and the workflow instance will be notified of success or failure
  • [ ] finish
068 workflow - workflow definition start parameters - process priority P1 start API service 1 Enter the workflow definition page and click "run" to pop up a pop-up box
2 Select the process priority in the pop-up box
When the master thread is insufficient, Run from high to low according to the process priority
  • [ ] finish
069 workflow - workflow definition start parameters - worker group. The default in worker configuration is P1 start API service 1 Enter the workflow definition page and click "run" to pop up a pop-up box
2 Select worker group default
1 in the pop-up box If only one worker service is started, the workflow task runs in the worker
2 If multiple worker services are started, the workflow task randomly selects one worker to run
  • [ ] finish
070 workflow - workflow definition startup parameters - worker group, change worker Group configuration P1 start API service 1 Change/ conf/worker. Worker. In the properties configuration group=woker_ group_ 188. If you don't change it, the default is default. Restart the worker service
2 Enter the workflow definition page and click "run" to pop up a pop-up box
3 Select worker group woker from the pop-up box_ group_ 188
workflow tasks in woker_ group_ 188 machine running
  • [ ] finish
071 workflow - workflow definition start parameters - environment name P1 start API service 1 Enter the workflow definition page and click "run" to pop up a pop-up box
2 Select the environment name
1 in the pop-up box When the workflow is running, the environment variables of the selected environment will be loaded preferentially
2 If no environment variable is selected, the environment variable configured in DS will be loaded by default/ conf/env/dolphinscheduler_ env. sh
  • [ ] finish
072 workflow - workflow definition start parameters - alarm group P1 start API service 1 Enter the workflow definition page and click "run" to pop up a pop-up box
2 Select the alarm group in the pop-up box
after the workflow runs, Send the results to the recipients of the alarm group
  • [ ] finish
073 workflow - workflow definition start parameters - complement: serial execution P1 start API service 1 Enter the workflow definition page and click "run" to pop up a pop-up box
2 Check "complement" in the pop-up box and select "serial execution"
3 Supplement date: 2019-11-05 00:00:00 - 2019-11-07 00:00:00
1 Run the workflow, first supplement the data on November 5, 2019, then supplement the data on November 6, 2019, and then supplement the data on November 7, 2019
2 There are only 1 data in the workflow instance page. The running type is "complement number". After scheduling time is completed from 2019-11-05 00:00:00>2019-11-06 00:00:00>2019-11-07 00:00:00 and complement, the scheduling time is 2019-11-07 00:00:00.
  • [ ] finish
074 workflow - workflow definition start parameters - complement: parallel execution P1 start API service 1 Enter the workflow definition page and click "run" to pop up a pop-up box
2 Check "replenish" in the pop-up box and select "parallel execution"
3 Supplement date: 2019-11-05 00:00:00 - 2019-11-07 00:00:00
1 Run the workflow and supplement the data of 2019-11-05, 2019-11-06 and 2019-11-07 in parallel
2 There are three pieces of data on the workflow instance page. The operation type is "supplement". The scheduling times are 2019-11-05 00:00:00, 2019-11-06 00:00:00 and 2019-11-07
  • [ ] finish
075 workflow - workflow definition ordinary users run workflow P1 start API, master, worker, logger and alert services 1 Ordinary users enter the workflow definition page and click "run" to pop up a pop-up box
2 Set the startup parameters in the pop-up box and click the "run" button
1 Run workflow, t ds Command writes a piece of data to be executed, and the master service scans t_ ds_ The command table writes data to zookeeper's tasks_ Queue, and then the worker service performs tasks
2 If the master service fails to execute the command, the data will be recorded to t ds errorCommand table
3 Workflow instance written to t
ds process Instance table
4 Task write to t ds task_ Instance table
  • [ ] finish
076 workflow - workflow definition admin user is not associated with tenant, run workflow P2 start API, master, worker, logger and alert services 1 The admin user (not associated with a tenant) enters the workflow definition page, selects the workflow definition, and clicks the "run" button
2 Set the startup parameters in the pop-up box and click the "run" button
the administrator admin is not associated with a tenant and cannot run the workflow
  • [ ] finish
077 workflow - workflow definition admin user associate tenant, run workflow P1 start API, master, worker, logger and alert services 1 Admin user (associated tenant) enters the workflow definition page, selects the workflow definition, and clicks the "run" button
2 Set the startup parameters in the pop-up box, click the "run" button
administrator admin is associated with the tenant, and the workflow can be run
  • [ ] finish
078 workflow - workflow definition verify the "timing" button P1 start the API service enter the workflow definition page, click the "timing" button the "timing setting parameters" pop up Bullet frame
  • [ ] finish
079 workflow - workflow definition add timing - parameter setting P1 start API service 1 Enter the workflow definition page and click "timing" to open the "timing setting parameters" pop-up box
2 Set "start and end time" and "timing" expressions in the pop-up box, failure policy, notification policy, process priority, worker group, notification group, recipient and CC
3 Click the "execution time" button
1 Click the "timing time" button, and the execution time of the next 5 times is displayed below
2 It will not take effect until it goes online regularly
3 When the current system time reaches the scheduled execution time, the workflow will run automatically. A new piece of data will be added to the workflow instance. The operation type is "scheduling execution", and the scheduling time will be displayed correctly
  • [ ] finish
080 workflow - workflow definition verify the "timing management" button P1 start the API service enter the workflow definition page, click the "timing management" button enter the "timing management" page, The timing list data is displayed correctly
  • [ ] finish
081 workflow - workflow definition regular editing P1 starting API service 1 Enter the workflow definition page and click "timing management" to enter the timing management page
2 Click the "Edit" button
timing editing is the same as adding timing - parameter setting
  • [ ] finish
082 workflow - workflow definition regular online P1 start API, master, worker, logger and alert services 1 Enter the workflow definition page and click "timing management" to enter the timing management page
2 Click the "go online" button
1 Click the "go online" button to take effect regularly
2 The edit and delete buttons cannot be clicked, and the "go online" button changes to "go offline"
  • [ ] finish
083 workflow - workflow definition timed offline P1 start API service 1 Enter the workflow definition page and click "timing management" to enter the timing management page
2 Click the "offline" button
1 After clicking the "offline" button, the timing will be invalid
2 The edit and delete buttons can be clicked, and the "offline" button changes to "online"
  • [ ] finish
084 workflow - workflow definition delete timing P1 start API service 1 Enter the workflow definition page and click "timing management" to enter the timing management page
2 Click the "delete" button
delete the timing
  • [ ] finish
085 workflow - workflow definition workflow definition online P1 start API service enter the workflow definition page and click "online" button 1 The "go online" button changes to "go offline"
2 The "Edit" and "delete" buttons cannot be clicked
  • [ ] finish
086 workflow - workflow definition workflow definition offline P1 start API service enter the workflow definition page and click the "offline" button 1 The "offline" button changes to "online"
2 "Edit" and "delete" buttons can be clicked, while "run", "timing" and "timing management" cannot be clicked
  • [ ] finish
087 workflow - workflow definition delete workflow definition P1 start API service enter the workflow definition page and click "delete" Button workflow definition deleted successfully
  • [ ] finish
088 workflow - workflow definition select all option box P1 start API service 1 Enter the workflow definition page and check "select all"
2 Click the "delete" button
1 If bselect all / B is checked, only the workflow at the end of the current page can be selected
2 Click Delete to delete only the selected workflow on the current page
  • [ ] finish
089 workflow - workflow definition cancel the "select all" option box P2 start API service 1 Enter the workflow definition page and uncheck "select all" the delete, export, batch copy and batch move buttons below the list are not clickable
  • [ ] finish
090 workflow - workflow definition radio option box P1 start API service 1 Enter the workflow definition page and check the "single choice" option box of workflow
2 Click the "delete" button
1. Only the offline workflow option box can be checked, and the online workflow option box cannot be checked
2. Click Delete to delete the selected workflow
  • [ ] finish
091 workflow - workflow definition cancel the "radio" option box P2 start API service 1 Enter the workflow definition page and uncheck the "radio" option box the delete, export, batch copy and batch move buttons below the list are not clickable
  • [ ] finish
092 workflow - workflow definition verification tree button P2 start API service enter the workflow definition page, click the "tree" button enter the tree page, Task related information is displayed correctly
  • [ ] finish
093 workflow - workflow definition verify the tree information P2 start API service switch the display quantity of the tree at the top right of the tree the number of tree tasks and related information are displayed correctly
  • [ ] finish
094 workflow - workflow definition workflow definition version switching, switching version P1 starting API service 1 Enter the workflow definition page, switch versions, and click OK switch versions successfully
  • [ ] finish
095 workflow - workflow definition DAG page version switching, switching version P1 starting API service 1 Enter the workflow DAG page, switch the version, and click the OK button the version switching is successful
  • [ ] finish
096 workflow - workflow definition workflow definition version switching, cancel version switching P2 start API service 1 Enter the workflow definition page, switch versions, and click the Cancel button do not switch versions
  • [ ] finish
097 workflow - workflow definition DAG page version switching, cancel version switching P2 start API service 1 Enter the workflow definition DAG, switch the version, click the Cancel button do not switch the version
  • [ ] finish
098 workflow - workflow definition select task to run P1 start API, master, worker, logger and alert services 1 Enter the workflow definition page, click edit or the workflow definition name link to enter the workflow DAG chart page
2 Select the task, right-click to pop up the pop-up box, and click the "run" button
1 Click Run to pop up the pop-up box for startup parameter settings:
1) select execute backward for node execution and execute backward from the current task
2) select execute forward for node execution and execute from the first task
3) select execute only the current node for node execution and execute only the current node
2 Workflow is not online and cannot be run
3 The workflow has been online and can be run
  • [ ] finish
099 workflow - workflow definition edit task P1 start API service 1 Enter the workflow definition page, click edit or the workflow definition name link to enter the workflow DAG chart page
2 Select a task, right-click to pop up the pop-up box, and click Edit
pop up the current node settings of the task
  • [ ] finish
100 workflow - workflow definition copy task P1 start API service 1 Enter the workflow definition page, click edit or the workflow definition name link to enter the workflow DAG chart page
2 Select a task, right-click to pop up a pop-up box, and click Copy
copy a current task node
  • [ ] finish
101 workflow - workflow definition delete task P1 start API service 1 Enter the workflow definition page, click edit or the workflow definition name link to enter the workflow DAG chart page
2 Select a task, right-click to pop up a pop-up box, and click Delete
delete the current task node
  • [ ] finish
102 workflow - workflow definition validate paging controls, no more than 10 P2 start API service 1 Select 10 pieces / page, no more than 10 pieces of data, and view the pagination display
2 Select 30 pieces / page, no more than 30 pieces of data, and view the pagination display
3 Select 50 pieces / page, no more than 50 pieces of data, and view the pagination display
1 Select 10 pieces / page, no more than 10 pieces of data, and 1 page displays
2 Select 30 pieces / page, no more than 30 pieces of data, and 1 page displays
3 Select 50 pieces / page, no more than 50 pieces of data, and one page displays
  • [ ] finish
103 workflow - workflow definition validate paging controls, more than 10 P2 start API service 1 Select 10 pieces / page, more than 10 pieces of data, and view the pagination display
2 Select 30 pieces / page and more than 30 pieces of data to view the pagination display
3 Select 50 pieces / page, more than 50 pieces of data, and view the pagination display
1 Select 10 pieces / page. If there are more than 10 pieces of data, turn the page to display
2 Select 30 pieces / page. If more than 30 pieces of data are selected, the page will turn to display
3 Select 50 pieces / page, and more than 50 pieces of data will be displayed on the page
  • [ ] finish
104 workflow - workflow definition verify the > button of paging control P2 start API service click the > button, View the pagination display page jump to the next page
  • [ ] finish
105 workflow - workflow definition verify the < button of paging control P2 start API service click the < button, View the pagination display page jump to the previous page
  • [ ] finish
106 workflow - workflow definition verify the input box of pagination control P2 start API service enter the number of pages to view pagination display Page jumps to the page where the page number is entered
  • [ ] finish
107 workflow - workflow instance verify the correctness of workflow instance list data P2 start API service enter the workflow instance page and view the workflow instance list header and data 1 Workflow instance list header: number, workflow name, running type, scheduling time, start time, end time, running duration s, running times, host, fault tolerance ID, status, operation (edit, rerun, recovery failure, stop, pause deletion, Gantt chart)
2 The list data is displayed correctly
  • [ ] finish
108 workflow - workflow instance verify workflow instance query criteria P1 start API service 1 Select query criteria: workflow name, execution user, host, status and time
2 Click the query button
workflow instance name and host support fuzzy query, and return the correct workflow data according to the query criteria
  • [ ] finish
109 workflow - workflow instance workflow instance status is "executing", check whether the button can be clicked P2 start API service workflow instance status is "executing", check whether the button can be clicked 1 Edit, rerun, restore failed and delete buttons cannot be clicked
2 Stop, pause and Gantt chart buttons can be clicked
  • [ ] finish
110 workflow - workflow instance workflow instance status is "successful", check whether the button can be clicked P2 start API service workflow instance status is "successful", check whether the button can be clicked 1 The recovery failed, stop and pause buttons cannot be clicked
2 Click Edit, rerun, delete and Gantt chart buttons
  • [ ] finish
111 workflow - workflow instance workflow instance status is "failed", check whether the button can be clicked P2 start API service workflow instance status is "failed", check whether the button can be clicked 1 Stop and pause buttons cannot be clicked
2 Edit, rerun, restore failed, delete and Gantt chart buttons can be clicked
  • [ ] finish
112 workflow - workflow instance workflow instance status is "waiting thread", check whether the button can be clicked P2 start API service workflow instance status is "waiting thread", check whether the button can be clicked 1 Edit, rerun, delete, resume failed, stop and pause buttons cannot be clicked
2 The Gantt chart button can be clicked
  • [ ] finish
113 workflow - workflow instance workflow instance status is "ready to pause", check whether the button can be clicked P2 start API service workflow instance status is "ready to pause", check whether the button can be clicked 1 Edit, rerun, delete, resume failed, stop and pause buttons cannot be clicked
2 The Gantt chart button can be clicked
  • [ ] finish
114 workflow - workflow instance workflow instance status is "suspended", check whether the button can be clicked P2 start API service workflow instance status is "suspended", check whether the button can be clicked 1 Recovery failed, stop button cannot be clicked
2 Click Edit, rerun, resume operation, delete and Gantt chart buttons
  • [ ] finish
115 workflow - workflow instance workflow instance status is "ready to stop", check whether the button can be clicked P2 start API service workflow instance status is "ready to stop", check whether the button can be clicked 1 Edit, rerun, delete, resume failed, stop and pause buttons cannot be clicked
2 The Gantt chart button can be clicked
  • [ ] finish
116 workflow - workflow instance workflow instance status is "stop", check whether the button can be clicked P2 start API service workflow instance status is "stop", check whether the button can be clicked 1 The resume failed and pause buttons cannot be clicked
2 Click Edit, rerun, resume operation, delete and Gantt chart buttons
  • [ ] finish
117 workflow - workflow instance edit workflow, update workflow definition and workflow instance P1 start API service 1 Enter the workflow instance page, click the "Edit" button or workflow name to enter the DAG workflow editing page
2 Edit the workflow and click Save to pop up a pop-up box
3 Check "update process definition" in the pop-up box
update the current workflow instance and workflow definition
  • [ ] finish
118 workflow - workflow instance edit workflow, update workflow instance, do not update workflow definition P1 start API service 1 Enter the workflow instance page, click the "Edit" button or workflow name to enter the DAG workflow editing page
2 Edit the workflow and click Save to pop up a pop-up box
3 Do not check "update process definition" in the pop-up box
only update the current workflow instance, and the workflow definition remains unchanged
  • [ ] finish
119 workflow - workflow instance view workflow instance parameters P2 start API service 1 Enter the workflow instance page, click the "Edit" button or click the workflow instance name link to enter the DAG workflow editing page
2 Click the "start parameters" and "view variables" buttons in the upper left corner of the page
1 Expand startup parameters, global variables and local variables
2 Click the "start parameter" and "view variable" buttons again to collapse the parameter display
  • [ ] finish
120 workflow - workflow instance view task log P1 start API service 1 Enter the workflow instance page, click the "Edit" button or click the workflow instance name link to enter the DAG workflow editing page
2 Double click the task, expand the node settings, and click the "view log" button
view the running log. The larger log can be viewed in pieces, dragged up and down, and requested in pieces
  • [ ] finish
121 workflow - workflow instance view history P1 start API service 1 Enter the workflow instance page, click the "Edit" button or click the workflow instance name link to enter the DAG workflow editing page
2 Double click the task, expand node settings, and click view history button
Enter the task instance page, and the task instance list displays the tasks associated with the workflow instance
  • [ ] finish
122 workflow - workflow instance rerun P1 start API, master, worker, logger and alert services enter the workflow instance page, click the "rerun" button run from the first task node, and the operation type changes to "Run again
  • [ ] finish
123 workflow - workflow instance recovery failed P1 start API, master, worker, logger and alert services enter the workflow instance page, click the "recovery failed" button run from the failed task node, and the operation type is "execute from the failed node"
  • [ ] finish
124 workflow - workflow instance stop P1 start API, master, worker, logger and alert services enter the workflow instance page and click the "stop" button 1. After clicking stop, the process status changes to "ready to pause" and the operation type changes to "stop"
2. Write the task being executed into the t_ds_commandtable. The master scans the table, writes the data into the tasks_kill queue of zookeeper, and then the worker executes the kill task
3. After killing the task, the process status changes to "stop" and the task status changes to "kill"
  • [ ] finish
125 workflow - workflow instance resume running stopped process P1 start API, master, worker, logger and alert services 1. Enter the workflow instance page and click "resume running" Start from the task that was killed when the process was stopped
  • [ ] finish
126 workflow - workflow instance pause P1 start API, master, worker, logger and alert services enter the workflow instance page and click the "pause" button 1. After clicking pause, the process status changes to "ready to pause" and the operation type changes to "pause"
2. The submitted tasks will be completed, the unsubmitted tasks will be suspended, and the process status will change to "suspended"
  • [ ] finish
127 workflow - workflow instance resume running suspended process P1 start API, master, worker, logger and alert services 1. Enter the workflow instance page and click "resume running" Button execute from the suspended task
  • [ ] finish
128 workflow - workflow instance waiting thread P1 start API, master, worker, logger and alert services reduce the number of executable processes of master.exec.threads in master.properties,execute multiple processes processes with excess number of executable threads will enter the waiting thread state
  • [ ] finish
129 workflow - workflow instance resume the process waiting for threads P1 start the executing threads such as: API, master, worker, logger and alert services end the execution, and the workflow waiting for threads will execute in turn the workflow waiting for thread status will execute normally and successfully
  • [ ] finish
130 workflow - workflow instance delete workflow instance P1 start API service 1. Enter the workflow instance page and click "delete" Button delete the workflow instance and delete the task associated with the workflow instance. At the same time, if there is an associated task instance of this process instance in ZK, it will be deleted
  • [ ] finish
131 workflow - workflow instance view Gantt chart P2 start API service 1. Enter the workflow instance page and click "Gantt chart" Button enter the Gantt chart page, and the task related information is displayed correctly
  • [ ] finish
132 workflow - workflow instance select all and delete workflow instance P1 start API service 1. Check the "select all" option box, and the "delete" button is displayed below the list
2. Click the "delete" button
click "delete" Button to delete all workflow instances and their associated tasks on the current page
  • [ ] finish
133 workflow - workflow instance deselect all P2 start API service deselect the "select all" option box delete below the list The button cannot be clicked
  • [ ] finish
134 workflow - workflow instance radio selection P1 start API service 1. Check the "radio selection" option box, and the "delete" button is displayed at the bottom of the list
2. Click "delete" Button
delete the selected workflow instance and its associated task
  • [ ] finish
135 workflow - workflow instance cancel radio selection P2 start API service uncheck the "radio selection" option box delete below the list The button cannot be clicked
  • [ ] finish
136 workflow - workflow instance verify pagination control, no more than 10 P2 start API service 1. Select 10 items / page, no more than 10 data, and view pagination display
2. Select 30 items / page, no more than 30 data, and view pagination display
3. Select 50 items / page, no more than 50 data, and view pagination display
1. Select 10 items / page, no more than 1 0 pieces of data, 1 page displays
2. Select 30 pieces / page, no more than 30 pieces of data, 1 page displays
3. Select 50 pieces / page, no more than 50 pieces of data, and 1 page displays
  • [ ] finish
137 workflow - workflow instance validate paging controls, more than 10 P2 start API service 1 Select 10 pieces / page, more than 10 pieces of data, and view the pagination display
2 Select 30 pieces / page and more than 30 pieces of data to view the pagination display
3 Select 50 pieces / page, more than 50 pieces of data, and view the pagination display
1 Select 10 pieces / page. If there are more than 10 pieces of data, turn the page to display
2 Select 30 pieces / page. If more than 30 pieces of data are selected, the page will turn to display
3 Select 50 pieces / page, and more than 50 pieces of data will be displayed on the page
  • [ ] finish
138 workflow - workflow instance verify the > button of paging control P2 start API service click the > button to view paging display page Jump to the next page
  • [ ] finish
139 workflow - workflow instance verify the < button of pagination control P2 start API service click the < button to view pagination display page Jump to the previous page
  • [ ] finish
140 workflow - workflow instance verify the input box of the paging control P2 start the API service enter the number of pages, view the paging display page Jump to the page where the page number is entered
  • [ ] finish
141 workflow - task instance verify task instance query criteria P1 start API service 1 Select query criteria: task name, workflow instance, execution user, host, status and time
2 Click the query button
task name and host support fuzzy query, and return the correct workflow data according to the query criteria
  • [ ] finish
142 workflow - task instance correctness of task instance list data P2 start API service enter the task instance page to view the task list header and data task name and host support fuzzy query, Return correct workflow data according to query criteria
  • [ ] finish
143 workflow - task instance view log P1 start API service enter the task instance page, click the "view task" button view the running log. The larger log is viewed in pieces, drag up and down, Partition request
  • [ ] finish
144 workflow - task instance workflow instance name link P1 start API service enter the task instance page, click the workflow instance name link enter the workflow instance DAG diagram page
  • [ ] finish
145 workflow - task instance verify paging controls, no more than 10 P2 start API service 1 Select 10 pieces / page, no more than 10 pieces of data, and view the pagination display
2 Select 30 pieces / page, no more than 30 pieces of data, and view the pagination display
3 Select 50 pieces / page, no more than 50 pieces of data, and view the pagination display
1 Select 10 pieces / page, no more than 10 pieces of data, and 1 page displays
2 Select 30 pieces / page, no more than 30 pieces of data, and 1 page displays
3 Select 50 pieces / page, no more than 50 pieces of data, and one page displays
  • [ ] finish
146 workflow - task instance validate paging controls, more than 10 P2 start API service 1 Select 10 pieces / page, more than 10 pieces of data, and view the pagination display
2 Select 30 pieces / page and more than 30 pieces of data to view the pagination display
3 Select 50 pieces / page, more than 50 pieces of data, and view the pagination display
1 Select 10 pieces / page. If there are more than 10 pieces of data, turn the page to display
2 Select 30 pieces / page. If more than 30 pieces of data are selected, the page will turn to display
3 Select 50 pieces / page, and more than 50 pieces of data will be displayed on the page
  • [ ] finish
147 workflow - task instance verify the > button of paging control P2 start API service click the > button to view paging display page Jump to the next page
  • [ ] finish
148 workflow - task instance verify the < button of paging control P2 start API service click the < button to view paging display page Jump to the previous page
  • [ ] finish
149 workflow - task instance verify the input box of the paging control P2 start the API service enter the number of pages, view the paging display page Jump to the page where the page number is entered

Use case

No response

Related issues

No response

Are you willing to submit a PR?

Code of Conduct

github-actions[bot] commented 2 years ago

Hi:

caishunfeng commented 2 years ago

I will add the switch task e2e case.

yangyunxi commented 2 years ago

@caishunfeng Help me look at the problem of E2E. It's on slack

yangyunxi commented 2 years ago

I will add the E2E case of the third project list

zhongjiajie commented 2 years ago

I will add the E2E case of the third project list

Which e2e case do you want to do? @yangyunxi workflow create or modify should be in the latter

yangyunxi commented 2 years ago

I will add the E2E case of the third project list

Which e2e case do you want to do? @yangyunxi workflow create or modify should be in the latter

Number is 3

zhongjiajie commented 2 years ago

I will add the E2E case of the third project list

Which e2e case do you want to do? @yangyunxi workflow create or modify should be in the latter

Number is 3

@yangyunxi It seem we already have an E2E about create project in https://github.com/apache/dolphinscheduler/blob/67cc260d52c1f2e5aa7db76aa8621cdd0f8c4ee0/dolphinscheduler-e2e/dolphinscheduler-e2e-case/src/test/java/org/apache/dolphinscheduler/e2e/cases/ProjectE2ETest.java#L49

Do you mind change to other one?

yangyunxi commented 2 years ago

I will add the E2E case of the third project list

Which e2e case do you want to do? @yangyunxi workflow create or modify should be in the latter

Number is 3

@yangyunxi It seem we already have an E2E about create project in

https://github.com/apache/dolphinscheduler/blob/67cc260d52c1f2e5aa7db76aa8621cdd0f8c4ee0/dolphinscheduler-e2e/dolphinscheduler-e2e-case/src/test/java/org/apache/dolphinscheduler/e2e/cases/ProjectE2ETest.java#L49

Do you mind change to other one?

Number is 5 view item list data

zhongjiajie commented 2 years ago

@yangyunxi yeah, you're right, maybe you could add describe to project, and add delete, delete project