Ouranosinc / PAVICS-e2e-workflow-tests

Test user-level workflow.
Apache License 2.0
0 stars 2 forks source link

Increase the amount of request in the stress test, using multiple threads #90

Closed ahandan closed 2 years ago

ahandan commented 2 years ago

…uest

Overview

Enhancement for the stress_test. Increase the amount of request by using multiple thread to increase the chance of making request in parallel as much as possible. The increase amount of request also increases the chance to force the cache to run a refresh cycle by filling it up and trying to make a request that is overlapping the cache refreshing cycle to generate a known issue (NoTransaction, DetachedInstance).

Changes

Related Issue / Discussion

This enhancement of the stress makes a reference to : Magpie issue (More caching/session re-connect and logging #473, https://github.com/Ouranosinc/Magpie/pull/473)

review-notebook-app[bot] commented 2 years ago

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

tlvu commented 2 years ago

@ahandan if this PR is ready for a review from me, please explicitly set me as the reviewer. Otherwise I won't be touching it.

That said, I was curious and opened up in ReviewNB (https://app.reviewnb.com/Ouranosinc/PAVICS-e2e-workflow-tests/pull/90/) and I ended up with this error "Invalid notebook file. Please make sure both before and after version of the file contains valid notebook JSON. ".

ahandan commented 2 years ago

Changes ;

ahandan commented 2 years ago

@fmigneault, I agree for the "Timestamps I don’t know exactly how to make it look good. I tried to align it on the left, but then it's hard to keep track of the line. For me it's better that each column is closer to each other but separated enough to distinguish them. So for now, I've set w = 16 and I added 5 spaces before the timestamps so that it is separated from Times .
Let me know what you think about it, is it clear or should we just align everything on the right?

fmigneault commented 2 years ago

@ahandan

You have to look at the fields this way. Since you put w=16, and that the format uses :>, fields are right-aligned by the width.

Because Codes, Delta and Times are all containing values that are shorter than 16, the full width is 16. Spacing between those "headers" are all 11 spaces because 16 - len("Codes") = 11. Since Timestamp is 10 char, (5 more than "Codes", "Delta", "Times"), you need to add 5 (in the header variable, for last field) to keep the same distance.

In the case of the timestamps values themselves, they are 20 char. This is >16 so the field overflows to the left. You could simply use a pseudo left-align for the timestamp values, by right aligning them with 26 (or 20 + any other extra spacing you add to the Timestamp header).

image

tlvu commented 2 years ago

ensure all the diagnostic printout is available for the failed run.

Below is what I meant, taken from https://daccs-jenkins.crim.ca/job/PAVICS-e2e-workflow-tests/job/master/570/console

All the outputs for the failed run are printed out but only for the failed run so we are not drowned under all the outputs of the good runs.

10:34:59  _____________________ notebooks/stress-tests.ipynb::Cell 2 _____________________
10:34:59  Notebook cell execution failed
10:34:59  Cell 2: Cell execution caused an exception
10:34:59  
10:34:59  Input:
10:34:59  # NBVAL_IGNORE_OUTPUT
10:34:59  
10:34:59  
10:34:59  failed_count = 0
10:34:59  failed_results = ""
10:34:59  for bird in TEST_WPS_BIRDS:
10:34:59      bird_url = f"{TWITCHER_URL}/{bird}/wps?service=wps&request=getcapabilities"
10:34:59      expect_status_code = 200
10:34:59      results = stress_test_requests(bird_url, runs=TEST_RUNS, code=expect_status_code,
10:34:59                                     max_err_code=TEST_MAX_ERR_CODE, max_avg_time=TEST_MAX_AVG_TIME,
10:34:59                                     abort_retries=TEST_TIMEOUT_RETRY, abort_timeout=TEST_TIMEOUT_ABORT)
10:34:59      print(results)
10:34:59      if results.status != 0:
10:34:59          failed_count += 1
10:34:59          failed_results = f"{failed_results}\n{results}"
10:34:59  assert failed_count == 0, f"Failed {failed_count} tests.  Failed results: {failed_results}"
10:34:59  
10:34:59  print("\nAll tests passed!")
10:34:59  
10:34:59  Traceback:
10:34:59  
10:34:59  ---------------------------------------------------------------------------
10:34:59  AssertionError                            Traceback (most recent call last)
10:34:59  <ipython-input-3-7c395b9a85cd> in <module>
10:34:59       14         failed_count += 1
10:34:59       15         failed_results = f"{failed_results}\n{results}"
10:34:59  ---> 16 assert failed_count == 0, f"Failed {failed_count} tests.  Failed results: {failed_results}"
10:34:59       17 
10:34:59       18 print("\nAll tests passed!")
10:34:59  
10:34:59  AssertionError: Failed 1 tests.  Failed results: 
10:34:59  Stress Test:
10:34:59      Test:
10:34:59          code: 200
10:34:59          runs: 100
10:34:59          max-avg-time:  1s
10:34:59          max-err-code:  0
10:34:59          sum-err-code:  1
10:34:59          timeout-abort: 0s
10:34:59          timeout-retry: 0
10:34:59          timeout-count: 1
10:34:59      Request:
10:34:59          method: GET
10:34:59          url:    https://medus.ouranos.ca/twitcher/ows/proxy/flyingpigeon/wps?service=wps&request=getcapabilities
10:34:59          args:   {}
10:34:59      Times:
10:34:59          min: 0.142s
10:34:59          avg: 0.254s
10:34:59          max: 5.000s
10:34:59      Results:
10:34:59          Run     Codes     Delta     Times
10:34:59            1       200    0.000s    0.197s
10:34:59            2       200    0.031s    0.162s
10:34:59            3       200    0.044s    0.153s
10:34:59            4       200    0.058s    0.159s
10:34:59            5       200    0.007s    0.163s
10:34:59            6       200    0.042s    0.156s
10:34:59            7       200    0.032s    0.174s
10:34:59            8       200    0.098s    0.159s
10:34:59            9       200    0.051s    0.171s
10:34:59           10       200    0.095s    0.152s
10:34:59           11       200    0.061s    0.163s
10:34:59           12       200    0.030s    0.171s
10:34:59           13       200    0.035s    0.161s
10:34:59           14       200    0.087s    0.157s
10:34:59           15       200    0.014s    0.158s
10:34:59           16       200    0.086s    0.166s
10:34:59           17       200    0.029s    0.159s
10:34:59           18       200    0.045s    0.163s
10:34:59           19       200    0.082s    0.176s
10:34:59           20       200    0.029s    0.163s
10:34:59           21       200    0.084s    0.161s
10:34:59           22       200    0.065s    0.182s
10:34:59           23       200    0.007s    0.165s
10:34:59           24       200    0.041s    0.154s
10:34:59           25       200    0.022s    0.165s
10:34:59           26       200    0.070s    0.172s
10:34:59           27       200    0.080s    0.163s
10:34:59           28       200    0.080s    0.158s
10:34:59           29       200    0.040s    0.160s
10:34:59           30       200    0.049s    0.157s
10:34:59           31       200    0.090s    0.171s
10:34:59           32       200    0.018s    0.156s
10:34:59           33       200    0.007s    0.156s
10:34:59           34       200    0.078s    0.166s
10:34:59           35       200    0.032s    0.162s
10:34:59           36       200    0.033s    0.164s
10:34:59           37       200    0.073s    0.159s
10:34:59           38       200    0.082s    0.148s
10:34:59           39       200    0.075s    0.174s
10:34:59           40       200    0.025s    0.166s
10:34:59           41       200    0.001s    0.159s
10:34:59           42       200    0.037s    0.161s
10:34:59           43       200    0.015s    0.162s
10:34:59           44       200    0.023s    0.163s
10:34:59           45       200    0.029s    0.162s
10:34:59           46       200    0.092s    0.156s
10:34:59           47       200    0.068s    0.167s
10:34:59           48       200    0.004s    0.169s
10:34:59           49       200    0.037s    0.146s
10:34:59           50       200    0.092s    0.154s
10:34:59           51       200    0.068s    0.159s
10:34:59           52       200    0.012s    0.147s
10:34:59           53       200    0.035s    0.156s
10:34:59           54       200    0.042s    0.155s
10:34:59           55       200    0.058s    0.162s
10:34:59           56       200    0.028s    0.163s
10:34:59           57       200    0.026s    0.175s
10:34:59           58       200    0.027s    0.159s
10:34:59           59       200    0.092s    0.165s
10:34:59           60       200    0.023s    0.166s
10:34:59           61       200    0.085s    0.168s
10:34:59           62       200    0.090s    0.182s
10:34:59           63       200    0.054s    0.164s
10:34:59           64       200    0.025s    0.179s
10:34:59           65       200    0.072s    0.162s
10:34:59           66       200    0.021s    0.171s
10:34:59           67       200    0.029s    0.145s
10:34:59           68       200    0.050s    0.152s
10:34:59           69       200    0.059s    0.154s
10:34:59           70       200    0.070s    0.173s
10:34:59           71       200    0.091s    0.150s
10:34:59           72       200    0.035s    0.142s
10:34:59           73       200    0.098s    0.158s
10:34:59           74   (!) 408    0.097s    5.000s
10:34:59           75       200    0.049s    4.547s
10:34:59           76       200    0.028s    0.165s
10:34:59           77       200    0.035s    0.163s
10:34:59           78       200    0.024s    0.145s
10:34:59           79       200    0.063s    0.162s
10:34:59           80       200    0.089s    0.176s
10:34:59           81       200    0.003s    0.155s
10:34:59           82       200    0.085s    0.159s
10:34:59           83       200    0.071s    0.183s
10:34:59           84       200    0.067s    0.163s
10:34:59           85       200    0.063s    0.154s
10:34:59           86       200    0.018s    0.146s
10:34:59           87       200    0.099s    0.168s
10:34:59           88       200    0.076s    0.145s
10:34:59           89       200    0.071s    0.145s
10:34:59           90       200    0.098s    0.156s
10:34:59           91       200    0.076s    0.159s
10:34:59           92       200    0.001s    0.164s
10:34:59           93       200    0.026s    0.154s
10:34:59           94       200    0.064s    0.160s
10:34:59           95       200    0.088s    0.157s
10:34:59           96       200    0.076s    0.152s
10:34:59           97       200    0.016s    0.163s
10:34:59           98       200    0.076s    0.157s
10:34:59           99       200    0.068s    0.159s
10:34:59          100       200    0.099s    0.152s
10:34:59      Summary:
10:34:59          Detected 1 erroneous HTTP codes not equal to expected 200.
10:34:59          Test failed (status=-1).
ahandan commented 2 years ago

Test is working and raising AssertionErrors based on the success/failure of a thread that is dependent of the success/failure of the stress_test.

I added a fail_report method in the StressTestProgression that is called when a thread/test fail.

For now the test is set to : WPS : Threads : 3 Runs : 100

THREDDS: Threads : 5 Runs : 500

Also added these env var that can be changed or set for a specific test build/run :

TEST_RUNS_THREDDS  # number of requests for thredds test
TEST_N_THREADS_WPS # number of threads testing in parallel each bird
TEST_N_THREADS_THREDDS   # number of threads testing in parallel thredds

Here is an example of failed test result : Failed test output

fmigneault commented 2 years ago

I'm fine with the way results are reported. Maybe just check the output format. I'm not sure if its on my side only or real output. It is somewhat hard to read:

19:14:24  ---------------------------------------------------------------------------
19:14:24  AssertionError                            Traceback (most recent call last)
19:14:24  /tmp/ipykernel_1032/1504455324.py in <module>
19:14:24        7             url = thredds_url,
19:14:24        8             n_threads = TEST_N_THREADS_THREDDS,
19:14:24  ----> 9             runs_per_threads = TEST_RUNS_THREDDS)
19:14:24  
19:14:24  /tmp/ipykernel_1032/4243812779.py in run_threads(target_test, url, execution_id, n_threads, runs_per_threads)
19:14:24       29         thread.join()
19:14:24       30 
19:14:24  ---> 31     assert stressTestProgression.failed_count == 0, stressTestProgression.fail_report()
19:14:24       32     print(f"All threads passed {target_test.__name__} function.")
19:14:24       33 
19:14:24  
19:14:24  AssertionError: This test failed due to bad code received from the requests.
19:14:24   Number of failed thread(s) : 3/ 5
19:14:24   Number of failed requests : 5/2500
19:14:24   First failed request timestamp : 2021-11-17 00:13:29
ahandan commented 2 years ago

@fmigneault as for the weird outputs, these (�[0;31m) are the colors of the outputs.

Consol output

vs

Plain text

Looks like it's a Jenkins problem. Issue

tlvu commented 2 years ago

In case it's not clear, stress-test.ipynb has 2 purposes:

In the Jenkins purpose: the result displayed when it failed in Jenkins is what matters. All the nice looking cell outputs do not make it to Jenkins and there is no way to retrieve them after the fact.

In the previous implementation, the failed outputs are saved and given to the assert error message. That was how we display only the failed cell outputs when failure occurred during Jenkins run.

fmigneault commented 2 years ago

@ahandan

I agree with @tlvu that for convenience, it would be great to have the failing subset of threads/URL reported directly in the raised message. Sorry this changes the output summary we previously discussed about.

Do you have an ETA to implement this? This PR is blocking #197 and another one coming soon by @ChaamC. If this adjustment takes too long, I would rather have the current iteration integrated and a subsequent update for the output formatting.

tlvu commented 2 years ago

I agree with @tlvu that for convenience, it would be great to have the failing subset of threads/URL reported directly in the raised message.

It's not just a "convenience". Jenkins build failure is useless if we are unable to have the failing info to diagnose !

This PR is blocking https://github.com/bird-house/birdhouse-deploy/pull/197 and another one coming soon by @ChaamC. If this adjustment takes too long, I would rather have the current iteration integrated and a subsequent update for the output formatting.

If it helps remove pressure and unblock things, I can accept merging this PR as-is as long as a followup PR with proper error reporting when Jenkins fails is done promptly right after.

fmigneault commented 2 years ago

It's not just a "convenience". Jenkins build failure is useless if we are unable to have the failing info to diagnose !

Everything is available in the notebook outputs. Its just more trouble to load them in jupyter than have them already in logs.

ahandan commented 2 years ago

Ok no problem for changing the outputs. I have already save exceptions and failed results in case we would change our minds.

I changed to assert printout to print only all the failed runs as the same output from the results one. Also added a Exception Section printing the Reason ex : (500 : Internal Server Error)

@tlvu Let me know what would rather have in the Exceptions area

  1. the entire Error Message like :
    <ExceptionReport version="1.0.0"
    xmlns="http://www.opengis.net/ows/1.1"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://www.opengis.net/ows/1.1 http://schemas.opengis.net/ows/1.1.0/owsExceptionReport.xsd">
    <Exception exceptionCode="NoApplicableCode" locator="NoApplicableCode">
        <ExceptionText>Unknown Error</ExceptionText>
    </Exception>
    </ExceptionReport>
  2. A simple : 500 : Internal Server Error

Here is an example of the output from the test : Test 1 Test 2

12:32:50  ---------------------------------------------------------------------------
12:32:50  AssertionError                            Traceback (most recent call last)
12:32:50  /tmp/ipykernel_1067/1504455324.py in <module>
12:32:50        7             url = thredds_url,
12:32:50        8             n_threads = TEST_N_THREADS_THREDDS,
12:32:50  ----> 9             runs_per_threads = TEST_RUNS_THREDDS)
12:32:50  
12:32:50  /tmp/ipykernel_1067/1081959450.py in run_threads(target_test, url, execution_id, n_threads, runs_per_threads)
12:32:50       37     assert (
12:32:50       38         stress_test_progression.failed_threads_count == 0
12:32:50  ---> 39     ), stress_test_progression.fail_report()
12:32:50       40 
12:32:50       41     print(f"All threads passed {target_test.__name__} function.")
12:32:50  
12:32:50  AssertionError: This test failed due to bad code received from the requests.
12:32:50   Number of failed thread(s) : 1/ 5
12:32:50   Number of failed requests : 1/500
12:32:50  
12:32:50  
12:32:50   Stress Test:
12:32:50      Test:
12:32:50          code: 200
12:32:50          runs: 100
12:32:50          max-avg-time:  1s
12:32:50          max-err-code:  0
12:32:50          sum-err-code:  1
12:32:50          timeout-abort: 0s
12:32:50          timeout-retry: 0
12:32:50          timeout-count: 0
12:32:50      Request:
12:32:50          method: GET
12:32:50          url:    https://hirondelle.crim.ca/twitcher/ows/proxy/thredds/catalog/birdhouse/testdata/catalog.html?dataset=birdhouse/testdata/ta_Amon_MRI-CGCM3_decadal1980_r1i1p1_199101-200012.nc
12:32:50          args:   {}
12:32:50      Times:
12:32:50          min: 0.137s
12:32:50          avg: 0.001s
12:32:50          max: 0.137s
12:32:50      Results:
12:32:50          Run                 Codes                 Delta                 Times            Timestamps
12:32:50            1               (!) 500                0.000s                0.137s   2021-11-22 17:32:38
12:32:50      Summary:
12:32:50          Detected 1 erroneous HTTP codes not equal to expected 200.
12:32:50          Test failed (status=-1).
12:32:50   === Exceptions ===
12:32:50   1. Internal Server Error
12:32:50  
12:32:50  =========================== short test summary info ============================
tlvu commented 2 years ago

It's not just a "convenience". Jenkins build failure is useless if we are unable to have the failing info to diagnose !

Everything is available in the notebook outputs. Its just more trouble to load them in jupyter than have them already in logs.

The actual notebook outputs are discarded !

The outputs that are being saved is the result of another run and in the case of intermittent problem like this caching issue, no guaranty we will have the same result. Also another runs means all the timestamps do not match.

See https://github.com/Ouranosinc/PAVICS-e2e-workflow-tests/blob/9ad91bd0c26986abc7f517d2d39b811ede3c3cc4/Jenkinsfile#L83-L85

tlvu commented 2 years ago

I changed to assert printout to print only all the failed runs as the same output from the results one. Also added a Exception Section printing the Reason ex : (500 : Internal Server Error)

@tlvu Let me know what would rather have in the Exceptions area

1. the entire Error Message like :
<ExceptionReport version="1.0.0"
    xmlns="http://www.opengis.net/ows/1.1"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://www.opengis.net/ows/1.1 http://schemas.opengis.net/ows/1.1.0/owsExceptionReport.xsd">
    <Exception exceptionCode="NoApplicableCode" locator="NoApplicableCode">
        <ExceptionText>Unknown Error</ExceptionText>
    </Exception>
</ExceptionReport>
2. A simple : 500 : Internal Server Error

Can we combine both so we have the bad code and and actual error matching the bad code. This will be particularly useful in the case where we have multiple failing code. For example we have seen in the past both 500 and 408 can happen in the same run. Having the actual error matching the code will speed up the diagnostic.

Here is an example of the output from the test : Test 1 Test 2

12:32:50  ---------------------------------------------------------------------------
12:32:50  AssertionError                            Traceback (most recent call last)
12:32:50  /tmp/ipykernel_1067/1504455324.py in <module>
12:32:50        7             url = thredds_url,
12:32:50        8             n_threads = TEST_N_THREADS_THREDDS,
12:32:50  ----> 9             runs_per_threads = TEST_RUNS_THREDDS)
12:32:50  
12:32:50  /tmp/ipykernel_1067/1081959450.py in run_threads(target_test, url, execution_id, n_threads, runs_per_threads)
12:32:50       37     assert (
12:32:50       38         stress_test_progression.failed_threads_count == 0
12:32:50  ---> 39     ), stress_test_progression.fail_report()
12:32:50       40 
12:32:50       41     print(f"All threads passed {target_test.__name__} function.")
12:32:50  
12:32:50  AssertionError: This test failed due to bad code received from the requests.
12:32:50   Number of failed thread(s) : 1/ 5
12:32:50   Number of failed requests : 1/500
12:32:50  
12:32:50  
12:32:50   Stress Test:
12:32:50      Test:
12:32:50          code: 200
12:32:50          runs: 100
12:32:50          max-avg-time:  1s
12:32:50          max-err-code:  0
12:32:50          sum-err-code:  1
12:32:50          timeout-abort: 0s
12:32:50          timeout-retry: 0
12:32:50          timeout-count: 0
12:32:50      Request:
12:32:50          method: GET
12:32:50          url:    https://hirondelle.crim.ca/twitcher/ows/proxy/thredds/catalog/birdhouse/testdata/catalog.html?dataset=birdhouse/testdata/ta_Amon_MRI-CGCM3_decadal1980_r1i1p1_199101-200012.nc
12:32:50          args:   {}
12:32:50      Times:
12:32:50          min: 0.137s
12:32:50          avg: 0.001s
12:32:50          max: 0.137s
12:32:50      Results:
12:32:50          Run                 Codes                 Delta                 Times            Timestamps
12:32:50            1               (!) 500                0.000s                0.137s   2021-11-22 17:32:38
12:32:50      Summary:
12:32:50          Detected 1 erroneous HTTP codes not equal to expected 200.
12:32:50          Test failed (status=-1).
12:32:50   === Exceptions ===
12:32:50   1. Internal Server Error
12:32:50  
12:32:50  =========================== short test summary info ============================

Oh this is great !

So I see combining both status code and the actual error for the exception will look like

 === Exceptions ===
1. 500 : Internal Server Error
<ExceptionReport version="1.0.0"
     xmlns="http://www.opengis.net/ows/1.1"
     xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
     xsi:schemaLocation="http://www.opengis.net/ows/1.1 http://schemas.opengis.net/ows/1.1.0/owsExceptionReport.xsd">
     <Exception exceptionCode="NoApplicableCode" locator="NoApplicableCode">
         <ExceptionText>Unknown Error</ExceptionText>
     </Exception>
 </ExceptionReport>

3. 408 : Request Timeout
 <ExceptionReport>
    blah blah blah ...
 </ExceptionReport>
ahandan commented 2 years ago

@tlvu @fmigneault Here is the latest test output : The test Jenkins_Test

10:12:59  ---------------------------------------------------------------------------
10:12:59  AssertionError                            Traceback (most recent call last)
10:12:59  /tmp/ipykernel_1135/1504455324.py in <module>
10:12:59        7             url = thredds_url,
10:12:59        8             n_threads = TEST_N_THREADS_THREDDS,
10:12:59  ----> 9             runs_per_threads = TEST_RUNS_THREDDS)
10:12:59  
10:12:59  /tmp/ipykernel_1135/1081959450.py in run_threads(target_test, url, execution_id, n_threads, runs_per_threads)
10:12:59       37     assert (
10:12:59       38         stress_test_progression.failed_threads_count == 0
10:12:59  ---> 39     ), stress_test_progression.fail_report()
10:12:59       40 
10:12:59       41     print(f"All threads passed {target_test.__name__} function.")
10:12:59  
10:12:59  AssertionError: This test failed due to bad code received from the requests.
10:12:59   Number of failed thread(s) : 3/ 5
10:12:59   Number of failed requests : 7/2500
10:12:59  
10:12:59  
10:12:59   Stress Test:
10:12:59      Test:
10:12:59          code: 200
10:12:59          runs: 500
10:12:59          max-avg-time:  1s
10:12:59          max-err-code:  0
10:12:59          sum-err-code:  2
10:12:59          timeout-abort: 0s
10:12:59          timeout-retry: 0
10:12:59          timeout-count: 0
10:12:59      Request:
10:12:59          method: GET
10:12:59          url:    https://hirondelle.crim.ca/twitcher/ows/proxy/thredds/catalog/birdhouse/testdata/catalog.html?dataset=birdhouse/testdata/ta_Amon_MRI-CGCM3_decadal1980_r1i1p1_199101-200012.nc
10:12:59          args:   {}
10:12:59      Times:
10:12:59          min: 0.064s
10:12:59          avg: 0.000s
10:12:59          max: 0.087s
10:12:59      Results:
10:12:59          Run                 Codes                 Delta                 Times            Timestamps
10:12:59            1               (!) 500                0.052s                0.087s   2021-11-23 15:11:52
10:12:59            2               (!) 500                0.075s                0.064s   2021-11-23 15:12:13
10:12:59      Summary:
10:12:59          Detected 2 erroneous HTTP codes not equal to expected 200.
10:12:59          Test failed (status=-1).
10:12:59  
10:12:59   === Exceptions ===
10:12:59  
10:12:59   1. 500 Internal Server Error
10:12:59   <?xml version="1.0" encoding="utf-8"?>
10:12:59  <ExceptionReport version="1.0.0"
10:12:59      xmlns="http://www.opengis.net/ows/1.1"
10:12:59      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
10:12:59      xsi:schemaLocation="http://www.opengis.net/ows/1.1 http://schemas.opengis.net/ows/1.1.0/owsExceptionReport.xsd">
10:12:59      <Exception exceptionCode="NoApplicableCode" locator="NoApplicableCode">
10:12:59          <ExceptionText>Unknown Error</ExceptionText>
10:12:59      </Exception>
10:12:59  </ExceptionReport>
10:12:59  
10:12:59   2. 500 Internal Server Error
10:12:59   <?xml version="1.0" encoding="utf-8"?>
10:12:59  <ExceptionReport version="1.0.0"
10:12:59      xmlns="http://www.opengis.net/ows/1.1"
10:12:59      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
10:12:59      xsi:schemaLocation="http://www.opengis.net/ows/1.1 http://schemas.opengis.net/ows/1.1.0/owsExceptionReport.xsd">
10:12:59      <Exception exceptionCode="NoApplicableCode" locator="NoApplicableCode">
10:12:59          <ExceptionText>Unknown Error</ExceptionText>
10:12:59      </Exception>
10:12:59  </ExceptionReport>
10:12:59  
10:12:59