deephaven / deephaven-core

Deephaven Community Core
Other
238 stars 79 forks source link

deephaven.learn v2 SIGSEGV crashes with PyTorch and TensorFlow #2176

Open jjbrosnan opened 2 years ago

jjbrosnan commented 2 years ago

Description

deephaven.learn, when used in conjunction with PyTorch/TensorFlow causes SIGSEGV crashes. These crashes are SEGV_MAPERR according to the logs. This occurs with our supported version of PyTorch (1.10.2) that we use in the server-pytorch image. This also occurs with the latest TensorFlow (2.8.0), and I am going to test with the supported version of TensorFlow from the server-tensorflow image shortly.

Steps to reproduce

From a Python session:

import os
os.system("pip install torch==1.10.2")

That takes a couple of minutes. Then run this:

# Deephaven imports
from deephaven import DynamicTableWriter
from deephaven import dtypes as dht
from deephaven.learn import gather
from deephaven.csv import read
from deephaven import learn

# Machine learning imports
import torch
import torch.nn as nn
import torch.nn.functional as F

# Python imports
import numpy as np, random, threading, time

# Read and quantize the Iris dataset
iris_raw = read("https://media.githubusercontent.com/media/deephaven/examples/main/Iris/csv/iris.csv")

classes = {}
num_classes = 0
def get_class_number(c):
    global classes, num_classes
    if c not in classes:
        classes[c] = num_classes
        num_classes += 1
    return classes[c]

iris = iris_raw.update(formulas = ["Class = (int)(byte)get_class_number(Class)"])

# Our neural network class
class IrisANN(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc1 = nn.Linear(in_features = 4, out_features = 16)
        self.fc2 = nn.Linear(in_features = 16, out_features = 12)
        self.output = nn.Linear(in_features = 12, out_features = 3)

    def forward(self, x):
        x = x.float()
        x = F.relu(self.fc1(x))
        x = F.relu(self.fc2(x))
        x = self.output(x)
        return x

# Create the neural network
model = IrisANN()

# A function that trains the model
def train_model(X_train, Y_train):
    global model
    # Set training parameters
    criterion = nn.CrossEntropyLoss()
    optimizer = torch.optim.Adam(model.parameters(), lr = 0.01)
    epochs = 100

    loss_arr = []

    for i in range(epochs):
        Y_hat = model.forward(X_train)
        loss = criterion(Y_hat, Y_train.long())
        loss_arr.append(loss)

        if i % 10 == 0:
            print(f'Epoch: {i} Loss: {loss}')

        optimizer.zero_grad()
        loss.backward()
        optimizer.step()

# A function to gather data from table columns into a torch tensor of doubles
def table_to_tensor_double(rows, cols):
    return torch.from_numpy(gather.table_to_numpy_2d(rows, cols, np_type = np.double))

# A function to gather data from table columns into a torch tensor of integers
def table_to_tensor_int(rows, cols):
    return torch.from_numpy(np.squeeze(gather.table_to_numpy_2d(rows, cols, np_type = np.intc)))

# A function to extract a prediction and cast the value to an integer
def get_predicted_class(data, idx):
    return int(data[idx])

That code should run without issue. This next code block will result in the error:

learn.learn(
    table = iris,
    model_func = train_model,
    inputs = [learn.Input(["SepalLengthCM", "SepalWidthCM", "PetalLengthCM", "PetalWidthCM"], table_to_tensor_double), learn.Input("Class", table_to_tensor_int)],
    outputs = None,
    batch_size = 150
)

Expected results

The Torch model to train. There should be some printouts to the console window as well.

Actual results

The table viewer window in the console complains for a second, then the screen just goes blank. TRANSIENT_FAILURE messages appear in the terminal window, and a file is generated: hs_err_pid1.log (among other files).

Additional details and attachments

This file is from one of the TensorFlow crashes.

hs_err_pid1.log

I will include the TensorFlow code blocks in a comment.

Versions

jjbrosnan commented 2 years ago

TensorFlow code (note that I don't enforce a version in the pip install, so that could be from a version mismatch. I will be performing more rigorous testing of that now.

import os
os.system("pip install tensorflow")
# Deephaven imports
from deephaven import DynamicTableWriter
from deephaven import dtypes as dht
from deephaven.learn import gather
from deephaven.csv import read
from deephaven import learn

# Machine learning imports
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense

# Python imports
import numpy as np, random, threading, time

iris_raw = read("https://media.githubusercontent.com/media/deephaven/examples/main/Iris/csv/iris.csv")

classes = {}
num_classes = 0
def get_class_number(c):
    global classes, num_classes
    if c not in classes:
        classes[c] = num_classes
        num_classes += 1
    return classes[c]

iris = iris_raw.update(formulas = ["Class = (int)(byte)get_class_number(Class)"])

# Our neural network
model = Sequential()
model.add(Dense(16, activation = tf.nn.relu))
model.add(Dense(12, activation = tf.nn.relu))
model.add(Dense(3, activation = tf.nn.softmax))

# A function that trains the model
def train_model(X_train, Y_train):
    model.compile(optimizer = tf.keras.optimizers.Adam(learning_rate = 0.01), loss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits = True), metrics = ["accuracy"])
    model.fit(x = X_train, y = Y_train, epochs = 100)

# A function that gets the model's predictions on input data
def predict_with_model(features):
    if features.ndim == 1:
        features = np.expand_dims(features, 0)
    predictions = model.predict(features)
    return np.array([np.argmax(item) for item in predictions], dtype = np.intc)

# A function to gather data from table columns into a NumPy array of doubles
def table_to_array_double(rows, cols):
    return gather.table_to_numpy_2d(rows, cols, np_type = np.double)

# A function to gather data from table columns into a NumPy array of doubles
def table_to_array_int(rows, cols):
    return gather.table_to_numpy_2d(rows, cols, np_type = np.intc)

# A function to extract a list element and cast to an integer
def get_predicted_class(data, idx):
    return int(data[idx])
# Use the learn function to train our neural network
learn.learn(
    table = iris,
    model_func = train_model,
    inputs = [learn.Input(["SepalLengthCM", "SepalWidthCM", "PetalLengthCM", "PetalWidthCM"], table_to_array_double), learn.Input(["Class"], table_to_array_int)],
    outputs = None,
    batch_size = 150
)
niloc132 commented 2 years ago

I'm not able to reproduce this in docker (though on host OS of linux), going to need some more info or to debug on your machine when you are available.

jjbrosnan commented 2 years ago

Worth noting: I get the same SIGSEGV core dump if I just convert the table data to NumPy directly and attempt to train the neural network model

niloc132 commented 2 years ago

Ok, so we can't rule out DH breaking this yet (though we did test on a call removing the jvm's signal handlers, just in case these libraries use sigsegv like the jvm does and they are fighting in some way), but at least that it isnt the fact that the engine is providing the data that is the problem.

Next steps, use the modified steps to repro (i.e. just numpy data) and try to train on older DH releases. If we can identify the last version it works and first version it fails, collect pip list from both, and compare to see what might have chaged.

Assigned to @jjbrosnan for triage.

jjbrosnan commented 2 years ago

This is the python code that works outside of a Deephaven console. The module versions versions I have installed outside of Deephaven are numpy 1.21.5 and tensorflow 2.8.0. The codedoes not use anything Deephaven, but results in a SIGSEG core dump when called from within DH.

import pandas as pd
import random
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Dropout

# Read and quantize the dataset
iris = pd.read_csv("https://media.githubusercontent.com/media/deephaven/examples/main/Iris/csv/iris.csv")
iris_mappings = {
    "Iris-setosa" : 0,
    "Iris-virginica" : 1,
    "Iris-versicolor" : 2
}
iris["Class"] = iris["Class"].apply(lambda x: iris_mappings[x])

# Split the DataFrame into training and testing sets
iris_shuffled = iris.sample(frac = 1)
train_size = int(0.75 * len(iris_shuffled))
train_set = iris_shuffled[:train_size]
test_set = iris_shuffled[train_size:]

# Separate our data into features and targets (X and Y)
X_train = train_set.drop("Class", axis = 1).values
Y_train = train_set["Class"].values
X_test = test_set.drop("Class", axis = 1).values
Y_test = test_set["Class"].values

# Create the ANN
model = Sequential()
model.add(Dense(16, input_shape = (4,), activation = tf.nn.relu))
model.add(Dense(12, activation = tf.nn.relu))
model.add(Dense(3, activation = tf.nn.softmax))

# Compile, fit, and evaluate the predictions of the ANN
model.compile(optimizer = tf.keras.optimizers.Adam(learning_rate = 0.01), loss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits = True),  metrics = ["accuracy"])
model.fit(x = X_train, y = Y_train, epochs = 100)
model.evaluate(X_test, Y_test)
jjbrosnan commented 2 years ago

The code in the comment just above this one works just fine in deephaven v0.9. Here are the results from pip list:

deephaven-deployment-server-1      | 2022-04-01T20:03:03.616Z | r-Scheduler-Serial-1 |  INFO | .u.PythonDeephavenSession | Evaluating command: os.system("pip list")
deephaven-deployment-server-1      | Package                 Version
deephaven-deployment-server-1      | ----------------------- ---------
deephaven-deployment-server-1      | absl-py                 0.15.0
deephaven-deployment-server-1      | astunparse              1.6.3
deephaven-deployment-server-1      | cached-property         1.5.2
deephaven-deployment-server-1      | cachetools              4.2.4
deephaven-deployment-server-1      | certifi                 2021.10.8
deephaven-deployment-server-1      | charset-normalizer      2.0.12
deephaven-deployment-server-1      | clang                   5.0
deephaven-deployment-server-1      | deephaven               0.9.0
deephaven-deployment-server-1      | deephaven-jpy           0.9.0
deephaven-deployment-server-1      | deephaven2              0.9.0
deephaven-deployment-server-1      | dill                    0.3.4
deephaven-deployment-server-1      | flatbuffers             1.12
deephaven-deployment-server-1      | gast                    0.4.0
deephaven-deployment-server-1      | google-auth             1.35.0
deephaven-deployment-server-1      | google-auth-oauthlib    0.4.6
deephaven-deployment-server-1      | google-pasta            0.2.0
deephaven-deployment-server-1      | grpcio                  1.44.0
deephaven-deployment-server-1      | h5py                    3.1.0
deephaven-deployment-server-1      | idna                    3.3
deephaven-deployment-server-1      | importlib-metadata      4.11.3
deephaven-deployment-server-1      | keras                   2.6.0
deephaven-deployment-server-1      | Keras-Preprocessing     1.1.2
deephaven-deployment-server-1      | llvmlite                0.38.0
deephaven-deployment-server-1      | Markdown                3.3.6
deephaven-deployment-server-1      | numba                   0.55.0
deephaven-deployment-server-1      | numpy                   1.19.5
deephaven-deployment-server-1      | oauthlib                3.2.0
deephaven-deployment-server-1      | opt-einsum              3.3.0
deephaven-deployment-server-1      | pandas                  1.3.5
deephaven-deployment-server-1      | pip                     21.3.1
deephaven-deployment-server-1      | pkg_resources           0.0.0
deephaven-deployment-server-1      | protobuf                3.20.0
deephaven-deployment-server-1      | pyasn1                  0.4.8
deephaven-deployment-server-1      | pyasn1-modules          0.2.8
deephaven-deployment-server-1      | python-dateutil         2.8.2
deephaven-deployment-server-1      | pytz                    2021.3
deephaven-deployment-server-1      | requests                2.27.1
deephaven-deployment-server-1      | requests-oauthlib       1.3.1
deephaven-deployment-server-1      | rsa                     4.8
deephaven-deployment-server-1      | setuptools              60.5.0
deephaven-deployment-server-1      | six                     1.15.0
deephaven-deployment-server-1      | tensorboard             2.6.0
deephaven-deployment-server-1      | tensorboard-data-server 0.6.1
deephaven-deployment-server-1      | tensorboard-plugin-wit  1.8.1
deephaven-deployment-server-1      | tensorflow              2.6.3
deephaven-deployment-server-1      | tensorflow-estimator    2.6.0
deephaven-deployment-server-1      | termcolor               1.1.0
deephaven-deployment-server-1      | typing-extensions       3.10.0.2
deephaven-deployment-server-1      | urllib3                 1.26.9
deephaven-deployment-server-1      | Werkzeug                2.1.1
deephaven-deployment-server-1      | wheel                   0.37.1
deephaven-deployment-server-1      | wrapt                   1.12.1
deephaven-deployment-server-1      | zipp                    3.7.0
deephaven-deployment-server-1      | WARNING: You are using pip version 21.3.1; however, version 22.0.4 is available.
deephaven-deployment-server-1      | You should consider upgrading via the '/opt/deephaven-venv/bin/python -m pip install --upgrade pip' command.
jjbrosnan commented 2 years ago

The code works in deephaven v0.10 when doing a build from pre-built images. Here's a diff of the pip list outputs between v0.10 and v0.9:

11,13c11,13
< deephaven-deployment-server-1      | deephaven               0.10.0
< deephaven-deployment-server-1      | deephaven-jpy           0.10.0
< deephaven-deployment-server-1      | deephaven2              0.10.0
---
> deephaven-deployment-server-1      | deephaven               0.9.0
> deephaven-deployment-server-1      | deephaven-jpy           0.9.0
> deephaven-deployment-server-1      | deephaven2              0.9.0
28c28
< deephaven-deployment-server-1      | numba                   0.55.1
---
> deephaven-deployment-server-1      | numba                   0.55.0
33c33
< deephaven-deployment-server-1      | pip                     22.0.3
---
> deephaven-deployment-server-1      | pip                     21.3.1
43c43
< deephaven-deployment-server-1      | setuptools              60.9.1
---
> deephaven-deployment-server-1      | setuptools              60.5.0
57c57
< deephaven-deployment-server-1      | WARNING: You are using pip version 22.0.3; however, version 22.0.4 is available.
---
> deephaven-deployment-server-1      | WARNING: You are using pip version 21.3.1; however, version 22.0.4 is available.
jjbrosnan commented 2 years ago

I re-tested tensorflow version 2.7.1 when building from both source and images alone, and am getting segfaults when using both. The segfaults are also occurring in version 2.6.3.

jjbrosnan commented 2 years ago

Yesterday, I tested locally with the server-tensorflow image using a little stress tester. It ran the TensorFlow code that has been shown to crash with the SIGSEGV 100 times. Each time the code ran, it was from a new set of docker containers. The output was tracked, and if the string Epoch 100/100 appeared in a log file, it was considered a success (the crash would always happen after Epoch 1/100 was printed).

The tests were successful 100 out of 100 times. I was unable to replicate the SIGSEGV crashes yesterday using the v0.11 release with the server-tensorflow image. That image, as of v0.11, has TensorFlow version 2.6.3.

Jeremiahcheng1206 commented 2 years ago

Hello, I was trying to run my python code using DHaaL on my local machine, the script is as below

from deephaven_server import Server
s = Server(port=8080, jvm_args=["-Xmx4g"])
s.start()
from deephaven import ugp
ugp.auto_locking = True

import tensorflow as tf
gpus = tf.config.experimental.list_physical_devices('GPU')
for gpu in gpus:
    print("Name:", gpu.name, "  Type:", gpu.device_type)
tf.config.list_physical_devices('GPU')
### Deephaven imports
from deephaven import DynamicTableWriter
from deephaven import dtypes as dht
from deephaven.learn import gather
from deephaven import read_csv
from deephaven import learn
from deephaven import new_table
from deephaven.column import string_col, double_col,int_col

### Python imports
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Dropout
import numpy as np
import threading
import time
# Read the iris csv file and change its format to deephaven table, along with some data pre-processing
iris_data = read_csv("https://media.githubusercontent.com/media/deephaven/examples/main/Iris/csv/iris.csv")
table2= new_table([
    string_col("Class", ["Iris-setosa", "Iris-versicolor", "Iris-virginica"]),
    int_col("class_1", [0, 1, 2])
])
iris=iris_data.exact_join(table=table2, on=["Class"]).drop_columns(cols=["Class"]).rename_columns(cols=["Class=class_1"])

# Create our neural network
model = Sequential()
model.add(Dense(16, input_shape=(4,), activation=tf.nn.relu))
model.add(Dense(12, activation=tf.nn.relu))
model.add(Dense(3, activation=tf.nn.softmax))

# A function to train the model
def train_model(features, targets):
    model.compile(optimizer=\
        tf.keras.optimizers.Adam(learning_rate=0.001), \
        loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True), \
        metrics=["accuracy"])
    model.fit(x=features, y=targets, epochs=5)

# Make predictions with the trained model
def predict_with_model(features):
    predictions = model.predict(features)
    return [np.argmax(item) for item in predictions]

# A function to gather data from table columns into a NumPy array of doubles
def table_to_array_double(rows, cols):
    return gather.table_to_numpy_2d(rows, cols, np_type=np.double)

# A function to gather data from table columns into a NumPy array of integers
def table_to_array_int(rows, cols):
    return gather.table_to_numpy_2d(rows, cols, np_type=np.intc)

# A function to extract a list element at a given index
def get_predicted_class(data, idx):
    return data[idx]

# The four features of the data set
inps=["SepalLengthCM", "SepalWidthCM", "PetalLengthCM", "PetalWidthCM"]
# Train the model
learn.learn(
    table=iris,
    model_func=train_model,
    inputs=[learn.Input(inps, table_to_array_double), learn.Input(["Class"], table_to_array_int)],
    outputs=None,
    batch_size=iris.size
)
# Apply the trained model to the data set
iris_predicted_static = learn.learn(
    table=iris,
    model_func=predict_with_model,
    inputs=[learn.Input(inps, table_to_array_double)],
    outputs=[learn.Output("PredictedClass", get_predicted_class, "int")],
    batch_size=iris.size
)
print(iris_predicted_static)

The error is as this:

# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x00007f9c7e8b5909, pid=1076, tid=1098
#
# JRE version: Java(TM) SE Runtime Environment (18.0.1.1+2) (build 18.0.1.1+2-6)
# Java VM: Java HotSpot(TM) 64-Bit Server VM (18.0.1.1+2-6, mixed mode, sharing, tiered, compressed oops, compressed class ptrs, g1 gc, linux-amd64)
# Problematic frame:
# C  [libpthread.so.0+0x10909]  sem_wait+0x9
#
# Core dump will be written. Default location: /mnt/wslg/dumps/core.%e
#
# An error report file with more information is saved as:
# /mnt/c/Users/Yu/Desktop/test_deephaven/hs_err_pid1076.log
#
# If you would like to submit a bug report, please visit:
#   https://bugreport.java.com/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
Aborted (core dumped)

Even it gives an error, the code finishes as expected. The error seems to appear sometimes, but not always.

Jeremiahcheng1206 commented 2 years ago

This is the PID log.

#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x00007f013ba17909, pid=1057, tid=1079
#
# JRE version: Java(TM) SE Runtime Environment (18.0.1.1+2) (build 18.0.1.1+2-6)
# Java VM: Java HotSpot(TM) 64-Bit Server VM (18.0.1.1+2-6, mixed mode, sharing, tiered, compressed oops, compressed class ptrs, g1 gc, linux-amd64)
# Problematic frame:
# C  [libpthread.so.0+0x10909]  sem_wait+0x9
#
# No core dump will be written. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# If you would like to submit a bug report, please visit:
#   https://bugreport.java.com/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#

---------------  S U M M A R Y ------------

Command Line: -Djpy.jpyLib=/home/haha/.local/lib/python3.7/site-packages/jpy.cpython-37m-x86_64-linux-gnu.so -Djpy.jdlLib=/home/haha/.local/lib/python3.7/site-packages/jdl.cpython-37m-x86_64-linux-gnu.so -Djpy.pythonLib=/usr/lib/python3.7/config-3.7m-x86_64-linux-gnu/libpython3.7.so -Djpy.pythonPrefix=/usr -Djpy.pythonExecutable=/usr/bin/python -Ddevroot=/mnt/c/Users/Yu/Desktop/test_deephaven -Dworkspace=. -DConfiguration.rootFile=dh-defaults.prop -DPythonDeephavenSession.initScripts= -Xmx4g --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.management/sun.management=ALL-UNNAMED 

Host: 11th Gen Intel(R) Core(TM) i9-11900H @ 2.50GHz, 16 cores, 15G, Ubuntu 18.04.6 LTS
Time: Wed Jul 13 17:38:31 2022 EDT elapsed time: 16.414214 seconds (0d 0h 0m 16s)

---------------  T H R E A D  ---------------

Current thread (0x00000000024a4230):  JavaThread "PyObject-cleanup" daemon [_thread_in_native, id=1079, stack(0x00007f0105afa000,0x00007f0105bfb000)]

Stack: [0x00007f0105afa000,0x00007f0105bfb000],  sp=0x00007f0105bf9670,  free space=1021k
Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code)
C  [libpthread.so.0+0x10909]  sem_wait+0x9

Java frames: (J=compiled Java code, j=interpreted, Vv=VM code)
j  org.jpy.PyLib.callAndReturnValue(JZLjava/lang/String;I[Ljava/lang/Object;[Ljava/lang/Class;Ljava/lang/Class;)Ljava/lang/Object;+0
j  org.jpy.PyProxyHandler.invoke(Ljava/lang/Object;Ljava/lang/reflect/Method;[Ljava/lang/Object;)Ljava/lang/Object;+291
j  org.jpy.$Proxy1.cleanupOnlyUseFromGIL()I+9
j  org.jpy.PyObjectReferences.cleanupThreadLogic()V+15
j  org.jpy.PyObjectReferences$$Lambda$4+0x0000000800c031e0.run()V+4
j  java.lang.Thread.run()V+11 java.base@18.0.1.1
v  ~StubRoutines::call_stub

siginfo: si_signo: 11 (SIGSEGV), si_code: 1 (SEGV_MAPERR), si_addr: 0x0000000000000000

Register to memory mapping:

RAX=0x0 is NULL
RBX=0x0 is NULL
RCX=0x00007f00c00008d0 points into unknown readable memory: 0x0100000100000100 | 00 01 00 00 01 00 00 01
RDX=0x0 is NULL
RSP=0x00007f0105bf9670 is pointing into the stack for thread: 0x00000000024a4230
RBP=0x0 is NULL
RSI=0xffffffffffffffff is an unknown value
RDI=0x0 is NULL
R8 =0x00007f0105bf9838 is pointing into the stack for thread: 0x00000000024a4230
R9 =0x0 is NULL
R10=0x00007f012069150b is at code_begin+843 in an Interpreter codelet
method entry point (kind = native)  [0x00007f01206911c0, 0x00007f0120691a20]  2144 bytes
R11=0x000000070064a5c8 is an oop: java.lang.Class 
{0x000000070064a5c8} - klass: 'java/lang/Class'
 - ---- fields (total size 16 words):
 - private volatile transient 'classRedefinedCount' 'I' @12  0
 - abstract internal 'klass' 'J' @16  34372328336 (c01b90 8)
 - abstract internal 'array_klass' 'J' @24  0 (0 0)
 - abstract internal 'oop_size' 'I' @32  16 (10)
 - abstract internal 'static_oop_field_count' 'I' @36  2
 - private volatile transient 'cachedConstructor' 'Ljava/lang/reflect/Constructor;' @40  NULL (0)
 - private transient 'name' 'Ljava/lang/String;' @44  "org.jpy.PyLib"{0x00000007005826f8} (e00b04df)
 - private transient 'module' 'Ljava/lang/Module;' @48  a 'java/lang/Module'{0x0000000700633978} (e00c672f)
 - private final 'classLoader' 'Ljava/lang/ClassLoader;' @52  a 'jdk/internal/loader/ClassLoaders$AppClassLoader'{0x000000070062fc18} (e00c5f83)
 - private transient 'classData' 'Ljava/lang/Object;' @56  NULL (0)
 - private transient 'packageName' 'Ljava/lang/String;' @60  "org.jpy"{0x0000000700658d28} (e00cb1a5)
 - private final 'componentType' 'Ljava/lang/Class;' @64  NULL (0)
 - private volatile transient 'reflectionData' 'Ljava/lang/ref/SoftReference;' @68  NULL (0)
 - private volatile transient 'genericInfo' 'Lsun/reflect/generics/repository/ClassRepository;' @72  NULL (0)
 - private volatile transient 'enumConstants' '[Ljava/lang/Object;' @76  NULL (0)
 - private volatile transient 'enumConstantDirectory' 'Ljava/util/Map;' @80  NULL (0)
 - private volatile transient 'annotationData' 'Ljava/lang/Class$AnnotationData;' @84  NULL (0)
 - private volatile transient 'annotationType' 'Lsun/reflect/annotation/AnnotationType;' @88  NULL (0)
 - transient 'classValueMap' 'Ljava/lang/ClassValue$ClassValueMap;' @92  NULL (0)
 - abstract internal 'protection_domain' 'Ljava/lang/Object;' @96  a 'java/security/ProtectionDomain'{0x0000000700658a30} (e00cb146)
 - abstract internal 'signers_name' 'Ljava/lang/Object;' @100  NULL (0)
 - abstract internal 'source_file' 'Ljava/lang/Object;' @104  "PyLib.java"{0x0000000727d4dd28} (e4fa9ba5)
 - signature: Lorg/jpy/PyLib;
 - ---- static fields (2):
 - private static final 'DEBUG' 'Z' @120  false
 - private static final 'ON_WINDOWS' 'Z' @121  false
 - private static final 'STOP_IS_NO_OP' 'Z' @122  false
 - private static 'dllFilePath' 'Ljava/lang/String;' @112  "/home/haha/.local/lib/python3.7/site-packages/jpy.cpython-37m-x86_64-linux-gnu.so"{0x00000007006b1fd0} (e00d63fa)
 - private static 'dllProblem' 'Ljava/lang/Throwable;' @116  NULL (0)
 - private static 'dllLoaded' 'Z' @123  true
R12=0x0 is NULL
R13=0x00007f00fc40f7b0 is pointing into metadata
R14=0x0 is NULL
R15=0x00007f0105bf9818 is pointing into the stack for thread: 0x00000000024a4230

Registers:
RAX=0x0000000000000000, RBX=0x0000000000000000, RCX=0x00007f00c00008d0, RDX=0x0000000000000000
RSP=0x00007f0105bf9670, RBP=0x0000000000000000, RSI=0xffffffffffffffff, RDI=0x0000000000000000
R8 =0x00007f0105bf9838, R9 =0x0000000000000000, R10=0x00007f012069150b, R11=0x000000070064a5c8
R12=0x0000000000000000, R13=0x00007f00fc40f7b0, R14=0x0000000000000000, R15=0x00007f0105bf9818
RIP=0x00007f013ba17909, EFLAGS=0x0000000000010293, CSGSFS=0x002b000000000033, ERR=0x0000000000000004
  TRAPNO=0x000000000000000e

Top of Stack: (sp=0x00007f0105bf9670)
0x00007f0105bf9670:   ffffffffffffffff 00000000004fe628
0x00007f0105bf9680:   00000000024a4d78 00007f0105bf97a0
0x00007f0105bf9690:   00000000024a4d74 00007f013ba14e7d
0x00007f0105bf96a0:   00007f01394f09d0 0000000000000009 

Instructions: (pc=0x00007f013ba17909)
0x00007f013ba17809:   48 8d 3d 70 3a 00 00 e8 1b 4f ff ff 0f 1f 00 f7
0x00007f013ba17819:   d8 eb bc 0f 1f 40 00 41 55 41 54 55 53 48 bd 00
0x00007f013ba17829:   00 00 00 01 00 00 00 48 89 fb 48 83 ec 38 64 48
0x00007f013ba17839:   8b 04 25 28 00 00 00 48 89 44 24 28 31 c0 f0 48
0x00007f013ba17849:   0f c1 2f 48 8d 35 2d ff ff ff 49 bd ff ff ff ff
0x00007f013ba17859:   fe ff ff ff 49 89 e4 48 89 fa 4c 89 e7 e8 b5 04
0x00007f013ba17869:   00 00 85 ed 74 41 4a 8d 54 2d 00 48 89 e8 f0 48
0x00007f013ba17879:   0f b1 13 48 89 c5 75 ea 31 db 31 f6 4c 89 e7 e8
0x00007f013ba17889:   b3 04 00 00 48 8b 4c 24 28 64 48 33 0c 25 28 00
0x00007f013ba17899:   00 00 89 d8 75 50 48 83 c4 38 5b 5d 41 5c 41 5d
0x00007f013ba178a9:   c3 66 0f 1f 44 00 00 48 89 df e8 d8 fe ff ff 83
0x00007f013ba178b9:   f8 6e 74 13 83 f8 04 74 0e 48 8b 2b eb a4 66 0f
0x00007f013ba178c9:   1f 84 00 00 00 00 00 48 8b 15 c9 96 20 00 64 89
0x00007f013ba178d9:   02 48 b8 00 00 00 00 ff ff ff ff f0 48 01 03 bb
0x00007f013ba178e9:   ff ff ff ff eb 94 e8 dc 4e ff ff 66 90 66 2e 0f
0x00007f013ba178f9:   1f 84 00 00 00 00 00 53 48 89 fb e8 77 ec ff ff
0x00007f013ba17909:   48 8b 03 85 c0 74 10 48 8d 50 ff f0 48 0f b1 13
0x00007f013ba17919:   75 05 31 c0 5b c3 90 48 89 df 5b e9 f7 fe ff ff
0x00007f013ba17929:   0f 1f 80 00 00 00 00 48 8b 07 85 c0 74 11 48 8d
0x00007f013ba17939:   50 ff f0 48 0f b1 17 75 f1 31 c0 c3 0f 1f 00 48
0x00007f013ba17949:   8b 05 51 96 20 00 64 c7 00 0b 00 00 00 b8 ff ff
0x00007f013ba17959:   ff ff c3 0f 1f 40 00 48 b8 00 00 00 00 ff ff ff
0x00007f013ba17969:   ff f0 48 01 07 c3 90 41 55 41 54 55 53 48 83 ec
0x00007f013ba17979:   18 48 85 f6 44 8b 67 08 75 4d 48 89 f3 48 89 fd
0x00007f013ba17989:   e8 62 07 00 00 44 89 e6 41 89 c5 41 b9 ff ff ff
0x00007f013ba17999:   ff 81 f6 89 01 00 00 45 31 c0 49 89 da 31 d2 48
0x00007f013ba179a9:   89 ef b8 ca 00 00 00 0f 05 48 3d 00 f0 ff ff 77
0x00007f013ba179b9:   26 44 89 ef e8 8e 07 00 00 31 c0 48 83 c4 18 5b
0x00007f013ba179c9:   5d 41 5c 41 5d c3 90 48 83 3e 00 79 ad b8 6e 00
0x00007f013ba179d9:   00 00 eb e7 0f 1f 00 44 89 ef 48 89 44 24 08 e8
0x00007f013ba179e9:   63 07 00 00 48 8b 44 24 08 83 f8 f5 74 19 83 f8
0x00007f013ba179f9:   fc 74 14 83 f8 92 74 0f 48 8d 3d 78 38 00 00 e8 

Stack slot to memory mapping:
stack at sp + 0 slots: 0xffffffffffffffff is an unknown value
stack at sp + 1 slots: 0x00000000004fe628: PyThread_acquire_lock_timed+0x00000000000000a8 in python at 0x0000000000400000
stack at sp + 2 slots: 0x00000000024a4d78 points into unknown readable memory: 0x0000000000000000 | 00 00 00 00 00 00 00 00
stack at sp + 3 slots: 0x00007f0105bf97a0 is pointing into the stack for thread: 0x00000000024a4230
stack at sp + 4 slots: 0x00000000024a4d74 points into unknown readable memory: 02 00 00 00
stack at sp + 5 slots: 0x00007f013ba14e7d: pthread_cond_timedwait+0x000000000000014d in /lib/x86_64-linux-gnu/libpthread.so.0 at 0x00007f013ba07000
stack at sp + 6 slots: 0x00007f01394f09d0 points into unknown readable memory: 0x0000000000000001 | 01 00 00 00 00 00 00 00
stack at sp + 7 slots: 0x0000000000000009 is an unknown value

---------------  P R O C E S S  ---------------

Threads class SMR info:
_java_thread_list=0x00007efeb830b610, length=39, elements={
0x0000000001ab4660, 0x0000000001bc5010, 0x0000000001bc64b0, 0x0000000001bcb2c0,
0x0000000001bcc660, 0x0000000001bcda60, 0x0000000001bcf3d0, 0x0000000001bd08f0,
0x0000000001bd1d50, 0x0000000001c0eb10, 0x0000000001c10650, 0x00000000024a4230,
0x000000000280e100, 0x00000000029c8740, 0x0000000003a19950, 0x0000000003a4d490,
0x00007f001c046310, 0x00007f001c047030, 0x00007f001c048070, 0x00007f001c0490c0,
0x0000000003aaa500, 0x0000000003b0d430, 0x0000000003b2aa70, 0x0000000003b2b520,
0x0000000003b2c300, 0x0000000003b2d190, 0x0000000003b2e1d0, 0x0000000003b2f340,
0x0000000003b304c0, 0x0000000003b31640, 0x0000000003c01c70, 0x0000000003c38680,
0x0000000003c391e0, 0x0000000003c3e810, 0x00007effa4001450, 0x00007effa4002710,
0x00007eff6c001380, 0x00007f00e03dd9b0, 0x00007f00e03ddf50
}

Java Threads: ( => current thread )
  0x0000000001ab4660 JavaThread "main" [_thread_in_native, id=1057, stack(0x00007ffedb8f4000,0x00007ffedb9f4000)]
  0x0000000001bc5010 JavaThread "Reference Handler" daemon [_thread_blocked, id=1066, stack(0x00007f01148d9000,0x00007f01149da000)]
  0x0000000001bc64b0 JavaThread "Finalizer" daemon [_thread_blocked, id=1067, stack(0x00007f01147d8000,0x00007f01148d9000)]
  0x0000000001bcb2c0 JavaThread "Signal Dispatcher" daemon [_thread_blocked, id=1068, stack(0x00007f01146d7000,0x00007f01147d8000)]
  0x0000000001bcc660 JavaThread "Service Thread" daemon [_thread_blocked, id=1069, stack(0x00007f01145d6000,0x00007f01146d7000)]
  0x0000000001bcda60 JavaThread "Monitor Deflation Thread" daemon [_thread_blocked, id=1070, stack(0x00007f01144d5000,0x00007f01145d6000)]
  0x0000000001bcf3d0 JavaThread "C2 CompilerThread0" daemon [_thread_blocked, id=1071, stack(0x00007f01143d4000,0x00007f01144d5000)]
  0x0000000001bd08f0 JavaThread "C1 CompilerThread0" daemon [_thread_blocked, id=1072, stack(0x00007f01142d3000,0x00007f01143d4000)]
  0x0000000001bd1d50 JavaThread "Sweeper thread" daemon [_thread_blocked, id=1073, stack(0x00007f01141d2000,0x00007f01142d3000)]
  0x0000000001c0eb10 JavaThread "Common-Cleaner" daemon [_thread_blocked, id=1075, stack(0x00007f0105eff000,0x00007f0106000000)]
  0x0000000001c10650 JavaThread "Notification Thread" daemon [_thread_blocked, id=1076, stack(0x00007f0105dfe000,0x00007f0105eff000)]
=>0x00000000024a4230 JavaThread "PyObject-cleanup" daemon [_thread_in_native, id=1079, stack(0x00007f0105afa000,0x00007f0105bfb000)]
  0x000000000280e100 JavaThread "grpc-timer-0" daemon [_thread_blocked, id=1088, stack(0x00007f0105173000,0x00007f0105274000)]
  0x00000000029c8740 JavaThread "CleanupReferenceProcessor-liveness-drainingThread" daemon [_thread_blocked, id=1089, stack(0x00007f0105072000,0x00007f0105173000)]
  0x0000000003a19950 JavaThread "UpdateGraphProcessor.DEFAULT.refreshThread" daemon [_thread_blocked, id=1122, stack(0x00007f004b02f000,0x00007f004b130000)]
  0x0000000003a4d490 JavaThread "process reaper" daemon [_thread_blocked, id=1124, stack(0x00007f004b8be000,0x00007f004b8e0000)]
  0x00007f001c046310 JavaThread "Static-FastNIODriver-0" daemon [_thread_blocked, id=1125, stack(0x00007f004af2e000,0x00007f004b02f000)]
  0x00007f001c047030 JavaThread "Static-FastNIODriver-1" daemon [_thread_blocked, id=1126, stack(0x00007f004ae2d000,0x00007f004af2e000)]
  0x00007f001c048070 JavaThread "Static-FastNIODriver-2" daemon [_thread_in_native, id=1127, stack(0x00007f004ad2c000,0x00007f004ae2d000)]
  0x00007f001c0490c0 JavaThread "Static-FastNIODriver-3" daemon [_thread_blocked, id=1128, stack(0x00007f004ac2b000,0x00007f004ad2c000)]
  0x0000000003aaa500 JavaThread "UpdatePerformanceTracker.Driver" daemon [_thread_blocked, id=1146, stack(0x00007f004b1b0000,0x00007f004b2b1000)]
  0x0000000003b0d430 JavaThread "ServerStateTracker.Driver" daemon [_thread_blocked, id=1147, stack(0x00007f004a910000,0x00007f004aa11000)]
  0x0000000003b2aa70 JavaThread "qtp347091104-36" [_thread_in_native, id=1148, stack(0x00007f004a80f000,0x00007f004a910000)]
  0x0000000003b2b520 JavaThread "qtp347091104-37" [_thread_in_native, id=1149, stack(0x00007f004a70e000,0x00007f004a80f000)]
  0x0000000003b2c300 JavaThread "qtp347091104-38" [_thread_in_native, id=1150, stack(0x00007f004a60d000,0x00007f004a70e000)]
  0x0000000003b2d190 JavaThread "qtp347091104-39" [_thread_in_native, id=1151, stack(0x00007f004a50c000,0x00007f004a60d000)]
  0x0000000003b2e1d0 JavaThread "qtp347091104-40" [_thread_in_native, id=1152, stack(0x00007f004a40b000,0x00007f004a50c000)]
  0x0000000003b2f340 JavaThread "qtp347091104-41" [_thread_in_native, id=1153, stack(0x00007f004a30a000,0x00007f004a40b000)]
  0x0000000003b304c0 JavaThread "qtp347091104-42" [_thread_in_native, id=1154, stack(0x00007f004a209000,0x00007f004a30a000)]
  0x0000000003b31640 JavaThread "qtp347091104-43" [_thread_in_native, id=1155, stack(0x00007f004a108000,0x00007f004a209000)]
  0x0000000003c01c70 JavaThread "Session-HouseKeeper-f171912-1" [_thread_blocked, id=1168, stack(0x00007f00493ef000,0x00007f00494f0000)]
  0x0000000003c38680 JavaThread "qtp347091104-45-acceptor-0@21af7cc4-ServerConnector@3ce3db41{HTTP/1.1, (http/1.1, h2c)}{0.0.0.0:8080}" [_thread_in_native, id=1169, stack(0x00007f00492ee000,0x00007f00493ef000)]
  0x0000000003c391e0 JavaThread "qtp347091104-46-acceptor-1@396d791d-ServerConnector@3ce3db41{HTTP/1.1, (http/1.1, h2c)}{0.0.0.0:8080}" [_thread_blocked, id=1170, stack(0x00007f00491ed000,0x00007f00492ee000)]
  0x0000000003c3e810 JavaThread "DeephavenApiServer-Scheduler-Concurrent-1" daemon [_thread_in_native, id=1171, stack(0x00007f00490ec000,0x00007f00491ed000)]
  0x00007effa4001450 JavaThread "DeephavenApiServer-Scheduler-Concurrent-2" daemon [_thread_blocked, id=1172, stack(0x00007f00489f4000,0x00007f0048af5000)]
  0x00007effa4002710 JavaThread "DeephavenApiServer-Scheduler-Concurrent-3" daemon [_thread_blocked, id=1173, stack(0x00007f0105bfb000,0x00007f0105cfc000)]
  0x00007eff6c001380 JavaThread "DeephavenApiServer-Scheduler-Concurrent-4" daemon [_thread_blocked, id=1174, stack(0x00007f004b7b9000,0x00007f004b8ba000)]
  0x00007f00e03dd9b0 JavaThread "C2 CompilerThread1" daemon [_thread_blocked, id=1261, stack(0x00007eff52fb9000,0x00007eff530ba000)]
  0x00007f00e03ddf50 JavaThread "C2 CompilerThread2" daemon [_thread_blocked, id=1262, stack(0x00007eff53ac4000,0x00007eff53bc5000)]

Other Threads:
  0x0000000001bc1e90 VMThread "VM Thread" [stack: 0x00007f01149dc000,0x00007f0114adc000] [id=1065]
  0x0000000001c12100 WatcherThread "VM Periodic Task Thread" [stack: 0x00007f0105cfe000,0x00007f0105dfe000] [id=1077]
  0x000000000199af50 WorkerThread "GC Thread#0" [stack: 0x00007f0115de0000,0x00007f0115ee0000] [id=1060]
  0x00007f00f4008570 WorkerThread "GC Thread#1" [stack: 0x00007f010567e000,0x00007f010577e000] [id=1083]
  0x00007f00f4008f90 WorkerThread "GC Thread#2" [stack: 0x00007f010557c000,0x00007f010567c000] [id=1084]
  0x00007f00f40099b0 WorkerThread "GC Thread#3" [stack: 0x00007f010547a000,0x00007f010557a000] [id=1085]
  0x00007f00f400a3d0 WorkerThread "GC Thread#4" [stack: 0x00007f0105378000,0x00007f0105478000] [id=1086]
  0x00007f00f400adf0 WorkerThread "GC Thread#5" [stack: 0x00007f0105276000,0x00007f0105376000] [id=1087]
  0x00007f00f4014170 WorkerThread "GC Thread#6" [stack: 0x00007f0061610000,0x00007f0061710000] [id=1105]
  0x00007f00f4014690 WorkerThread "GC Thread#7" [stack: 0x00007f006150e000,0x00007f006160e000] [id=1106]
  0x00007f00f4014de0 WorkerThread "GC Thread#8" [stack: 0x00007f006140c000,0x00007f006150c000] [id=1107]
  0x00007f00f40155b0 WorkerThread "GC Thread#9" [stack: 0x00007f006130a000,0x00007f006140a000] [id=1108]
  0x00007f00f400e0d0 WorkerThread "GC Thread#10" [stack: 0x00007f004b4b7000,0x00007f004b5b7000] [id=1118]
  0x00007f00f4018100 WorkerThread "GC Thread#11" [stack: 0x00007f004b3b5000,0x00007f004b4b5000] [id=1119]
  0x00007f00f40189a0 WorkerThread "GC Thread#12" [stack: 0x00007f004b2b3000,0x00007f004b3b3000] [id=1120]
  0x0000000001b35380 ConcurrentGCThread "G1 Main Marker" [stack: 0x00007f0115cde000,0x00007f0115dde000] [id=1061]
  0x0000000001b36180 WorkerThread "G1 Conc#0" [stack: 0x00007f0115bdc000,0x00007f0115cdc000] [id=1062]
  0x00007f0108000bb0 WorkerThread "G1 Conc#1" [stack: 0x00007f0061208000,0x00007f0061308000] [id=1109]
  0x00007f0108001480 WorkerThread "G1 Conc#2" [stack: 0x00007f0061106000,0x00007f0061206000] [id=1110]
  0x0000000001a471b0 ConcurrentGCThread "G1 Refine#0" [stack: 0x00007f0114dcd000,0x00007f0114ecd000] [id=1063]
  0x00007f0100000bb0 ConcurrentGCThread "G1 Refine#1" [stack: 0x00007f004a008000,0x00007f004a108000] [id=1156]
  0x00007effe8000bb0 ConcurrentGCThread "G1 Refine#2" [stack: 0x00007f0049f06000,0x00007f004a006000] [id=1157]
  0x00007effdc000bb0 ConcurrentGCThread "G1 Refine#3" [stack: 0x00007f0049e04000,0x00007f0049f04000] [id=1158]
  0x00007effe0000bb0 ConcurrentGCThread "G1 Refine#4" [stack: 0x00007f0049d02000,0x00007f0049e02000] [id=1159]
  0x00007effd4000bb0 ConcurrentGCThread "G1 Refine#5" [stack: 0x00007f0049c00000,0x00007f0049d00000] [id=1160]
  0x00007effd8000bb0 ConcurrentGCThread "G1 Refine#6" [stack: 0x00007f0049afe000,0x00007f0049bfe000] [id=1161]
  0x00007effcc000bb0 ConcurrentGCThread "G1 Refine#7" [stack: 0x00007f00499fc000,0x00007f0049afc000] [id=1162]
  0x00007effd0000bb0 ConcurrentGCThread "G1 Refine#8" [stack: 0x00007f00498fa000,0x00007f00499fa000] [id=1163]
  0x00007effc4000bb0 ConcurrentGCThread "G1 Refine#9" [stack: 0x00007f00497f8000,0x00007f00498f8000] [id=1164]
  0x00007effc8000bb0 ConcurrentGCThread "G1 Refine#10" [stack: 0x00007f00496f6000,0x00007f00497f6000] [id=1165]
  0x00007effbc000bb0 ConcurrentGCThread "G1 Refine#11" [stack: 0x00007f00495f4000,0x00007f00496f4000] [id=1166]
  0x00007effc0000bb0 ConcurrentGCThread "G1 Refine#12" [stack: 0x00007f00494f2000,0x00007f00495f2000] [id=1167]
  0x0000000001a4b4c0 ConcurrentGCThread "G1 Service" [stack: 0x00007f0114ccb000,0x00007f0114dcb000] [id=1064]

Threads with active compile tasks:

VM state: not at safepoint (normal execution)

VM Mutex/Monitor currently owned by a thread: None

Heap address: 0x0000000700000000, size: 4096 MB, Compressed Oops mode: Zero based, Oop shift amount: 3

CDS archive(s) mapped at: [0x0000000800000000-0x0000000800bb1000-0x0000000800bb1000), size 12259328, SharedBaseAddress: 0x0000000800000000, ArchiveRelocationMode: 0.
Compressed class space mapped at: 0x0000000800c00000-0x0000000840c00000, reserved size: 1073741824
Narrow klass base: 0x0000000800000000, Narrow klass shift: 0, Narrow klass range: 0x100000000

GC Precious Log:
 CardTable entry size: 512
 Card Set container configuration: InlinePtr #cards 4 size 8 Array Of Cards #cards 16 size 48 Howl #buckets 8 coarsen threshold 3686 Howl Bitmap #cards 512 size 80 coarsen threshold 460 Card regions per heap region 1 cards per card region 4096
 CPUs: 16 total, 16 available
 Memory: 15859M
 Large Page Support: Disabled
 NUMA Support: Disabled
 Compressed Oops: Enabled (Zero based)
 Heap Region Size: 2M
 Heap Min Capacity: 8M
 Heap Initial Capacity: 248M
 Heap Max Capacity: 4G
 Pre-touch: Disabled
 Parallel Workers: 13
 Concurrent Workers: 3
 Concurrent Refinement Workers: 13
 Periodic GC: Disabled

Heap:
 garbage-first heap   total 1097728K, used 747213K [0x0000000700000000, 0x0000000800000000)
  region size 2048K, 162 young (331776K), 4 survivors (8192K)
 Metaspace       used 53065K, committed 53504K, reserved 1114112K
  class space    used 6360K, committed 6592K, reserved 1048576K

Heap Regions: E=young(eden), S=young(survivor), O=old, HS=humongous(starts), HC=humongous(continues), CS=collection set, F=free, OA=open archive, CA=closed archive, TAMS=top-at-mark-start (previous, next)
|   0|0x0000000700000000, 0x0000000700200000, 0x0000000700200000|100%| O|  |TAMS 0x0000000700200000, 0x0000000700000000| Untracked 
|   1|0x0000000700200000, 0x0000000700400000, 0x0000000700400000|100%| O|  |TAMS 0x0000000700400000, 0x0000000700200000| Untracked 
|   2|0x0000000700400000, 0x0000000700600000, 0x0000000700600000|100%| O|  |TAMS 0x0000000700600000, 0x0000000700400000| Untracked 
|   3|0x0000000700600000, 0x0000000700800000, 0x0000000700800000|100%| O|  |TAMS 0x0000000700800000, 0x0000000700600000| Untracked 
|   4|0x0000000700800000, 0x0000000700a00000, 0x0000000700a00000|100%| O|  |TAMS 0x0000000700a00000, 0x0000000700800000| Untracked 
|   5|0x0000000700a00000, 0x0000000700c00000, 0x0000000700c00000|100%| O|  |TAMS 0x0000000700c00000, 0x0000000700a00000| Untracked 
|   6|0x0000000700c00000, 0x0000000700e00000, 0x0000000700e00000|100%| O|  |TAMS 0x0000000700e00000, 0x0000000700c00000| Untracked 
|   7|0x0000000700e00000, 0x0000000701000000, 0x0000000701000000|100%| O|  |TAMS 0x0000000701000000, 0x0000000700e00000| Untracked 
|   8|0x0000000701000000, 0x0000000701200000, 0x0000000701200000|100%| O|  |TAMS 0x0000000701200000, 0x0000000701000000| Untracked 
|   9|0x0000000701200000, 0x0000000701400000, 0x0000000701400000|100%| O|  |TAMS 0x0000000701400000, 0x0000000701200000| Untracked 
|  10|0x0000000701400000, 0x0000000701600000, 0x0000000701600000|100%| O|  |TAMS 0x0000000701600000, 0x0000000701400000| Untracked 
|  11|0x0000000701600000, 0x0000000701800000, 0x0000000701800000|100%| O|  |TAMS 0x0000000701800000, 0x0000000701600000| Untracked 
|  12|0x0000000701800000, 0x0000000701a00000, 0x0000000701a00000|100%| O|  |TAMS 0x0000000701a00000, 0x0000000701800000| Untracked 
|  13|0x0000000701a00000, 0x0000000701c00000, 0x0000000701c00000|100%| O|  |TAMS 0x0000000701c00000, 0x0000000701a00000| Untracked 
|  14|0x0000000701c00000, 0x0000000701e00000, 0x0000000701e00000|100%| O|  |TAMS 0x0000000701e00000, 0x0000000701c00000| Untracked 
|  15|0x0000000701e00000, 0x0000000702000000, 0x0000000702000000|100%| O|  |TAMS 0x0000000702000000, 0x0000000701e00000| Untracked 
|  16|0x0000000702000000, 0x0000000702200000, 0x0000000702200000|100%| O|  |TAMS 0x0000000702200000, 0x0000000702000000| Untracked 
|  17|0x0000000702200000, 0x0000000702400000, 0x0000000702400000|100%| O|  |TAMS 0x0000000702400000, 0x0000000702200000| Untracked 
|  18|0x0000000702400000, 0x0000000702600000, 0x0000000702600000|100%| O|  |TAMS 0x0000000702600000, 0x0000000702400000| Untracked 
|  19|0x0000000702600000, 0x0000000702800000, 0x0000000702800000|100%| O|  |TAMS 0x0000000702800000, 0x0000000702600000| Untracked 
|  20|0x0000000702800000, 0x0000000702a00000, 0x0000000702a00000|100%| O|  |TAMS 0x0000000702a00000, 0x0000000702800000| Untracked 
|  21|0x0000000702a00000, 0x0000000702c00000, 0x0000000702c00000|100%| O|  |TAMS 0x0000000702c00000, 0x0000000702a00000| Untracked 
|  22|0x0000000702c00000, 0x0000000702e00000, 0x0000000702e00000|100%| O|  |TAMS 0x0000000702e00000, 0x0000000702c00000| Untracked 
|  23|0x0000000702e00000, 0x0000000703000000, 0x0000000703000000|100%| O|  |TAMS 0x0000000703000000, 0x0000000702e00000| Untracked 
|  24|0x0000000703000000, 0x0000000703200000, 0x0000000703200000|100%| O|  |TAMS 0x0000000703200000, 0x0000000703000000| Untracked 
|  25|0x0000000703200000, 0x0000000703400000, 0x0000000703400000|100%| O|  |TAMS 0x0000000703400000, 0x0000000703200000| Untracked 
|  26|0x0000000703400000, 0x0000000703600000, 0x0000000703600000|100%| O|  |TAMS 0x0000000703600000, 0x0000000703400000| Untracked 
|  27|0x0000000703600000, 0x0000000703800000, 0x0000000703800000|100%| O|  |TAMS 0x0000000703800000, 0x0000000703600000| Untracked 
|  28|0x0000000703800000, 0x0000000703a00000, 0x0000000703a00000|100%| O|  |TAMS 0x0000000703a00000, 0x0000000703800000| Untracked 
|  29|0x0000000703a00000, 0x0000000703c00000, 0x0000000703c00000|100%| O|  |TAMS 0x0000000703c00000, 0x0000000703a00000| Untracked 
|  30|0x0000000703c00000, 0x0000000703e00000, 0x0000000703e00000|100%| O|  |TAMS 0x0000000703e00000, 0x0000000703c00000| Untracked 
|  31|0x0000000703e00000, 0x0000000704000000, 0x0000000704000000|100%| O|  |TAMS 0x0000000704000000, 0x0000000703e00000| Untracked 
|  32|0x0000000704000000, 0x0000000704200000, 0x0000000704200000|100%| O|  |TAMS 0x0000000704200000, 0x0000000704000000| Untracked 
|  33|0x0000000704200000, 0x0000000704400000, 0x0000000704400000|100%| O|  |TAMS 0x0000000704400000, 0x0000000704200000| Untracked 
|  34|0x0000000704400000, 0x0000000704600000, 0x0000000704600000|100%| O|  |TAMS 0x0000000704400000, 0x0000000704400000| Untracked 
|  35|0x0000000704600000, 0x0000000704800000, 0x0000000704800000|100%| O|  |TAMS 0x0000000704600000, 0x0000000704600000| Untracked 
|  36|0x0000000704800000, 0x0000000704a00000, 0x0000000704a00000|100%| O|  |TAMS 0x0000000704800000, 0x0000000704800000| Untracked 
|  37|0x0000000704a00000, 0x0000000704c00000, 0x0000000704c00000|100%| O|  |TAMS 0x0000000704a00000, 0x0000000704a00000| Untracked 
|  38|0x0000000704c00000, 0x0000000704e00000, 0x0000000704e00000|100%| O|  |TAMS 0x0000000704c00000, 0x0000000704c00000| Untracked 
|  39|0x0000000704e00000, 0x0000000705000000, 0x0000000705000000|100%| O|  |TAMS 0x0000000704e00000, 0x0000000704e00000| Untracked 
|  40|0x0000000705000000, 0x0000000705200000, 0x0000000705200000|100%| O|  |TAMS 0x0000000705000000, 0x0000000705000000| Untracked 
|  41|0x0000000705200000, 0x0000000705400000, 0x0000000705400000|100%| O|  |TAMS 0x0000000705200000, 0x0000000705200000| Untracked 
|  42|0x0000000705400000, 0x0000000705600000, 0x0000000705600000|100%| O|  |TAMS 0x0000000705600000, 0x0000000705400000| Untracked 
|  43|0x0000000705600000, 0x0000000705800000, 0x0000000705800000|100%| O|  |TAMS 0x0000000705800000, 0x0000000705600000| Untracked 
|  44|0x0000000705800000, 0x0000000705a00000, 0x0000000705a00000|100%| O|  |TAMS 0x0000000705a00000, 0x0000000705800000| Untracked 
|  45|0x0000000705a00000, 0x0000000705c00000, 0x0000000705c00000|100%| O|  |TAMS 0x0000000705c00000, 0x0000000705a00000| Untracked 
|  46|0x0000000705c00000, 0x0000000705e00000, 0x0000000705e00000|100%| O|  |TAMS 0x0000000705c00000, 0x0000000705c00000| Untracked 
|  47|0x0000000705e00000, 0x0000000706000000, 0x0000000706000000|100%| O|  |TAMS 0x0000000705e00000, 0x0000000705e00000| Untracked 
|  48|0x0000000706000000, 0x0000000706200000, 0x0000000706200000|100%| O|  |TAMS 0x0000000706000000, 0x0000000706000000| Untracked 
|  49|0x0000000706200000, 0x0000000706400000, 0x0000000706400000|100%| O|  |TAMS 0x0000000706200000, 0x0000000706200000| Untracked 
|  50|0x0000000706400000, 0x0000000706600000, 0x0000000706600000|100%| O|  |TAMS 0x0000000706400000, 0x0000000706400000| Untracked 
|  51|0x0000000706600000, 0x00000007067c8600, 0x0000000706800000| 89%| O|  |TAMS 0x0000000706600000, 0x0000000706600000| Untracked 
|  52|0x0000000706800000, 0x0000000706a00000, 0x0000000706a00000|100%|HS|  |TAMS 0x0000000706800000, 0x0000000706800000| Complete 
|  53|0x0000000706a00000, 0x0000000706a00000, 0x0000000706c00000|  0%| F|  |TAMS 0x0000000706a00000, 0x0000000706a00000| Untracked 
|  54|0x0000000706c00000, 0x0000000706c00000, 0x0000000706e00000|  0%| F|  |TAMS 0x0000000706c00000, 0x0000000706c00000| Untracked 
|  55|0x0000000706e00000, 0x0000000706e00000, 0x0000000707000000|  0%| F|  |TAMS 0x0000000706e00000, 0x0000000706e00000| Untracked 
|  56|0x0000000707000000, 0x0000000707000000, 0x0000000707200000|  0%| F|  |TAMS 0x0000000707000000, 0x0000000707000000| Untracked 
|  57|0x0000000707200000, 0x0000000707200000, 0x0000000707400000|  0%| F|  |TAMS 0x0000000707200000, 0x0000000707200000| Untracked 
|  58|0x0000000707400000, 0x0000000707400000, 0x0000000707600000|  0%| F|  |TAMS 0x0000000707400000, 0x0000000707400000| Untracked 
|  59|0x0000000707600000, 0x0000000707600000, 0x0000000707800000|  0%| F|  |TAMS 0x0000000707600000, 0x0000000707600000| Untracked 
|  60|0x0000000707800000, 0x0000000707800000, 0x0000000707a00000|  0%| F|  |TAMS 0x0000000707800000, 0x0000000707800000| Untracked 
|  61|0x0000000707a00000, 0x0000000707a00000, 0x0000000707c00000|  0%| F|  |TAMS 0x0000000707a00000, 0x0000000707a00000| Untracked 
|  62|0x0000000707c00000, 0x0000000707c00000, 0x0000000707e00000|  0%| F|  |TAMS 0x0000000707c00000, 0x0000000707c00000| Untracked 
|  63|0x0000000707e00000, 0x0000000707e00000, 0x0000000708000000|  0%| F|  |TAMS 0x0000000707e00000, 0x0000000707e00000| Untracked 
|  64|0x0000000708000000, 0x0000000708000000, 0x0000000708200000|  0%| F|  |TAMS 0x0000000708000000, 0x0000000708000000| Untracked 
|  65|0x0000000708200000, 0x0000000708200000, 0x0000000708400000|  0%| F|  |TAMS 0x0000000708200000, 0x0000000708200000| Untracked 
|  66|0x0000000708400000, 0x0000000708400000, 0x0000000708600000|  0%| F|  |TAMS 0x0000000708400000, 0x0000000708400000| Untracked 
|  67|0x0000000708600000, 0x0000000708600000, 0x0000000708800000|  0%| F|  |TAMS 0x0000000708600000, 0x0000000708600000| Untracked 
|  68|0x0000000708800000, 0x0000000708800000, 0x0000000708a00000|  0%| F|  |TAMS 0x0000000708800000, 0x0000000708800000| Untracked 
|  69|0x0000000708a00000, 0x0000000708a00000, 0x0000000708c00000|  0%| F|  |TAMS 0x0000000708a00000, 0x0000000708a00000| Untracked 
|  70|0x0000000708c00000, 0x0000000708e00000, 0x0000000708e00000|100%| O|  |TAMS 0x0000000708e00000, 0x0000000708c00000| Untracked 
|  71|0x0000000708e00000, 0x0000000709000000, 0x0000000709000000|100%| O|  |TAMS 0x0000000709000000, 0x0000000708e00000| Untracked 
|  72|0x0000000709000000, 0x0000000709000000, 0x0000000709200000|  0%| F|  |TAMS 0x0000000709000000, 0x0000000709000000| Untracked 
|  73|0x0000000709200000, 0x0000000709200000, 0x0000000709400000|  0%| F|  |TAMS 0x0000000709200000, 0x0000000709200000| Untracked 
|  74|0x0000000709400000, 0x0000000709400000, 0x0000000709600000|  0%| F|  |TAMS 0x0000000709400000, 0x0000000709400000| Untracked 
|  75|0x0000000709600000, 0x0000000709600000, 0x0000000709800000|  0%| F|  |TAMS 0x0000000709600000, 0x0000000709600000| Untracked 
|  76|0x0000000709800000, 0x0000000709800000, 0x0000000709a00000|  0%| F|  |TAMS 0x0000000709800000, 0x0000000709800000| Untracked 
|  77|0x0000000709a00000, 0x0000000709a00000, 0x0000000709c00000|  0%| F|  |TAMS 0x0000000709a00000, 0x0000000709a00000| Untracked 
|  78|0x0000000709c00000, 0x0000000709e00000, 0x0000000709e00000|100%| O|  |TAMS 0x0000000709e00000, 0x0000000709c00000| Untracked 
|  79|0x0000000709e00000, 0x0000000709e00000, 0x000000070a000000|  0%| F|  |TAMS 0x0000000709e00000, 0x0000000709e00000| Untracked 
|  80|0x000000070a000000, 0x000000070a000000, 0x000000070a200000|  0%| F|  |TAMS 0x000000070a000000, 0x000000070a000000| Untracked 
|  81|0x000000070a200000, 0x000000070a200000, 0x000000070a400000|  0%| F|  |TAMS 0x000000070a200000, 0x000000070a200000| Untracked 
|  82|0x000000070a400000, 0x000000070a400000, 0x000000070a600000|  0%| F|  |TAMS 0x000000070a400000, 0x000000070a400000| Untracked 
|  83|0x000000070a600000, 0x000000070a600000, 0x000000070a800000|  0%| F|  |TAMS 0x000000070a600000, 0x000000070a600000| Untracked 
|  84|0x000000070a800000, 0x000000070a800000, 0x000000070aa00000|  0%| F|  |TAMS 0x000000070a800000, 0x000000070a800000| Untracked 
|  85|0x000000070aa00000, 0x000000070aa00000, 0x000000070ac00000|  0%| F|  |TAMS 0x000000070aa00000, 0x000000070aa00000| Untracked 
|  86|0x000000070ac00000, 0x000000070ac00000, 0x000000070ae00000|  0%| F|  |TAMS 0x000000070ac00000, 0x000000070ac00000| Untracked 
|  87|0x000000070ae00000, 0x000000070ae00000, 0x000000070b000000|  0%| F|  |TAMS 0x000000070ae00000, 0x000000070ae00000| Untracked 
|  88|0x000000070b000000, 0x000000070b000000, 0x000000070b200000|  0%| F|  |TAMS 0x000000070b000000, 0x000000070b000000| Untracked 
|  89|0x000000070b200000, 0x000000070b200000, 0x000000070b400000|  0%| F|  |TAMS 0x000000070b200000, 0x000000070b200000| Untracked 
|  90|0x000000070b400000, 0x000000070b400000, 0x000000070b600000|  0%| F|  |TAMS 0x000000070b400000, 0x000000070b400000| Untracked 
|  91|0x000000070b600000, 0x000000070b600000, 0x000000070b800000|  0%| F|  |TAMS 0x000000070b600000, 0x000000070b600000| Untracked 
|  92|0x000000070b800000, 0x000000070b800000, 0x000000070ba00000|  0%| F|  |TAMS 0x000000070b800000, 0x000000070b800000| Untracked 
|  93|0x000000070ba00000, 0x000000070ba00000, 0x000000070bc00000|  0%| F|  |TAMS 0x000000070ba00000, 0x000000070ba00000| Untracked 
|  94|0x000000070bc00000, 0x000000070bc00000, 0x000000070be00000|  0%| F|  |TAMS 0x000000070bc00000, 0x000000070bc00000| Untracked 
|  95|0x000000070be00000, 0x000000070be00000, 0x000000070c000000|  0%| F|  |TAMS 0x000000070be00000, 0x000000070be00000| Untracked 
|  96|0x000000070c000000, 0x000000070c000000, 0x000000070c200000|  0%| F|  |TAMS 0x000000070c000000, 0x000000070c000000| Untracked 
|  97|0x000000070c200000, 0x000000070c200000, 0x000000070c400000|  0%| F|  |TAMS 0x000000070c200000, 0x000000070c200000| Untracked 
|  98|0x000000070c400000, 0x000000070c400000, 0x000000070c600000|  0%| F|  |TAMS 0x000000070c400000, 0x000000070c400000| Untracked 
|  99|0x000000070c600000, 0x000000070c600000, 0x000000070c800000|  0%| F|  |TAMS 0x000000070c600000, 0x000000070c600000| Untracked 
| 100|0x000000070c800000, 0x000000070c800000, 0x000000070ca00000|  0%| F|  |TAMS 0x000000070c800000, 0x000000070c800000| Untracked 
| 101|0x000000070ca00000, 0x000000070ca00000, 0x000000070cc00000|  0%| F|  |TAMS 0x000000070ca00000, 0x000000070ca00000| Untracked 
| 102|0x000000070cc00000, 0x000000070cc00000, 0x000000070ce00000|  0%| F|  |TAMS 0x000000070cc00000, 0x000000070cc00000| Untracked 
| 103|0x000000070ce00000, 0x000000070ce00000, 0x000000070d000000|  0%| F|  |TAMS 0x000000070ce00000, 0x000000070ce00000| Untracked 
| 104|0x000000070d000000, 0x000000070d000000, 0x000000070d200000|  0%| F|  |TAMS 0x000000070d000000, 0x000000070d000000| Untracked 
| 105|0x000000070d200000, 0x000000070d200000, 0x000000070d400000|  0%| F|  |TAMS 0x000000070d200000, 0x000000070d200000| Untracked 
| 106|0x000000070d400000, 0x000000070d400000, 0x000000070d600000|  0%| F|  |TAMS 0x000000070d400000, 0x000000070d400000| Untracked 
| 107|0x000000070d600000, 0x000000070d800000, 0x000000070d800000|100%| O|  |TAMS 0x000000070d800000, 0x000000070d600000| Untracked 
| 108|0x000000070d800000, 0x000000070d800000, 0x000000070da00000|  0%| F|  |TAMS 0x000000070d800000, 0x000000070d800000| Untracked 
| 109|0x000000070da00000, 0x000000070dc00000, 0x000000070dc00000|100%| O|  |TAMS 0x000000070dc00000, 0x000000070da00000| Untracked 
| 110|0x000000070dc00000, 0x000000070de00000, 0x000000070de00000|100%| O|  |TAMS 0x000000070de00000, 0x000000070dc00000| Untracked 
| 111|0x000000070de00000, 0x000000070de00000, 0x000000070e000000|  0%| F|  |TAMS 0x000000070de00000, 0x000000070de00000| Untracked 
| 112|0x000000070e000000, 0x000000070e000000, 0x000000070e200000|  0%| F|  |TAMS 0x000000070e000000, 0x000000070e000000| Untracked 
| 113|0x000000070e200000, 0x000000070e200000, 0x000000070e400000|  0%| F|  |TAMS 0x000000070e200000, 0x000000070e200000| Untracked 
| 114|0x000000070e400000, 0x000000070e400000, 0x000000070e600000|  0%| F|  |TAMS 0x000000070e400000, 0x000000070e400000| Untracked 
| 115|0x000000070e600000, 0x000000070e800000, 0x000000070e800000|100%| O|  |TAMS 0x000000070e800000, 0x000000070e600000| Untracked 
| 116|0x000000070e800000, 0x000000070ea00000, 0x000000070ea00000|100%|HS|  |TAMS 0x000000070e800000, 0x000000070e800000| Complete 
| 117|0x000000070ea00000, 0x000000070ec00000, 0x000000070ec00000|100%|HC|  |TAMS 0x000000070ea00000, 0x000000070ea00000| Complete 
| 118|0x000000070ec00000, 0x000000070ee00000, 0x000000070ee00000|100%|HC|  |TAMS 0x000000070ec00000, 0x000000070ec00000| Complete 
| 119|0x000000070ee00000, 0x000000070f000000, 0x000000070f000000|100%|HC|  |TAMS 0x000000070ee00000, 0x000000070ee00000| Complete 
| 120|0x000000070f000000, 0x000000070f200000, 0x000000070f200000|100%|HC|  |TAMS 0x000000070f000000, 0x000000070f000000| Complete 
| 121|0x000000070f200000, 0x000000070f400000, 0x000000070f400000|100%|HC|  |TAMS 0x000000070f200000, 0x000000070f200000| Complete 
| 122|0x000000070f400000, 0x000000070f600000, 0x000000070f600000|100%|HC|  |TAMS 0x000000070f400000, 0x000000070f400000| Complete 
| 123|0x000000070f600000, 0x000000070f800000, 0x000000070f800000|100%|HC|  |TAMS 0x000000070f600000, 0x000000070f600000| Complete 
| 124|0x000000070f800000, 0x000000070fa00000, 0x000000070fa00000|100%|HC|  |TAMS 0x000000070f800000, 0x000000070f800000| Complete 
| 125|0x000000070fa00000, 0x000000070fc00000, 0x000000070fc00000|100%|HC|  |TAMS 0x000000070fa00000, 0x000000070fa00000| Complete 
| 126|0x000000070fc00000, 0x000000070fe00000, 0x000000070fe00000|100%|HC|  |TAMS 0x000000070fc00000, 0x000000070fc00000| Complete 
| 127|0x000000070fe00000, 0x0000000710000000, 0x0000000710000000|100%|HC|  |TAMS 0x000000070fe00000, 0x000000070fe00000| Complete 
| 128|0x0000000710000000, 0x0000000710200000, 0x0000000710200000|100%|HC|  |TAMS 0x0000000710000000, 0x0000000710000000| Complete 
| 129|0x0000000710200000, 0x0000000710400000, 0x0000000710400000|100%|HC|  |TAMS 0x0000000710200000, 0x0000000710200000| Complete 
| 130|0x0000000710400000, 0x0000000710600000, 0x0000000710600000|100%|HC|  |TAMS 0x0000000710400000, 0x0000000710400000| Complete 
| 131|0x0000000710600000, 0x0000000710800000, 0x0000000710800000|100%|HC|  |TAMS 0x0000000710600000, 0x0000000710600000| Complete 
| 132|0x0000000710800000, 0x0000000710a00000, 0x0000000710a00000|100%|HC|  |TAMS 0x0000000710800000, 0x0000000710800000| Complete 
| 133|0x0000000710a00000, 0x0000000710c00000, 0x0000000710c00000|100%|HC|  |TAMS 0x0000000710a00000, 0x0000000710a00000| Complete 
| 134|0x0000000710c00000, 0x0000000710e00000, 0x0000000710e00000|100%|HC|  |TAMS 0x0000000710c00000, 0x0000000710c00000| Complete 
| 135|0x0000000710e00000, 0x0000000711000000, 0x0000000711000000|100%|HC|  |TAMS 0x0000000710e00000, 0x0000000710e00000| Complete 
| 136|0x0000000711000000, 0x0000000711200000, 0x0000000711200000|100%|HC|  |TAMS 0x0000000711000000, 0x0000000711000000| Complete 
| 137|0x0000000711200000, 0x0000000711400000, 0x0000000711400000|100%|HC|  |TAMS 0x0000000711200000, 0x0000000711200000| Complete 
| 138|0x0000000711400000, 0x0000000711600000, 0x0000000711600000|100%|HC|  |TAMS 0x0000000711400000, 0x0000000711400000| Complete 
| 139|0x0000000711600000, 0x0000000711800000, 0x0000000711800000|100%|HC|  |TAMS 0x0000000711600000, 0x0000000711600000| Complete 
| 140|0x0000000711800000, 0x0000000711a00000, 0x0000000711a00000|100%|HC|  |TAMS 0x0000000711800000, 0x0000000711800000| Complete 
| 141|0x0000000711a00000, 0x0000000711c00000, 0x0000000711c00000|100%|HC|  |TAMS 0x0000000711a00000, 0x0000000711a00000| Complete 
| 142|0x0000000711c00000, 0x0000000711e00000, 0x0000000711e00000|100%|HC|  |TAMS 0x0000000711c00000, 0x0000000711c00000| Complete