-
Models converted from `saved_model` to `tfjs_graph_model` loose output signature information.
This is not specific to any single model, it's a generic converter issue.
a) Saved model from
``…
-
Hello, I am an avid user who is using TensorRT for service.
I want to serve the QAT model (int8) with TensorRT which is learned with pytorch 1.7.1.
What is the best scenario to serve this model ?
…
-
Please make sure that this is a bug. As per our [GitHub Policy](https://github.com/tensorflow/tensorflow/blob/master/ISSUES.md), we only address code/doc bugs, performance issues, feature requests and…
-
None of the new EfficientDet saved models on TFHub can be used in TFJS.
Actual convert completes ok and creates `tfjs_graph_model`:
`tensorflowjs_converter --input_format=tf_hub --output_forma…
-
Please make sure that this is a bug. As per our
[GitHub Policy](https://github.com/tensorflow/tensorflow/blob/master/ISSUES.md),
we only address code/doc bugs, performance issues, feature requests a…
-
这是我的模型:
```
class xbert_gru(tf.keras.models.Model):
def __init__(self):
super(gru, self).__init__()
self.xbert_model = build_transformer_model(config_path,
…
-
## Context
Trying to loadtest a torch serve model to gauge performance on a custom handler.
* torchserve version: 0.2.0
* torch version: 1.6.0
* java version: openjdk 11.0.8
* Operating System…
-
## Environment info
- `transformers` version: 4.5.0
- Platform: Windows-10-10.0.19041-SP0
- Python version: 3.7.4
- PyTorch version (GPU?): 1.8.1+cu111 (True)
- Tensorflow version (GPU?): 2.4…
-
Code:
```
import tensorflow as tf
from tensorflow.keras.layers import Input
model = tf.keras.models.load_model("Model")
print(model) # First line of output.
inputs = Input(shape=(1, 12, 12))…
-
### System information
- **Have I written custom code (as opposed to using a stock example script
provided in TensorFlow Model Analysis)**: stock configuration with custom model
- **OS Pl…