h2oai / h2ogpt

Private chat with local GPT with document, images, video, etc. 100% private, Apache 2.0. Supports oLLaMa, Mixtral, llama.cpp, and more. Demo: https://gpt.h2o.ai/ https://gpt-docs.h2o.ai/
http://h2o.ai
Apache License 2.0
11.24k stars 1.23k forks source link

Increase in GPU memory usage as generation continues, imbalanced across GPUs #66

Closed pseudotensor closed 7 months ago

pseudotensor commented 1 year ago
>>> import torch
>>> from transformers import pipeline
>>> from transformers import pipeline
>>> generate_text = pipeline(model="h2oai/h2ogpt-oasst1-512-20b", torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto")
>>> res = generate_text("Why is drinking water so healthy?", max_new_tokens=3000)
Setting `pad_token_id` to `eos_token_id`:0 for open-end generation.

During this long generation, first starts out balanced, then increasingly imbalanced.

Thu Apr 20 16:37:04 2023       
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 530.30.02              Driver Version: 530.30.02    CUDA Version: 12.1     |
|-----------------------------------------+----------------------+----------------------+
| GPU  Name                  Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf            Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                                         |                      |               MIG M. |
|=========================================+======================+======================|
|   0  NVIDIA RTX A6000                On | 00000000:3B:00.0 Off |                  Off |
|  0%   45C    P2              105W / 250W|  12220MiB / 49140MiB |     33%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+
|   1  NVIDIA RTX A6000                On | 00000000:5E:00.0 Off |                  Off |
|  0%   45C    P2               72W / 250W|  11744MiB / 49140MiB |     17%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+
|   2  NVIDIA RTX A6000                On | 00000000:86:00.0 Off |                  Off |
|  0%   45C    P2               98W / 250W|  11744MiB / 49140MiB |     19%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+
|   3  NVIDIA RTX A6000                On | 00000000:AF:00.0 Off |                  Off |
|  0%   45C    P2              103W / 250W|  11125MiB / 49140MiB |     23%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 530.30.02              Driver Version: 530.30.02    CUDA Version: 12.1     |
|-----------------------------------------+----------------------+----------------------+
| GPU  Name                  Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf            Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                                         |                      |               MIG M. |
|=========================================+======================+======================|
|   0  NVIDIA RTX A6000                On | 00000000:3B:00.0 Off |                  Off |
|  0%   50C    P2               95W / 250W|  40566MiB / 49140MiB |     73%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+
|   1  NVIDIA RTX A6000                On | 00000000:5E:00.0 Off |                  Off |
|  0%   48C    P2               76W / 250W|  15926MiB / 49140MiB |     36%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+
|   2  NVIDIA RTX A6000                On | 00000000:86:00.0 Off |                  Off |
|  0%   48C    P2               87W / 250W|  15926MiB / 49140MiB |     10%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+
|   3  NVIDIA RTX A6000                On | 00000000:AF:00.0 Off |                  Off |
|  0%   49C    P2              130W / 250W|  14682MiB / 49140MiB |     21%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+

but then can go back down by alot still during generation:

Thu Apr 20 16:47:17 2023       
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 530.30.02              Driver Version: 530.30.02    CUDA Version: 12.1     |
|-----------------------------------------+----------------------+----------------------+
| GPU  Name                  Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf            Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                                         |                      |               MIG M. |
|=========================================+======================+======================|
|   0  NVIDIA RTX A6000                On | 00000000:3B:00.0 Off |                  Off |
|  0%   50C    P2               95W / 250W|  18334MiB / 49140MiB |     75%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+
|   1  NVIDIA RTX A6000                On | 00000000:5E:00.0 Off |                  Off |
|  0%   49C    P2               74W / 250W|  17642MiB / 49140MiB |      8%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+
|   2  NVIDIA RTX A6000                On | 00000000:86:00.0 Off |                  Off |
|  0%   50C    P2              117W / 250W|  17642MiB / 49140MiB |      6%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+
|   3  NVIDIA RTX A6000                On | 00000000:AF:00.0 Off |                  Off |
|  0%   49C    P2              115W / 250W|  16139MiB / 49140MiB |     16%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+

Also eventually fails:

??????????????????????????????? Traceback (most recent call last) ?????????????????????????????????
? in <module>:1                                                                                    ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/transformers/pipelines/text_genera ?
? tion.py:209 in __call__                                                                          ?
?                                                                                                  ?
?   206 ?   ?   ?   - **generated_token_ids** (`torch.Tensor` or `tf.Tensor`, present when `retu   ?
?   207 ?   ?   ?     ids of the generated text.                                                   ?
?   208 ?   ?   """                                                                                ?
? ? 209 ?   ?   return super().__call__(text_inputs, **kwargs)                                     ?
?   210 ?                                                                                          ?
?   211 ?   def preprocess(self, prompt_text, prefix="", handle_long_generation=None, **generate   ?
?   212 ?   ?   inputs = self.tokenizer(                                                           ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/transformers/pipelines/base.py:110 ?
? 9 in __call__                                                                                    ?
?                                                                                                  ?
?   1106 ?   ?   ?   ?   )                                                                         ?
?   1107 ?   ?   ?   )                                                                             ?
?   1108 ?   ?   else:                                                                             ?
? ? 1109 ?   ?   ?   return self.run_single(inputs, preprocess_params, forward_params, postproces  ?
?   1110 ?                                                                                         ?
?   1111 ?   def run_multi(self, inputs, preprocess_params, forward_params, postprocess_params):   ?
?   1112 ?   ?   return [self.run_single(item, preprocess_params, forward_params, postprocess_par  ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/transformers/pipelines/base.py:111 ?
? 6 in run_single                                                                                  ?
?                                                                                                  ?
?   1113 ?                                                                                         ?
?   1114 ?   def run_single(self, inputs, preprocess_params, forward_params, postprocess_params):  ?
?   1115 ?   ?   model_inputs = self.preprocess(inputs, **preprocess_params)                       ?
? ? 1116 ?   ?   model_outputs = self.forward(model_inputs, **forward_params)                      ?
?   1117 ?   ?   outputs = self.postprocess(model_outputs, **postprocess_params)                   ?
?   1118 ?   ?   return outputs                                                                    ?
?   1119                                                                                           ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/transformers/pipelines/base.py:101 ?
? 5 in forward                                                                                     ?
?                                                                                                  ?
?   1012 ?   ?   ?   ?   inference_context = self.get_inference_context()                          ?
?   1013 ?   ?   ?   ?   with inference_context():                                                 ?
?   1014 ?   ?   ?   ?   ?   model_inputs = self._ensure_tensor_on_device(model_inputs, device=se  ?
? ? 1015 ?   ?   ?   ?   ?   model_outputs = self._forward(model_inputs, **forward_params)         ?
?   1016 ?   ?   ?   ?   ?   model_outputs = self._ensure_tensor_on_device(model_outputs, device=  ?
?   1017 ?   ?   ?   else:                                                                         ?
?   1018 ?   ?   ?   ?   raise ValueError(f"Framework {self.framework} is not supported")          ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/transformers/pipelines/text_genera ?
? tion.py:251 in _forward                                                                          ?
?                                                                                                  ?
?   248 ?   ?   ?   in_b = input_ids.shape[0]                                                      ?
?   249 ?   ?   prompt_text = model_inputs.pop("prompt_text")                                      ?
?   250 ?   ?   # BS x SL                                                                          ?
? ? 251 ?   ?   generated_sequence = self.model.generate(input_ids=input_ids, attention_mask=att   ?
?   252 ?   ?   out_b = generated_sequence.shape[0]                                                ?
?   253 ?   ?   if self.framework == "pt":                                                         ?
?   254 ?   ?   ?   generated_sequence = generated_sequence.reshape(in_b, out_b // in_b, *genera   ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/torch/utils/_contextlib.py:115 in  ?
? decorate_context                                                                                 ?
?                                                                                                  ?
?   112 ?   @functools.wraps(func)                                                                 ?
?   113 ?   def decorate_context(*args, **kwargs):                                                 ?
?   114 ?   ?   with ctx_factory():                                                                ?
? ? 115 ?   ?   ?   return func(*args, **kwargs)                                                   ?
?   116 ?                                                                                          ?
?   117 ?   return decorate_context                                                                ?
?   118                                                                                            ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/transformers/generation/utils.py:1 ?
? 437 in generate                                                                                  ?
?                                                                                                  ?
?   1434 ?   ?   ?   ?   )                                                                         ?
?   1435 ?   ?   ?                                                                                 ?
?   1436 ?   ?   ?   # 11. run greedy search                                                       ?
? ? 1437 ?   ?   ?   return self.greedy_search(                                                    ?
?   1438 ?   ?   ?   ?   input_ids,                                                                ?
?   1439 ?   ?   ?   ?   logits_processor=logits_processor,                                        ?
?   1440 ?   ?   ?   ?   stopping_criteria=stopping_criteria,                                      ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/transformers/generation/utils.py:2 ?
? 248 in greedy_search                                                                             ?
?                                                                                                  ?
?   2245 ?   ?   ?   model_inputs = self.prepare_inputs_for_generation(input_ids, **model_kwargs)  ?
?   2246 ?   ?   ?                                                                                 ?
?   2247 ?   ?   ?   # forward pass to get next token                                              ?
? ? 2248 ?   ?   ?   outputs = self(                                                               ?
?   2249 ?   ?   ?   ?   **model_inputs,                                                           ?
?   2250 ?   ?   ?   ?   return_dict=True,                                                         ?
?   2251 ?   ?   ?   ?   output_attentions=output_attentions,                                      ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/torch/nn/modules/module.py:1501 in ?
? _call_impl                                                                                       ?
?                                                                                                  ?
?   1498 ?   ?   if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks   ?
?   1499 ?   ?   ?   ?   or _global_backward_pre_hooks or _global_backward_hooks                   ?
?   1500 ?   ?   ?   ?   or _global_forward_hooks or _global_forward_pre_hooks):                   ?
? ? 1501 ?   ?   ?   return forward_call(*args, **kwargs)                                          ?
?   1502 ?   ?   # Do not call functions when jit is used                                          ?
?   1503 ?   ?   full_backward_hooks, non_full_backward_hooks = [], []                             ?
?   1504 ?   ?   backward_pre_hooks = []                                                           ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/accelerate/hooks.py:165 in         ?
? new_forward                                                                                      ?
?                                                                                                  ?
?   162 ?   ?   ?   with torch.no_grad():                                                          ?
?   163 ?   ?   ?   ?   output = old_forward(*args, **kwargs)                                      ?
?   164 ?   ?   else:                                                                              ?
? ? 165 ?   ?   ?   output = old_forward(*args, **kwargs)                                          ?
?   166 ?   ?   return module._hf_hook.post_forward(module, output)                                ?
?   167 ?                                                                                          ?
?   168 ?   module.forward = new_forward                                                           ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/transformers/models/gpt_neox/model ?
? ing_gpt_neox.py:662 in forward                                                                   ?
?                                                                                                  ?
?   659 ?   ?   ```"""                                                                             ?
?   660 ?   ?   return_dict = return_dict if return_dict is not None else self.config.use_return   ?
?   661 ?   ?                                                                                      ?
? ? 662 ?   ?   outputs = self.gpt_neox(                                                           ?
?   663 ?   ?   ?   input_ids,                                                                     ?
?   664 ?   ?   ?   attention_mask=attention_mask,                                                 ?
?   665 ?   ?   ?   position_ids=position_ids,                                                     ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/torch/nn/modules/module.py:1501 in ?
? _call_impl                                                                                       ?
?                                                                                                  ?
?   1498 ?   ?   if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks   ?
?   1499 ?   ?   ?   ?   or _global_backward_pre_hooks or _global_backward_hooks                   ?
?   1500 ?   ?   ?   ?   or _global_forward_hooks or _global_forward_pre_hooks):                   ?
? ? 1501 ?   ?   ?   return forward_call(*args, **kwargs)                                          ?
?   1502 ?   ?   # Do not call functions when jit is used                                          ?
?   1503 ?   ?   full_backward_hooks, non_full_backward_hooks = [], []                             ?
?   1504 ?   ?   backward_pre_hooks = []                                                           ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/transformers/models/gpt_neox/model ?
? ing_gpt_neox.py:553 in forward                                                                   ?
?                                                                                                  ?
?   550 ?   ?   ?   ?   ?   head_mask[i],                                                          ?
?   551 ?   ?   ?   ?   )                                                                          ?
?   552 ?   ?   ?   else:                                                                          ?
? ? 553 ?   ?   ?   ?   outputs = layer(                                                           ?
?   554 ?   ?   ?   ?   ?   hidden_states,                                                         ?
?   555 ?   ?   ?   ?   ?   attention_mask=attention_mask,                                         ?
?   556 ?   ?   ?   ?   ?   position_ids=position_ids,                                             ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/torch/nn/modules/module.py:1501 in ?
? _call_impl                                                                                       ?
?                                                                                                  ?
?   1498 ?   ?   if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks   ?
?   1499 ?   ?   ?   ?   or _global_backward_pre_hooks or _global_backward_hooks                   ?
?   1500 ?   ?   ?   ?   or _global_forward_hooks or _global_forward_pre_hooks):                   ?
? ? 1501 ?   ?   ?   return forward_call(*args, **kwargs)                                          ?
?   1502 ?   ?   # Do not call functions when jit is used                                          ?
?   1503 ?   ?   full_backward_hooks, non_full_backward_hooks = [], []                             ?
?   1504 ?   ?   backward_pre_hooks = []                                                           ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/accelerate/hooks.py:165 in         ?
? new_forward                                                                                      ?
?                                                                                                  ?
?   162 ?   ?   ?   with torch.no_grad():                                                          ?
?   163 ?   ?   ?   ?   output = old_forward(*args, **kwargs)                                      ?
?   164 ?   ?   else:                                                                              ?
? ? 165 ?   ?   ?   output = old_forward(*args, **kwargs)                                          ?
?   166 ?   ?   return module._hf_hook.post_forward(module, output)                                ?
?   167 ?                                                                                          ?
?   168 ?   module.forward = new_forward                                                           ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/transformers/models/gpt_neox/model ?
? ing_gpt_neox.py:320 in forward                                                                   ?
?                                                                                                  ?
?   317 ?   ?   layer_past: Optional[Tuple[torch.Tensor]] = None,                                  ?
?   318 ?   ?   output_attentions: Optional[bool] = False,                                         ?
?   319 ?   ):                                                                                     ?
? ? 320 ?   ?   attention_layer_outputs = self.attention(                                          ?
?   321 ?   ?   ?   self.input_layernorm(hidden_states),                                           ?
?   322 ?   ?   ?   attention_mask=attention_mask,                                                 ?
?   323 ?   ?   ?   position_ids=position_ids,                                                     ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/torch/nn/modules/module.py:1501 in ?
? _call_impl                                                                                       ?
?                                                                                                  ?
?   1498 ?   ?   if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks   ?
?   1499 ?   ?   ?   ?   or _global_backward_pre_hooks or _global_backward_hooks                   ?
?   1500 ?   ?   ?   ?   or _global_forward_hooks or _global_forward_pre_hooks):                   ?
? ? 1501 ?   ?   ?   return forward_call(*args, **kwargs)                                          ?
?   1502 ?   ?   # Do not call functions when jit is used                                          ?
?   1503 ?   ?   full_backward_hooks, non_full_backward_hooks = [], []                             ?
?   1504 ?   ?   backward_pre_hooks = []                                                           ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/accelerate/hooks.py:165 in         ?
? new_forward                                                                                      ?
?                                                                                                  ?
?   162 ?   ?   ?   with torch.no_grad():                                                          ?
?   163 ?   ?   ?   ?   output = old_forward(*args, **kwargs)                                      ?
?   164 ?   ?   else:                                                                              ?
? ? 165 ?   ?   ?   output = old_forward(*args, **kwargs)                                          ?
?   166 ?   ?   return module._hf_hook.post_forward(module, output)                                ?
?   167 ?                                                                                          ?
?   168 ?   module.forward = new_forward                                                           ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/transformers/models/gpt_neox/model ?
? ing_gpt_neox.py:152 in forward                                                                   ?
?                                                                                                  ?
?   149 ?   ?   present = (key, value) if use_cache else None                                      ?
?   150 ?   ?                                                                                      ?
?   151 ?   ?   # Compute attention                                                                ?
? ? 152 ?   ?   attn_output, attn_weights = self._attn(query, key, value, attention_mask, head_m   ?
?   153 ?   ?                                                                                      ?
?   154 ?   ?   # Reshape outputs                                                                  ?
?   155 ?   ?   attn_output = self._merge_heads(attn_output, self.num_attention_heads, self.head   ?
?                                                                                                  ?
? /home/jon/miniconda3/envs/alpaca/lib/python3.10/site-packages/transformers/models/gpt_neox/model ?
? ing_gpt_neox.py:219 in _attn                                                                     ?
?                                                                                                  ?
?   216 ?   ?   # Need to be a tensor, otherwise we get error: `RuntimeError: expected scalar ty   ?
?   217 ?   ?   # Need to be on the same device, otherwise `RuntimeError: ..., x and y to be on    ?
?   218 ?   ?   mask_value = torch.tensor(mask_value, dtype=attn_scores.dtype).to(attn_scores.de   ?
? ? 219 ?   ?   attn_scores = torch.where(causal_mask, attn_scores, mask_value)                    ?
?   220 ?   ?                                                                                      ?
?   221 ?   ?   if attention_mask is not None:                                                     ?
?   222 ?   ?   ?   # Apply the attention mask                                                     ?
????????????????????????????????????????????????????????????????????????????????????????????????????
RuntimeError: The size of tensor a (2048) must match the size of tensor b (2049) at non-singleton dimension 3
>>> 
arnocandel commented 1 year ago

(env) arno@rippa:/nfs4/llm/h2ogpt(main)$ CUDA_VISIBLE_DEVICES=2 python generate.py --infer-devices=False --base_model=h2oai/h2ogpt-oasst1-512-12b --temperature=0.1 --load_8bit=True


<!DOCTYPE html>
<html
    lang="en"
    style="
        margin: 0;
        padding: 0;
        min-height: 100%;
        display: flex;
        flex-direction: column;
    "
>
    <head>
        <meta charset="utf-8" />
        <meta
            name="viewport"
            content="width=device-width, initial-scale=1, shrink-to-fit=no, maximum-scale=1"
        />

        <meta property="og:url" content="https://gradio.app/" />
        <meta property="og:type" content="website" />
        <meta property="og:image" content="" />
        <meta property="og:title" content="h2oGPT" />
        <meta
            property="og:description"
            content=""
        />
        <meta name="twitter:card" content="summary_large_image" />
        <meta name="twitter:creator" content="@teamGradio" />
        <meta name="twitter:title" content="h2oGPT" />
        <meta
            name="twitter:description"
            content=""
        />
        <meta name="twitter:image" content="" />

        <script>
            window.dataLayer = window.dataLayer || [];
            function gtag() {
                dataLayer.push(arguments);
            }
            gtag("js", new Date());
            gtag("config", "UA-156449732-1");
            window.__gradio_mode__ = "app";
        </script>

        <script>window.gradio_config = {"version":"3.27.0\n","mode":"blocks","dev_mode":false,"analytics_enabled":false,"components":[{"id":1,"type":"state","props":{"show_label":true,"name":"state","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["Any",""],"raw_output":["Any",""],"serialized_input":["Any",""],"serialized_output":["Any",""]},"example_inputs":{"raw":null,"serialized":null}},{"id":2,"type":"state","props":{"show_label":true,"name":"state","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["Any",""],"raw_output":["Any",""],"serialized_input":["Any",""],"serialized_output":["Any",""]},"example_inputs":{"raw":null,"serialized":null}},{"id":3,"type":"state","props":{"show_label":true,"name":"state","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["Any",""],"raw_output":["Any",""],"serialized_input":["Any",""],"serialized_output":["Any",""]},"example_inputs":{"raw":null,"serialized":null}},{"id":4,"type":"state","props":{"show_label":true,"name":"state","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["Any",""],"raw_output":["Any",""],"serialized_input":["Any",""],"serialized_output":["Any",""]},"example_inputs":{"raw":null,"serialized":null}},{"id":5,"type":"markdown","props":{"value":"\u003cdiv style=\"display:flex; justify-content:center; margin-bottom:30px;\"\u003e\n        \u003cdiv style=\"height: 60px; width: 60px; margin-right:20px;\"\u003e\u003csvg id=\"Layer_1\" data-name=\"Layer 1\" xmlns=\"http://www.w3.org/2000/svg\" width=\"100%\" height=\"100%\" viewBox=\"0 0 600.28 600.28\"\u003e\u003cdefs\u003e\u003cstyle\u003e.cls-1{fill:#fec925;}.cls-2{fill:#161616;}.cls-3{fill:#54585a;}\u003c/style\u003e\u003c/defs\u003e\u003cg id=\"Fill-1\"\u003e\u003crect class=\"cls-1\" width=\"600.28\" height=\"600.28\" rx=\"23.24\"/\u003e\u003c/g\u003e\u003cpath class=\"cls-2\" d=\"M174.33,246.06v92.78H152.86v-38H110.71v38H89.24V246.06h21.47v36.58h42.15V246.06Z\"/\u003e\u003cpath class=\"cls-2\" d=\"M259.81,321.34v17.5H189.7V324.92l35.78-33.8c8.22-7.82,9.68-12.59,9.68-17.09,0-7.29-5-11.53-14.85-11.53-7.95,0-14.71,3-19.21,9.27L185.46,261.7c7.15-10.47,20.14-17.23,36.84-17.23,20.68,0,34.46,10.6,34.46,27.44,0,9-2.52,17.22-15.51,29.29l-21.33,20.14Z\"/\u003e\u003cpath class=\"cls-2\" d=\"M268.69,292.45c0-27.57,21.47-48,50.76-48s50.76,20.28,50.76,48-21.6,48-50.76,48S268.69,320,268.69,292.45Zm79.78,0c0-17.63-12.46-29.69-29-29.69s-29,12.06-29,29.69,12.46,29.69,29,29.69S348.47,310.08,348.47,292.45Z\"/\u003e\u003cpath class=\"cls-3\" d=\"M377.23,326.91c0-7.69,5.7-12.73,12.85-12.73s12.86,5,12.86,12.73a12.86,12.86,0,1,1-25.71,0Z\"/\u003e\u003cpath class=\"cls-3\" d=\"M481.4,298.15v40.69H462.05V330c-3.84,6.49-11.27,9.94-21.74,9.94-16.7,0-26.64-9.28-26.64-21.61,0-12.59,8.88-21.34,30.62-21.34h16.43c0-8.87-5.3-14-16.43-14-7.55,0-15.37,2.51-20.54,6.62l-7.43-14.44c7.82-5.57,19.35-8.62,30.75-8.62C468.81,266.47,481.4,276.54,481.4,298.15Zm-20.68,18.16V309H446.54c-9.67,0-12.72,3.57-12.72,8.35,0,5.16,4.37,8.61,11.66,8.61C452.37,326,458.34,322.8,460.72,316.31Z\"/\u003e\u003cpath class=\"cls-3\" d=\"M497.56,246.06c0-6.49,5.17-11.53,12.86-11.53s12.86,4.77,12.86,11.13c0,6.89-5.17,11.93-12.86,11.93S497.56,252.55,497.56,246.06Zm2.52,21.47h20.68v71.31H500.08Z\"/\u003e\u003c/svg\u003e\u003c/div\u003e\n        \u003ch1 style=\"line-height:60px\"\u003eh2oGPT\u003c/h1\u003e\n    \u003c/div\u003e\n\u003cp\u003eFor more information, visit \u003ca href=\"https://github.com/h2oai/h2ogpt\" target=\"_blank\"\u003ethe project’s website\u003c/a\u003e.\u003cbr\u003e\u003c/p\u003e\n","name":"markdown","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":6,"type":"button","props":{"value":"ENTER","variant":"primary","interactive":true,"name":"button","visible":false,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":7,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":8,"type":"tabs","props":{"visible":true,"style":{}}},{"id":9,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":10,"type":"column","props":{"type":"column","variant":"default","scale":1,"min_width":320,"visible":false,"style":{}}},{"id":11,"type":"textbox","props":{"lines":5,"max_lines":20,"value":"","type":"text","label":"h2oGPT [Model: h2oai/h2ogpt-oasst1-512-12b]","show_label":true,"name":"textbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":12,"type":"textbox","props":{"lines":4,"max_lines":20,"placeholder":"Enter a question or imperative.","value":"","type":"text","label":"Instruction","show_label":true,"name":"textbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":13,"type":"textbox","props":{"lines":4,"max_lines":20,"placeholder":"","value":"","type":"text","label":"Input context for Instruction","show_label":true,"name":"textbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":14,"type":"button","props":{"value":"Submit","variant":"secondary","interactive":true,"name":"button","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":15,"type":"button","props":{"value":"Flag","variant":"secondary","interactive":true,"name":"button","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":16,"type":"column","props":{"type":"column","variant":"default","scale":1,"min_width":320,"visible":"OpenAssistant/reward-model-deberta-v3-large-v2","style":{}}},{"id":17,"type":"textbox","props":{"lines":1,"max_lines":20,"value":"Response Score: NA","type":"text","show_label":false,"name":"textbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":18,"type":"form","props":{"type":"form","visible":true,"style":{}}},{"id":19,"type":"form","props":{"type":"form","visible":true,"style":{}}},{"id":20,"type":"column","props":{"type":"column","variant":"default","scale":1,"min_width":320,"visible":true,"style":{}}},{"id":21,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":22,"type":"chatbot","props":{"value":[],"selectable":false,"label":"h2oGPT [Model: h2oai/h2ogpt-oasst1-512-12b]","show_label":true,"name":"chatbot","visible":true,"style":{"height":400}},"serializer":"JSONSerializable","api_info":{"raw_input":["str | Dict | List","JSON-serializable object or a string"],"raw_output":["Dict | List","dictionary- or list-like object"],"serialized_input":["str","filepath to JSON file"],"serialized_output":["str","filepath to JSON file"]},"example_inputs":{"raw":{"a":1,"b":2},"serialized":null}},{"id":23,"type":"chatbot","props":{"value":[],"selectable":false,"label":"h2oGPT [   !!! Please Load Model in Models Tab !!!   ]","show_label":true,"name":"chatbot","visible":false,"style":{"height":400}},"serializer":"JSONSerializable","api_info":{"raw_input":["str | Dict | List","JSON-serializable object or a string"],"raw_output":["Dict | List","dictionary- or list-like object"],"serialized_input":["str","filepath to JSON file"],"serialized_output":["str","filepath to JSON file"]},"example_inputs":{"raw":{"a":1,"b":2},"serialized":null}},{"id":24,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":25,"type":"column","props":{"type":"column","variant":"default","scale":50,"min_width":320,"visible":true,"style":{}}},{"id":26,"type":"textbox","props":{"lines":4,"max_lines":20,"placeholder":"Enter a question or imperative.","value":"","type":"text","label":"You (Shift-Enter or push Submit to send message)","show_label":true,"name":"textbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":27,"type":"form","props":{"type":"form","visible":true,"style":{}}},{"id":28,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":29,"type":"button","props":{"value":"Submit","variant":"secondary","interactive":true,"name":"button","visible":true,"style":{"full_width":false,"size":"sm"}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":30,"type":"button","props":{"value":"Stop","variant":"secondary","interactive":true,"name":"button","visible":true,"style":{"full_width":false,"size":"sm"}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":31,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":32,"type":"button","props":{"value":"New Conversation","variant":"secondary","interactive":true,"name":"button","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":33,"type":"button","props":{"value":"Flag","variant":"secondary","interactive":true,"name":"button","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":34,"type":"column","props":{"type":"column","variant":"default","scale":1,"min_width":320,"visible":"OpenAssistant/reward-model-deberta-v3-large-v2","style":{}}},{"id":35,"type":"textbox","props":{"lines":1,"max_lines":20,"value":"Response Score: NA","type":"text","show_label":false,"name":"textbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":36,"type":"textbox","props":{"lines":1,"max_lines":20,"value":"Response Score2: NA","type":"text","show_label":false,"name":"textbox","visible":false,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":37,"type":"form","props":{"type":"form","visible":true,"style":{}}},{"id":38,"type":"button","props":{"value":"Regenerate","variant":"secondary","interactive":true,"name":"button","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":39,"type":"button","props":{"value":"Undo","variant":"secondary","interactive":true,"name":"button","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":40,"type":"tabitem","props":{"label":"Input/Output","visible":true,"style":{}}},{"id":41,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":42,"type":"tabitem","props":{"label":"Expert","visible":true,"style":{}}},{"id":43,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":44,"type":"column","props":{"type":"column","variant":"default","scale":1,"min_width":320,"visible":true,"style":{}}},{"id":45,"type":"checkbox","props":{"value":true,"label":"Stream output","show_label":true,"name":"checkbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["bool","boolean value"],"raw_output":["bool","boolean value"],"serialized_input":["bool","boolean value"],"serialized_output":["bool","boolean value"]},"example_inputs":{"raw":true,"serialized":true}},{"id":46,"type":"dropdown","props":{"choices":["plain","instruct","quality","human_bot","dai_faq","summarize","simple_instruct","instruct_vicuna","instruct_with_end","human_bot_orig"],"value":"human_bot","allow_custom_value":false,"label":"Prompt Type","show_label":true,"name":"dropdown","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","Option from: [\u0027plain\u0027, \u0027instruct\u0027, \u0027quality\u0027, \u0027human_bot\u0027, \u0027dai_faq\u0027, \u0027summarize\u0027, \u0027simple_instruct\u0027, \u0027instruct_vicuna\u0027, \u0027instruct_with_end\u0027, \u0027human_bot_orig\u0027]"],"raw_output":["str","Option from: [\u0027plain\u0027, \u0027instruct\u0027, \u0027quality\u0027, \u0027human_bot\u0027, \u0027dai_faq\u0027, \u0027summarize\u0027, \u0027simple_instruct\u0027, \u0027instruct_vicuna\u0027, \u0027instruct_with_end\u0027, \u0027human_bot_orig\u0027]"],"serialized_input":["str","Option from: [\u0027plain\u0027, \u0027instruct\u0027, \u0027quality\u0027, \u0027human_bot\u0027, \u0027dai_faq\u0027, \u0027summarize\u0027, \u0027simple_instruct\u0027, \u0027instruct_vicuna\u0027, \u0027instruct_with_end\u0027, \u0027human_bot_orig\u0027]"],"serialized_output":["str","Option from: [\u0027plain\u0027, \u0027instruct\u0027, \u0027quality\u0027, \u0027human_bot\u0027, \u0027dai_faq\u0027, \u0027summarize\u0027, \u0027simple_instruct\u0027, \u0027instruct_vicuna\u0027, \u0027instruct_with_end\u0027, \u0027human_bot_orig\u0027]"]},"example_inputs":{"raw":"plain","serialized":"plain"}},{"id":47,"type":"dropdown","props":{"choices":["plain","instruct","quality","human_bot","dai_faq","summarize","simple_instruct","instruct_vicuna","instruct_with_end","human_bot_orig"],"value":"human_bot","allow_custom_value":false,"label":"Prompt Type Model 2","show_label":true,"name":"dropdown","visible":false,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","Option from: [\u0027plain\u0027, \u0027instruct\u0027, \u0027quality\u0027, \u0027human_bot\u0027, \u0027dai_faq\u0027, \u0027summarize\u0027, \u0027simple_instruct\u0027, \u0027instruct_vicuna\u0027, \u0027instruct_with_end\u0027, \u0027human_bot_orig\u0027]"],"raw_output":["str","Option from: [\u0027plain\u0027, \u0027instruct\u0027, \u0027quality\u0027, \u0027human_bot\u0027, \u0027dai_faq\u0027, \u0027summarize\u0027, \u0027simple_instruct\u0027, \u0027instruct_vicuna\u0027, \u0027instruct_with_end\u0027, \u0027human_bot_orig\u0027]"],"serialized_input":["str","Option from: [\u0027plain\u0027, \u0027instruct\u0027, \u0027quality\u0027, \u0027human_bot\u0027, \u0027dai_faq\u0027, \u0027summarize\u0027, \u0027simple_instruct\u0027, \u0027instruct_vicuna\u0027, \u0027instruct_with_end\u0027, \u0027human_bot_orig\u0027]"],"serialized_output":["str","Option from: [\u0027plain\u0027, \u0027instruct\u0027, \u0027quality\u0027, \u0027human_bot\u0027, \u0027dai_faq\u0027, \u0027summarize\u0027, \u0027simple_instruct\u0027, \u0027instruct_vicuna\u0027, \u0027instruct_with_end\u0027, \u0027human_bot_orig\u0027]"]},"example_inputs":{"raw":"plain","serialized":"plain"}},{"id":48,"type":"checkbox","props":{"value":false,"label":"Sample","show_label":true,"name":"checkbox","visible":true,"style":{},"info":"Enable sampler, required for use of temperature, top_p, top_k"},"serializer":"Serializable","api_info":{"raw_input":["bool","boolean value"],"raw_output":["bool","boolean value"],"serialized_input":["bool","boolean value"],"serialized_output":["bool","boolean value"]},"example_inputs":{"raw":true,"serialized":true}},{"id":49,"type":"slider","props":{"minimum":0.01,"maximum":3,"step":0.01,"value":0.1,"label":"Temperature","show_label":true,"name":"slider","visible":true,"style":{},"info":"Lower is deterministic (but may lead to repeats), Higher more creative (but may lead to hallucinations)"},"serializer":"Serializable","api_info":{"raw_input":["int | float","numeric value between 0.01 and 3"],"raw_output":["int | float","numeric value between 0.01 and 3"],"serialized_input":["int | float","numeric value between 0.01 and 3"],"serialized_output":["int | float","numeric value between 0.01 and 3"]},"example_inputs":{"raw":0.01,"serialized":0.01}},{"id":50,"type":"slider","props":{"minimum":0,"maximum":1,"step":0.01,"value":0.75,"label":"Top p","show_label":true,"name":"slider","visible":true,"style":{},"info":"Cumulative probability of tokens to sample from"},"serializer":"Serializable","api_info":{"raw_input":["int | float","numeric value between 0 and 1"],"raw_output":["int | float","numeric value between 0 and 1"],"serialized_input":["int | float","numeric value between 0 and 1"],"serialized_output":["int | float","numeric value between 0 and 1"]},"example_inputs":{"raw":0,"serialized":0}},{"id":51,"type":"slider","props":{"minimum":0,"maximum":100,"step":1,"value":40,"label":"Top k","show_label":true,"name":"slider","visible":true,"style":{},"info":"Num. tokens to sample from"},"serializer":"Serializable","api_info":{"raw_input":["int | float","numeric value between 0 and 100"],"raw_output":["int | float","numeric value between 0 and 100"],"serialized_input":["int | float","numeric value between 0 and 100"],"serialized_output":["int | float","numeric value between 0 and 100"]},"example_inputs":{"raw":0,"serialized":0}},{"id":52,"type":"slider","props":{"minimum":1,"maximum":8,"step":1,"value":1,"label":"Beams","show_label":true,"name":"slider","visible":true,"style":{},"info":"Number of searches for optimal overall probability.  Uses more GPU memory/compute"},"serializer":"Serializable","api_info":{"raw_input":["int | float","numeric value between 1 and 8"],"raw_output":["int | float","numeric value between 1 and 8"],"serialized_input":["int | float","numeric value between 1 and 8"],"serialized_output":["int | float","numeric value between 1 and 8"]},"example_inputs":{"raw":1,"serialized":1}},{"id":53,"type":"slider","props":{"minimum":1,"maximum":2048,"step":1,"value":256,"label":"Max output length","show_label":true,"name":"slider","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["int | float","numeric value between 1 and 2048"],"raw_output":["int | float","numeric value between 1 and 2048"],"serialized_input":["int | float","numeric value between 1 and 2048"],"serialized_output":["int | float","numeric value between 1 and 2048"]},"example_inputs":{"raw":1,"serialized":1}},{"id":54,"type":"slider","props":{"minimum":0,"maximum":2048,"step":1,"value":0,"label":"Min output length","show_label":true,"name":"slider","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["int | float","numeric value between 0 and 2048"],"raw_output":["int | float","numeric value between 0 and 2048"],"serialized_input":["int | float","numeric value between 0 and 2048"],"serialized_output":["int | float","numeric value between 0 and 2048"]},"example_inputs":{"raw":0,"serialized":0}},{"id":55,"type":"checkbox","props":{"value":false,"label":"EarlyStopping","show_label":true,"name":"checkbox","visible":true,"style":{},"info":"Stop early in beam search"},"serializer":"Serializable","api_info":{"raw_input":["bool","boolean value"],"raw_output":["bool","boolean value"],"serialized_input":["bool","boolean value"],"serialized_output":["bool","boolean value"]},"example_inputs":{"raw":true,"serialized":true}},{"id":56,"type":"slider","props":{"minimum":0,"maximum":300,"step":1,"value":180,"label":"Max. time","show_label":true,"name":"slider","visible":true,"style":{},"info":"Max. time to search optimal output."},"serializer":"Serializable","api_info":{"raw_input":["int | float","numeric value between 0 and 300"],"raw_output":["int | float","numeric value between 0 and 300"],"serialized_input":["int | float","numeric value between 0 and 300"],"serialized_output":["int | float","numeric value between 0 and 300"]},"example_inputs":{"raw":0,"serialized":0}},{"id":57,"type":"slider","props":{"minimum":0.01,"maximum":3.0,"step":0.01,"value":1.07,"label":"Repetition Penalty","show_label":true,"name":"slider","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["int | float","numeric value between 0.01 and 3.0"],"raw_output":["int | float","numeric value between 0.01 and 3.0"],"serialized_input":["int | float","numeric value between 0.01 and 3.0"],"serialized_output":["int | float","numeric value between 0.01 and 3.0"]},"example_inputs":{"raw":0.01,"serialized":0.01}},{"id":58,"type":"slider","props":{"minimum":1,"maximum":10,"step":1,"value":1,"label":"Number Returns","show_label":true,"name":"slider","visible":true,"style":{},"info":"Must be \u003c= num_beams"},"serializer":"Serializable","api_info":{"raw_input":["int | float","numeric value between 1 and 10"],"raw_output":["int | float","numeric value between 1 and 10"],"serialized_input":["int | float","numeric value between 1 and 10"],"serialized_output":["int | float","numeric value between 1 and 10"]},"example_inputs":{"raw":1,"serialized":1}},{"id":59,"type":"textbox","props":{"lines":4,"max_lines":20,"placeholder":"","value":"","type":"text","label":"Input","show_label":true,"name":"textbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":60,"type":"textbox","props":{"lines":3,"max_lines":20,"value":"","type":"text","label":"System Pre-Context","show_label":true,"name":"textbox","visible":false,"style":{},"info":"Directly pre-appended without prompt processing"},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":61,"type":"checkbox","props":{"value":true,"label":"Chat mode","show_label":true,"name":"checkbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["bool","boolean value"],"raw_output":["bool","boolean value"],"serialized_input":["bool","boolean value"],"serialized_output":["bool","boolean value"]},"example_inputs":{"raw":true,"serialized":true}},{"id":62,"type":"form","props":{"type":"form","visible":true,"style":{}}},{"id":63,"type":"form","props":{"type":"form","visible":true,"style":{}}},{"id":64,"type":"tabitem","props":{"label":"Models","visible":true,"style":{}}},{"id":65,"type":"checkbox","props":{"value":false,"label":"Compare Mode","show_label":true,"name":"checkbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["bool","boolean value"],"raw_output":["bool","boolean value"],"serialized_input":["bool","boolean value"],"serialized_output":["bool","boolean value"]},"example_inputs":{"raw":true,"serialized":true}},{"id":66,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":67,"type":"column","props":{"type":"column","variant":"default","scale":1,"min_width":320,"visible":true,"style":{}}},{"id":68,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":69,"type":"column","props":{"type":"column","variant":"default","scale":50,"min_width":320,"visible":true,"style":{}}},{"id":70,"type":"dropdown","props":{"choices":["[None/Remove]","EleutherAI/gpt-j-6B","EleutherAI/pythia-6.9b","EleutherAI/pythia-12b","EleutherAI/pythia-12b-deduped","EleutherAI/gpt-neox-20b","decapoda-research/llama-7b-hf","decapoda-research/llama-13b-hf","decapoda-research/llama-30b-hf","decapoda-research/llama-65b-hf","facebook/mbart-large-50-many-to-many-mmt","philschmid/bart-large-cnn-samsum","philschmid/flan-t5-base-samsum","gpt2","distilgpt2","databricks/dolly-v2-12b","h2oai/h2ogpt-oasst1-512-12b","h2oai/h2ogpt-oasst1-512-20b","h2oai/h2ogpt-oig-oasst1-512-6.9b","t5-small","t5-large","google/flan-t5","google/flan-t5-xxl","google/flan-ul2","AlekseyKorshuk/vicuna-7b","togethercomputer/GPT-NeoXT-Chat-Base-20B"],"value":"h2oai/h2ogpt-oasst1-512-12b","allow_custom_value":false,"label":"Choose Model","show_label":true,"name":"dropdown","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","Option from: [\u0027[None/Remove]\u0027, \u0027EleutherAI/gpt-j-6B\u0027, \u0027EleutherAI/pythia-6.9b\u0027, \u0027EleutherAI/pythia-12b\u0027, \u0027EleutherAI/pythia-12b-deduped\u0027, \u0027EleutherAI/gpt-neox-20b\u0027, \u0027decapoda-research/llama-7b-hf\u0027, \u0027decapoda-research/llama-13b-hf\u0027, \u0027decapoda-research/llama-30b-hf\u0027, \u0027decapoda-research/llama-65b-hf\u0027, \u0027facebook/mbart-large-50-many-to-many-mmt\u0027, \u0027philschmid/bart-large-cnn-samsum\u0027, \u0027philschmid/flan-t5-base-samsum\u0027, \u0027gpt2\u0027, \u0027distilgpt2\u0027, \u0027databricks/dolly-v2-12b\u0027, \u0027h2oai/h2ogpt-oasst1-512-12b\u0027, \u0027h2oai/h2ogpt-oasst1-512-20b\u0027, \u0027h2oai/h2ogpt-oig-oasst1-512-6.9b\u0027, \u0027t5-small\u0027, \u0027t5-large\u0027, \u0027google/flan-t5\u0027, \u0027google/flan-t5-xxl\u0027, \u0027google/flan-ul2\u0027, \u0027AlekseyKorshuk/vicuna-7b\u0027, \u0027togethercomputer/GPT-NeoXT-Chat-Base-20B\u0027]"],"raw_output":["str","Option from: [\u0027[None/Remove]\u0027, \u0027EleutherAI/gpt-j-6B\u0027, \u0027EleutherAI/pythia-6.9b\u0027, \u0027EleutherAI/pythia-12b\u0027, \u0027EleutherAI/pythia-12b-deduped\u0027, \u0027EleutherAI/gpt-neox-20b\u0027, \u0027decapoda-research/llama-7b-hf\u0027, \u0027decapoda-research/llama-13b-hf\u0027, \u0027decapoda-research/llama-30b-hf\u0027, \u0027decapoda-research/llama-65b-hf\u0027, \u0027facebook/mbart-large-50-many-to-many-mmt\u0027, \u0027philschmid/bart-large-cnn-samsum\u0027, \u0027philschmid/flan-t5-base-samsum\u0027, \u0027gpt2\u0027, \u0027distilgpt2\u0027, \u0027databricks/dolly-v2-12b\u0027, \u0027h2oai/h2ogpt-oasst1-512-12b\u0027, \u0027h2oai/h2ogpt-oasst1-512-20b\u0027, \u0027h2oai/h2ogpt-oig-oasst1-512-6.9b\u0027, \u0027t5-small\u0027, \u0027t5-large\u0027, \u0027google/flan-t5\u0027, \u0027google/flan-t5-xxl\u0027, \u0027google/flan-ul2\u0027, \u0027AlekseyKorshuk/vicuna-7b\u0027, \u0027togethercomputer/GPT-NeoXT-Chat-Base-20B\u0027]"],"serialized_input":["str","Option from: [\u0027[None/Remove]\u0027, \u0027EleutherAI/gpt-j-6B\u0027, \u0027EleutherAI/pythia-6.9b\u0027, \u0027EleutherAI/pythia-12b\u0027, \u0027EleutherAI/pythia-12b-deduped\u0027, \u0027EleutherAI/gpt-neox-20b\u0027, \u0027decapoda-research/llama-7b-hf\u0027, \u0027decapoda-research/llama-13b-hf\u0027, \u0027decapoda-research/llama-30b-hf\u0027, \u0027decapoda-research/llama-65b-hf\u0027, \u0027facebook/mbart-large-50-many-to-many-mmt\u0027, \u0027philschmid/bart-large-cnn-samsum\u0027, \u0027philschmid/flan-t5-base-samsum\u0027, \u0027gpt2\u0027, \u0027distilgpt2\u0027, \u0027databricks/dolly-v2-12b\u0027, \u0027h2oai/h2ogpt-oasst1-512-12b\u0027, \u0027h2oai/h2ogpt-oasst1-512-20b\u0027, \u0027h2oai/h2ogpt-oig-oasst1-512-6.9b\u0027, \u0027t5-small\u0027, \u0027t5-large\u0027, \u0027google/flan-t5\u0027, \u0027google/flan-t5-xxl\u0027, \u0027google/flan-ul2\u0027, \u0027AlekseyKorshuk/vicuna-7b\u0027, \u0027togethercomputer/GPT-NeoXT-Chat-Base-20B\u0027]"],"serialized_output":["str","Option from: [\u0027[None/Remove]\u0027, \u0027EleutherAI/gpt-j-6B\u0027, \u0027EleutherAI/pythia-6.9b\u0027, \u0027EleutherAI/pythia-12b\u0027, \u0027EleutherAI/pythia-12b-deduped\u0027, \u0027EleutherAI/gpt-neox-20b\u0027, \u0027decapoda-research/llama-7b-hf\u0027, \u0027decapoda-research/llama-13b-hf\u0027, \u0027decapoda-research/llama-30b-hf\u0027, \u0027decapoda-research/llama-65b-hf\u0027, \u0027facebook/mbart-large-50-many-to-many-mmt\u0027, \u0027philschmid/bart-large-cnn-samsum\u0027, \u0027philschmid/flan-t5-base-samsum\u0027, \u0027gpt2\u0027, \u0027distilgpt2\u0027, \u0027databricks/dolly-v2-12b\u0027, \u0027h2oai/h2ogpt-oasst1-512-12b\u0027, \u0027h2oai/h2ogpt-oasst1-512-20b\u0027, \u0027h2oai/h2ogpt-oig-oasst1-512-6.9b\u0027, \u0027t5-small\u0027, \u0027t5-large\u0027, \u0027google/flan-t5\u0027, \u0027google/flan-t5-xxl\u0027, \u0027google/flan-ul2\u0027, \u0027AlekseyKorshuk/vicuna-7b\u0027, \u0027togethercomputer/GPT-NeoXT-Chat-Base-20B\u0027]"]},"example_inputs":{"raw":"[None/Remove]","serialized":"[None/Remove]"}},{"id":71,"type":"dropdown","props":{"choices":["[None/Remove]"],"value":"[None/Remove]","allow_custom_value":false,"label":"Choose LORA","show_label":true,"name":"dropdown","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","Option from: [\u0027[None/Remove]\u0027]"],"raw_output":["str","Option from: [\u0027[None/Remove]\u0027]"],"serialized_input":["str","Option from: [\u0027[None/Remove]\u0027]"],"serialized_output":["str","Option from: [\u0027[None/Remove]\u0027]"]},"example_inputs":{"raw":"[None/Remove]","serialized":"[None/Remove]"}},{"id":72,"type":"column","props":{"type":"column","variant":"default","scale":1,"min_width":320,"visible":true,"style":{}}},{"id":73,"type":"button","props":{"value":"Load-Unload Model/LORA","variant":"secondary","interactive":true,"name":"button","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":74,"type":"checkbox","props":{"value":true,"label":"Load 8-bit [Not all models support]","show_label":true,"name":"checkbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["bool","boolean value"],"raw_output":["bool","boolean value"],"serialized_input":["bool","boolean value"],"serialized_output":["bool","boolean value"]},"example_inputs":{"raw":true,"serialized":true}},{"id":75,"type":"checkbox","props":{"value":false,"label":"Infer Devices [If GPU ID=-1 or not Checked, then will spread model over GPUs]","show_label":true,"name":"checkbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["bool","boolean value"],"raw_output":["bool","boolean value"],"serialized_input":["bool","boolean value"],"serialized_output":["bool","boolean value"]},"example_inputs":{"raw":true,"serialized":true}},{"id":76,"type":"dropdown","props":{"choices":["-1","0"],"value":"0","allow_custom_value":false,"label":"GPU ID [-1 = all GPUs]","show_label":true,"name":"dropdown","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","Option from: [\u0027-1\u0027, \u00270\u0027]"],"raw_output":["str","Option from: [\u0027-1\u0027, \u00270\u0027]"],"serialized_input":["str","Option from: [\u0027-1\u0027, \u00270\u0027]"],"serialized_output":["str","Option from: [\u0027-1\u0027, \u00270\u0027]"]},"example_inputs":{"raw":"-1","serialized":"-1"}},{"id":77,"type":"textbox","props":{"lines":1,"max_lines":20,"value":"h2oai/h2ogpt-oasst1-512-12b","type":"text","label":"Current Model","show_label":true,"name":"textbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":78,"type":"textbox","props":{"lines":1,"max_lines":20,"value":"[None/Remove]","type":"text","label":"Current LORA","show_label":true,"name":"textbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":79,"type":"form","props":{"type":"form","visible":true,"style":{}}},{"id":80,"type":"form","props":{"type":"form","visible":true,"style":{}}},{"id":81,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":82,"type":"column","props":{"type":"column","variant":"default","scale":50,"min_width":320,"visible":true,"style":{}}},{"id":83,"type":"textbox","props":{"lines":1,"max_lines":20,"value":"","type":"text","label":"New Model HF name/path","show_label":true,"name":"textbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":84,"type":"textbox","props":{"lines":1,"max_lines":20,"value":"","type":"text","label":"New LORA HF name/path","show_label":true,"name":"textbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":85,"type":"form","props":{"type":"form","visible":true,"style":{}}},{"id":86,"type":"column","props":{"type":"column","variant":"default","scale":1,"min_width":320,"visible":true,"style":{}}},{"id":87,"type":"button","props":{"value":"Add new model name","variant":"secondary","interactive":true,"name":"button","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":88,"type":"button","props":{"value":"Add new LORA name","variant":"secondary","interactive":true,"name":"button","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":89,"type":"column","props":{"type":"column","variant":"default","scale":1,"min_width":320,"visible":false,"style":{}}},{"id":90,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":91,"type":"column","props":{"type":"column","variant":"default","scale":50,"min_width":320,"visible":true,"style":{}}},{"id":92,"type":"dropdown","props":{"choices":["[None/Remove]","EleutherAI/gpt-j-6B","EleutherAI/pythia-6.9b","EleutherAI/pythia-12b","EleutherAI/pythia-12b-deduped","EleutherAI/gpt-neox-20b","decapoda-research/llama-7b-hf","decapoda-research/llama-13b-hf","decapoda-research/llama-30b-hf","decapoda-research/llama-65b-hf","facebook/mbart-large-50-many-to-many-mmt","philschmid/bart-large-cnn-samsum","philschmid/flan-t5-base-samsum","gpt2","distilgpt2","databricks/dolly-v2-12b","h2oai/h2ogpt-oasst1-512-12b","h2oai/h2ogpt-oasst1-512-20b","h2oai/h2ogpt-oig-oasst1-512-6.9b","t5-small","t5-large","google/flan-t5","google/flan-t5-xxl","google/flan-ul2","AlekseyKorshuk/vicuna-7b","togethercomputer/GPT-NeoXT-Chat-Base-20B"],"value":"[None/Remove]","allow_custom_value":false,"label":"Choose Model 2","show_label":true,"name":"dropdown","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","Option from: [\u0027[None/Remove]\u0027, \u0027EleutherAI/gpt-j-6B\u0027, \u0027EleutherAI/pythia-6.9b\u0027, \u0027EleutherAI/pythia-12b\u0027, \u0027EleutherAI/pythia-12b-deduped\u0027, \u0027EleutherAI/gpt-neox-20b\u0027, \u0027decapoda-research/llama-7b-hf\u0027, \u0027decapoda-research/llama-13b-hf\u0027, \u0027decapoda-research/llama-30b-hf\u0027, \u0027decapoda-research/llama-65b-hf\u0027, \u0027facebook/mbart-large-50-many-to-many-mmt\u0027, \u0027philschmid/bart-large-cnn-samsum\u0027, \u0027philschmid/flan-t5-base-samsum\u0027, \u0027gpt2\u0027, \u0027distilgpt2\u0027, \u0027databricks/dolly-v2-12b\u0027, \u0027h2oai/h2ogpt-oasst1-512-12b\u0027, \u0027h2oai/h2ogpt-oasst1-512-20b\u0027, \u0027h2oai/h2ogpt-oig-oasst1-512-6.9b\u0027, \u0027t5-small\u0027, \u0027t5-large\u0027, \u0027google/flan-t5\u0027, \u0027google/flan-t5-xxl\u0027, \u0027google/flan-ul2\u0027, \u0027AlekseyKorshuk/vicuna-7b\u0027, \u0027togethercomputer/GPT-NeoXT-Chat-Base-20B\u0027]"],"raw_output":["str","Option from: [\u0027[None/Remove]\u0027, \u0027EleutherAI/gpt-j-6B\u0027, \u0027EleutherAI/pythia-6.9b\u0027, \u0027EleutherAI/pythia-12b\u0027, \u0027EleutherAI/pythia-12b-deduped\u0027, \u0027EleutherAI/gpt-neox-20b\u0027, \u0027decapoda-research/llama-7b-hf\u0027, \u0027decapoda-research/llama-13b-hf\u0027, \u0027decapoda-research/llama-30b-hf\u0027, \u0027decapoda-research/llama-65b-hf\u0027, \u0027facebook/mbart-large-50-many-to-many-mmt\u0027, \u0027philschmid/bart-large-cnn-samsum\u0027, \u0027philschmid/flan-t5-base-samsum\u0027, \u0027gpt2\u0027, \u0027distilgpt2\u0027, \u0027databricks/dolly-v2-12b\u0027, \u0027h2oai/h2ogpt-oasst1-512-12b\u0027, \u0027h2oai/h2ogpt-oasst1-512-20b\u0027, \u0027h2oai/h2ogpt-oig-oasst1-512-6.9b\u0027, \u0027t5-small\u0027, \u0027t5-large\u0027, \u0027google/flan-t5\u0027, \u0027google/flan-t5-xxl\u0027, \u0027google/flan-ul2\u0027, \u0027AlekseyKorshuk/vicuna-7b\u0027, \u0027togethercomputer/GPT-NeoXT-Chat-Base-20B\u0027]"],"serialized_input":["str","Option from: [\u0027[None/Remove]\u0027, \u0027EleutherAI/gpt-j-6B\u0027, \u0027EleutherAI/pythia-6.9b\u0027, \u0027EleutherAI/pythia-12b\u0027, \u0027EleutherAI/pythia-12b-deduped\u0027, \u0027EleutherAI/gpt-neox-20b\u0027, \u0027decapoda-research/llama-7b-hf\u0027, \u0027decapoda-research/llama-13b-hf\u0027, \u0027decapoda-research/llama-30b-hf\u0027, \u0027decapoda-research/llama-65b-hf\u0027, \u0027facebook/mbart-large-50-many-to-many-mmt\u0027, \u0027philschmid/bart-large-cnn-samsum\u0027, \u0027philschmid/flan-t5-base-samsum\u0027, \u0027gpt2\u0027, \u0027distilgpt2\u0027, \u0027databricks/dolly-v2-12b\u0027, \u0027h2oai/h2ogpt-oasst1-512-12b\u0027, \u0027h2oai/h2ogpt-oasst1-512-20b\u0027, \u0027h2oai/h2ogpt-oig-oasst1-512-6.9b\u0027, \u0027t5-small\u0027, \u0027t5-large\u0027, \u0027google/flan-t5\u0027, \u0027google/flan-t5-xxl\u0027, \u0027google/flan-ul2\u0027, \u0027AlekseyKorshuk/vicuna-7b\u0027, \u0027togethercomputer/GPT-NeoXT-Chat-Base-20B\u0027]"],"serialized_output":["str","Option from: [\u0027[None/Remove]\u0027, \u0027EleutherAI/gpt-j-6B\u0027, \u0027EleutherAI/pythia-6.9b\u0027, \u0027EleutherAI/pythia-12b\u0027, \u0027EleutherAI/pythia-12b-deduped\u0027, \u0027EleutherAI/gpt-neox-20b\u0027, \u0027decapoda-research/llama-7b-hf\u0027, \u0027decapoda-research/llama-13b-hf\u0027, \u0027decapoda-research/llama-30b-hf\u0027, \u0027decapoda-research/llama-65b-hf\u0027, \u0027facebook/mbart-large-50-many-to-many-mmt\u0027, \u0027philschmid/bart-large-cnn-samsum\u0027, \u0027philschmid/flan-t5-base-samsum\u0027, \u0027gpt2\u0027, \u0027distilgpt2\u0027, \u0027databricks/dolly-v2-12b\u0027, \u0027h2oai/h2ogpt-oasst1-512-12b\u0027, \u0027h2oai/h2ogpt-oasst1-512-20b\u0027, \u0027h2oai/h2ogpt-oig-oasst1-512-6.9b\u0027, \u0027t5-small\u0027, \u0027t5-large\u0027, \u0027google/flan-t5\u0027, \u0027google/flan-t5-xxl\u0027, \u0027google/flan-ul2\u0027, \u0027AlekseyKorshuk/vicuna-7b\u0027, \u0027togethercomputer/GPT-NeoXT-Chat-Base-20B\u0027]"]},"example_inputs":{"raw":"[None/Remove]","serialized":"[None/Remove]"}},{"id":93,"type":"dropdown","props":{"choices":["[None/Remove]"],"value":"[None/Remove]","allow_custom_value":false,"label":"Choose LORA 2","show_label":true,"name":"dropdown","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","Option from: [\u0027[None/Remove]\u0027]"],"raw_output":["str","Option from: [\u0027[None/Remove]\u0027]"],"serialized_input":["str","Option from: [\u0027[None/Remove]\u0027]"],"serialized_output":["str","Option from: [\u0027[None/Remove]\u0027]"]},"example_inputs":{"raw":"[None/Remove]","serialized":"[None/Remove]"}},{"id":94,"type":"column","props":{"type":"column","variant":"default","scale":1,"min_width":320,"visible":true,"style":{}}},{"id":95,"type":"button","props":{"value":"Load-Unload Model/LORA 2","variant":"secondary","interactive":true,"name":"button","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":96,"type":"checkbox","props":{"value":true,"label":"Load 8-bit 2 [Not all models support]","show_label":true,"name":"checkbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["bool","boolean value"],"raw_output":["bool","boolean value"],"serialized_input":["bool","boolean value"],"serialized_output":["bool","boolean value"]},"example_inputs":{"raw":true,"serialized":true}},{"id":97,"type":"checkbox","props":{"value":false,"label":"Infer Devices 2 [If GPU ID=-1 or not Checked, then will spread model over GPUs]","show_label":true,"name":"checkbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["bool","boolean value"],"raw_output":["bool","boolean value"],"serialized_input":["bool","boolean value"],"serialized_output":["bool","boolean value"]},"example_inputs":{"raw":true,"serialized":true}},{"id":98,"type":"dropdown","props":{"choices":["-1","0"],"value":"0","allow_custom_value":false,"label":"GPU ID [-1 = all GPUs]","show_label":true,"name":"dropdown","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","Option from: [\u0027-1\u0027, \u00270\u0027]"],"raw_output":["str","Option from: [\u0027-1\u0027, \u00270\u0027]"],"serialized_input":["str","Option from: [\u0027-1\u0027, \u00270\u0027]"],"serialized_output":["str","Option from: [\u0027-1\u0027, \u00270\u0027]"]},"example_inputs":{"raw":"-1","serialized":"-1"}},{"id":99,"type":"textbox","props":{"lines":1,"max_lines":20,"value":"[None/Remove]","type":"text","label":"Current Model 2","show_label":true,"name":"textbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":100,"type":"textbox","props":{"lines":1,"max_lines":20,"value":"[None/Remove]","type":"text","label":"Current LORA 2","show_label":true,"name":"textbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":101,"type":"form","props":{"type":"form","visible":true,"style":{}}},{"id":102,"type":"form","props":{"type":"form","visible":true,"style":{}}},{"id":103,"type":"form","props":{"type":"form","visible":true,"style":{}}},{"id":104,"type":"tabitem","props":{"label":"System","visible":true,"style":{}}},{"id":105,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":106,"type":"textbox","props":{"lines":1,"max_lines":1,"value":"","type":"password","label":"Admin Password","show_label":true,"name":"textbox","visible":false,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":107,"type":"button","props":{"value":"Admin Access","variant":"secondary","interactive":true,"name":"button","visible":false,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":108,"type":"form","props":{"type":"form","visible":true,"style":{}}},{"id":109,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":110,"type":"column","props":{"type":"column","variant":"default","scale":1,"min_width":320,"visible":true,"style":{}}},{"id":111,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":112,"type":"button","props":{"value":"Get System Info","variant":"secondary","interactive":true,"name":"button","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":113,"type":"textbox","props":{"lines":1,"max_lines":20,"value":"","type":"text","label":"System Info","show_label":true,"name":"textbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":114,"type":"form","props":{"type":"form","visible":true,"style":{}}},{"id":115,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":116,"type":"button","props":{"value":"Zip","variant":"secondary","interactive":true,"name":"button","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":117,"type":"textbox","props":{"lines":1,"max_lines":20,"value":"","type":"text","label":"Zip file name","show_label":true,"name":"textbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":118,"type":"file","props":{"file_count":"single","selectable":false,"show_label":true,"name":"file","visible":true,"style":{}},"serializer":"FileSerializable","api_info":{"raw_input":["str | Dict","base64 string representation of file; or a dictionary-like object, the keys should be either: is_file (False), data (base64 representation of file) or is_file (True), name (str filename)"],"raw_output":["Dict","dictionary-like object with keys: name (str filename), data (base64 representation of file), is_file (bool, set to False)"],"serialized_input":["str","filepath or URL to file"],"serialized_output":["str","filepath or URL to file"]},"example_inputs":{"raw":{"is_file":false,"data":{"name":"test/test_files/sample_file.pdf","data":"data:@file/pdf;base64,JVBERi0xLjQKJdPr6eEKMSAwIG9iago8PC9UaXRsZSAoVW50aXRsZWQgZG9jdW1lbnQpCi9Qcm9kdWNlciAoU2tpYS9QREYgbTk3IEdvb2dsZSBEb2NzIFJlbmRlcmVyKT4+CmVuZG9iagozIDAgb2JqCjw8L2NhIDEKL0JNIC9Ob3JtYWw+PgplbmRvYmoKNSAwIG9iago8PC9GaWx0ZXIgL0ZsYXRlRGVjb2RlCi9MZW5ndGggMjM2Pj4gc3RyZWFtCnicjZDfakMhDMbvfYpcD2bzTxNhFFZYe90h7AG2tTDoYO37w9S1O1A4cIyo5Bc/80mALR6pLVYY3k/hJ/RMJh6J82d4e4Dvlo2WRu1tb6UEPV538Hc4H8NqJ3C8DAWnDIQpd4lD2LdYomzcZ9O+Km1qWG0VSCRKG+xQD4FuTZeWdTcR0CiZiqtAPYXOGKOhEBnUD3hC5M0a6lcoObInwdIErsAHcI+F3cknsB3ANFJCU54Byf6B8AAvdZi9s8WokcXNFrvLEj0n0gXu5Hm8TJyiK6nm+54Ipd3IXnQiae5H5vyxTf724RdvlHTtCmVuZHN0cmVhbQplbmRvYmoKMiAwIG9iago8PC9UeXBlIC9QYWdlCi9SZXNvdXJjZXMgPDwvUHJvY1NldCBbL1BERiAvVGV4dCAvSW1hZ2VCIC9JbWFnZUMgL0ltYWdlSV0KL0V4dEdTdGF0ZSA8PC9HMyAzIDAgUj4+Ci9Gb250IDw8L0Y0IDQgMCBSPj4+PgovTWVkaWFCb3ggWzAgMCA2MTIgNzkyXQovQ29udGVudHMgNSAwIFIKL1N0cnVjdFBhcmVudHMgMAovUGFyZW50IDYgMCBSPj4KZW5kb2JqCjYgMCBvYmoKPDwvVHlwZSAvUGFnZXMKL0NvdW50IDEKL0tpZHMgWzIgMCBSXT4+CmVuZG9iago3IDAgb2JqCjw8L1R5cGUgL0NhdGFsb2cKL1BhZ2VzIDYgMCBSPj4KZW5kb2JqCjggMCBvYmoKPDwvTGVuZ3RoMSAxNjgwOAovRmlsdGVyIC9GbGF0ZURlY29kZQovTGVuZ3RoIDgzOTM+PiBzdHJlYW0KeJztegl4VEX276m6t/dOeiFJd9a+nU4aSQOBsAaQdDZAI3uABIkkQCQoyBJQcCPOiGBwHweVccRd1EE7i0yCjjDAuCAIo4y7grg7Iui4ovT9/6q6wzLqvHzvfe97z++bezm/OnXqnFpOnXtuXdLEiKgHQKV+o8vKR7HBrDcR90I6bPSE8ZNXVmy4k0hZg3rz6MlTSqxPm64jYhHU+42fnF+wfOjmfdAX9dqpZWOrJtxywddoSiJy3Tp7Qd0idjf7Eu2VaJ8x++Kl2j0Zr/6TyH4OkbHy/EVzF+xeUb2eyH036hfNrWtcRF6yoP8R0HfOnb/i/LWfPPI+UaqTyFbSMGfB8ttq5/aAbhnI3FBfN+dg0jPojx2E/uAGCNwDLCrqmCPlNCxYurzv++ptWBzmQ5/NXzi7LrV3+h6sB/3R8gV1yxcZ1iU0QR/zIe2iugX1ntr+bxMZUGVlixY2LtXzaB34+aJ90ZL6RbmvjN2KrrEe29OQKWQmTi5iug5e+HI4fUkj6I9kgtxJ+TQVo/8JugbUFZKX3lP0+TMX7E0jo+Oo1EnHHj92qVNKTruGS4mV+uI21C2pm0Xa7BVL5pM2d0n9haQ11M9aQtr8uqUXkXayTzKkrn94ZvmKmY4RX5vTzVJ873s980T5woThm489fnyuk8x2VC0nRhSlPc5zrCYm60lnAEO4GdaWDyzAzWgQbkbDcLO4BcnVJsW9koT4GoMyUfrLSOWonUPjaRJNg+eIyk6t6++dvH/iAUVZw26CN82G9YYBmFJ6rFT+Tudzt9nAbUaVi0ulf/Pe2PHjxlMYI00zvBydyAaYRrLWsNg4jK8GDU+KHSb1Z/fl/+R6muXLe3fs5hnyfkvcav+u23BPfF9LaAYpckd7x3ZU7mVSbF6YKYP3TvLsFB4uuLB+CXRPxbgPhB6H55mkRGnFKYNSZH/5sb3T35TYgCfrJ07//+cyPEt3GabSvafU7z+1XW08+WwZC2n2KXr3/HtfpuspVRQ0XUSpirxDF1BTnGfYjYvjPIfPGuK8ghg6I86rp+gYKA1aMd4IjqiYltA8qqP5NJYqkQfqUW+EZCGJp3MQnuD+1A/tY6VkIS2lFbQIWhqdRQsgnwvdi4Aa9QGd7E3DU1IP+TLwdZCeXjup9zA0CzBCf9waZtAg+/7paKWoLQEvsA7y2Az7yjHnx8ebhxEa0NYYH71RruZi4BxoEon3RdhmNXdvE01GkhkFhTnGwZFINzZL9+wtZpGppKUlxpENlJBg7aa95YS9NW6fAHI4bN2zt1ljEzbLFCmNHCCnw/6f7bouuy1mZTnd3uVK+N+2d4F69Ejsnn1iQmzBNjmuNMJLlZKTnd2zdyTGrDC4MzZ1SgZ5Pe7u2bucsQknyHEFRx5QekZS9+yT3LEJO+S40igDlJmV0j375B6xCTvluIKjLJCmebtn70mOTRjTSI1x8nXrz07tnr03JfbEwD4txlE2KDeY0T37dIyTTnLmmTGOgqC8PK179lkZsQVj5v4YR+Iw0LdvoHv2fp80FJPPiXEyCRQUBLtnn+OXhmLTesY4JCoc4Ab36p59zxxpKGaeF+NoMGjYsN7ds8/rGVuwRkitksPBhai0pKB79v1g1Q9lLtHAGIcXN1FFxdDu2Q8uiE04T44rOKoATZ48snv2I4aASDq9OMbRZNCMc8u7Z19yZmzCODeNiXF0LmjO7Iru2Y8plYaE5Y6LcfJFa9hCqaA0w0OUqgZFXOsfgT4WZXSe/rFoFyX/FModcSLaSJvYPNpEW2k7Owqrx6mT2uk5RGcZ3UmX0620Gm+K6ZBci3fPJLxpy+hWlqq34+RyD96499Ae6E6jK2kLpTCv/gmtpFXKy7BahQyTDRdNwBvtenaOvgynqwPqb2kIzpoX0SLWpFfpN+i36PfTA9SpPKcfR0ZMw1Jm0x79c8Nr+lsIjxn0e7qDDrBbLE/gzT8N54NO5Y94961XalSmz9WPYQZ+ugRzUPFu3cO28RB6r6ePmJddrpSil/v0iL4TWhlUg3foetrCBrHR3G+YoY/V9+AM1oeWo9c7qJU24+6gv9AbzG44qt+vH0V66Y3TwEr440W2TYkevypaJBwNL/WiQrQspKfpWdrHAuyvfKHBbigwhA2X6vuRE/vTFMz2IVh+yL7lV+JeqTyjjtJLkLlX0c3C2/Q3epel4Ww6nk3lvfhCfpeyBO+03vLEMAfv/GvpdvT+DguxzdzO9yr3qY+qPxgzowf1ROxIkP6A75y/sgSsVGON7DfsFfYeL+Uz+R/4IeVW9WH1JVMdVn0eTjPX06P0LXOzoWwiO5c1sMvZanYzu4PtYfvYx7yYV/IL+RGlQVms/EUtwT1ZbVR/a7jGsNb4cbQqujP69+i3eoF+DU1EPFyF2f+e7sLKOmkvvY77AB1iBmZjibg15mdT2GW4r2TXs3vZRvYwa8co+9gh9gn7kn3NfuA40HEjT+d+no07wJfwS/it/E6+F/c+/hn/XvEo2UpIGaSMUKqVhZjVauUm3E8o76pp6l5Vh58LDOsMGwwbDY8athuOGu2m35jJvPvH+47nHX8nStE10XXR1mi7/i5ydCpiKoN8eE4n4nxVhzPmcpxRH0Ccv8zs8F0ay2Mj2TnwzEx2AVvMlsOTV7P17AE598fYU/DSq+wI5pyALwcx5758EC/h43Gfx+v5Yn4Tv4W381f4McWk2BSHkqzkKaOVGqVeWaqsUNYpEWW38rZySPlG+RG3rlpVn5qtBtWQOlqdqS5T71I/Uj8yzDC8YPjAaDUuMF5j7DB+YRpsGmmaYJpoqjHdaNps2m+uRXTuoCfoz6emAnZQuUopV56gG/gANZW/yF9EPM+kOcpYjkjlG9kafgVr5zmG5cbhfDgbR0fVIHz9DN/Av+HDlbGsgk2mC3j/WG/GJPURkd/UHXRYfQprexE9Lzfa2ZX8iNFOrfhcKcSYf1P6qSHlBXpDOcBM6j30pmplHnaYP6RMQBT8RR1pqCK/cic9pixmV9ATHGnR+oP5OsTxOPYI8kIlK2DfKfhi5+MQRUOU9+i3dCF/jQ7jOV5Dt7E56ly6gQawy+kjehBPRS/DRcY8YzJ7ns9Tm3kP1k5cfRirK2Q5TDEk0dWsRllvPMJfxyl8r2qld5Q/YfZ7+WPKWPWoYRJrwBNwBV1Di/WraIWhSn2JzSWFTaVc9SCy2+VKgepHuRJZZQZy2mY83VuQB4qVsZB4ETnnIC6mIEOsx3078oSKCJqHZ3wastiL1G6s5B0015DIkHXwBfRCdBJN1x+kO/S5dJF+C/VBPlitX44eN9IHdCNtZKuil+G8n4Un5x12jmEU32sYpffhzfx1PpmvO31/4e1c5qVPcT+Gykh8Jzerr+J1U6Rfp/8D0X0GMuwdNIvOpvexys8xwhhlGw2IjuMt+ihlEdZ7gCbqD+k+ZqUGfT6+8Z+iB0wGqjOFsMcR9hLWexnV80n6UqU+Og9+uBFeCMNby5B/rg2XTqksDheNPHPE8GGFQ4cMGjigoH+//L59eofyep3RM5ibE8j2a76szIz0tFSvJyU5qYfb5XQkJthtVovZZDSoCmfUuzwwqlaLBGsjajAwZkwfUQ/UQVB3iqA2okE06nSdiFYr1bTTNcPQPP/fNMMxzfAJTebURtCIPr218oAW2VMW0DrY9IlV4K8vC1RrkcOSHyv5mySfAN7vh4FW7m0o0yKsViuPjLq4obm8tgzdtdispYHSemuf3tRitYG1gYt4AotamGckkwz3lA9rwZd+AiYVSQuUlUdSA2ViBhElt7xuTmTCxKrysnS/v7pP7wgrnR2YFaFAScQRkipUKoeJGEsjJjmMNk+shtZqLb23NV/X4aRZtSH7nMCcuhlVEaWuWozhCmHcsojn0ve9J6vo3F1atfrU1nSludw7TxPV5ubVWuTuiVWntvoFVlejD9jy3FG1zaMw9HVwYsVkDaPxVdVVEbYKQ2piJWJVsfXVB8qFpPYCLWIJlAQami+oxdakNUdo0gp/a1pauFM/SGnlWnNlVcAfKUoPVNeVZbQkUfOkFW2pYS319JY+vVucrphjWxIdccaecCpTf6JNclJdcBWTTniWiRkFzkJARLTZGmZSFcCahgqoH0rNs4dCDVc1g1VkDnZkXsRSWtvsHCbkwj5iyHUGtOavCREQOPzZ6ZK6uMSY6/yaBCvi5ESoob2Lj4RCkbw8ESKmUuwp5jhS1gf16X1xBw8EFjk1FHAfTYBv66qH5cP9fr/Y4LUdYZqFSqRpYlWsrtGs9FYK54eqI7xWtGzrakmeIlqaulpOmNcGEMnt8n+SkiPm4Il/DmdKj/KGYRGW8h+a62PtFZMDFROnV2nlzbVx31ZUnlaLtQ890RbnIj1Kq5R0Hud4uiJbEZQzTiiLSpU9oubin1EG9ZwOkxlRKSVMGxVx1o6JYbXV7++mUYd+VFjJ4qRZfJqRYaHT68NPq582PXuzggnjVVlROb252XpaG0ItNuBZ8QIRT5VVfq00QlPwZObiX4e+baig6vRIGC4rFQqIv5goXj1NMT3OV+MS0dmn9ygkuubmUQFtVHNtc12H3jQroDkDzZ18O9/evKi8titwOvQta9Mjo66rhq8a2DA8FJxKWgJszcSWMFszeXpVJz6ztTWVVa2c8dLakuqWHLRVdWpEYSnlQiqEoqKJClUwLLKVm6V+emeYqEm2qlIg67M7GEmZuUvGaHYHj8mcXTIOmRqThaVMXCLHlFZWnRo98pGsxmcdLO7CAXs6vlUclMlSw27Nx0rNGZlZmL3LmeUgs6dDj7bb7SVTwHzZbrNJ5ptwtj0BXFCzMF84IYFPsWhOJ9DqcAC9UtKhfxXuabcbp1jSfJnORGHqtCbAzGkX/Tk1pmEV0o7QZbswlYywBnMMw0rm23bRC5jvwrAHV5M1fIY35PwmJK+aEceBI+LVmsMAKhpxfISg/v1KV4QHK+kms9FsMKtm1ZjqTfNyo81qtyZYFWNySlJKjxTFmK54/MydCPCaM/wsxeryUyjEQqE8XFexmgEuf4EnxZPiTk7iiTyQ6y8YPGTw4EEDgz2DAf9d7PtHp19ZvbRx3KU371kVbWGFNz/Qv3zsbfPHbYruNmxJzjxnVnTvzoei0YfrCjYN7l/+yYMffpuXhbXfi/OL+E60UXs42WjIMptNJlJU4XyrJctGZhOCNJzvdA80VSpna1YtgVvTElQLF/6zSI9arGIjLN325bF2i+WERDr1aJdT7cPP9YbGOb8Kdbl1rPTrOOc3NWO/ev+kT92F+SOcwrVwSrI/TveqOT/epYR+/IdytWHLpmjRn6IJmzCj+xFd2WKFzN5JCVhMSo/kgaqSZbHebd1n5VYD5zYzdqYryMxdQWYWQWYRazNrJpOxQ/9crgnMl2GbWJTRKVaE+sFwns1mnGJkYj3GmqYElsBt0kM26SGb9JAt5iHhTyum8J9cFbZJX5lFr6dHX0rcUVoC0xImJNQmLEpQh1d7QzWLu2LxZDTWxCTwlEA4r2hEYU2+DEkWGuCC70AB4P3b+bHt248bDVuOP8inHxvF246PxUzXITby4DkD/SZsZxw+M5BZU5nawR8K+01ckUtU5BIVuUSl20HwzU8eKOPPPVAf1sT2XOy02Ot12/lLhi3H/rVJ5I3Z+keGtw378X2dzlLCFWkOluRMSkr3pKerqlNNsnls6erDns2JzyQqHo83nWuZYdf4HuM94bQqQ5VlmnOKa2aP6Z6Z3qlp09LXeu7gztQsRXFn2SzJXbGQ3BULySIW5BKTg5qJ4aH4SsrBfNwuVmsS4SEWCeaoXCSYT9vFBkplsUaT2NkisW5TWlMmy3RI/zmk/xyyc0dQuM8sBGQXAjLKEDBKZ6VmzJ5x4vGoGSuyzLiuTe4SUNHhosPY35rFVFNTs7iHk/wFqkgZaiA7hw9x0oACcg3kwUA2zWZr2OAX2KhH26Obt+6Nbtn4HMt89U2WvuKTm1+Mvsp3sQXsj9ujD7x1IHr3E8+x6U9Hv43uZQNZehuz/S76Afx/D56sTYgPL2XzYWG/25bI3IMzpvvONy/wqRanWLJZokliDiJfeiZBOEQw9i7G1sW4O/RDbe60gSiPtmX3HOgS9cyeA53x0hEv0f5aW2Yw1g59Z7wU7eGzwOQmnp1xtjbZNiNjQcYSy/LEFY5V1jWO2xIednQ4Pk78yOFMtNs1lyPJ5XK4HHaLO53701KsRnzLJNgNXoslxZOWmuURM46/b7aFk8VWeDzkzxbZkbxehyPRnNUVKlldoZJ1Im1kBRPvNIoAiaeN2FMg88VAmTmMwi3GGi1nUU5TjpKT7ZUB4ZUB4ZUB4f1fPlDxVGH8aaqIP1eB4Rt/LqfGAyf1fW/8beXEHc+todBxVArz3Z5C5vIUrk7sGzJc4dwpwip06kWiP5zQwlZz2FHocA5zuYdBVM0WQ9hJifo74bTUQld2aqEblBjOKHRmJ4F8oOTCeCfV4sWWgg9JowlvN0+PgNKX45UWcEEs328B/z28eefuS3e9PPaMKefoX22fctG0Pv6Kd9k9q9aNu+2+aD/DlvHPrbjzlczcnHHLootZ/6uvG2ozHV+mDBiyYnTDNeKvsalEpotFpPLLO8mhR4XTSqZw6e7EWFbCw9ehH483KCca5LMpjhG9BKcaY7lOIJfbpMqDhCKR2+Nm4nGXZp92MV/JERDv+9ttkBjA4J0BrhcFXb3cQW8hDXYVugd7z6LRrrPco71VNM1V5Z7mdd5uvt3BW4zi/BQe4GRpqaHkgYaB9jJDmb0iudJQaT83eY5hjv3C5KWGpfbLkh2GZLtCzG0mswMnNcRpkbhc2Moa5nIXFqaHsxTVYOBGE156VizXkpDocNjxGe9OTvF4vch0I9oM5NVEaXe7RBmenmy2aIQ3pcYoiTHyGszmrGRvUnKy1223WLKS3WDdLrvDoTldSU6ny22xm73JBofLaSeOKRkUr9PhsFjMZo45ed1ul4vMaR5PmrPYwiaSRnZgMihMBjZxs6YxxlJTO9jalljw1qSljj2e5j1+PC31uHdceX3Zhyci1hm/RbBifa4uKixcPbZvaPUVO1f39f60QOCtTnTu3AkYsbOLOxVYRcQxuSLiwoG11W314pEbOrQawlwI8yDsJBKnd6qI2CBJhKTNHjaEoVSN52RJTezsdvrlZwN6pHgGD0HhRtFjAAuwYE+jibG7opc9eyAnbaiVeT59aXwgo8+HO6IXPRl9oafJkxR93rDlx6Lbfv/PHOWd42nRz/61tl157NgoteY6rX70D/fhCN1JlcoZbUGvb99TSi86COJKr9ZQpq9T6alktg73hTuUQJs7ucBR3EcR+SRfogZcCHoctFURv8WYqYgzoRO4EtQEehy0FbQPZCQCilYNtBC0AXRQtCiZSkar5nMW91RSYZuKt4ND8dARkA5SyAfMB40HzQTdCNoAMko9IVkIWgnaCjoqW8KKp/WWAZi7p3WtLNoumF8gq3Wx6owaWW2bVh0rx06MlWVnxdSGxdT6D4yJ+5bEyp69Y6U7t6BJlNaEgm3FKUoKFpmCiS8CMr6THAh0H92tJFMExBVjXBJW3G05wYINWxWVmMIVRnPIp29TWGuCq6DYynV+hNzk45/zw7EWfrgt0VWwofhsfogeB20FKfwQ7nf5u7SSHxQ+BxaBNoC2gvaCjoCM/CDuA7jf4e+Qg79N+aAi0EzQBtBW0BGQib8NdPK3xDe+RMEXgbj47Qtqb2JZbwId/A1wb/A3MLWXW4cUFnRKJpQfZ3y5ccaTHmfcKQUd/KXW73shooLYaUTUk0o2jaQBSnZrbn9fh+JtHTHP18Hfa9NCvruL+/H9FAFxzGQ/Rt5PGmgCqBa0CGQE9wq4V6gJdBPoblAEhCgDOkEa3wXaDXqF+oHCoAkgM9/XimE6+N7WYImvOIW/yJ8lDzy+hz8ny938GVm+wP8my+dRZqHcxZ9pzfJRsQ3tBBsnSifKfLQb+F/bctw+vdjFt8J3PmA+qAg0HjQTdCPIyLfy7NY5Pjc6eZJ2mQmarfSJLB+ke80UvsAXDpYiADUBwWFnggNs0DYEeTi47g5UBQRvuAWcgODV14ETELz0KnACgvMvBicgOOcCcAKC02eCExAcXwkO0MHv+nNOT9+Q8RcyrdjBL4GXLoGXLoGXLiGVXyJu+l4Vc/tDa14ePLY+HOqV52vawpqeYk2TWNO9rKmeNV3Jmq5iTSNY03msKcSaMlhTFmsKs6Yn2VC4oomF20+rFoa9rGkXa9rEmhpZU5A15bKmHNaksSHhDu5vPWuALMpl0VYsHjqUZ45E9nFwPzzqR8z7kRO2AveCdFkLQ0nLjimnZokyuy2vKFbvO6xgYfEYvgOGO7ANO+gASMUG7UAY7UAnO9CBA1gEmgnaBjoC0kFGaGdj4jdKdADzQUWgmaCVoCMgo5zOERCnhfEpPi4nlh+f9HhR4ztwiz9i+bk/nOnMcIacY5QbM5gji43P0rP4EEoRv4lwu8yuDpaw+duE775NIEuxhd/Ab6RMbMRN8fLG1u8zfR3s9tbgk77iZHYbZamIOlZIQZaLcig1yvogyjCLciBl8EdRFrRmTIWZozXY27eFJQqrzb7vM973fZLRwcF+nPGk71WtQ2Wtvn9A8uhm3/6Ma33P53eYIXkq2MFQbNGkamfGUN+mXVL1KjSsb/VdKYrNvisyRvsuzJAN9bGG8xpRCzt8k4LTfWPQX1nGLF+4EX1u9hVlnOcbEdMaJGw2+/phCqEYm4fJ9sqQgwayZIdThnSwhnBv0zpTlWm8abCpwNTb5Df5TJmmdFOS2W12mhPNdrPVbDYbzaqZ4xiTJM7LIXGKSzLKH2gaVfkDO8k7Ocmf1Mmf3XFm5nQ2RXooFbxicgle1ttmU8UsLfLN5EAHs06cHjEESljEXUEVlSWRoaGKDpM+KTIkVBExTTi3qoWxG6ohjfA1HYwqqzqYLkSr0sX/rXcSY65V16eL8oxV11dXkzfl4iJvkXukq3BU2c9AbRxPeft7T+MzI+sqJldFHsmsjhQIRs+sroj8Tvzneyf7kh0tL+tkX4iiuqpTGcm+LJ8k5MrIsurqig42VeqRxr6AHiLmC6lnxotZ6JFmzorprY/p5cIeejmigJ7FQrlSL9dikXoqE3otjTnlZS05OVLHo1Gj1Gn0aKfq7MqFTm6u1Elpol1SZ1dKk9CJjJQqGRlQycqQKiyNMqRKBkuTKlNPquTHVa49oXKtHElhJ3UyYjoJB7t0Eg5C59/PVb941ZfgFNY2vHr2DPGHi9pAeT2oNrL24gZvpGmWprXMro7/RSNYO2t2gyjr6iPVgfqyyOxAmdYyfMbPNM8QzcMDZS00o7yyqmVGuL6sdXh4eHmgrqy6bfSEgUNOG+vaE2MNnPAznU0QnQ0UY40e8jPNQ0TzaDHWEDHWEDHW6PBoORbJGJ9Q1WKmkmp8cMmyjdusiNfadH91SYpz0UgZvMP93ivTt+C0spFsoeqIPVASSQCJpj7FfYpFE54p0ZQo/joVb/JeOdyfvoVtjDc5IXYFSii0dFnjMvKWzyuL/WvEBdHSZcLhMQw1/tKFtvJIuK6scSnh5JyHk3MRTs4tJhOktWJJkWFdMputHF/dMWFfCIcJoaKcUBSyEUJmscQVf7r/y+Kl/Bxt4k+2sXAWW0qN1Uokq6KSIxVUxv8MsAVnKfF6aKzGAhtZiDV29RGfduxrVxRizV20dFmci/tiabyMWcKkscslJy7hrNAJjy1Fh+JSSGHiMigK4/IL6zPbNvrOrBNSoB4lC1n042Qlq/zNjA1oJzswgRKAiRId+OI+Tk584B4nF/BHHENdwB7kBiZRD2Ay8AdKoSSgh5KBXuAxfCF7wKdRKvh0SgNmSMykdGAWZejf4+grUKNMoB8H2+8pmzRgAPgd5ZAfmEvZwCDwW+pJAeAZlAPEdy4wT2KIeurfUG86A9hHYl/KA+ZTCNiP+gD7A7+mAuoLHED5wIHUT/+KBkkcTP2BQ2gAcCgN1P9FhRKH0SDgcIkjaDDwTBoCHElDgUVUqH+JL8xhwGIaDiyhEcBS4BdURmcCy2kkcBQV6UdpNIWBY6gYeBaVAM+WWEGlwHOoDDiWRulHaJzE8TQaOIHGACfSWfrnNEniZDobWEkV+mGaQmOBUyVOo3HAKhqvf0bVNAE4HXiYzqWJ4GfQZGANVQLPkziTpuj/pFqaCqyjacBZwE9pNlUD59B0YD2dCzyfZuif0FyJDVQDnEfn6R/TBVQL/kKJ86kOuIBmQX4RzQYulLiI5ugf0WKqBy6hucBGiUupQf+QltE84MV0AfAS4Ae0nC4ErqAFwEvpIuBlEi+nhcAraBHwSlqsv08rJTZRI/AqWgr8DS3TxW9BLgZeLXEVXaIfomtoOXA1rQCuoUuB19Jl+rvUTJcD19IVkFwHfJeupyuBN9BK4I10FfAm4EG6mX4DvIV+C/wdXa0foFsl/p5WAdfRauBttAattwMP0B10LXA9Nevv0B9oLfBOug74R4l30Q3ADXQj8G66CXgP8G26l24G3ke3AO+n3wEfoFv1t+hB+r3+Jj1E64Ab6TbgwxIfoduBj9IdwD/RH4CbJD5GdwIfpz8CI3QXsAX4BrXSBmAb3Q1sp3v11+kJuk9/jTZL/DPdD+ygB4Cd9CBwi8QnaSPwKXpYf5X+Qo8An5a4lR4FbqM/Af9Km4Db6THgDnpcf4V2UgT4N2rR/0HPSHyWWoHPUZu+n56nduAuegL4Am0G7qY/A/dQB/BF6gTulbiPtgD/Tk8BX6K/6C/Ty8CXaD89DfwHbQW+Qtv0v9OrEl+j7cDXaQfwDdoJfFPiW/Q34Nv0DPAdelbfRwckHqTn9b30Lu0CHqIXgO9JfJ92Az+gPcAP6UXgR7RPf5E+lvgJ/R34Kb2k76F/0svAzyQepv3Az+kVfTcdoVeBRyV+Qa8Bv6TXgf+iN4BfSfya3tJfoG/obeC39A7wO+Au+p4OAI/RQeAP9C7wR4nH6T39eYrS+0CdPgD+N6f/38/pX/zKc/o/u53TP/mFnP7JT3L6x7+Q0z/6SU7/sBs5/f0TOX3JaTn9vV/I6e/JnP7eT3L6IZnTD52S0w/JnH5I5vRDp+T0d3+S0w/KnH5Q5vSDv8Kc/vr/o5y+/785/b85/VeX03/t5/Rfb07/pXP6f3P6f3P6z+f05379Of1/ABquEH0KZW5kc3RyZWFtCmVuZG9iago5IDAgb2JqCjw8L1R5cGUgL0ZvbnREZXNjcmlwdG9yCi9Gb250TmFtZSAvQXJpYWxNVAovRmxhZ3MgNAovQXNjZW50IDkwNS4yNzM0NAovRGVzY2VudCAtMjExLjkxNDA2Ci9TdGVtViA0NS44OTg0MzgKL0NhcEhlaWdodCA3MTUuODIwMzEKL0l0YWxpY0FuZ2xlIDAKL0ZvbnRCQm94IFstNjY0LjU1MDc4IC0zMjQuNzA3MDMgMjAwMCAxMDA1Ljg1OTM4XQovRm9udEZpbGUyIDggMCBSPj4KZW5kb2JqCjEwIDAgb2JqCjw8L1R5cGUgL0ZvbnQKL0ZvbnREZXNjcmlwdG9yIDkgMCBSCi9CYXNlRm9udCAvQXJpYWxNVAovU3VidHlwZSAvQ0lERm9udFR5cGUyCi9DSURUb0dJRE1hcCAvSWRlbnRpdHkKL0NJRFN5c3RlbUluZm8gPDwvUmVnaXN0cnkgKEFkb2JlKQovT3JkZXJpbmcgKElkZW50aXR5KQovU3VwcGxlbWVudCAwPj4KL1cgWzAgWzc1MF0gMzkgWzcyMi4xNjc5NyA2NjYuOTkyMTkgMCAwIDcyMi4xNjc5NyAwIDAgMCA1NTYuMTUyMzQgMCAwIDc3Ny44MzIwMyAwIDAgNzIyLjE2Nzk3XSA1OCBbOTQzLjg0NzY2XV0KL0RXIDA+PgplbmRvYmoKMTEgMCBvYmoKPDwvRmlsdGVyIC9GbGF0ZURlY29kZQovTGVuZ3RoIDI2NT4+IHN0cmVhbQp4nF2RTWuEMBCG7/kVc9welmi6snsQYdcieOgHtf0Bmow2UJMQ48F/33xsLXQggYd538nMhNbtU6ukA/pmNe/QwSiVsLjo1XKEASepSM5ASO7uFG8+94ZQb+62xeHcqlGTsgSg7z67OLvB4Sr0gA+EvlqBVqoJDp9157lbjfnGGZWDjFQVCBx9pefevPQzAo22Yyt8Xrrt6D1/io/NILDIeeqGa4GL6TnaXk1IysxHBWXjoyKoxL98kVzDyL96G9Ts5tVZdrpUkZpEdaRHlqhJVEQqWKJronN85V4v/62+N8POUcYuqdLprk750F5Y4z47X631Y8ddx3nDpFLh/h1Gm+AK5wck/4erCmVuZHN0cmVhbQplbmRvYmoKNCAwIG9iago8PC9UeXBlIC9Gb250Ci9TdWJ0eXBlIC9UeXBlMAovQmFzZUZvbnQgL0FyaWFsTVQKL0VuY29kaW5nIC9JZGVudGl0eS1ICi9EZXNjZW5kYW50Rm9udHMgWzEwIDAgUl0KL1RvVW5pY29kZSAxMSAwIFI+PgplbmRvYmoKeHJlZgowIDEyCjAwMDAwMDAwMDAgNjU1MzUgZiAKMDAwMDAwMDAxNSAwMDAwMCBuIAowMDAwMDAwNDUwIDAwMDAwIG4gCjAwMDAwMDAxMDcgMDAwMDAgbiAKMDAwMDAxMDExMCAwMDAwMCBuIAowMDAwMDAwMTQ0IDAwMDAwIG4gCjAwMDAwMDA2NTggMDAwMDAgbiAKMDAwMDAwMDcxMyAwMDAwMCBuIAowMDAwMDAwNzYwIDAwMDAwIG4gCjAwMDAwMDkyMzkgMDAwMDAgbiAKMDAwMDAwOTQ2NiAwMDAwMCBuIAowMDAwMDA5Nzc0IDAwMDAwIG4gCnRyYWlsZXIKPDwvU2l6ZSAxMgovUm9vdCA3IDAgUgovSW5mbyAxIDAgUj4+CnN0YXJ0eHJlZgoxMDI0MgolJUVPRg=="}},"serialized":"https://github.com/gradio-app/gradio/raw/main/test/test_files/sample_file.pdf"}},{"id":119,"type":"form","props":{"type":"form","visible":true,"style":{}}},{"id":120,"type":"row","props":{"type":"row","variant":"default","visible":true,"style":{}}},{"id":121,"type":"button","props":{"value":"S3UP","variant":"secondary","interactive":true,"name":"button","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":122,"type":"textbox","props":{"lines":1,"max_lines":20,"value":"","type":"text","label":"S3UP result","show_label":true,"name":"textbox","visible":true,"style":{}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}},{"id":123,"type":"form","props":{"type":"form","visible":true,"style":{}}},{"id":124,"type":"button","props":{"value":"Dark Mode","variant":"primary","interactive":true,"name":"button","visible":true,"style":{"size":"sm"}},"serializer":"Serializable","api_info":{"raw_input":["str","string value"],"raw_output":["str","string value"],"serialized_input":["str","string value"],"serialized_output":["str","string value"]},"example_inputs":{"raw":"Howdy!","serialized":"Howdy!"}}],"css":"footer {visibility: hidden;}\n    body{background:linear-gradient(#f5f5f5,#e5e5e5);}\n    body.dark{background:linear-gradient(#000000,#0d0d0d);}\n    ","title":"h2oGPT","is_space":false,"enable_queue":true,"show_error":true,"show_api":false,"is_colab":false,"stylesheets":["https://fonts.googleapis.com/css2?family=Montserrat:wght@400;600\u0026display=swap","https://fonts.googleapis.com/css2?family=IBM+Plex+Mono:wght@400;600\u0026display=swap"],"root":"","theme":"soft","layout":{"id":0,"children":[{"id":1},{"id":2},{"id":3},{"id":4},{"id":5},{"id":6},{"id":7,"children":[{"id":8,"children":[{"id":9,"children":[{"id":10,"children":[{"id":19,"children":[{"id":11},{"id":12},{"id":13}]},{"id":14},{"id":15},{"id":16,"children":[{"id":18,"children":[{"id":17}]}]}]},{"id":20,"children":[{"id":21,"children":[{"id":22},{"id":23}]},{"id":24,"children":[{"id":25,"children":[{"id":27,"children":[{"id":26}]}]},{"id":28,"children":[{"id":29},{"id":30}]}]},{"id":31,"children":[{"id":32},{"id":33},{"id":34,"children":[{"id":37,"children":[{"id":35},{"id":36}]}]},{"id":38},{"id":39}]}]}]},{"id":40,"children":[{"id":41,"children":[]}]},{"id":42,"children":[{"id":43,"children":[{"id":44,"children":[{"id":62,"children":[{"id":45}]},{"id":46},{"id":47},{"id":63,"children":[{"id":48},{"id":49},{"id":50},{"id":51},{"id":52},{"id":53},{"id":54},{"id":55},{"id":56},{"id":57},{"id":58},{"id":59},{"id":60},{"id":61}]}]}]}]},{"id":64,"children":[{"id":103,"children":[{"id":65}]},{"id":66,"children":[{"id":67,"children":[{"id":68,"children":[{"id":69,"children":[{"id":70},{"id":71}]},{"id":72,"children":[{"id":73},{"id":79,"children":[{"id":74},{"id":75}]},{"id":76},{"id":80,"children":[{"id":77},{"id":78}]}]}]},{"id":81,"children":[{"id":82,"children":[{"id":85,"children":[{"id":83},{"id":84}]}]},{"id":86,"children":[{"id":87},{"id":88}]}]}]},{"id":89,"children":[{"id":90,"children":[{"id":91,"children":[{"id":92},{"id":93}]},{"id":94,"children":[{"id":95},{"id":101,"children":[{"id":96},{"id":97}]},{"id":98},{"id":102,"children":[{"id":99},{"id":100}]}]}]}]}]}]},{"id":104,"children":[{"id":105,"children":[{"id":108,"children":[{"id":106}]},{"id":107}]},{"id":109,"children":[{"id":110,"children":[{"id":111,"children":[{"id":112},{"id":114,"children":[{"id":113}]}]},{"id":115,"children":[{"id":116},{"id":119,"children":[{"id":117}]},{"id":118}]},{"id":120,"children":[{"id":121},{"id":123,"children":[{"id":122}]}]}]}]}]}]}]},{"id":124}]},"dependencies":[{"targets":[116],"trigger":"click","inputs":[],"outputs":[118,117],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[121],"trigger":"click","inputs":[117],"outputs":[122],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[107],"trigger":"click","inputs":[106],"outputs":[109],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[107],"trigger":"then","inputs":[106],"outputs":[105],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":2,"trigger_only_on_success":false},{"targets":[124],"trigger":"click","inputs":[],"outputs":[],"backend_fn":false,"js":"() =\u003e {\n        if (document.querySelectorAll(\u0027.dark\u0027).length) {\n            document.querySelectorAll(\u0027.dark\u0027).forEach(el =\u003e el.classList.remove(\u0027dark\u0027));\n        } else {\n            document.querySelector(\u0027body\u0027).classList.add(\u0027dark\u0027);\n        }\n    }","queue":false,"api_name":"dark","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[61],"trigger":"select","inputs":[61],"outputs":[10],"backend_fn":true,"js":null,"queue":null,"api_name":"chat_checkbox","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[61],"trigger":"then","inputs":[61],"outputs":[20],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":5,"trigger_only_on_success":false},{"targets":[61],"trigger":"then","inputs":[61],"outputs":[60],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":6,"trigger_only_on_success":false},{"targets":[26],"trigger":"submit","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,22],"outputs":[22],"backend_fn":true,"js":null,"queue":"checkbox","api_name":"instruction","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[26],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,23],"outputs":[23],"backend_fn":true,"js":null,"queue":"checkbox","api_name":"instruction2","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":8,"trigger_only_on_success":false},{"targets":[26],"trigger":"then","inputs":[],"outputs":[26],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":9,"trigger_only_on_success":false},{"targets":[26],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,1,22],"outputs":[22],"backend_fn":true,"js":null,"queue":null,"api_name":"instruction_bot","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":true},"collects_event_data":false,"trigger_after":10,"trigger_only_on_success":false},{"targets":[26],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,22],"outputs":[35],"backend_fn":true,"js":null,"queue":null,"api_name":"instruction_bot_score","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":11,"trigger_only_on_success":false},{"targets":[26],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,2,23],"outputs":[23],"backend_fn":true,"js":null,"queue":null,"api_name":"instruction_bot2","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":true},"collects_event_data":false,"trigger_after":12,"trigger_only_on_success":false},{"targets":[26],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,23],"outputs":[36],"backend_fn":true,"js":null,"queue":null,"api_name":"instruction_bot_score2","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":13,"trigger_only_on_success":false},{"targets":[26],"trigger":"then","inputs":[],"outputs":[],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":14,"trigger_only_on_success":false},{"targets":[29],"trigger":"click","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,22],"outputs":[22],"backend_fn":true,"js":null,"queue":"checkbox","api_name":"submit","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[29],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,23],"outputs":[23],"backend_fn":true,"js":null,"queue":"checkbox","api_name":"submit2","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":16,"trigger_only_on_success":false},{"targets":[29],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,1,22],"outputs":[22],"backend_fn":true,"js":null,"queue":null,"api_name":"submit_bot","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":true},"collects_event_data":false,"trigger_after":17,"trigger_only_on_success":false},{"targets":[29],"trigger":"then","inputs":[],"outputs":[26],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":18,"trigger_only_on_success":false},{"targets":[29],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,22],"outputs":[35],"backend_fn":true,"js":null,"queue":null,"api_name":"submit_bot_score","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":19,"trigger_only_on_success":false},{"targets":[29],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,2,23],"outputs":[23],"backend_fn":true,"js":null,"queue":null,"api_name":"submit_bot2","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":true},"collects_event_data":false,"trigger_after":20,"trigger_only_on_success":false},{"targets":[29],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,23],"outputs":[36],"backend_fn":true,"js":null,"queue":null,"api_name":"submit_bot_score2","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":21,"trigger_only_on_success":false},{"targets":[29],"trigger":"then","inputs":[],"outputs":[],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":22,"trigger_only_on_success":false},{"targets":[38],"trigger":"click","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,22],"outputs":[22],"backend_fn":true,"js":null,"queue":"checkbox","api_name":"retry","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[38],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,23],"outputs":[23],"backend_fn":true,"js":null,"queue":"checkbox","api_name":"retry2","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":24,"trigger_only_on_success":false},{"targets":[38],"trigger":"then","inputs":[],"outputs":[26],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":25,"trigger_only_on_success":false},{"targets":[38],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,1,22],"outputs":[22],"backend_fn":true,"js":null,"queue":null,"api_name":"retry_bot","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":true},"collects_event_data":false,"trigger_after":26,"trigger_only_on_success":false},{"targets":[38],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,22],"outputs":[35],"backend_fn":true,"js":null,"queue":null,"api_name":"retry_bot_score","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":27,"trigger_only_on_success":false},{"targets":[38],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,2,23],"outputs":[23],"backend_fn":true,"js":null,"queue":null,"api_name":"retry_bot2","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":true},"collects_event_data":false,"trigger_after":28,"trigger_only_on_success":false},{"targets":[38],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,23],"outputs":[36],"backend_fn":true,"js":null,"queue":null,"api_name":"retry_bot_score2","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":29,"trigger_only_on_success":false},{"targets":[38],"trigger":"then","inputs":[],"outputs":[],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":30,"trigger_only_on_success":false},{"targets":[39],"trigger":"click","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,22],"outputs":[22],"backend_fn":true,"js":null,"queue":"checkbox","api_name":"undo","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[39],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,22],"outputs":[35],"backend_fn":true,"js":null,"queue":null,"api_name":"undo_score","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":32,"trigger_only_on_success":false},{"targets":[39],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,23],"outputs":[23],"backend_fn":true,"js":null,"queue":"checkbox","api_name":"undo2","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":33,"trigger_only_on_success":false},{"targets":[39],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,23],"outputs":[36],"backend_fn":true,"js":null,"queue":null,"api_name":"undo_score2","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":34,"trigger_only_on_success":false},{"targets":[39],"trigger":"then","inputs":[],"outputs":[26],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":35,"trigger_only_on_success":false},{"targets":[32],"trigger":"click","inputs":[],"outputs":[22],"backend_fn":true,"js":null,"queue":false,"api_name":"clear","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[32],"trigger":"then","inputs":[],"outputs":[23],"backend_fn":true,"js":null,"queue":false,"api_name":"clear2","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":37,"trigger_only_on_success":false},{"targets":[14],"trigger":"click","inputs":[1,26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13],"outputs":[11],"backend_fn":true,"js":null,"queue":null,"api_name":"submit_nochat","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":true},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[14],"trigger":"then","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,11],"outputs":[17],"backend_fn":true,"js":null,"queue":null,"api_name":"instruction_bot_score_nochat","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":39,"trigger_only_on_success":false},{"targets":[14],"trigger":"then","inputs":[],"outputs":[],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":40,"trigger_only_on_success":false},{"targets":[73],"trigger":"click","inputs":[70,71,1,46,74,75,76],"outputs":[1,77,78,46],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[73],"trigger":"then","inputs":[46],"outputs":[46],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":42,"trigger_only_on_success":false},{"targets":[73],"trigger":"then","inputs":[22,77],"outputs":[22],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":43,"trigger_only_on_success":false},{"targets":[73],"trigger":"then","inputs":[22,77],"outputs":[11],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":44,"trigger_only_on_success":false},{"targets":[73],"trigger":"then","inputs":[],"outputs":[],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":45,"trigger_only_on_success":false},{"targets":[95],"trigger":"click","inputs":[92,93,2,47,96,97,98],"outputs":[2,99,100,47],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[95],"trigger":"then","inputs":[47],"outputs":[47],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":47,"trigger_only_on_success":false},{"targets":[95],"trigger":"then","inputs":[23,99],"outputs":[23],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":48,"trigger_only_on_success":false},{"targets":[95],"trigger":"then","inputs":[],"outputs":[],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":49,"trigger_only_on_success":false},{"targets":[87],"trigger":"click","inputs":[3,83],"outputs":[70,92,83,3],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[88],"trigger":"click","inputs":[4,84,77,78,99,100],"outputs":[71,93,84,4],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[6],"trigger":"click","inputs":[],"outputs":[6],"backend_fn":true,"js":null,"queue":null,"api_name":"go","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[6],"trigger":"then","inputs":[],"outputs":[7],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":53,"trigger_only_on_success":false},{"targets":[6],"trigger":"then","inputs":[70,71,1,46,74,75,76],"outputs":[1,77,78,46],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":54,"trigger_only_on_success":false},{"targets":[6],"trigger":"then","inputs":[46],"outputs":[46],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":55,"trigger_only_on_success":false},{"targets":[65],"trigger":"select","inputs":[65],"outputs":[23],"backend_fn":true,"js":null,"queue":null,"api_name":"compare_checkbox","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[65],"trigger":"then","inputs":[65],"outputs":[89],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":57,"trigger_only_on_success":false},{"targets":[65],"trigger":"then","inputs":[65],"outputs":[47],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":58,"trigger_only_on_success":false},{"targets":[65],"trigger":"then","inputs":[65],"outputs":[36],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":59,"trigger_only_on_success":false},{"targets":[33],"trigger":"click","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,22],"outputs":[],"backend_fn":true,"js":null,"queue":null,"api_name":"flag","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[15],"trigger":"click","inputs":[26,59,60,45,46,49,50,51,52,53,54,55,56,57,58,48,61,12,13,22],"outputs":[],"backend_fn":true,"js":null,"queue":null,"api_name":"flag_nochat","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[112],"trigger":"click","inputs":[],"outputs":[113],"backend_fn":true,"js":null,"queue":null,"api_name":"system_info","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[30],"trigger":"click","inputs":[],"outputs":[],"backend_fn":true,"js":null,"queue":false,"api_name":"stop","scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[30],"trigger":"click","inputs":[],"outputs":[],"backend_fn":true,"js":null,"queue":false,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[41,15,23,31],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false},{"targets":[30],"trigger":"then","inputs":[],"outputs":[],"backend_fn":true,"js":null,"queue":null,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":64,"trigger_only_on_success":false},{"targets":[],"trigger":"load","inputs":[],"outputs":[],"backend_fn":false,"js":"() =\u003e {\n        if (document.querySelectorAll(\u0027.dark\u0027).length) {\n            document.querySelectorAll(\u0027.dark\u0027).forEach(el =\u003e el.classList.remove(\u0027dark\u0027));\n        } else {\n            document.querySelector(\u0027body\u0027).classList.add(\u0027dark\u0027);\n        }\n    }","queue":false,"api_name":null,"scroll_to_output":false,"show_progress":true,"every":null,"batch":false,"max_batch_size":4,"cancels":[],"types":{"continuous":false,"generator":false},"collects_event_data":false,"trigger_after":null,"trigger_only_on_success":false}]};</script>

        <link rel="preconnect" href="https://fonts.googleapis.com" />
        <link
            rel="preconnect"
            href="https://fonts.gstatic.com"
            crossorigin="anonymous"
        />
        <script src="https://cdnjs.cloudflare.com/ajax/libs/iframe-resizer/4.3.1/iframeResizer.contentWindow.min.js"></script>
        <script type="module" crossorigin src="https://gradio.s3-us-west-2.amazonaws.com/3.27.0/assets/index-9405f928.js"></script>

    </head>

    <body
        style="
            width: 100%;
            margin: 0;
            padding: 0;
            display: flex;
            flex-direction: column;
            flex-grow: 1;
        "
    >
        <gradio-app
            control_page_title="true"
            embed="false"
            eager="true"
            style="display: flex; flex-direction: column; flex-grow: 1"
        >
        </gradio-app>
        <script>
            const ce = document.getElementsByTagName("gradio-app");
            if (ce[0]) {
                ce[0].addEventListener("domchange", () => {
                    document.body.style.padding = "0";
                });
                document.body.style.padding = "0";
            }
        </script>
    </body>
</html>

when posting this to the chat interface, gradio calls matplotlib that chokes and keeps memory usage on GPU high, doesn't recover

WARNING: Special characters in prompt
Using pad_token, but it is not set yet.
Using pad_token, but it is not set yet.
Traceback (most recent call last):
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/gradio/routes.py", line 401, in run_predict
    output = await app.get_blocks().process_api(
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/gradio/blocks.py", line 1305, in process_api
    data = self.postprocess_data(fn_index, result["prediction"], state)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/gradio/blocks.py", line 1239, in postprocess_data
    prediction_value = block.postprocess(prediction_value)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/gradio/components.py", line 4626, in postprocess
    self._postprocess_chat_messages(message_pair[1]),
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/gradio/components.py", line 4599, in _postprocess_chat_messages
    return self.md.renderInline(chat_message)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/markdown_it/main.py", line 299, in renderInline
    return self.renderer.render(self.parseInline(src, env), self.options, env)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/markdown_it/renderer.py", line 87, in render
    result += self.renderInline(token.children, options, env)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/markdown_it/renderer.py", line 108, in renderInline
    result += self.rules[token.type](tokens, i, options, env)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/mdit_py_plugins/dollarmath/index.py", line 70, in render_math_inline
    content = _renderer(str(tokens[idx].content).strip(), {"display_mode": False})
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/gradio/utils.py", line 904, in tex2svg
    fig.savefig(
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/matplotlib/figure.py", line 3343, in savefig
    self.canvas.print_figure(fname, **kwargs)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/matplotlib/backend_bases.py", line 2342, in print_figure
    self.figure.draw(renderer)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/matplotlib/artist.py", line 95, in draw_wrapper
    result = draw(artist, renderer, *args, **kwargs)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/matplotlib/artist.py", line 72, in draw_wrapper
    return draw(artist, renderer)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/matplotlib/figure.py", line 3140, in draw
    mimage._draw_list_compositing_images(
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/matplotlib/image.py", line 131, in _draw_list_compositing_images
    a.draw(renderer)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/matplotlib/artist.py", line 72, in draw_wrapper
    return draw(artist, renderer)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/matplotlib/text.py", line 752, in draw
    bbox, info, descent = self._get_layout(renderer)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/matplotlib/text.py", line 386, in _get_layout
    w, h, d = _get_text_metrics_with_cache(
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/matplotlib/text.py", line 97, in _get_text_metrics_with_cache
    return _get_text_metrics_with_cache_impl(
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/matplotlib/text.py", line 105, in _get_text_metrics_with_cache_impl
    return renderer_ref().get_text_width_height_descent(text, fontprop, ismath)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/matplotlib/backends/backend_svg.py", line 1317, in get_text_width_height_descent
    return self._text2path.get_text_width_height_descent(s, prop, ismath)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/matplotlib/textpath.py", line 60, in get_text_width_height_descent
    self.mathtext_parser.parse(s, 72, prop)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/matplotlib/mathtext.py", line 226, in parse
    return self._parse_cached(s, dpi, prop)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/matplotlib/mathtext.py", line 247, in _parse_cached
    box = self._parser.parse(s, fontset, fontsize, dpi)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/matplotlib/_mathtext.py", line 1995, in parse
    raise ValueError("\n" + ParseException.explain(err, 0)) from None
ValueError: 
$%<br>Pres#$
^
ParseException: Expected end of text, found '$'  (at char 0), (line:1, col:1)

now even page refresh doesn't release memory, adds about 4GB each time.

arnocandel commented 1 year ago

7772ccd27fe8ed84cccc46685e0a6ba5f26f4c60 GPU OOM 3: question then also fails in eval (cascade) with

Traceback (most recent call last):
  File "/nfs4/llm/h2ogpt/generate.py", line 1066, in score_qa
    score = torch.sigmoid(smodel(**inputs).logits[0]).cpu().detach().numpy()[0]
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/accelerate/hooks.py", line 165, in new_forward
    output = old_forward(*args, **kwargs)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1311, in forward
    outputs = self.deberta(
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/accelerate/hooks.py", line 165, in new_forward
    output = old_forward(*args, **kwargs)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1083, in forward
    encoder_outputs = self.encoder(
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/accelerate/hooks.py", line 165, in new_forward
    output = old_forward(*args, **kwargs)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 521, in forward
    output_states = layer_module(
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/accelerate/hooks.py", line 165, in new_forward
    output = old_forward(*args, **kwargs)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 362, in forward
    attention_output = self.attention(
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/accelerate/hooks.py", line 165, in new_forward
    output = old_forward(*args, **kwargs)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 293, in forward
    self_output = self.self(
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/accelerate/hooks.py", line 165, in new_forward
    output = old_forward(*args, **kwargs)
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 728, in forward
    rel_att = self.disentangled_attention_bias(
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 811, in disentangled_attention_bias
    score += c2p_att / scale.to(dtype=c2p_att.dtype)
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 256.00 MiB (GPU 0; 23.65 GiB total capacity; 21.65 GiB already allocated; 90.69 MiB free; 22.41 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.  See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
Traceback (most recent call last):
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/gradio/routes.py", line 401, in run_predict
    output = await app.get_blocks().process_api(
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/gradio/blocks.py", line 1302, in process_api
    result = await self.call_function(
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/gradio/blocks.py", line 1025, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/anyio/to_thread.py", line 31, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread
    return await future
  File "/nfs4/llm/h2o-llm/env/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 867, in run
    result = context.run(func, *args)
  File "/nfs4/llm/h2ogpt/gradio_runner.py", line 448, in score_last_response
    return 'Response Score: {:.1%}'.format(score)
ValueError: Unknown format code '%' for object of type 'str'

image

after afterwards, memory is held on to forever, started with 15GB image

arnocandel commented 1 year ago

once the user clicks "New Conversation", it frees it up, ok, good, so only in active sessions image

arnocandel commented 1 year ago

aa6b5e4a7dba73e504420ca89ae480b774d6d714 fixed cascade thing.

arnocandel commented 1 year ago

easiest way to repro: dump massive website source into prompt, wait 5s, abandon page, come back, refresh, wait, etc., still will be stuck at higher memory forever.

pseudotensor commented 7 months ago

https://github.com/h2oai/h2ogpt/pull/1407