-
Hi all,
I have fine-tuned a 9-class text classifier by making use of [Flair](https://github.com/flairNLP/flair) and made use of these multilingual [sentence-embeddings](https://huggingface.co/sent…
-
### Chapter 1: Introduction
---
On page 2:
> "Because of this complex interaction of many interested parties and
> forces and the constant evolution of change, it seems to me that the
> t…
-
Hello, sorry to disturb again. But I ran into some more issues when verifying assumptions.
_ASV_RCM_2_lin
-
Hi, thanks for sharing the great work!
I would like to use this metric to evaluate the results in the image-to-image translation task. However, in I2I datasets, the number of real samples are always …
-
In section '5.9.3.3 Estimating the Shapley Value':
https://github.com/christophM/interpretable-ml-book/blob/4759479f5d4287eaf353ab9db6cbeee506e54046/manuscript/05.9-agnostic-shapley.Rmd#L320-L321
…
-
I am running 'captum' on OS X 10.11.6 (also Ubuntu 16.04LTS).
The example 'python -m captum.insights.example' gets and Internal Server Error when I try
to connect to http://localhost:51283/ with Saf…
-
## Feature interaction
Monotonicity constraints is one way to make the black-box more intuitive and interpretable. For tree based models, using [Interaction constraints](https://xgboost.readthedocs…
-
## Description of your problem
Interpolated Docs are missing sample plot. One should be added
https://docs.pymc.io/api/distributions/continuous.html#pymc3.distributions.continuous.Interpolated
![…
-
Some case status values in OECI are not "Open" or "Closed" but are still interpretable as such:
https://github.com/codeforpdx/recordexpungPDX/blob/8121e8a170a56e16fdf2363eda7d6edb9d91f8f8/src/back…
-
I have been trying to use captum to interpret my Low-Resource Neural Machine Translation model (specifically, XLM).
I am getting the following error when trying to run ```IntegratedGradients.attri…