-
Demystifying Post-hoc Explainability for ML models
These are just some example:
Section 3 First sentence of Rule-based Explanations is copied from: https://www.aaai.org/ocs/index.php/AAAI/AAAI18…
-
# 一句话总结
针对跨句子的RE问题,提出了LSTM-CNN的结构。还是比较直觉的一种做法
资源:
- [pdf](https://openreview.net/pdf?id=Sye0lZqp6Q)
关键词:
- A combined Long Short Term Memory and Convolutional Neural Networks (LSTM …
-
**In the paper, section 3.1 BERT, it is said that -**
`we extract a fixed sized vector via max pooling of the second to last layer.` then `A sentence of N words will hence
esult in an N ∗ H embeddi…
-
The main string class doco needs some work. The main intro sentence, "A string array of Unicode strings.", is really clumsy, the whole thing focuses too much on Unicode, and it's not clear who the aud…
-
Hello, I have tried to run a NetPyNE based project(https://github.com/DepartmentofNeurophysiology/Cortical-representation-of-touch-in-silico-NetPyne) but falied when came to this sentence(in ./multitr…
-
## 0. Paper
@inproceedings{chen-etal-2014-unified,
title = "A Unified Model for Word Sense Representation and Disambiguation",
author = "Chen, Xinxiong and
Liu, Zhiyuan and
…
a1da4 updated
3 years ago
-
## Problem
While doing some analysis on the pre-trained SciBert transformer networks, we found that there is an anomaly in the context embedding on index 422. Doing some more tests we found that this…
-
## Adding a Dataset
- **Name:** *SPECTER*
- **Description:** *SPECTER: Document-level Representation Learning using Citation-informed Transformers*
- **Paper:** *https://doi.org/10.18653/v1/2020.ac…
-
In the original game, there are two spaces at the start of each sentence after the dot "." of the previous sentence.
This makes it easier to locate the beginning of each sentence and makes it easier …
-
The parser needs a rework
* the narsese parser must build a AST
* the AST must be transformed to the internal representation of terms/statements/products/etc
This provides some way to introduce…