-
### Describe the bug
In the function documentation of the `ctc_greedy_decode` the probabilities shape is implied to be `[batch, probabilities, time]`. The sequence is therefore narrowed depending on …
-
Standalone tools were removed at 92bf51bc8f201a2d5b1e8b90b8dc033606dbcfb0. They are useful to execute batch process decrypting the whole library.
-
Hey guys, I really enjoyed reading the paper, and thanks for pushing the source code. I am working on Informer for a multivariate problem where I am having 94 features and one output target. I have a …
-
Currently the QA process is mostly done offline.
The photographer does not get any direct feedback in terms of duplicates, rejected photos, etc.
The role QA responsible has to go through all photos,…
-
### 🐛 Describe the bug
import torch.nn as nn
import torch
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
class CRNN(nn.Module):
def __init__(self, img_channel…
-
### Descriptive summary
When doing a batch edit, and as a counterpart to #2827, the "Currently Shared With" does not show editors - when the same editor is an editor on all works on the batch. So o…
-
### Which app?
Intel Manager 2.0 (Experience Builder)
### Describe requirements
IM1.0 allows users to select a group of points and edit attributes of the group all at one time. IM2.0 needs this abi…
-
Creating an issue to track the TODOs for this task:
We need to update `torch2trt` to support dynamic batch sizes, up to the size given during conversion.
During first compilation of the model wh…
-
That way we can stop (ab)using iterators (and maybe even deprecate them in ursadb - they're a bit problematic in case of failed jobs).
And with postgres it won't be a problem.
-
Currently, our binary distribution packs ~100MB raw data for pinot quick starter scripts. Removing this can greatly reduce the size of our official binary distribution, which is currently over 500MB.
…