-
### 🚀 Task description
We need to add function `etna.datasets.load_dataset` to public API, so users can see `load_dataset` docs in the API References.
### Plan
Firstly, we should add cross-referen…
-
# Hakai Dataset Submission
Below are listed all the different steps related to the initial submission of a dataset.
A more detailed written and visual description of every step is available resp…
-
Can you please post the details of the test dataset? I see there are no instructions specifically for the DIV2K valid dataset. I used the weights you posted to test on the DIV2K valid dataset posted b…
-
Related: #55
We should add an interface for users to run a specific model on a specific dataset locally. This will help drive adoption of TabRepo for method papers that are introducing a new model…
-
-
**Describe the Issue**
On the ESS-DIVE project, we found couple of use-cases where a dataset submission's validation would pass on MetacatUI but fails on Metacat. One example is entering special symb…
-
When I predict the model
`from setfit import SetFitModel
from datasets import load_dataset
model = SetFitModel.from_pretrained(model_name)
test_dataset = load_dataset("csv", data_files="dataset…
-
Can you use Baidu Netdisk or other ways to provide the dataset? This way the download will be faster, thank you!
-
A lot of historical datasets from the ocean acidification group are not available via the ERDDAP server or only the last 60 days for some of them. For now, no plan is active regarding the future of th…
-
你好 可否把数据集共享一下哈,也方便您的代码被大众所使用、引用~我从“https://drive.google.com/drive/folders/1IXa3IJS9zJS4vggpyU7yda8f7jZjz4gB
”下载到的数据集没有packet_length这个字段啊?哪里出问题了哈?您这个项目用的是哪里下载到的数据集呢?