Kaggle / kaggle-api

Official Kaggle API
Apache License 2.0
6.11k stars 1.08k forks source link

How to add my own data in a kernel push? #143

Closed esvhd closed 1 year ago

esvhd commented 5 years ago

Hi guys,

I've been trying to add outputs from my own kernels in a new kernel push kaggle k push.

So far I've tried using the ref output from kaggle k list -m but this doesn't seem to be working.

I know that I can do this in a notebook kernel online but have had no luck with kaggle API.

Does anyone have any insight into this?

Thanks.

bothmena commented 5 years ago

Hi @esvhd,

Sorry for the late answer, but I just started digging into Kaggle API and I believe I have the solution for you.

To be able to add any kind of data source (datasets, competitions, kernels output) to your kernels, all you need to do is to make few modifications to your kernel metadata file:

First, if you don't have the metadata already there you can generate it using the CLI:

$ kaggle kernels init -p /path/to/your/kernel/dir

This command will create a file kernel-metadata.json that is similar to this:

{
  "id": "timoboz/my-awesome-kernel",
  "id_no": 12345,
  "title": "My Awesome Kernel",
  "code_file": "my-awesome-kernel.ipynb",
  "language": "python",
  "kernel_type": "notebook",
  "is_private": "false",
  "enable_gpu": "false",
  "enable_internet": "false",
  "dataset_sources": ["timoboz/my-awesome-dataset"],
  "competition_sources": [],
  "kernel_sources": []
}

You can read more about the content of this file in the Kaggle API Wiki

Now to be able to add the output of another kernel all you need to do is to modify the line:

  "kernel_sources": ["username/kernel-id"]

You can even use the output of the same kernel, say for example you saved the weights of a model as a kernel output and you want to train your model for more epochs, you can use the output of the same kernel as an input for another commit/version.

Now you can push your code to Kaggle ;)

esvhd commented 5 years ago

That’s pretty neat. Let me check it out. Thanks for the detailed reply.