Miserlou / SoundScrape

SoundCloud (and Bandcamp and Mixcloud) downloader in Python.
MIT License
1.42k stars 146 forks source link

SoundScrape!

SoundScrape Build Status Python 3 PyPI

SoundScrape makes it super easy to download artists from SoundCloud (and Bandcamp and MixCloud) - even those which don't have download links! It automatically creates ID3 tags as well (including album art), which is handy.

Usage

First, install it:

pip install soundscrape

Note that if you are having problems, please first try updating to the latest version:

pip install soundscrape --upgrade

Then, just call soundscrape and the name of the artist you want to scrape:

soundscrape rabbit-i-am

And you're done! Hooray! Files are stored as mp3s in the format Artist name - Track title.mp3.

You can also use the -n argument to only download a certain number of songs.

soundscrape rabbit-i-am -n 3

Sets

Soundscrape can also download sets, but you have to include the full URL of the set you want to download:

soundscrape https://soundcloud.com/vsauce-awesome/sets/awesome

Groups

Soundscrape can also download tracks from SoundCloud groups with the -g argument.

soundscrape chopped-and-screwed -gn 2

Tracks

Soundscrape can also download specific tracks with -t:

soundscrape foolsgoldrecs -t danny-brown-dip

or with just the straight URL:

soundscrape https://soundcloud.com/foolsgoldrecs/danny-brown-dip

Likes

Soundscrape can also download all of an Artist's Liked items with -l:

soundscrape troyboi -l

or with just the straight URL:

soundscrape https://soundcloud.com/troyboi/likes

High-Quality Downloads Only

By default, SoundScrape will try to rip everything it can. However, if you only want to download tracks that have an official download available (which are typically at a higher-quality 320kbps bitrate), you can use the -d argument.

soundscrape sly-dogg -d

Keep Preview Tracks

By default, SoundScrape will skip the 30-second preview tracks that SoundCloud now provides. You can choose to keep these preview snippets with the -k argument.

soundscrape chromeo -k

Folders

By default, SoundScrape aims to act like wget, downloading in place in the current directory. With the -f argument, however, SoundScrape acts more like a download manager and sorts songs into the following format:

./ARTIST_NAME - ALBUM_NAME/SONG_NUMBER - SONG_TITLE.mp3

It will also skip previously downloaded tracks.

soundscrape murdercitydevils -f

Bandcamp

SoundScrape can also pull down albums from Bandcamp. For Bandcamp pages, use the -b argument along with an artist's username or a specific URL. It only downloads one album at a time. This works with all of the other arguments, except -d as Bandcamp streams only come at one bitrate, as far as I can tell.

Note: Currently, when using the -n argument, the limit is evaluated for each album separately.

soundscrape warsaw -b -f

This also works for non-Bandcamp URLs that are hosted on Bandcamp:

soundscrape -b http://music.monstercat.com/

Note that the full URL must be included.

Mixcloud

SoundScrape can also grab mixes from Mixcloud. This feature is extremely expermental and is in no way guaranteed to work!

Finds the original mp3 of a mix and grabs that (with tags and album art) if it can, or else just gets the raw m4a stream.

Mixcloud currently only takes an invidiual mix. Capacity for a whole artist's profile due shortly.

soundscrape https://www.mixcloud.com/corenewsuploads/flume-essential-mix-2015-10-03/ -of

Audiomack

Just for fun, SoundScrape can also download individual songs from Audiomack. Not that you'd ever want to.

soundscrape -a http://www.audiomack.com/song/bottomfeedermusic/top-shottas

MusicBed

For some strange reason, it also works for MusicBed.com. Thanks @brachna for this feature.

soundscrape https://www.musicbed.com/albums/be-still/2828

Opening Files

As a convenience method, SoundScrape can automatically 'open' files that it downloads. This uses your system's 'open' command for file associations.

soundscrape lorn -of

Issues

There's probably a lot more that can be done to improve this. Please file issues if you find them!