Azure Data Service Notebook is a set of extentions for working with Azure Data Service (e.g. Azure Data Lake, HDIsight, CosmosDB, Azure SQL and Azure Data Warehouse etc.) using Jupyter Notebook.
WARNING: This SDK/CLI is currently in very early stage of development. It can and will change in backwards incompatible ways.
Latest Version: 0.0.1a2
Azure Data Service Notebook currently provides a set of Jupyter Magic Functions for users to access Azure Data Lake. Available magics are captured in the table below. Please click on the command name to see the syntax reference.
Command | Function |
---|---|
%adl login | Line magic* to log in to Azure Data Lake. |
%adl listaccounts | Line magic to list the Azure Data Lake Analytic accounts for current user. |
%adl listjobs | Line magic to list the Azure Data Lake jobs for a given account. |
%%adl submitjob | Cell magic* to submit a USQL job to Azure Data Lake cluster. |
%adl viewjob | Line magic to view detailed job info. |
%adl liststoreaccounts | Line magic to list the Azure Data Lake Store accounts. |
%adl liststorefolders | Line magic to list the folders under a given directory. |
%adl liststorefiles | Line magic to list the files under a given directory. |
%adl sample | Line magic to sample a given file, return results as Pandas DataFrame. |
%adl logout | Line magic to log out. |
*Please check Magic Functions for detailed definiton of Line magic
and Cell magics
.
pip install jupyter
pip install --no-cache-dir adlmagics
adlmgics
functions for Azure Data Lake job control and data exploration.Line magic to login to Azure Data Lake service.
%adl login --tenant <tenant>
Name | Type | Example | Description |
---|---|---|---|
tenant required |
string | microsoft.onmicrosoft.com | The value of this argument can either be an .onmicrosoft.com domain or the Azure object ID for the tenant. |
Line magic to enumerate the Azure Data Lake Analytic accounts for current user. The account list will be returned as Pandas DataFrame, you can call Pandas funtions directly afterward.
%adl listaccounts --page_index
--page_account_number
Name | Type | Example | Description |
---|---|---|---|
page_index required |
int | 0 | The result page number. This must be greater than 0. Default value is 0. |
page_account_number required |
int | 10 | The number of results per page. |
Line magic to enumerate the Azure Data Lake jobs for a given account. The job list will be returned as Pandas DataFrame, you can call Pandas funtions directly afterward.
%adl listjobs --account <azure data lake analytic account>
--page_index
--page_account_number
Name | Type | Example | Description |
---|---|---|---|
account required |
string | telemetryadla | The Azure Data Lake Analytics account to list the job from. |
page_index required |
int | 0 | The result page number. This must be greater than 0. Default value is 0. |
page_account_number required |
int | 10 | The number of results per page. |
Cell magic to submit a U-SQL job to Azure Data Lake cluster.
%%adl submitjob --account <zure data lake analytic account>
--name <job name>
--parallelism
--priority
--runtime
Name | Type | Example | Description |
---|---|---|---|
account required |
string | telemetryadla | the Azure Data Lake Analytics account to execute job operations on. |
name required |
string | myscript | the friendly name of the job to submit. |
parallelism | int | 5 | the degree of parallelism used for this job. This must be greater than 0, if set to less than 0 it will default to 1. |
priority | int | 1000 | the priority value for the current job. Lower numbers have a higher priority. By default, a job has a priority of 1000. This must be greater than 0. |
runtime | string | default | the runtime version of the Data Lake Analytics engine to use for the specific type of job being run. |
Line magic to view detailed job info.
%adl view job --account <azure data lake analytic account>
--job_id <job GUID to be viewed>
Name | Type | Example | Description |
---|---|---|---|
account required |
string | telemetryadla | the Azure Data Lake Analytics account to execute job operations on. |
job_id required |
GUID | 36a62f78-1881-1935-8a6a-9e37b497582d | job identifier. uniquely identifies the job across all jobs submitted to the service. |
Line magic to list the Azure Data Lake Store accounts.
%adl liststoreaccounts
Line magic to list the folders under a given directory.
%adl liststorefolders --account <azure data lake store account>
--folder_path
Name | Type | Example | Description |
---|---|---|---|
account required |
string | telemetryadls | the name of the Data Lake Store account. |
folder_path required |
string | root/data | the directory path under the Data Lake Store account. |
Line magic to list the files under a given directory.
%adl liststorefiles --account <azure data lake store account>
--folder_path
Name | Type | Example | Description |
---|---|---|---|
account required |
string | telemetryadls | the name of the Data Lake Store account. |
folder_path required |
string | root/data | the directory path under the Data Lake Store account. |
Line magic to sample a given file, return results as Pandas DataFrame.
%adl sample --account <azure data lake store account>
--file_path
--file_type
--encoding
--row_number
Name | Type | Example | Description |
---|---|---|---|
account required |
string | telemetryadls | the name of the Data Lake Store account. |
file_path required |
string | root/data/sample.tsv | the file path to sample data from. |
file_type | string | tsv | the type of the file to sample from. |
encoding | string | UTF-8 | encoding type of the file. |
row_number | int | 10 | number of rows to read from the file. |
Line magic to log out.
%adl logout
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.