Data-Simply / pyretailscience

pyretailscience - A data analysis and science toolkit for detail data
Other
5 stars 1 forks source link

PyRetailScience Logo

PyRetailScience

⚡ Rapid bespoke and deep dive retail analytics ⚡

PyRetailScience equips you with a wide array of retail analytical capabilities, from segmentations to gain-loss analysis. Leave the mundane to us and elevate your role from data janitor to insights virtuoso.

Installation

To get the latest release:

pip install pyretailscience

Alternatively, if you want the very latest version of the package you can install it from GitHub:

pip install git+https://github.com/Data-Simply/pyretailscience.git

Features

New Store Cannibalization Analysis

Cross Shop Analysis Chart

Examples

Gains Loss Analysis

Here is an excerpt from the gain loss analysis example notebook

from pyretailscience.gain_loss import GainLoss

gl = GainLoss(
    df,
    # Flag the rows of period 1
    p1_index=time_period_1,
    # Flag the rows of period 2
    p2_index=time_period_2,
    # Flag which rows are part of the focus group.
    # Namely, which rows are Calvin Klein sales
    focus_group_index=df["brand_name"] == "Calvin Klein",
    focus_group_name="Calvin Klein",
    # Flag which rows are part of the comparison group.
    # Namely, which rows are Diesel sales
    comparison_group_index=df["brand_name"] == "Diesel",
    comparison_group_name="Diesel",
    # Finally we specifiy that we want to calculate
    # the gain/loss in total revenue
    value_col="total_price",
)
# Ok now let's plot the result
gl.plot(
    x_label="Revenue Change",
    source_text="Transactions 2023-01-01 to 2023-12-31",
    move_legend_outside=True,
)
plt.show()

Cross Shop Analysis Chart

Cross Shop Analysis

Here is an excerpt from the cross shop analysis example notebook

from pyretailscience import cross_shop

cs = cross_shop.CrossShop(
    df,
    group_1_idx=df["category_1_name"] == "Jeans",
    group_2_idx=df["category_1_name"] == "Shoes",
    group_3_idx=df["category_1_name"] == "Dresses",
    labels=["Jeans", "Shoes", "Dresses"],
)
cs.plot(
    title="Jeans are a popular cross-shopping category with dresses",
    source_text="Source: Transactions 2023-01-01 to 2023-12-31",
    figsize=(6, 6),
)
plt.show()
# Let's see which customers were in which groups
display(cs.cross_shop_df.head())
# And the totals for all groups
display(cs.cross_shop_table_df)

Cross Shop Analysis Chart

Customer Retention Analysis

Here is an excerpt from the customer retention analysis example notebook

ax = dbp.plot(
    figsize=(10, 5),
    bins=20,
    cumlative=True,
    draw_percentile_line=True,
    percentile_line=0.8,
    source_text="Source: Transactions in 2023",
    title="When Do Customers Make Their Next Purchase?",
)

# Let's dress up the chart a bit of text and get rid of the legend
churn_period = dbp.purchases_percentile(0.8)
ax.annotate(
    f"80% of customers made\nanother purchase within\n{round(churn_period)} days",
    xy=(churn_period, 0.81),
    xytext=(dbp.purchase_dist_s.min(), 0.8),
    fontsize=15,
    ha="left",
    va="center",
    arrowprops=dict(facecolor="black", arrowstyle="-|>", connectionstyle="arc3,rad=-0.25", mutation_scale=25),
)
ax.legend().set_visible(False)
plt.show()

Cumulative Next Purchase Chart

Documentation

Please see here for full documentation, which includes:

Contributing

We welcome contributions from the community to enhance and improve PyRetailScience. To contribute, please follow these steps:

  1. Fork the repository.
  2. Create a new branch for your feature or bug fix.
  3. Make your changes and commit them with clear messages.
  4. Push your changes to your fork.
  5. Open a pull request to the main repository's main branch.

Please make sure to follow the existing coding style and provide unit tests for new features.

Contact / Support

This repository is supported by Data simply.

If you are interested in seeing what Data Simply can do for you, then please email email us. We work with companies at a variety of scales and with varying levels of data and retail analytics sophistication, to help them build, scale or streamline their analysis capabilities.

Contributors

Made with contrib.rocks.

Acknowledgements

Built with expertise doing analytics and data science for scale-ups to multi-nationals, including:

License

This project is licensed under the Elastic License 2.0 - see the LICENSE file for details.