thefakequake / pypartpicker

A Python package that can be used to fetch information from PCPartPicker on products and parts lists.
MIT License
18 stars 8 forks source link

PyPartPicker

PyPartPicker is a package that allows you to obtain information from PCPartPicker quickly and easily, with data being returned via objects with numerous attributes.


Features:

Installation


Installation via pip:

>>> pip install pypartpicker

Or clone the repo directly:

>>> git clone https://github.com/thefakequake/pypartpicker.git

Example programs


Here is a program that searches for i7's, prints every result, then gets the first result and prints its specs:

from pypartpicker import Scraper

# creates the scraper object
pcpp = Scraper()
# returns a list of Part objects we can iterate through
parts = pcpp.part_search("i7")

# iterates through every part object
for part in parts:
    # prints the name of the part
    print(part.name)

# gets the first product and fetches its URL
first_product_url = parts[0].url
# gets the Product object for the item
product = pcpp.fetch_product(first_product_url)
# prints the product's specs using the specs attribute
print(product.specs)

Here is another program that finds i3s that are cheaper than or equal to £110, prints their specs and then prints the first review:

from pypartpicker import Scraper
from time import sleep

# returns a list of Part objects we can iterate through
# the region is set to "uk" so that we get prices in GBP
pcpp = Scraper()
parts = pcpp.part_search("i3", region="uk")

# iterates through the parts
for part in parts:
    # checks if the price is lower than 110
    if float(part.price.strip("£")) <= 110:
        print(f"I found a valid product: {part.name}")
        print(f"Here is the link: {part.url}")
        # gets the product object for the parts
        product = pcpp.fetch_product(part.url)
        print(product.specs)
        # makes sure the product has reviews
        if product.reviews != None:
            # gets the first review
            review = product.reviews[0]
            print(f"Posted by {review.author}: {review.content}")
            print(f"They rated this product {review.rating}/5.")
        else:
            print("There are no reviews on this product!")

    # slows down the program so as not to spam PCPartPicker and potentially get IP banned
    sleep(3)

Creating the Scraper object


Scraper(headers={...}, response_retriever=...)

Parameters

Scraper Methods


Scraper.part_search(search_term, limit=20, region=None)

Returns Part objects using PCPartPicker's search function.

Parameters

Returns

A list of Part objects corresponding to the results on PCPartPicker.


Scraper.fetch_product(product_url)

Returns a Product object from a PCPartPicker product URL.

Parameters

Returns

A Product object for the part.


Scraper.fetch_list(list_url)

Returns a PCPPLIst object from a PCPartPicker list URL.

Parameters

Returns

A PCPPList object for the list.

Other methods


get_list_links(string)

Returns a list of PCPartPicker list links from the given string.

Parameters

Returns

A list of URLs.


get_product_links(string)

Returns a list of PCPartPicker product links from the given string.

Parameters

Returns

A list of URLs.

Async Methods


Same syntax as sync functions, but add aio_ to the beginning of the method name and add await before the function call.

For example:

pcpp = Scraper()
results = pcpp.part_search("i5")

becomes

pcpp = Scraper()
results = await pcpp.aio_part_search("i5")

Remember: you can only call async functions within other async functions. If you are not writing async code, do not use these methods. Use the sync methods, which don't have aio_ before their name.

Only the blocking functions (the ones that involve active scraping) have async equivalents.

Objects


Part

Attributes


Product

Attributes


Price

Attributes


Review

Attributes


PCPPList

Attributes