luongz / iptv-jp

A collection of Japanese TV channel links.
The Unlicense
450 stars 50 forks source link

HD Stream stuck #35

Closed Azrael-001 closed 1 year ago

Azrael-001 commented 1 year ago

Sir,

After you transferred v1 from jptv3 to jptv2, I found the HD stream became stuck today, that means the stream is not smooth as before. I used the same method to play the stream, it was smooth in jptv3 server. May I ask it is source problem or my problem, thanks a lot

Azrael-001 commented 1 year ago

Additional, not only the HD, all the streams have same problem, the stream buffer will fallback not linear increase, so causing the live will stuck or fallback.

torytyler commented 1 year ago

It's a little more cluttered, but it seems this GitHub repo just relists the streams from this distributor. Temporary fix is to just use that link as your m3u8 until repo is updated, as it has working up to date links. It's just cluttered with more channels that aren't Japanese channels.

Honestly it would be pretty simple to create a bot that scrapes that url once or twice a day, delete the irrelevant channels, only displays the Japanese channels, and have it self update on a github repo, but I don't really feel like creating that rn. I just favorite the channels I like and ignore the rest.

Hope this helps!

luongz commented 1 year ago

It's a little more cluttered, but it seems this GitHub repo just relists the streams from this distributor. Temporary fix is to just use that link as your m3u8 until repo is updated, as it has working up to date links. It's just cluttered with more channels that aren't Japanese channels.

Honestly it would be pretty simple to create a bot that scrapes that url once or twice a day, delete the irrelevant channels, only displays the Japanese channels, and have it self update on a github repo, but I don't really feel like creating that rn. I just favorite the channels I like and ignore the rest.

Hope this helps!

  • edit, hope nobody actually paid this guy for re hosting someone elses streams! lol

First.

It's a little more cluttered, but it seems this GitHub repo just relists the streams from this distributor.

There was a little misunderstanding.

I'm working this JP list with the owner of distributor. I have full permission to access the server.

Honestly it would be pretty simple to create a bot that scrapes that url once or twice a day, delete the irrelevant channels, only displays the Japanese channels, and have it self update on a github repo, but I don't really feel like creating that rn. I just favorite the channels I like and ignore the rest.

The method you mentioned is patched. We are still finding a way.

  • edit, hope nobody actually paid this guy for re hosting someone elses streams! lol

fine.

luongz commented 1 year ago

The relist is true , but the owner of the list are accepted. But i have to renew the license every month. That's why i'm putting the "donated" announcement. Screenshot_20230709-110107

torytyler commented 1 year ago

It's patched... I see. Well here's a baseline script I just wrote that will help you in the future! Tested and working as of today, 7/9/23. Theoretically if the streams stop working again, all you have to do is run the script to scrape updated URLS.

Run in python, it's a script that scrubs that URL, filters out the non Japan channels, and saves the file as a .m3u that can be used with various iptv applications.

import requests

def save_url_as_text(url, filename):
    print(f"Connecting to Vthanhtivi! Please wait, extracting the Japanese Channel URLs...")
    response = requests.get(url)
    # Check if the request was successful
    if response.status_code == 200:
        # Get the content of the response
        content = response.text

        lines = content.split("\n")
        filtered_lines = lines[:1] + lines[1043:1333] + lines[3796:]
        filtered_content = "\n".join(filtered_lines)

        with open(filename, "w", encoding='utf-8') as file:
            file.write(filtered_content)

        print(f"Successfully saved the filtered URL as {filename}. お楽しみに!")
    else:
        # If the request was not successful, print an error message
        print("Error: Failed to retrieve the web page, perhaps the host is down? :c")

# Provide the URL you want to save as a text document
url = "https://playlist.vthanhtivi.pw/"

# Provide the filename for the text document
filename = "JapanTVPlaylist.m3u"

# Call the function to save the URL as a text document and filter out unwanted lines
save_url_as_text(url, filename)

# Created with love by phak! <3
luongz commented 1 year ago

It's patched... I see. Well here's a baseline script I just wrote that will help you in the future! Tested and working as of today, 7/9/23. Theoretically if the streams stop working again, all you have to do is run the script to scrape updated URLS.

Run in python, it's a script that scrubs that URL, filters out the non Japan channels, and saves the file as a .m3u that can be used with various iptv applications.

import requests

def save_url_as_text(url, filename):
    print(f"Connecting to Vthanhtivi! Please wait, extracting the Japanese Channel URLs...")
    response = requests.get(url)
    # Check if the request was successful
    if response.status_code == 200:
        # Get the content of the response
        content = response.text

        lines = content.split("\n")
        filtered_lines = lines[:1] + lines[1043:1333] + lines[3796:]
        filtered_content = "\n".join(filtered_lines)

        with open(filename, "w", encoding='utf-8') as file:
            file.write(filtered_content)

        print(f"Successfully saved the filtered URL as {filename}. お楽しみに!")
    else:
        # If the request was not successful, print an error message
        print("Error: Failed to retrieve the web page, perhaps the host is down? :c")

# Provide the URL you want to save as a text document
url = "https://playlist.vthanhtivi.pw/"

# Provide the filename for the text document
filename = "JapanTVPlaylist.m3u"

# Call the function to save the URL as a text document and filter out unwanted lines
save_url_as_text(url, filename)

# Created with love by phak! <3

Thanks for script but if the list changed something , the owner of the links will told me. And i have the Non-JP filter too. The owner public at all v1 , v2

luongz commented 1 year ago

We still finding a method. The v1 streams will be moved to a new host . v2 streams we will find new method. Thanks