Open blazgocompany opened 2 weeks ago
it's not a new project, its a new... umm... task, maybe?
Ok what is it
Wait... ill live share somthing
https://prod.liveshare.vsengsaas.visualstudio.com/join?FAD2EE93B4CDF916A58DCECFC9D2FD4F395A
k and pls talk to me here instead https://adapt.chat/invite/NgSI9xGG it is easier
I don't need that. We have a live chat in liveshare itself
Its at the bottom bar
Its at the bottom bar
ik but inseatd of github talk to me here https://adapt.chat/invite/NgSI9xGG it is way easier
btw I got my AI running:
https://rainai.leroy-i-fernandes.workers.dev/?data=WwogIHsgInJvbGUiOiAic3lzdGVtIiwgImNvbnRlbnQiOiAiWW91IGFyZSBhIGhlbHBmdWwgYXNzaXN0YW50LiIgfSwKICB7ICJyb2xlIjogInVzZXIiLCAiY29udGVudCI6ICJXaG8gd29uIHRoZSB3b3JsZCBzZXJpZXMgaW4gMjAyMD8iIH0KXQ==
The string after data=
is base64 encoded of:
[
{ "role": "system", "content": "You are a helpful assistant." },
{ "role": "user", "content": "Who won the world series in 2020?" }
]
btw I got my AI running:
https://rainai.leroy-i-fernandes.workers.dev/?data=WwogIHsgInJvbGUiOiAic3lzdGVtIiwgImNvbnRlbnQiOiAiWW91IGFyZSBhIGhlbHBmdWwgYXNzaXN0YW50LiIgfSwKICB7ICJyb2xlIjogInVzZXIiLCAiY29udGVudCI6ICJXaG8gd29uIHRoZSB3b3JsZCBzZXJpZXMgaW4gMjAyMD8iIH0KXQ==
The string after
data=
is base64 encoded of:[ { "role": "system", "content": "You are a helpful assistant." }, { "role": "user", "content": "Who won the world series in 2020?" } ]
k how to use it look
Hmm...
Oh sorry, I changed the domain after I sent you... just change the "rainai.leroy" at the beginning to "rainapi.leroy"
Oh sorry, I changed the domain after I sent you... just change the "rainai.leroy" at the beginning to "rainapi.leroy"
Ok cool I saw it
Did you want to talk to me?
Did you want to talk to me?
Yes could you help with a scratch online detectore in python
How would I do that?.....
How would I do that?.....
nvm I did it look `import scratchattach as scratch3 import requests from datetime import datetime, timedelta from bs4 import BeautifulSoup import re
conn = scratch3.TwCloudConnection( project_id="1068841039", # Replace with your project id purpose="Check user activity", # Optional: You can specify the use case contact="cat-girl-12345" # Replace with your Scratch account username )
client = scratch3.TwCloudRequests(conn)
Scratch_URL = "https://scratch.mit.edu"
def get_activity_type(raw_activity_type): activities = { "added": "studio-add", "became a curator of": "studio-curator", "loved": "project-love", "favorited": "project-favorite", "is now following": "user-follow", "is now following the studio": "studio-follow", "shared the project": "project-share", "was promoted to manager of": "studio-manager", "remixed": "project-remix", "joined Scratch": "scratch-join", } return activities.get(raw_activity_type, "unknown")
def get_activity(username): response = requests.get( f"https://scratch.mit.edu/messages/ajax/user-activity/?user={username}&max=1000000", headers={"User-Agent": "Python Requests"}, ).content
soup = BeautifulSoup(response, "html.parser")
activities = soup.find_all("li")
result = []
for activity in activities:
div_contents = activity.find("div").contents
activity_type = get_activity_type(div_contents[2].strip())
activity_time = str(activity.find("span", "time").contents[0]).replace("\xa0", " ")
a = {"Type": activity_type, "Time": activity_time, "Action": {}}
a_act = a["Action"]
if activity_type == "studio-add":
a_act["ProjectURL"] = f"{Scratch_URL}{div_contents[3]['href']}"
a_act["ProjectName"] = f"{str(div_contents[3].contents[0])}"
a_act["StudioURL"] = f"{Scratch_URL}{div_contents[5]['href']}"
a_act["StudioName"] = f"{str(div_contents[5].contents[0])}"
elif activity_type == "studio-curator":
a_act["StudioURL"] = f"{Scratch_URL}{div_contents[3]['href']}"
a_act["StudioName"] = f"{str(div_contents[3].contents[0])}"
elif activity_type == "project-love":
a_act["ProjectURL"] = f"{Scratch_URL}{div_contents[3]['href']}"
a_act["ProjectName"] = f"{str(div_contents[3].contents[0])}"
elif activity_type == "project-favorite":
a_act["ProjectURL"] = f"{Scratch_URL}{div_contents[3]['href']}"
a_act["ProjectName"] = f"{str(div_contents[3].contents[0])}"
elif activity_type == "user-follow":
a_act["UsernameURL"] = f"{Scratch_URL}{div_contents[3]['href']}"
a_act["Username"] = f"{str(div_contents[3].contents[0])}"
elif activity_type == "studio-follow":
a_act["StudioURL"] = f"{Scratch_URL}{div_contents[3]['href']}"
a_act["StudioName"] = f"{str(div_contents[3].contents[0])}"
elif activity_type == "project-share":
a_act["ProjectURL"] = f"{Scratch_URL}{div_contents[3]['href']}"
a_act["ProjectName"] = f"{str(div_contents[3].contents[0])}"
elif activity_type == "studio-manager":
a_act["StudioURL"] = f"{Scratch_URL}{div_contents[3]['href']}"
a_act["StudioName"] = f"{str(div_contents[3].contents[0])}"
elif activity_type == "project-remix":
a_act["ParentProjectURL"] = f"{Scratch_URL}{div_contents[3]['href']}"
a_act["ParentProjectName"] = f"{str(div_contents[3].contents[0])}"
a_act["NewProjectURL"] = f"{Scratch_URL}{div_contents[5]['href']}"
a_act["NewProjectName"] = f"{str(div_contents[5].contents[0])}"
result.append(a)
return result
def parse_relative_time(relative_time_str): """ Parse relative time strings (e.g., '7 minutes ago') into a datetime object. """ now = datetime.now() patterns = [ (r"(\d+)\s+minute(?:s)?\s+ago", lambda x: timedelta(minutes=int(x))), (r"(\d+)\s+hour(?:s)?\s+ago", lambda x: timedelta(hours=int(x))), (r"(\d+)\s+day(?:s)?\s+ago", lambda x: timedelta(days=int(x))), (r"(\d+)\s+week(?:s)?\s+ago", lambda x: timedelta(weeks=int(x))), (r"(\d+)\s+minutes?\s+ago", lambda x: timedelta(minutes=int(x))), (r"(\d+)\s+hours?\s+ago", lambda x: timedelta(hours=int(x))), (r"(\d+)\s+days?\s+ago", lambda x: timedelta(days=int(x))), (r"(\d+)\s+weeks?\s+ago", lambda x: timedelta(weeks=int(x))) ] for pattern, delta_func in patterns: match = re.match(pattern, relative_time_str) if match: value = match.group(1) delta = delta_func(value) return now - delta
return now
def is_recent_activity(activity_time_str, minutes=2): """ Check if the activity time is within the last 'minutes' minutes. """ try:
activity_time = datetime.strptime(activity_time_str, '%Y-%m-%dT%H:%M:%S.%fZ')
except ValueError:
try:
activity_time = datetime.strptime(activity_time_str, '%Y-%m-%dT%H:%M:%S%z')
except ValueError:
# Handle relative time strings
activity_time = parse_relative_time(activity_time_str)
recent_threshold = datetime.now() - timedelta(minutes=minutes)
print(f"Activity Time: {activity_time}, Recent Threshold: {recent_threshold}") # Debugging line
return activity_time > recent_threshold
def fetch_user_activity(user_username): """ Fetch the user's recent activities, including projects and comments. """ activities = get_activity(user_username) recent_threshold = datetime.now() - timedelta(minutes=2) # Updated to 2 minutes
for activity in activities:
activity_time_str = activity.get('Time', '')
print(f"Checking activity time: {activity_time_str}") # Debugging line
if is_recent_activity(activity_time_str, minutes=2): # Updated to 2 minutes
return True
print(f"No recent activity found for {user_username}.")
return False
@client.request def check_user_activity(user_username): # Called when client receives request with 'user_username' print(f"Request received to check if {user_username} is active.")
is_active = fetch_user_activity(user_username)
status = 'Active' if is_active else 'Inactive'
print(f"Activity status for {user_username}: {status}")
return f"{user_username} is {status}"
@client.event def on_ready(): print("Request handler is running")
client.run() # Make sure this is ALWAYS at the bottom of your Python file `
oh... That works... i guess.
And also please wrap code like this in triple back-tick: ```
And what does GammaTube acutally do?
Also, are you better at CSS
and HTML
or better at Flask
oh... That works... i guess.
And also please wrap code like this in triple back-tick: ```
And what does GammaTube acutally do?
k also GammaTube is YT in Scratch using scratchattach
Also, are you better at
CSS
andHTML
or better atFlask
Flask and HTML
hmm.... thats an odd combination.... are you willing to learn CSS?
hmm.... thats an odd combination.... are you willing to learn CSS?
Yes but I already know some css
ok.... well... I can help you.
ok.... well... I can help you.
Thx :D
Wait.... What do you want me to help you with???? I was thinking that you help me with the html and flask part of my site and ill help you with the css...
Wait.... What do you want me to help you with???? I was thinking that you help me with the html and flask part of my site and ill help you with the css...
Yes ok
Wait im trying to get the server to work...
Wait im trying to get the server to work...
For what????
rAIn
Oh
rAIn
Oh Ok
Hmm... the database isn't working.... once we get that were all set. do you know how to connect to a database in python
Hmm... the database isn't working.... once we get that were all set. do you know how to connect to a database in python
I think so..... it should be easy
I think so..... it should be easy
It should be... lol... sadly it isn't:
Error: 2003: Can't connect to MySQL server on '%-.100s:%u' (%s) (Warning: %u format: a real number is required, not str)
I have no idea what this even means...
I think so..... it should be easy
It should be... lol... sadly it isn't:
Error: 2003: Can't connect to MySQL server on '%-.100s:%u' (%s) (Warning: %u format: a real number is required, not str)
I have no idea what this even means...
XD wdym??? also why cant I delete my yahoo mail from my yahoo inbox I delete it but it just comes back do you know why
no...
no...
ok nvm I did it
no...
could you create a scratch project similar to your pfp one but instead a sound one with a python backend pls I will give credits @blazgocompany
Maybe... but not anytime soon...
Finally its working.... But i won't be there until Monday...
Maybe... but not anytime soon...
Ok np
Finally its working.... But i won't be there until Monday...
Huh???? what is working?? the API????
It's Monday and I'm back (from the beach)!
Huh???? what is working?? the API????
No... i told you not anytime soon.
Huh???? what is working?? the API????
No... i told you not anytime soon.
Huh? your confusing me btw do you have discord???
It's Monday and I'm back (from the beach)!
Yay Hi :)
Huh? your confusing me btw do you have discord???
no. I meant rAIn was done. not your api
Huh? your confusing me btw do you have discord???
no. I meant rAIn was done. not your api
I did not ask for an API
by the way, the sound thing isn't possible:
Replicating audio from just frequency and loudness data is challenging because this information alone doesn't capture all the nuances of the original sound. Specifically, while frequency analysis reveals which tones are present, it lacks essential phase information and harmonic details that influence timbre and texture. Additionally, loudness gives a sense of volume but misses the dynamic variations and subtle characteristics of the original audio. As a result, any reconstruction would likely sound like a rough approximation rather than an exact replica.
It won't work for talking either:
Voices, like those in speech, contain complex harmonics and subtle variations in pitch, tone, and timing that contribute to their clarity and distinctiveness. If you only capture frequency and loudness, you miss critical aspects like formants (resonant frequencies that shape vowel sounds) and the specific timing of consonants and syllables. This can lead to a distorted or unnatural reproduction of the voice, making it difficult to understand and less recognizable. Additionally, the emotional nuances conveyed through inflection and intonation can be lost, further distorting the original intent of the speech.
And it won't work for music:
When multiple instruments play together, their unique timbres create a rich, complex sound. If you only capture frequency and loudness, the result can sound muddled, as each instrument has its own harmonic structure and overtones that interact in specific ways. Without this detailed timbral information, the nuances that allow instruments to complement or contrast with each other are lost, leading to a lack of clarity and harmony in the overall sound.
But it will work for pure melodies:
If you capture multiple sine waves one after another—meaning you analyze each one separately—you can accurately replicate each wave using just its frequency and amplitude. Since each sine wave is distinct and lacks the complexities of harmonics or timbre, you can recreate them perfectly by using the frequency and loudness data for each individual wave. This approach would work well for reconstructing the sound as long as the analysis is done accurately for each wave.
by the way, the sound thing isn't possible:
Replicating audio from just frequency and loudness data is challenging because this information alone doesn't capture all the nuances of the original sound. Specifically, while frequency analysis reveals which tones are present, it lacks essential phase information and harmonic details that influence timbre and texture. Additionally, loudness gives a sense of volume but misses the dynamic variations and subtle characteristics of the original audio. As a result, any reconstruction would likely sound like a rough approximation rather than an exact replica.
It won't work for talking either:
Voices, like those in speech, contain complex harmonics and subtle variations in pitch, tone, and timing that contribute to their clarity and distinctiveness. If you only capture frequency and loudness, you miss critical aspects like formants (resonant frequencies that shape vowel sounds) and the specific timing of consonants and syllables. This can lead to a distorted or unnatural reproduction of the voice, making it difficult to understand and less recognizable. Additionally, the emotional nuances conveyed through inflection and intonation can be lost, further distorting the original intent of the speech.
And it won't work for music:
When multiple instruments play together, their unique timbres create a rich, complex sound. If you only capture frequency and loudness, the result can sound muddled, as each instrument has its own harmonic structure and overtones that interact in specific ways. Without this detailed timbral information, the nuances that allow instruments to complement or contrast with each other are lost, leading to a lack of clarity and harmony in the overall sound.
But it will work for pure melodies:
If you capture multiple sine waves one after another—meaning you analyze each one separately—you can accurately replicate each wave using just its frequency and amplitude. Since each sine wave is distinct and lacks the complexities of harmonics or timbre, you can recreate them perfectly by using the frequency and loudness data for each individual wave. This approach would work well for reconstructing the sound as long as the analysis is done accurately for each wave.
Ok btw do you have Discord
it's not a new project, its a new... umm... task, maybe?