bramses / obsidian-stack-overflow

Fetch Stack Oveflow answers and copy paste them directly into Obsidian
MIT License
33 stars 0 forks source link

Support other sites on the StackExchange network (not just StackOverflow) #7

Open 0xdevalias opened 2 years ago

0xdevalias commented 2 years ago

Currently the code seems to be written to only work for StackOverflow, rather than any site in the StackExchange network:

https://github.com/bramses/obsidian-stack-overflow/blob/4f2a556dcb7d4a33fc062ecb2234001ccdf15581/main.ts#L113-L116

https://github.com/bramses/obsidian-stack-overflow/blob/4f2a556dcb7d4a33fc062ecb2234001ccdf15581/main.ts#L85

It would be cool if it worked for any/all of the StackExchange sites:

A number of these seem to be in the format: https://FOO.stackexchange.com/, though it seems that there are also a few that use their own custom domains

This answer seems to talk a bit more about it:

As well as suggesting that the API can also provide this information:

You can get a complete (and always up to date) list of Stack Exchange site URLs from the API.

By tweaking the default filter used we can reduce the results to just returning the site name, site_url, api_site_parameter, etc that would be required to handle each in an easy way:

A live example of the output from this can be seen at:

{
items: [
{
aliases: [
"https://www.stackoverflow.com/",
"https://facebook.stackoverflow.com/"
],
site_url: "https://stackoverflow.com/",
api_site_parameter: "stackoverflow",
name: "Stack Overflow"
},
{
site_url: "https://serverfault.com/",
api_site_parameter: "serverfault",
name: "Server Fault"
},
{
site_url: "https://superuser.com/",
api_site_parameter: "superuser",
name: "Super User"
},
{
site_url: "https://meta.stackexchange.com/",
api_site_parameter: "meta",
name: "Meta Stack Exchange"
},
// ...etc...
bramses commented 2 years ago

This is a great idea! It seems they all follow a similar structure as well, so it might be straightforward in doing similar HTML parsing techniques

0xdevalias commented 2 years ago

Yeah, that was my thoughts too. I hadn't actually looked deeper into the parsing side of things, but I assumed they're all deployed on basically the same core 'product', just with different URLs/styling/etc.

As an MVP hack it might be enough to just swap out the couple of hardcoded references for one of the other sites and see if still works.

As for supporting all the sites, not sure the best approach for it, but I would probably hit the API to get the data, and then cache that somewhere (I assume obsidian plugins can support something like that?) so that it wouldn't need to be looked up each time. Alternately it could be 'baked in' to the code, but then it would need to be updated if/when any of the sites change, which might be more maintenance effort than it's worth.