code4recovery / 12-step-meeting-list

WordPress plugin for managing a list of weekly recovery meetings
https://wordpress.org/plugins/12-step-meeting-list/
GNU General Public License v2.0
53 stars 51 forks source link

MAJOR PROBLEM when refreshing Data Sources under Import & Settings #255

Closed Area48Webmaster closed 3 years ago

Area48Webmaster commented 3 years ago

When I hit the refresh under Import & Settings, which I do every night, tonight I got this response to all 1,160 meetings and we currently have nothing listed on the website. Please help!

_Google gave an unexpected response for address 362 State St., Albany, NY, 12210, United States. Response was (object) array( 'error_message' => 'This IP, site or mobile application is not authorized to use this API key. Request received from IP address 35.208.14.103, with empty referer', 'results' => array ( ), 'status' => 'REQUEST_DENIED', )_

tech2serve commented 3 years ago

Hey Tim, Josh and everyone else contributing to this project,

First off... thank you from the bottom of my heart for all the time and effort you folks have put into this plug-in. It is truly a wonderful resource for so many in the recovery community. You guys deserve lots of props and no grief.

I ran into this issue on Friday night, and had to restore our website from an old backup. Note to self... backup website more often!!

I did try using our own Google Maps API Key in line 585, but ran into a different set error messages. Since I don't wan to the website to be down while I figure this out, I'll have to wait until after hours to figure out using our own API key is working.

What sort of errors? Had you enabled the geocoding API for the key?

When you paid the bill (thank you for that), I was able to re-import our meeting list into the plugin, which I was doing fairly often, ignorant to the fact that this was somehow using up some Google API credit somewhere. Sorry about that.

I'm not ready to call it a "fact" yet. We're still researching. But I am curious, when you do a full scale import, are you deleting all meetings as part of the process? Or, do you only update meetings that don't exist or have changed?

For now, I will try to avoid any full scale re-imports until I have a good solution to this situation. Ideally, I'll be able to figure out how to use our own API key for the geo-coding as well as the mapping.

With all that said, I want to understand if and why I should be avoiding re-importing our JSON feed into the plug-in. We keep our meeting list as a separate DB which we then build a JSON feed for consumption by the plug-in and the meeting guide app. Previous to this issue, I was perfectly comfortable re-importing the JSON feed ANY time there was an update to our external meeting list DB.

I'm not sure I can answer this accurately right now. I can offer my experience and opinions, and that is that I favor a single database and have found that many of the support requests I've answered relate to import of csv files.

To that end, we 're getting close to announcing a new front end that will be able to take feed from a database and display the meeting list in TSML or any other HTML-based website. Technically, this is available today for experimentation, and we're working on documentation to help people get started.

Was this causing every single address to go through the geo-coding on re-import?

Not sure. If the local cache is available, it shouldn't.

Or only the addresses that hadn't gone through the geo-coding already?

That is how it should be working.

Because from the error messages that I saw when the credit had been used up, it seemed like every single entry being imported was causing an error message. As a result, the re-import resulted in NO entries in the plug-in's meeting list DB. The key was turned off. If you were deleting all data as part of the import, then what you described is as I would expect.

Could we possibly be the culprit who is sending malformed addresses through?? I don't believe so, but how would I find out?

Very unlikely.

But I am exploring the possibility of a single website (inadvertently, I hope) causing all of the errors we see in our Google console. Tomorrow, when Google support is open, I'll be contacting them to try and determine this.

Here is our JSON feed: https://suffolkny-aa.org/meeting-guide.php

I can't imagine that we are the only intergroup that would prefer to keep our DB separate from the plug-in and importing the list when necessary.

Any info would be greatly appreciated.

OSAwebsite commented 3 years ago

Good luck with Google tomorrow!!

arbliss commented 3 years ago

Hey Tim,

Is there a way to not delete the whole database when refreshing from a data source (json feed)?

image

On Sun, Jan 3, 2021 at 8:37 PM Tech2Serve notifications@github.com wrote:

Hey Tim, Josh and everyone else contributing to this project,

First off... thank you from the bottom of my heart for all the time and effort you folks have put into this plug-in. It is truly a wonderful resource for so many in the recovery community. You guys deserve lots of props and no grief.

I ran into this issue on Friday night, and had to restore our website from an old backup. Note to self... backup website more often!!

I did try using our own Google Maps API Key in line 585, but ran into a different set error messages. Since I don't wan to the website to be down while I figure this out, I'll have to wait until after hours to figure out using our own API key is working.

What sort of errors? Had you enabled the geocoding API for the key?

When you paid the bill (thank you for that), I was able to re-import our meeting list into the plugin, which I was doing fairly often, ignorant to the fact that this was somehow using up some Google API credit somewhere. Sorry about that.

I'm not ready to call it a "fact" yet. We're still researching. But I am curious, when you do a full scale import, are you deleting all meetings as part of the process? Or, do you only update meetings that don't exist or have changed?

For now, I will try to avoid any full scale re-imports until I have a good solution to this situation. Ideally, I'll be able to figure out how to use our own API key for the geo-coding as well as the mapping.

With all that said, I want to understand if and why I should be avoiding re-importing our JSON feed into the plug-in. We keep our meeting list as a separate DB which we then build a JSON feed for consumption by the plug-in and the meeting guide app. Previous to this issue, I was perfectly comfortable re-importing the JSON feed ANY time there was an update to our external meeting list DB.

I'm not sure I can answer this accurately right now. I can offer my experience and opinions, and that is that I favor a single database and have found that many of the support requests I've answered relate to import of csv files.

To that end, we 're getting close to announcing a new front end that will be able to take feed from a database and display the meeting list in TSML or any other HTML-based website. Technically, this is available today for experimentation, and we're working on documentation to help people get started.

Was this causing every single address to go through the geo-coding on re-import?

Not sure. If the local cache is available, it shouldn't.

Or only the addresses that hadn't gone through the geo-coding already?

That is how it should be working.

Because from the error messages that I saw when the credit had been used up, it seemed like every single entry being imported was causing an error message. As a result, the re-import resulted in NO entries in the plug-in's meeting list DB. The key was turned off. If you were deleting all data as part of the import, then what you described is as I would expect.

Could we possibly be the culprit who is sending malformed addresses through?? I don't believe so, but how would I find out?

Very unlikely.

But I am exploring the possibility of a single website (inadvertently, I hope) causing all of the errors we see in our Google console. Tomorrow, when Google support is open, I'll be contacting them to try and determine this.

Here is our JSON feed: https://suffolkny-aa.org/meeting-guide.php

I can't imagine that we are the only intergroup that would prefer to keep our DB separate from the plug-in and importing the list when necessary.

Any info would be greatly appreciated.

You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/code4recovery/12-step-meeting-list/issues/255#issuecomment-753726860, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAVHWURNBZVI7RECCATITFLSYESXNANCNFSM4VP7OJ5Q .

HowardL-md commented 3 years ago

@tech2serve re: But I am curious, when you do a full scale import, are you deleting all meetings as part of the process? Or, do you only update meetings that don't exist or have changed?

I can say that for me, I would do around 4 meeting updates to our .json and then refresh at the plugin, and it must have been all the 109 meetings failing the geolocate because it took many page-downs to get to the bottom of the box.

tech2serve commented 3 years ago

@tech2serve re: But I am curious, when you do a full scale import, are you deleting all meetings as part of the process? Or, do you only update meetings that don't exist or have changed?

I can say that for me, I would do around 4 meeting updates to our .json and then refresh at the plugin, and it must have been all the 109 meetings failing the geolocate because it took many page-downs to get to the bottom of the box.

I should take this moment to clarify something. There are two different types of imports. One if from a csv file, and the other is using a JSON feed. Both approaches should used cached addresses (think of the location already being in the database) or overrides first. The JSON feed import shouldn't be removing all the data first, but I'll need to confirm that. In any case, the cache should still be present.

If someone is using the csv import option, they choose if they are deleting all records (really shouldn't need to do this unless restoring the site), adding to the existing (most likely only if they know the new data set they're importing is for a different region and doesn't duplicate the existing set of meetings), or the middle option which is basically to replace entries that are in the csv (useful for an import of just updated entries). Used correctly, the middle option (the last I discussed) should be the quickest and possibly safest. In any case, unless you've reset your site, addresses in the existing data set should still be present, and a new geocoding call shouldn't be required.

Unless someone here is seeing a lot of errors (currently) on their site when they're importing, I would suggest continuing to do business as you have been. If you're able to use your own key on line 585 as a temporary solution, that will help.

cc: @arbliss

SIA-WEBSUP commented 3 years ago

Tim,

I don't delete any entries when I re-import from the JSON feed. I had always hoped that only the deltas need to be run through a lot of processing, but I can't be sure.

I might also mention that many many months ago, in the April - May time frame, I ran into some bugs when updating existing meetings. I'm trying to remember the exact problem, but it basically involved the meeting notes not being updated correctly. At the time, I did have to delete all meetings and being sure to 'emptying the trash bin' and doing a full re-import.

Sometime later, these bugs seemed to be fixed and I no longer had to delete all meetings before re-importing them.

When I have some time I would like to get our API key working instead of the hardcoded one.

When you do have time to review the code, please let me know if the geocoding code is being hit for every entry on re-import regardless if it is a new/changed entry.

mikepoconnor4 commented 3 years ago

Just wondering if you have a rough ETA on when you'll be announcing your future plans for handling this issue.

tech2serve commented 3 years ago

Hi @mikepoconnor4,

Before I take a crack at answering your question, is your site being affected today by the issue? If so, please explain the impact so I can see if we can address it in short order.

Members of our team met over the weekend and we decided on two (or three) paths to pursue.

  1. Write a service (called an api gateway) to process geocode requests from TSML before sending them to Google for action. This accomplishes several things for our TSML community. First, it allows us to validate the format of requests before sending them on (and getting charged). Second, it will let us audit usage better and to work with sites from which we're seeing a high number of requests (or errors). The current setup does not let us do that today. And third, it will let us explore, in a controlled manner, the use of other less expensive geocoding services (such as Mapbox and the Census Bureau) without impacting users.
  2. The second path we're working on is to identify the other sites generating errors. We found the big one at the end of last week, and that has dropped off the Google request rate significantly, but likely not enough to prevent us from getting charged for the month of January. Nor do I think finding the other error generators will significantly change that.
  3. The third path is my least favorite because the potential pitfalls are significant. However, if the two paths above aren't sufficient (or cannot be implemented quickly enough) we may be forced down this path, even if temporary. This approach requires users to set up their own key for geocoding. The could be Google (the most tested service) or Mapbox (least tested code; currently being written). Unfortunately, setting up a Google API key for geocoding is daunting for many users. Heck, even us volunteers experienced difficulties getting it to work.

That is an update on where we stand. Today, if larger more active sites are able to use their own Google key for the short term, that will undoubtedly help us prevent charges. To do this, they would be need to set up their own Google API key for their site, turn on the geocoding API for that key, replace our hard-coded key (line 585 in functions.php), and test to make sure it is working. If this needs to remain as an option after our next update (which currently would write over the functions.php file), Code for Recovery will strive to provide a way for users to configure this more permanently for their sites.

As a final note, we are also discussing re-establishment of a 501c3 so we can accept donations (7th Tradition and all). Because of the overhead, we are reluctantly pursuing this. But we also realize it is wiser to have it in place should we need it rather than to scramble if it becomes urgent.

I hope that answers the question without overwhelming you. I tend to favor more transparency when it is safe and prudent to do so. If you have any questions specific to our paths, I will strive to answer them.

On behalf of Code for Recovery, Tim

OSAwebsite commented 3 years ago

Hi Tim, and others working on the plugin,

That sounds like a thorough and well planned list of strategic items.

I have no issues anymore right now so this is not an active ticket for this issue. I would like to give some background about the batch imports, from my own experience.

When I started to use the TSML plugin for the first time (early 2020), I was irritated by the time it took to update a meeting. This had nothing to do with the plugin itself, but stemmed from standard Wordpress behaviour. When I had a batch of meetings to update, it just took so much time to first look them up in the meeting list, then go to detail, edit and save and rinse and repeat.... Patience is not my best virtue.

So I decided that a batched re-import was 'much quicker' for me. Then I had new problems.

  1. The 2 delete options in the Import sections of the plugin, did not work reliably for me. I got many duplicate entries.
  2. When I tried to delete the entire content of the database myself before doing the import, using Wordpress batched delete, Wordpress did not allow me to do that fast and reliably too. Sometimes my whole website would stall and GoDaddy would tell me that I had gone beyond my plan's maximum number of pagehits or something like that

The import, once I got to that stage, always worked very well and amazingly quickly. A re-import of the complete list of well over 300 meetings was done within 5 minutes. I could not possibly be that fast using the standard Wordpress edit options.

But...

As I continued to work with the plugin, refreshing the whole set did not feel like a good solution anymore. I questioned myself to find out what the main source of my irritation was. And it was not the small edits I had to make online. It was the time it took to pull up the relevant pages on the website.

By that time, I had my entire list of meetings stored in an off-website database solution that is now in use for much more than just the website. That database has the option of storing a dynamic value that represents the URL to access the edit mode of a particular meeting. What I needed for that was the unique ID for each meeting. There is a plugin that reveals that ID on each post and page in Wordpress. Once I had added that ID to all my listings, I was able to use the 'open URL' functionality to pull up the page for editing.

And it is fast, just the way I like it. It literally takes no more than one or two seconds to load the page, no search in Wordpress required. I go right there, I make my edit, save it and I am done within the minute most of the time. Doing this manually with the Wordpress interface takes me much much longer.

From my experience most of my edits merely have to do with the electronic meeting details. The rest is pretty stable. Not many meeting locations change, neither do the meeting type and focus and the accessibility of a meeting. Once that is on file, it tends to be stable information. So why do a complete database refresh when I can copy and paste changes from my database to the website?

I can imagine, as I have stated before, that a batch refresh might become necessary when all of a sudden a large number of groups goes back to physical meetings. Then it might be quicker to select those meetings in the back-end meeting list, delete them and do an import.

But for now, my editing is quick and neat and I will stick to it.

One question about your findings about the API issue: were you able to find out at which point the geocode checks are done? It would be nice to know that they are just done at the first time listing or when fields are adjusted that actually have something to do with the address.

All else might be overkill.

Good luck with the new coding and just take your time with it. Hasty code is usually fickle...

Best, Nicky

tech2serve commented 3 years ago

@OSAwebsite/Nicky--

Your situation may be a good fit for our new front end (looks like we're going to call it TSMLui (long name is 12 Step Meeting List User Interface)) if you're not wed to having a Wordpress site for other reasons (eventually, this will replace the current Wordpress front end).

If you have the right export (php) set up from your back office database, you can having a meeting finder on any website. We're working on documentation, but feel free to explore with it if you want. We don't plan to announce it more widely until we get that together, and our support bandwidth might be a bit limited right now :-)

mikepoconnor4 commented 3 years ago

Our site is fine and we made no changes. I was just asking from a planning perspective.

Btw - you guys are great and put in a lot of hard work and it's much appreciated.

sieis commented 3 years ago

Hey y'all, If we've already got our own Mapbox API key plugged into the Import & Settings for the Meeting Guide wordpress plugin TSML, are we good, or is there another piece specific to a Google Maps key that you're requesting we get too? Probably a noob question; thanks for the help as I'm just familiarizing myself with both our current Intergroup's site and redesign underway as well as the various meeting guide options. Looking forward to trying out the new interface @tech2serve mentioned too for my own home group's site. Thanks again!

tech2serve commented 3 years ago

Hi @sieis,

Prior to 2018, the plugin used a single Google API key for both mapping and geocoding (two separate services). When Google raised their prices and changed the billing structure to require an account with a credit card, we split the services to two keys and added the option for users to enter a Mapbox key for the mapping service; geocoding is still handled through Google.

The Mapbox API key you have plugged into your site handles mapping only. We have continued to provide the API key for geocoding. The issue in this thread only involves geocoding key.

At this stage, if your your site is working as you'd expect, please don't bother trying to set up a new key unless you really want to do so.

I also need to keep asking (sometimes I feel like a broken record :-) ) for people to not refer to 12-step-meeting-list (TSML) as "Meeting Guide". The latter is GSO's (their trademark) mobile app for receiving, processing and displaying meeting information to end users. Code for Recovery and TSML are separate organizations and tools. Thanks for your understanding.

sieis commented 3 years ago

Hi @tech2serve, Thanks for the clarification--makes sense now! And noted for the wording--original comment edit. Thanks!

tech2serve commented 3 years ago

If you are using our geocoding API key (i.e., you have not put your own Google key into the functions.php file), please update to v3.9.6 asap. We're going to have a hard time not getting charged again, and I may have to shut down the original key.

Thank you for your understanding, and help to us in getting through this challenge.