Open hashplan opened 10 years ago
Currently crawler searches by event name, venueId and event datetime We could try to search by the stubhub_url field - in this case we'll be able to change all fields of an event on our end
1) I have not found a method to connect metroarea to city. Was it done manually before? 2) In which way will admin user set venue - by selecting from a list with preliminary filtering by metroarea or by filling the form
1) metroarea to city mapping was done "manually", but it is a one-time setup (new cities do not pop up frequently).
2) I would say probably by filling out the form, but it would be helpful if there was a dropdown also provided with what is available to make sure we do not have enter cases like "Yankees Stadium" vs "yankeesstadium" etc.
Regarding your prior comment of editing all fields of an event based on the stubhub_url. Would this cause cases when we change the venue as admin and when crawler runs next time, it would write this event to the database again because the venue would not match (even though the event name and datetime would) and we would have the same event on same date in two different venues showing up to users on the front end?
Also to prioritize - events the admin would enter into the main list of events is more important than editing existing stubhub events. Admin would add events they think would be of interest to all users. These events might not necessarily show up on stubhub (ever). These might be more local events in smaller venues (like a local bar or restaurant having a special event, etc).
1) There are 610 cities in the database "cities" table not related to metroarea
2) I propose to implement it this way
The main event fields (name, etc.) remain just form text fields
To choose venue: Choose metroarea: {selectbox} Choose venue: {selectbox} (selectbox with autocomplete kind of "http://jqueryui.com/autocomplete/#combobox")
The list of venues is filtered during choosing metroarea It can be enhanced on and on in future.
I propose to create and interface for managing cities, metroarea, venues entities. As an enhancement on dashboard - for example I can display a number of cities not related to metroarea to admin. So admin will be able to manually walk through after crawler and to correct info.
we will take a look at 1
For 2 it is a bit trickier because the list of venues we have in the database is only based on what we have from Stubhub crawling. There could be a lot of other venues we might want to add via the manual admin steps that do not exist from stubhub, that is why I suggested the form text field (however the dropdown will help in case the venue already exists and we don't have to worry about spelling it differently etc).
Some of the enhancements you mention sound good to list - things that need to be checked like cities without metroareas, etc.
Also for the crawler page, could we add last time it ran and number of events added? Also not sure if there is a way to check if crawler fully succeded. Like lets say the crawler starts running and it adds 2 new events but then fails for some reason and does not add the remaining 1000 events that stubhub has. Is there a way for us to check that it scanned all pages and finished or if it broke and failed somewhere in between?
I'm writing a code to manage entities. Currently there are lists - users (with full management), events - the list of adding a new event, completing management of metroarea. I did not answer the question regarding the crawler, because I'm still thinking when I have spare time how to improve its work. Got a couple of ideas, but need to check. As for the check if crawler has completely worked or not - have not invented how to do it yet.
Roma, We need to create a dashboard for admins. All functionality a regular "member" has in top navbar + a dashboard tab (I sort of started that in the old version with listing all users).
I kind of like the layout of this: http://getbootstrap.com/examples/dashboard/ without the left sidebar, so we can start with it.
The dashboard will have few numbers in the top circles, like number of users, next circle will be number of total events (all in future, so events with dates that passed are not shown), next one could be number of events added by users, etc. The numbers in the circles should be links. That way once the admin clicks on the number, the table under "section title" will populate with the appropriate data, etc.
The events one is the more critical at this point. We need ability (as an admin) to add public events to the main list of events for all users. Basically the crawler does not cover all events under the sun, so if we find something really interesting in a certain metroarea and want to add to the list, we need to be able to do that. This will probably be a bit more complicated than the user-entered events, because we will need ability to specify the city, metroarea, venue and other details, etc, so we will need to make sure we write the appropriate pieces of the inputs to the appropriate existing tables. For the new event we add, it can be an empty row in the table of all events displayed on the page that will be similar to what you did with user account setting page, all events get returned with editable fields and each one has save/delete button next to them. For new events we enter it will obviously insert, for existing it will edit/update. This will also give us ability to edit all events (crawled and user-entered). Obviously ids will stay the same for the events, but if we find out about an event changing locations or changing times, etc before stubhub does (or before the crawler reruns), we would need to be able to edit it on our site. This gets complicated to make sure that the next time the crawler runs, it does not duplicate the event if we edited something (this depends on what crawler matches with existing events in database before writing what it thinks is a new event). Could you outline what it checks/matches to write to the database?
For users list, we would need the first name, last name, email and then a few of ion_auth functionalities like edit group, deactivate, etc.
This will probably require pagination (with 500-1000 per page)? Columns should be sortable.
Let me know your thoughts. This will probably require more intimate work with the crawler, but I am hoping we can do this in 1-2 days.