Closed pmackay closed 9 years ago
Is this a feature that would make sense for Ohana Web Search or the Admin interface in Ohana API? (or conceivably both I guess.)
A few questions and comments:
What is the expected output? A single flat CSV file that includes the associated tables as well? If so, note that the API only returns a small subset of data in search results by default, as explained in the documentation. Also keep in mind that the API only returns 30 results at a time by default, but you can auto paginate using the Ohanakapa Ruby wrapper.
Since the API already includes some of the associated tables in the JSON response, it's easiest to just parse the JSON and create the CSV from that. Otherwise, if you need more info in the CSV file, you will need to create a custom CSV file in the API app from what the database returns. Depending on what you're trying to do, this may get complicated and messy, but should be doable with the csv_shaper gem that's already included in the API. You can take a look at any of the files in https://github.com/codeforamerica/ohana-api/tree/master/app/views/admin/csv for an example of output customization.
If all you need in the CSV file is what the API currently returns, then it makes most sense to add this feature in an app that consumes the API, such as ohana-web-search, or your own standalone project (for example a single-page app with a search form that downloads a CSV file).
If you require the CSV file to contain information from all the tables, it can get complicated and messy. For example, here is a sample CSV file based on the JSON response for the demo location: https://json-csv.com/c/PfnQ. I find that CSV file to be extremely hard to read.
If instead of just one CSV file, you require a set of CSV files (like an HSDS set), then that's even more complicated, and I would say outside the scope of this project (i.e. I'm not planning on building that on my own).
Since the API results are public, there's no need for any logins or permissions. Anyone and everyone should be able to get access to the data returned by the API.
I'm not entirely sold on the value of this feature for a majority of deployments (and don't have time to spend on it), so I would suggest you implement it in your deployment, or as a standalone project, see how it works, and if you think it makes sense to include in the project, submit a PR.
Many thanks for the responses.
The intent was to allow users to filter a list and then download a CSV of emails, potentially along with some other relevant info like names or org names (yet to dig into that). The key thing is having a list of emails that could be imported into a mailing list, CRM, etc.
Thanks for the good info on approaches. We need to consider more whether just org emails would do or related contact information is also important (which would presumably affect the related tables points). Initially the summary should be enough, as org emails are included.
I'm assuming this would be better in the web search UI given that it has means to filter by categories or location, etc. Whereas the admin UI is more focussed on editing.
If the intent is to collect org emails, I would not put this in the web search UI because that feature would only be relevant to a few people. It sounds like a better fit in the admin interface or as a standalone project or script. Presumably, the filters would be predetermined, so you could create a bunch of different queries and a link to download a CSV file for each query. It would be more efficient to query the DB directly and extract org emails than to make an API call.
Also, if the intent is to subscribe people to a mailing list, I would make sure to allow them to opt in as opposed to automatically subscribing them and making them opt out.
Thanks for explaining the intent. This is more of a custom need so I'm closing this.
FWIW i suspect this is a feature that many would find useful -- although perhaps not an urgent need for any of Open Referral's primary types of uses (help seekers, providers, and researchers). @pmackay if you all do work on it, please do keep us posted!
@monfresh FYI the filters would not be predetermined. Hence why I think the web UI could be better for this, in that content can be filtered there by search or by taxonomy fields. That isnt supported in the admin UI.
The fact that the admin interface does not currently provide a search form does not mean it's not supported :smile: Adding search to the admin interface should be fairly straightforward. The hard work has already been done. At a basic level, what you would need to do is add some HTML for the fields you want to search on, and add a new controller action that calls Location.search(params)
.
My argument against adding this to ohana-web-search wasn't so much because of predetermined queries, but rather because this feature is not meant for the typical visitor to ohana-web-search. As I understand it, it's a need expressed by the data administrators.
My point about the predetermined queries was that if there is only a small subset of queries, the admin UI could display buttons/links that download CSV files based on those queries, as opposed to a freeform search field that allows for ad hoc searching.
Now that you mention that the filters would not be predetermined, I'm even more intrigued. Could you please further explain the ultimate purpose of this feature? If CSV files can be downloaded based on any search query, you could end up with hundreds of CSV files with probable overlap (i.e. the same emails appearing in various CSV files). What are you planning to do with these emails that you collect? What is the value of having hundreds of different mailing lists?
The idea with emails is that grassroots community groups (such as those listed here http://love.lambeth.gov.uk/mapping-lambeths-communities/) could be updating their info on community websites. We're experimenting with pulling this together using Ohana (hence the question about data import). People could get a CSV of groups and emails and merge it into their own lists. Many of these mailing lists maintained by the council are for different groups and audiences, but it is difficult and time consuming (and thus expensive) to keep the info up-to-date.
Would appreciate comments on this idea. Its a feature request we have that once a filtered selection is made on the site, it could be possible to download a CSV file of the data. This might relate to https://github.com/codeforamerica/ohana-api/issues/326 and https://github.com/codeforamerica/ohana-api/issues/304.
This may need some level of login to the web search site, to support permissions for downloading the data. Has that ever been considered for the front end?