Open Conal-Tuohy opened 10 years ago
This would be cool to have.
Add a new output format is for Omeka 1, but it might give you and idea of what you need to do.
Also, the filters for action_contexts
and response_contexts
should be helpful.
I haven't actually added an Omeka content type, but IIRC, @clioweb has, and he might have more useful suggestions.
@erochest is right: The action_contexts
and response_contexts
filters are exactly what you need, in addition to creating the shared view for the output itself. In Neatline Time I did just this to generate a JSON output for each timeline, to use in the SIMILE timeline display.
Take a look at NeatlineTimePlugin.php and the view items.neatlinetime-json.php and see if that helps. Something similar for a Neatline exhibit, using KML instead of JSON, could be added to Neatline.
Meant to add this earlier. What would need to happen I think is:
action_contexts
and response_contexts
filters, in NeatlinePlugin.php, where you'd add an output for 'kml'. I'd only do this on the "show" action.views/shared/show.kml.php
and build the KML output there. Here's where the fun work would happen.examples.com/neatline-exhibits/show/exhibit-title/?output=kml
.You might also look at some of the internal classes in Omeka for outputs.
+1. Would also be nice to be able to get everything out as GeoJSON.
Yup @davidmcclure! From a superficial glance, it seemed like GeoJSON and KML were pretty much the same in terms of data structure. Not sure if this is true, but if it is, it'd probably be better to just write an output class that could be used for both outputs, but one in JSON and one in XML.
Yeah, they are similar, and we could fully encode the Neatline data model (dates, colors, etc.), since both formats allow user-defined attributes (<Data>
in KML, properties
in GeoJSON). @clioweb, as for the implementation, I bet we could write an abstract NeatlineExport
class, which would take care of all the format-independent stuff that would need to happen when exporting an exhibit (fetching records, looping over them, etc). And then write concrete subclasses that would handle each record and build out the export document - NeatineExportKML
, NeatlineExportGeoJSON
, etc.
This would be an awesome feature, would make Neatline far more interoperable with other software.
Thanks for all these pointers!
I will follow up on action_context and response_context - but I have a question as to whether this approach is amenable to supporting not just HTTP GET but also PUT and/or POST? Are there analogous Neatline features for handing HTTP submission of different content types?
I had been looking at Neatline_Controller_Rest and thinking that dispatching on HTTP Accept header belonged here (or in a subclass).
but I have a question as to whether this approach is amenable to supporting not just HTTP GET but also PUT and/or POST? Are there analogous Neatline features for handing HTTP submission of different content types?
There aren't any features for this in Neatline per se, but there are using Omeka's API.
Omeka's REST API does look close to what I want! I can't immediately see how one can do HTTP content negotiation with the HTTP Accept header, but perhaps if I dig into it I will find something?
Anyway I have gone down a level and looked at Omeka's REST API. I read that plugins can extend this by defining new resource types, and I thought I would have a quick look at the some of the Omeka instances in the showcase to see if Neatline resources appeared. I actually found that almost all the sites listed did not expose an API at all. All but one of the showcase sites returned 404s. Any ideas why that might be? There was just one site with an accessible API, and it did include Exhibits in the list of resource types, however, when I followed the link, the exhibits resource itself was broken, returning a 500 error. Is this supposed to work, currently? Or is Neatline's support for Omeka's REST interface not fully implemented?
So that does look promising, but on the other hand, I can't escape from the feeling that this Neatline code is currently the code responsible for exchanging resource-representations of exhibits, and that this is where any KML I/O should properly go.
It's not obvious to me why Neatline's REST API is distinct from that of Omeka's, for that matter.
I'd appreciate a comment from someone familiar with the overall architecture, about how these APIs relate, if at all.
Incidentally, @davidmcclure, I don't think kml:Data
is necessary to support dates and colours as they are part of the KML native data model.
I have a question about the development process. Is this a good place to ask? Or is there a mail-listo r something?
My question is how do I hack on Neatline: can I clone it directly into a running Omeka, install it, and then continue to edit it? Are there some kinds of edits which would require me to re-install it in Omeka?
@Conal-Tuohy There shouldn't need to reinstall Omeka, at worst you just need to create a new database. You should be able to just clone the repo in to an Omeka instance and make edits. Depending on your workflow, @erochest has some scripts for setting up a dev environment (https://github.com/erochest/neatline.dev).
Just a heads up, this project relies on composer
and npm
for manage dependencies, and grunt
for automation. There's a script (https://github.com/scholarslab/Neatline/blob/master/setup) to run to get the dependencies installed.
Hey @Conal-Tuohy,
Neatline's REST API is separate from Omeka's mainly for historical reasons - Omeka didn't have an API when Neatline 2.x was being developed. We might be able to fold it into Omeka at this point, but I'm not sure - Neatline has a couple of unusual requirements.
Either way, though, you shouldn't really need to do anything with the REST APIs - Omeka's response_contexts
API should provide all the wiring you need to expose the KML/GeoJSON. Once the response context is set up, you can just provide a template that's named with the extension that you want to match (eg, kml
or geojson
). In this case, since we just want to return a serialized version of the exhibit data, the templates will probably just need to contain a single call to a helper function that will construct the output document - nl_serialize_exhibit('kml')
, or something to that effect.
The only other plugin that we've worked (to my knowledge) that does this is TEIDisplay, which might be a useful reference here. Check out the main plugin class for the action/response context definitions, and the show.tei.php
file, which templates out the custom response format (in this case, a TEI document transformed into HTML).
... and while I''m at it, I want to view it as TEI and PDF as well. And I want the moon on a stick.
I am having a look at the issue myself. I don't know much about Zend, but I think the way to go is dispatch on the HTTP "Accept" header to generate a representation of the exhibit in KML. KML has to be the most popular open data standard which closely matches the semantics of an Exhibit.
This should enable republishing of Neatline exhibits in Google Earth (to some degree), and in other forms, perhaps as PDFs? And it would enable Neatline exhibits to be archived with a well known content type.