Open hlapp opened 10 years ago
I've thought about this a bit and here is how I would see it. First, forget the uncontrolled beach setting and focus on the zoo or the botanical garden. Second, generate a QR code for each species and put it on the placards used to identify organisms in the zoo or garden. This way users can collect electronically encoded species names using their smart device and your web server. [But you need permission from the zoo. Another approach is to do OCR on images of signs and then parse out species names, which requires no permission, but is way more complicated.] Third, send your list of species to an OpenTree service that returns a tree for the species. This would be a fun activity for a student field trip, e.g., instruct each student to get 1 insect, 1 lizard, 1 ape, 1 bird, 1 fish, 1 mammal, etc, and get a tree, then ask teams of students to compare trees and draw conclusions. For an alpha test, just set up a room with some photos of species on the walls, put the QR codes on the photos, and get kids to try it.
The UM Museum of Natural history has a great display of the fauna of Michigan that could be a good place to scope out how to incorporate this tool with a real collection. Many of these exist in display cabinets that could be represented as a species list associated with a single QR code. There would also be the opportunity to tie this into existing databases on where these taxa were collected in Michigan so that the question of 'where else in Michigan does this live' can be addressed by actual vouchered specimens that exist in the 'behind the scenes' part of the collection. The 'Hall of Evolution' could also benefit from some phylogeny awareness.
http://www.lsa.umich.edu/ummnh/exhibits/permanentexhibits
I have been working here in Michigan on a db schema that would allow for museum data to be stored in a framework that could be available in RDF queries that would allow for combined queries of specimens and trees.
http://igulo.blogspot.com/2014_01_01_archive.html
Unfortunately our current institutional focus has been on migrating specimen data into KE Emu which is not something I am all that interested in.
I think it would be useful to incorporate a rudimentary character mapping feature, both in the mobile app and the main project. This could be useful to teachers designing lessons about groups of animals (i.e. map how many legs on members of each taxa in a tree containing lots of divergent groups, say a tree with multiple representatives of insects, spiders, millipedes etc. as a very simplistic example). I also could envision a teacher taking students on a field trip to a museum and having each student pick different characters to map, some of which would be homologous while others are convergent, allowing those concepts to be introduced in an easy to visualize way.
This is a great idea. I can also imagine kids taking pictures to submit to a database, and then be able to see pictures that other students have taken of the same species, genus, family, etc. in order to see how much variation there is in a group. Or, perhaps, submitting some basic morphology (i.e., how long is this insect?) and then being able to see measurements that other students have made on the same species, in order to see how much variation there is in body size, wing length, etc. In other words, trying to hook up trait information with phylogenetic information in service of educating app-users about diversity and variation.
@tcatanach, kids can diagnose some character states, e.g., number of legs in your arthropod example. I think it would be great to think of a question for phylogenetic character analysis where (1) the question (about what is ancestral, or whether state X came before state Y) is inherently interesting (2) the states can be diagnosed easily and (3) the answer is non-obvious. In the case of "what is the number of ancestral legs for arthropods?", I suppose that in most museums kids would mainly find insects with 6 legs, and that it might be a learning experience (for kids) to find that the number that is most popular is not the same as the ancestral number.
BTW, from what I have heard, curators don't want to put QR codes on their displays, because QR codes are new-fangled and ugly, whereas museums are austere and carefully designed for visual appeal. However, due to limitations on human visual spectrum, it is possible to create QR codes that are invisible to the naked eye but visible to a phone camera (e.g., see http://qrbarcodegenerator.org/the-invisible-qr-code.html).
I'm not saying this because I think we should worry about using invisible QR codes for a prototype. I'm saying it because, when evaluating the strength of this idea, we should not reject it on the grounds that no one will want QR codes on their displays. When it comes time to implement in a museum, zoo, or botanical garden, we could hide invisible QR codes in some pretty icon (e.g., a camera icon) so that people know where to point their camera.
Also, even if the prototype works with QR codes, if the code is written modularly enough the QR -> taxon module could be replaced with an iBeacon -> taxon, GPS location -> taxon, or even something like Word Lens ( https://en.wikipedia.org/wiki/Word_Lens) to go from printed text to name recognition to taxon.
I (Curt Lisle) am a remote participant this week, but I have some minimal experience making cross-device apps. In my opinion, as the access to OpenTree and the machinery to turn species lists into trees matures, this project could become more feasible to do. Arbor's machinery allows for creating web pages relatively easily. This might be easiest to prototype through a mobile-friendly web application. Get in touch if we should talk seriously about trying this.
I'm interested in being involved (if I can be useful in any way) if this gets off the ground. It's a neat idea and seems doable.
Not sure if this has got a group working on it this week at the hackathon, but I would be interested in working on this, I do a lot of Xcode so might be able to help with some of the app design.
Mark, there was not a group for that. The groups are here:
https://docs.google.com/document/d/1aAwS5mH4iCmRW4aRlVe3po5GO8hB9CQw3e25NWoSAFM
Sorry you couldn’t be here.
Arlin
On Sep 16, 2014, at 12:06 PM, Mark Westneat notifications@github.com<mailto:notifications@github.com> wrote:
Not sure if this has got a group working on it this week at the hackathon, but I would be interested in working on this, I do a lot of Xcode so might be able to help with some of the app design.
— Reply to this email directly or view it on GitHubhttps://github.com/OpenTreeOfLife/hackathon/issues/6#issuecomment-55767277.
Arlin Stoltzfus (arlin@umd.edumailto:arlin@umd.edu) Research Biologist, NIST; Fellow, IBBR; Adj. Assoc. Prof., UMCP IBBR, 9600 Gudelsky Drive, Rockville, MD, 20850 tel: 240 314 6208; web: www.molevol.orghttp://www.molevol.org
I still have this vision that if I go the zoo, the botanic garden, or to the beach with my children, I can have an app on my phone, or they can have an app on their iPad or iPod that allows them to ask (and get answers to) questions such as, how are these things in neighboring cages related? When did their ancestors diverge, and what did their last common ancestor probably look like? Where do their close relatives live? Which related clades live near my home? What's the Tree of Life living at this Zoo? At Duke Forest? At Smoky Mountain National Park? I want these things not when I'm at home at my big high-res screen computer, but when I'm there with my children, armed with a phone, an iPad, or an iPod.