Open anthonyselkowitz opened 8 years ago
Nice idea.
With respect to analytics and keeping track of citations Google scholar does that well. It tells you how many citations an author/article has and if you click it it will pull up those citations. With that though there is no commentary or organization except by author and article.
The data around the analytics will be a breeze. Our data model supports that very, very well. The only issue is that my visual design skills and front-end dev skills are slim, at best. We still need to identify someone to build the pretty part. I can do a first pass with D3 or something, but I'd not be comfortable launching with front-end stuff I've built.
Discussion systems are something I lack experience in, but I have a pretty high confidence we can pull it off from a technical perspective. @anthonyselkowitz 's points regarding real names and toxicity are consistent with my experiences, as well, and I think that will be an easy win.
Feedback and learning systems are also an area where I lack expertise, but I have strong contacts who specialize in those areas. I can lean on them for help when we get to that point, and I think they'd find the project intriguing.
Do we feel comfortable adding analytics, discussion, feed systems, and feedback systems as features? I believe I have citations and dependencies covered already. We haven't yet started prioritizing things, so more on that later.
I'm okay with those features. I can start to sketch up some wireframes for to give us a point to start. Same with me, it's the front end implementation that I am lacking. Is there an area with all of the features that we want to implement? With a list of features I can start wireframing and organizing them. If that's not already compiled, I can look through the discussions and compile one.
I think that a feed system could be particularly useful to users. Especially if it is similar to Facebook in which it could include things that the user's similar to themselves are commenting on or in a similar area. This would help people by allowing them to see what types of feedback other researchers are getting. It'd be like if there were a personalized area for stackexchange where you could see similar topics to the one's that are tailored to your research/area. So these researchers would be better informed with the types of research others are performing and would help them identify topics for future study. This would also help them contribute to the discussion on articles similar to theirs and further others' research at the same time.
Also, if a portion of the app/website is to be an expanded version of F1000(since it seems like they only cover medical related articles). There should be a comprehensive set of publishing guidelines for the types of articles/format of articles accepted. These guidelines help reviewers know what to look for and how to judge an article. As a reviewer for conferences/journals I've had difficulty reviewing papers that didn't follow the typical accepted format.
Hmmm, we need a way to track and prioritize features, i.e. what does and doesn't make MVP and in what order. Suggestions appreciated. I'm not sure Github issues are the best medium for that...?
I'm definitely down for the feed system, but be aware that it will require a great deal of iteration for it to feel right. It'll be an ongoing work.
Ahhh, the formatting is an interesting point. I haven't thought about that at all yet except with regards to citations. I imagine that would just be some static content for each category... We should probably divvy up things into high-level categories - medicine, physics, etc etc. I lack domain knowledge in where to draw these lines, though.
Oh, derp. For feature tracking, we should use the Wiki. Maybe your analysis belongs there, too, since we'll want to reference it as we progress.
@anthonyselkowitz @DesireeVanHaute - While crawling through the links in the Vox article Anthony sent me, I came across this: https://github.com/CenterForOpenScience/osf.io + https://osf.io/
It seems like someone's already attempting what we are. If we find that our goals align well, I'd be comfortable taking the knowledge we have here and contributing it to their project.
What vox article do you keep referring to?
Sent from my iPhone
On Jul 22, 2016, at 8:27 AM, TheBeege notifications@github.com wrote:
@anthonyselkowitz @DesireeVanHaute - While crawling through the links in the Vox article Anthony sent me, I came across this: https://github.com/CenterForOpenScience/osf.io + https://osf.io/
It seems like someone's already attempting what we are. If we find that our goals align well, I'd be comfortable taking the knowledge we have here and contributing it to their project.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
@DesireeVanHaute http://www.vox.com/2016/7/14/12016710/science-challeges-research-funding-peer-review-process My bad. I tossed it in the wiki. Not sure where else I can put it for more visibility.
Hmmmm that's interesting. I envisioned something similar to researchgate, but with an emphasis on replications, linking of projects, and null results. I'll have to dig into OSF more!
Yeah, lemme know. Since they're much farther along, I'd prefer to avoid additional fragmentation and join up with them, pending us selling them on our feature ideas.
Thoughts?
Let's start a thread of the competing products so that we can identify where the ecosystem currently is and what we would do to differentiate ourselves from these services.
http://f1000research.com/
"F1000Research is an original open science publishing platform for life scientists that offers immediate open access publication, transparent post-publication peer review by invited referees, and full data deposition and sharing. F1000Research accepts all scientifically sound articles, including single findings, case reports, protocols, replications, null/negative results, and more traditional articles."
Academia.edu/
From my experiences it is a place where you put your articles whether published or not to showcase your work, You can upload your PDFs or link to an external source. Then they have article views and whether or not people look at your profile on there. They also have a news feed that essentially includes articles based on what you listed your interests as. Here's an example of their analytics page.(I'm sure that someone with more citations than me would have an interesting page lol) Academia.edu also allows you to upload your papers and get feedback on them.
Academia.edu also allows you to upload your papers and get feedback on them.
http://journals.plos.org/plosone/
Plos one is essentially an open access journal that functions in the same manner that normal journals do except it is entirely open access.
https://www.researchgate.net/
Research gate seems to be similar to academia.edu in that you can publish articles and link coauthors to these articles. In addition, it appears that you can request feedback from your network regarding the articles that you put on there. It seems like there is a discussion portion as well. It seems like they avoid toxicity by using real names(as mentioned in another discussion) and those real names are linked back to the author's pages. Which using their professional reputations, would help avoid toxicity as well.
In addition to their questions feed, they have a feedback option to help give a better understanding of their user's profiles so they can display better questions for that person.
For instance, the first one was on target with my expertise,
But the second one was not:
And that's where the feedback option was given after clicking the 'x' in the top right.
Sub topic(will make a separate thread), This may help avoid toxicity but also give a feedback option as well. We could have a feature similar to something like stackexchange where people can pose questions about the article and foster a discussion on the null/reproduced results of the article. This would be good for a reproduction standpoint, help evaluate null results, and give author feedback that could help when they publish the article in a standard journal. Also, it could be used to help us identify articles that are reproductions of results where people could suggest articles that serve as reproductions and then they could be voted on as whether or not they are good reproductions.