RoboCupAtHome / RuleBook

Rulebook for RoboCup @Home 2024
https://robocupathome.github.io/RuleBook/
Other
149 stars 60 forks source link

Software sharing (EC decision for RoboCup 2019) #479

Closed komeisugiura closed 2 years ago

komeisugiura commented 6 years ago

Exec meeting on June 21, 2018 Participants (alphabetical order): Caleb, Komei, Luca, Mauricio, Sven, Tijn

Asking champion teams to publish software has not been working well so far. This is partially because incentive of the team is questionable after obtaining the award. Therefore, delay in publication sometimes occurred.

If the team did not publish the software, TC can cancel the award. However, this is unrealistic considering the work to be done for that.

EC decision

balkce commented 6 years ago

This could be considered as part of the qualification material for participating, as well.

LoyVanBeek commented 6 years ago

Requiring open source: awesome!

But publishing software is one thing (our's is all on Github, would even be updated during the tournament if a local server wouldn't be faster). That is not very useful when other teams do not know about some software or cannot integrate it in their own systems. Perhaps some overview of what team uses what software for what purpose is handy is that regard. E.g. some table that list:

World Modelling Face recognition Pose recognition Action Planning Speech recognition
Homer
PUMAS
TechUnited Ed OpenFace OpenPose action_server Dragonfly
Walking Machine Wonderland

Then, teams can see what other teams did well and use or fork that. Hopefully, that encourages some re-use as well. Filling in this table could be a requirement for qualification as well, if we have this on the wiki.

komeisugiura commented 6 years ago

@balkce Indeed, open source activities have been counted as plus for the qualification material. We will continue with this, however we will not be able to make it mandatory because..

The idea is to require teams to publish the software N (=7?) days before the competition.

komeisugiura commented 6 years ago

By the way, the spirit of standar platform is to make the software reuse easier.

Indeed in AIBO league, the software-sharing rule was very strict. Every team was required to open all the software.

kyordhel commented 6 years ago

What about making it a requirement for Open Challenge? Typically it is expected that teams present their edge research in the OC, but we can take an step further and force them to make the solution available to others.

This way, teams can see it running and also have access to it.

justinhart commented 6 years ago

I don't know that I can get into that. Despite the fact that I plan on a software release for what we displayed in the open challenge, I also plan on publishing a paper on it before releasing the software.

komeisugiura commented 6 years ago

@kyordhel we are going to remove OC #478 ...

warp1337 commented 6 years ago

What about a "framework" that helps people to actually share their code? In my opinion, it is not enough to just make the code public somewhere. Usually it takes 20% of the time to find and compile "stuff" and then 80% to figure out how it works, get additional software that has (accidentally) not been published, e.g., is not in the same repo etc. Thus, having the same documented methodology etc, or even best practices would be great in this context.

moriarty commented 6 years ago

@warp1337: I agree, but if it is ROS based then I don’t think we need to re-invent the wheel. Melodic now builds for Ubuntu, Debian and Fedora, and can build docs, ROS-Industrial has a decent set of tools for ros-ci with Travis, and I would expect everyone to be familiar with the process.

If it’s not ROS based... ¯_(ツ)_/¯ ... Which is unfortunate because there is a bit of a monopoly there. At the very least: Open Souce code should contain a Docker file and be running on one of the CI platforms: GitHub + Travis.

I disagree with holding off on “open source” software releases because of “waiting to publish a paper” but that’s a whole other debate... fortunately the Open Challenge has been removed so we don’t need to have that debate.

warp1337 commented 6 years ago

@moriarty I agree with you, at least partially. For the sake of time I will not write down my opinion on this topic here :)

Please have a look at section II and III here

LoyVanBeek commented 6 years ago

@moriarty Being based on ROS does not mean I can easily integrate someone else's code. It helps a LOT, that's for sure, but if they use custom ROS messages, it's still not going to work.

Having agreed on interfaces for would help a lot. When you use ROS, it's often move_base and probably MoveIt! but anything above has no obvious de-facto standard to use or to build on. If for example, all vision stuff in @Home adhered to https://wiki.ros.org/vision_msgs, that would help software exchange so much.

I don't like to impose this on teams, but e.g. MSL and other leagues also have some rules for standardization, eg. http://wiki.robocup.org/Middle_Size_League#Standardization. Standardization is a different, but very related topic I think.

LoyVanBeek commented 6 years ago

BTW: I added the table I made earlier to https://github.com/RoboCupAtHome/AtHomeCommunityWiki/wiki/Software#Team_software_overview. Feel free to extend it @airglow @warp1337 @oscar-lima

raphaelmemmesheimer commented 6 years ago

I personally find the table of @LoyVanBeek more attractable than the last page that was added to the TDP, which for this year listed all used software more than half a year prior to the RoboCup, where most teams didn't even start their preparation. I updated the table, however some repositories are linked but not yet set to public. This if often not a straight forward process according to software licenses. Further the winners have the possibilities to give a more in-depth description in a paper for the annual RoboCup book (whatever happened with it 2017?).

LoyVanBeek commented 6 years ago

Thnx @airglow @oscar-lima. Interestingly, each in the column 'pose recognition' lists OpenPose, each with a different ROS wrapper. Perhaps 'my' table can be compiled from all the TDPs.

As to the main discussion: requirement of open sourcing robot software before some challenge. I like the idea, but one issue I see with it from a referee point of view: how to check and enforce this? @komeisugiura ?

warp1337 commented 6 years ago

@komeisugiura @LoyVanBeek @airglow

We need a well designed process and tools for that. Otherwise people will share stuff, which is incomplete (e.g. required additional software that has not been mentioned), doesn't work or sth. similar. I assume the goal is to give other teams the opportunity to re-use software.

I will think about this ... and already have an idea.

RemiFabre commented 6 years ago

As a brand new team (that is preparing for its first qualification video) what we did is read all of the recent TDPs. One of the output of this work is a big list of libraries and techniques used to solve different tasks. The list itself is hard to organize. I would amend the list with useful comments, github links, tutorials, papers. This work definitely helped us to have a good overview of what is used but every time I talked to one of the @Home participants in Montreal, I learned practical/useful stuff that you wouldn't normally write on a TDP.

A few ideas to help incentivise knowledge/code/method sharing:

kyordhel commented 6 years ago

Thanks for the ideas @RemiFabre.

  1. Honestly I see little chance that anything happens after the finals. Not sure in other leagues, but in @Home participation gets reduced sadly after 2nd stage.
  2. If a team uses another's contribution, good for them! We have an award for Open-source sharing and the most teams that use your solution, the most chances you've to get it. I wouldn't grant any point for using anything since teams can declare they use X and in practice use Y. When something works fine, teams naturally migrate to it.
  3. Let's think about a nice co-op setup or challenge in which robots can share or exchange knowledge in order to solve a task.
maximest-pierre commented 6 years ago

Since some teams qualify early in the year shouldn't the committee ask for an updated TDP a month or 2 from the competition date. This would actually be a better representation of what the actual hardware and software that is currently being used in the arena. Since we know the last minute change in Nagoya, were only describe in the TDP for Montréal.

balkce commented 6 years ago

I agree with @kyordhel. Maybe we could add some points if the test is solved via cooperation with another robot.

LoyVanBeek commented 6 years ago

@maximest-pierre May I ask you to also fill in the table?

maximest-pierre commented 6 years ago

I've added the world modelling for Walking Machine. I am going to dig up the rest later today.

johaq commented 4 years ago

So far we have not implemented this. Maybe we should prepare a software survey for Bordeaux to further fill out the table started by @LoyVanBeek