Closed RobertTLange closed 1 year ago
Hi, the author here.
The paper didn’t cite evojax since we don’t consider it to be a general EC library. I have used evojax before and to me it feels like evojax’s api is heavily biased toward ES and RL. It’s a bit weird to place it together with general purpose libraries like DEAP and pymoo. So even though we are both using JAX, comparing evox with evojax feels like comparing apples to oranges.
For evosax, I apologize that I didn’t notice it earlier. I spent some time looking into the library, and it’s really exciting. Though considering both papers came out at a similar time and the difficulty to change this paper now(especially considering that we don’t want to introduce technical details in the paper), I would still prefer to keep it as is. But still, your work is wonderful, I would surely like to introduce your library on the readme page and I hope we could have further collaboration.
For the technical difference, based on my understanding(I’m a noob to evosax, correct me if I’m wrong), the most fundamental difference is that, in evox, the state is hierarchical, and is managed automatically. For example, we could have
+-operator_1
+--algorithm-+
Pipeline -+ +-operator_2
+--problem
instead of having a separate state for each object, we automatically merge these states into a tree-like structure. So we always have one state, meaning you could write code like this:
population, state = self.algorithm.ask(state)
fitness, state = self.problem.evaluate(state, population)
notice that algorithm and problem should have two different states but behave like one. It may not be that exciting here, but by abusing this feature, it’s quite easy to build something like an algorithm that runs another algorithm inside, like pgpe+adam, or meta-learning, which is a problem that wraps an entire pipeline inside, or a distributed pipeline.
Though on the other hand, evosax’s use of dataclass feels really clean, and separate modules for networks and restart strategies are inspiring. It's also exciting to see that there are so many algorithms available in evosax.
I wish to discuss more on this topic if you could, to help both projects and grow the community.
As a regular reader of cs.NE
, and general enthusiast of new developments and tooling in the EC space, I was glad to see a new library enter the space.
And, probably like many (if not most) readers of cs.NE
and the domain it represents, we are aware of recent developments in the space.
I agree with @RobertTLange, that the total omission of both EvoJAX and evosax from the EvoX paper was extraordinary.
From the abstract, when I read the claim:
To the best of our knowledge, this is the first library supporting distributed GPU computing in the EC literature.
I thought, well, EvoJAX was published a year prior and is well known in the community; evosax has been around even longer, and is mentioned in the EvoJAX README.
The paper didn’t cite evojax since we don’t consider it to be a general EC library. I have used evojax before and to me it feels like evojax’s api is heavily biased toward ES and RL. It’s a bit weird to place it together with general purpose libraries like DEAP and pymoo. So even though we are both using JAX, comparing evox with evojax feels like comparing apples to oranges.
But you mention nearly every other evolutionary library in the paper's section II; section II.B is even explicitly about neuroevolution, TABLE I
has a comparison table including "GPU Computing" and "Neuroevolution Tasks" columns--which readers current in this domain would naturally ask "EvoJAX?"--yet it is not only not compared or mentioned, it's not cited at all, and Section II.B
concludes:
to the best of our knowledge, none of the existing EC libraries currently supports distributed GPU acceleration for neuroevolution.
No matter what, EvoJAX had well-established precedence here as a GPU-accelerated, JAX-based library for neuroevolution, even if it is not considered a general EC library--and so it's complete omission seems odd.
If it's apples-to-oranges, then you can still mention there's another JAX lib for EC-related tooling, EvoJAX, but it's not general purpose like ours.
For evosax, I apologize that I didn’t notice it earlier. I spent some time looking into the library, and it’s really exciting. Though considering both papers came out at a similar time and the difficulty to change this paper now(especially considering that we don’t want to introduce technical details in the paper), I would still prefer to keep it as is.
There are some points here that are not academically defensible.
With the otherwise fairly broad survey of EC tooling made in the paper, that evosax was missed seems uncanny. Yes, the other libraries are older; but again, EvoJAX links to evosax, and you can find it fairly easily otherwise.
Considering both papers came out at a similar time
Okay, fair, but while the evosax paper may have been published a few months ago, the repo was published much earlier in 2022.
the difficulty to change this paper now(especially considering that we don’t want to introduce technical details in the paper), I would still prefer to keep it as is.
This point is perhaps the least defensible.
On arXiv, there are very few popular papers that do not have, at least, a v2
.
Authors revise their papers all the time (not just for conference mentions etc..).
I don't meant to disparage here; I want to encourage the development and interest in EC, and I'm trying to provide helpful criticism that likely many readers may have had. It's important for our community to encourage and acknowledge the efforts and achievements made in this space.
As such, for both the robustness of your paper and in the interest of promoting the community, I think it's important that EvoJAX and evosax be mentioned.
Hi Evan (Robert cc’ed),
Thank you for taking the time to provide insightful comments on our paper. We appreciate your efforts in helping us improve our work.
We would like to apologize for the oversight in not citing evosax and EvoJAX in our paper. This was due to our carelessness and was not intended to cause offense. We have the utmost respect for the authors and the JAX community. Based on your suggestions, we have made revisions to the paper and updated the version on arXiv. In addition, we hope the following clarifications will help to to clear up a misunderstanding and set the record straight.
Regarding the missing citation of evosax, we have already reached out to @RobertTLange privately via email to clarify the situation. We did not intend to overlook this work and will ensure that it is properly cited as a pioneering use of JAX for accelerating EC algorithms. Our reason for not updating the version on arXiv immediately was due to the paper being under review and our desire to maintain consistency.
In regards to EvoJAX, we fully agree with your feedback. It was inappropriate of us to ignore it in the first place, despite the fact that it focuses on evolutionary strategies applied to neuroevolution tasks, which is different from the general EC library, EvoX, that supports various types of algorithms on various tasks. We acknowledge that EvoJAX has shed light on the potential of JAX for GPU computing in neuroevolution.
Our statement "To the best of our knowledge, this is the first library supporting distributed GPU computing in the EC literature" was not meant to boast. When we referred to "EC library", we meant to encompass a broader range of algorithms, including single-/multi-objective algorithms, variants of GA, DE, PSO, etc. However, to avoid confusion, we have removed this statement as per your suggestion.
The libraries listed in Section-II of our paper are well-established and have a long history and significant impact in the EC community. We believe that evosax and EvoJAX, as emerging libraries focusing on evolutionary strategies and neuroevolution, could also benefit a wider audience. However, we decided not to mix them with the classic libraries in Section-II.
Once again, we appreciate your valuable feedback and look forward to future opportunities to collaborate and contribute to the EC and JAX communities.
Best regards, EvoX Team
@BillHuang2001 & EvoX team, Thank you your prompt, thoughtful, and comprehensive response.
I read through your comment and appreciate you addressing each item. I'm impressed that you and the team pushed new versions of the EvoX
paper with mentions of EvoJAX and evosax
so fast!
Thank you for updating your literature and providing this great resource EvoX
with the community!
🦎✨
Closing the issue because it's already resolved.
Hi there! Nice to see more enthusiasm for JAX and evolutionary optimization. Any reason why you don't mention EvoJAX and evosax in your paper? Would love to know how they compare.