Open jank324 opened 1 month ago
We kind of already talked about this a little bit, but I think this will basically require that the survived_mask
is moved to be a permanent property of ParticleBeam
... and then all other non-aperture computations will have to be changed to respect the mask.
Correct? Am I missing something?
We kind of already talked about this a little bit, but I think this will basically require that the
survived_mask
is moved to be a permanent property ofParticleBeam
... and then all other non-aperture computations will have to be changed to respect the mask.Correct? Am I missing something?
Yes, the properties of the beam should be calculated only w.r.t. ParticleBeam[survived_mask]
.
Then we need to think of how other elements handle the tracking. Probably it's easier to still propagate the whole beam (for the vectorized implementation) even if a portion of the macroparticles are already lost.
I would suggest adding another property like particle_lost_at
to keep track of where the particles are lost along the segment. But this requires probably keeping track of the s
positions as #216 proposed.
So, I think we should possibly approach this PR slightly differently. A mask and the way apertures currently work are not gradient-friendly, because it's all step functions. Maybe we should instead consider a survival probability for each function. And then consider beam properties as survival probability-weighted. Apterture
would then apply a smooth mirrored Sigmoid-like function to the probabilities, which would give us nice gradients.
Another handy thing about this would be that we could leave the masked implementation of Aperture
in place for now and use the mask as probabilities in other computations. We then wouldn't have to implement the smooth masking now and could do that in a later PR if we wanted, but the rest of the code would already be able to work with it.
This sounds good for me. I guess we'll see how this scales with collective effects later...
@jank324 will you finish this PR?
This sounds good for me. I guess we'll see how this scales with collective effects later...
@jank324 will you finish this PR?
I actually think it will scale very nicely. Masking always flattens tensors, so you have to run a bunch of reshaping with Python operations. Multiplying with probabilities is one multiplication operation that can completely run concurrently and in machine code, so should be super fast.
I meant to finish it myself, but right now I'm not on top enough of my submission schedule to dedicate time to Cheetah development. So if you want to work on this, go right ahead.
Description
Add
particle_survival
attribute to theParticleBeam
. The particles blocked by theAperture
will be marked with0
and surviving particles will have1
.Right now
Aperture
still blocks differentiability, but can be addressed later using a sigmoid function for a smoothed blocking.Motivation and Context
Fixes #241.
Types of changes
Checklist
flake8
(required).pytest
tests pass (required).pytest
on a machine with a CUDA GPU and made sure all tests pass (required).Note: We are using a maximum length of 88 characters per line.