JuliaMath / MeasureTheory.jl

"Distributions" that might not add to one.
MIT License
386 stars 32 forks source link

Add a `Uniform(a, b)` measure #182

Closed jarredbarber closed 2 years ago

jarredbarber commented 2 years ago

This PR adds a parameterized uniform measure on an interval (a, b).

jarredbarber commented 2 years ago

It might be cleaner to just add a function Uniform(a, b) that constructs an Affine; I did it this way because (1) I wasn't sure how that would interact with inbounds and (2) I was following the Normal distribution as an example and was trying to stick to how that worked.

codecov[bot] commented 2 years ago

Codecov Report

Merging #182 (13745ef) into master (7102906) will increase coverage by 0.15%. The diff coverage is 70.58%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #182      +/-   ##
==========================================
+ Coverage   45.07%   45.22%   +0.15%     
==========================================
  Files          32       32              
  Lines         670      681      +11     
==========================================
+ Hits          302      308       +6     
- Misses        368      373       +5     
Impacted Files Coverage Δ
src/parameterized/uniform.jl 73.07% <70.58%> (-4.71%) :arrow_down:
src/realized.jl 49.18% <0.00%> (-2.39%) :arrow_down:
src/parameterized/binomial.jl 53.84% <0.00%> (-1.71%) :arrow_down:
src/parameterized/negativebinomial.jl 80.76% <0.00%> (-1.38%) :arrow_down:

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 7102906...13745ef. Read the comment docs.

github-actions[bot] commented 2 years ago
Package name latest stable
Mitosis.jl
Soss.jl
cscherrer commented 2 years ago

Hey @jarredbarber, I have a few updates...

First, I'm punting on the AGPL thing, going back to MIT. Sales would be a ways off, and it hit me that I really want to avoid being in such a sales role anyway. And it was feeling kind of isolating, which isn't great either.

Also, tracking the support as part of the log-density wasn't working well, so I've pulled that out. So now insupport is separate from logdensity_def, and logdensityof puts everything together.

So having another look at this today, I found it's now pretty easy to implement this. We really just need

@kwstruct Uniform(a,b)
Uniform(a,b) = Uniform((a=a,b=b))

insupport(d::Uniform{(:a,:b)}, x) = d.a ≤ x ≤ d.b

proxy(d::Uniform{(:a,:b)}) = affine((μ=d.a, σ=d.b - d.a), Uniform())

@useproxy Uniform{(:a,:b)}
Base.rand(rng::Random.AbstractRNG, ::Type{T}, μ::Uniform) where {T} = rand(rng, T, proxy(μ))

Here @kwstruct is from KeywordCalls.jl, and the second line makes it so we can use this with unnamed arguments. The @useproxy line tells MeasureTheory to use the proxy for calls to logdensity_def and basemeasure. This doesn't take care of rand, so I have that line separately.

This will make it into the dev branch soon.

cscherrer commented 2 years ago

Added support for Uniform(a,b) in version 0.15, so I'll close this now. Thanks @jarredbarber