probmods / probmods2

probmods 2: electric boogaloo
https://probmods.org/v2
296 stars 76 forks source link

Typos all over #108

Open Antipurity opened 5 years ago

Antipurity commented 5 years ago

Syntax errors… Such a small thing, just from reading through.

(conditioning) the the mathematical formulation

(conditional dependence) Both cold and lung disease are now far more likely that their baseline probability: …

(bayesian data analysis) relavent

(inference algorithms) htat

(inference algorithms) we either accepting or reject

(inference algorithms) acheive reasonabe result

(inference algorithms) checkershaddow illusion

(process-level cognitive modeling) a engineering detail

(learning as conditional inference) However, constructing sequences in this way it is easy to …

(occam's razor) This examples

(occam's razor) until the data overwhelmingly favor it.

(occam's razor) How much data does it take tend to believe the polynomial is third order?

(mixture models) In phonology this is been

(mixture models) the perceptual magnet effect: Hearers regularize

(mixture models) that is they are attracted

(mixture models) In an infinite model we construct assume

(mixture models) probability pf each category

(mixture models) many researchers have explored infinite micture models.

(social cognition) this method of making a choices

(appendix js basics) Multiple variable can be assigned

(Appendix - Useful distributions) Hierarcical models

(Also, appendix js basics contains 3^2 in a comment, which is XOR in JS, so it is misleading; ** is the exponentiation operator there, or just use words.)

(So it wasn't my imagination; overall quality really did drop significantly after about the first third, with Bayes.)

…And while you're here…

There might be a few similarities between probabilistic and numerical programming/analysis, distributions and numbers (not in basic structure, but in how they were created and are used). Expertise in one area is likely to transition well to another.

In probabilistic programming, marginal distributions are reified/manifested with Infer.*, and are defined only by that. (A collection of simulating/inferring methods united by a common constraint: simulation and distribution (in/out) converge in a basic sense — when simulated (with forward inference with increasing repeat-count), difference decreases.)

In pure mathematics, numbers are manifested with Number.* (like computation or Newton's approximation), and are defined only by that. (A collection of approximating methods united by a common constraint: computation and result (in/out) converge in a basic sense — when approximated (by a float with increasing precision), difference decreases.)

In reinforcement learning and hormones, agents are manifested with RL.* (like temporal difference learning (dopamine, expectation and reality converge with time), or Q-learning (serotonin, best action and reality converge with time)), and are defined only by constraint to converge, not by structure.

In graphics, objects are rendered, and detail decreasing and rasterizing methods could be seen to need to converge to similar rasterization with resolution increase.

The ability to reverse computation direction for concepts of its domain; being able to insert Infer/Number in any place (like on every built-in math function call, to a standard precision, as is effectively done today) without changing semantics, only precision — shared between those. Probably caused by being constrained by conceptual convergence.

Somewhat nice, I guess.