Open gmagerma opened 4 years ago
Or is there a difference between math and programming languages applications?
One difference might be that the mathematical definition of "set" does not include ordering, a partially/fully ordered set is technically the members of the set + the ordering relations of the members. https://en.wikipedia.org/wiki/Total_order https://en.wikipedia.org/wiki/Partially_ordered_set The computational tuple is just a data structure, this could reflect an unordered set (e.g. the set of GDP of each country, no expectation that the order of this tuple is important), or could be expected/interpreted as an ordered set (e.g. for time series analysis, where you'd hope that tuple of price is being delivered and handled as sequential).
Probably semantics, but: https://en.wikipedia.org/wiki/Tuple https://en.wikipedia.org/wiki/Set_(mathematics)
Obviously the set of positive integers is N = {1,2,...}. There are no recurring elements. But say that we flip 10 coins 1 time, then one event/outcome is A ={H,H,H,T,T,H,T,T,H,H} This is then not a set but a tuple? And what if we flip a coin as many times as needed to get a heads? Then the tuple is possibly of size n -> \infty?
Potato potato? Or is there a difference between math and programming languages applications?
Bonus: symmetric difference is an extension of a set complement: https://en.wikipedia.org/wiki/Symmetric_difference