Closed StoneCypher closed 5 years ago
New functionals
Accepting three functions, before(pred, pre, post, ctx1, ctx2)
, call pred
.
While pred
is false, call and return pre
. When pred
returns true, call
and memoize post
, then forever return memoized post
.
Because this is essentially a reverse Promise
, I want to call this threat
.
Nobody thinks that's funny but me.
before(cheating_detected, handle_user_calls, permanent_cheater_lockdown);
New globals
Language level memoization, idempotence, and partial application (no that isn't currying) could have a pretty significant contribution to engine implementers' ability to optimize
Object additions
What we always do for config stuff. Walk N objects-or-maps-or-whatever, either from an array or more likely as varargs, taking keys from the earliest present.
What you want for
const Config = ukeymerge(userPreferences, themeSettings, editorDefaults);
So it'll be the non-terrible version of
const ukeymerge = (Objects..., out = {}) =>
Objects.map(o =>
Object.keys(o).map(
ok => {
if (out[ok] === undefined) { out[ok] = o[ok]; }
}
)
),
out;
That's probably wrong and only 20% of us can confidently eyeball it
Can be done way more efficiently in-place and very often implemented (and botched) in userland code.
Which means I'd like
const userPreferences = { indent: 'space' },
themeSettings = { indent: 'tab', size: 4 },
editorDefaults = { indent: 'tab', size: 2, smart: true };
const tabExample = userPreferences.ukeymerge(themeSettings, editorDefaults);
A sane array equals thing already. Jesus
const a1 = [1,2,3],
a2 = [1,2,3];
const o1 = [a:1,b:2],
o2 = [a:1,b:2];
const n1 = [a1, o1, {z: [a1, o1]}],
n2 = [a1, o1, {z: [a1, o1]}];
console.log(a1.deepEquals(a2)? 'yay' : 'javascript');
console.log(o1.deepEquals(o2)? 'yay' : 'javascript');
console.log(n1.deepEquals(n2)? 'yay' : 'javascript');
Two objects are similar if they have the same keys.
const similar(o1, o2) =>
Object.keys(o1).nsort().deepEquals(
Object.keys(o2).nsort()
);
// lol writing this in es3. programmer.hasOwnSanity
{ name: 'bob' }.similar({ name: 'carol' }); // true
{ name: 'bob' }.similar({ face: 'bob' }); // false
Like similar, but for a list of object keys in the first place.
The second argument is a boolean, default false, which when true enforces exhaustive keys (that is, not just everything, but also nothing else.)
const troll = {
name : 'Thog',
gender : 'male',
class : 'warrior',
level : 14,
job_title : 'marauder',
attributes : { str: 16,
wis: 11,
int: 10,
dex: 7,
con: 19,
cha: 11 },
hit_points : 141,
thac0 : 7,
salary : { value : 15,
currency : 'gp',
timing : { frame: 'monthly',
dates: [1,15] }
}
};
const wizard = [ 'level', 'hit_points', 'spell_list' ],
employee = [ 'job_title', 'hit_points', 'salary' ];
troll.shaped_like(wizard); // false - no spell list
troll.shaped_like(employee); // true - has all three
troll.shaped_like(employee, true); // false - has other things also
A natural for destructuring assignment with pattern matching to get through complex datastructures.
{ a: "fonzie", b: "stringer", c: "your way out" }.pick(['a', 'b']);
// returns ['fonzie', 'stringer']
The converse of pick.
{ a: "fonzie", b: "stringer", c: "your way out" }.pick_skip(['a', 'b']);
// returns ["your way out"]
Winnow is like pick, but it keeps the object's structure and (reduced) keys.
{ a: "fonzie", b: "stringer", c: "your way out" }.winnow(['a', 'b']);
// returns { a: 'fonzie', b: 'stringer' }
Winnow can also take a predicate.
const only_angry = key => key.toUppercase() === key;
const bad_example = {
grr: 'grumble',
GRR: 'LOUDER',
hello: 'greeting',
HI: 'WHY ARE YOU SCARED'
};
bad_example.winnow(only_angry);
// returns { GRR: 'LOUDER', HI: 'WHY ARE YOU SCARED' }
This is startlingly useful for deep nested JSON validation, by example, which must be vetted for minimum key anger levels at all time to ensure integrity.
The converse of winnow, remove returns a new object with the noted keys omitted entirely.
{ a: "fonzie", b: "stringer", c: "your way out" }.remove(['a', 'b']);
// returns { c: "your way out" }
Remove can also take a predicate.
const only_angry = key => key.toUppercase() === key;
const bad_example = {
grr: 'grumble',
GRR: 'LOUDER',
hello: 'greeting',
HI: 'WHY ARE YOU SCARED'
};
bad_example.remove(only_angry);
// returns { grr: 'grumble', hello: 'greeting' }
Objects want filter too, friends.
Details left as an exercise for the reader
Math additions
We all know this by heart for a reason.
A terrible, terrible reason.
const seq = n => (new Array(n)).fill(false).map( (_, i) => i );
I triple-guarantee there's a better way to do that. Rainbow sprinkles on the contract.
Seq on steroids.
Array.range(10, 20, 3); // [10, 13, 16, 19]
C'mon.
// I called for implementation-controllable randomness elsewhere. Assume that's
// handled here, somehow. Thanks
const rnd_i({ min=0, max, impl='PCG', seed }) =>
// I probably got this wrong
Math.floor(Math.random(impl, seed) * (max - min)) + min;
const rnd(a, b) =>
typeof(a) === 'object'
? rnd_i(a)
: b === undefined
? rnd_i({ max: a })
: rnd_i({ max: b, min: a });
Math.rnd(4); // [0 .. 3]
Math.rnd(1,6); // [1 .. 6]
Probably you can't just call random
like that because it'd trample other
unrelated states, so maybe
const rnd = Math.make_rnd({ impl: 'PNG', seed: ..., step: 408 });
or whatever, and then you could query for current state and what have you
there's an enormous amount more that the js standard library legitimately needs
unfortunately all i got was a bunch of flak, so, i stopped trying
i think a big part of the reason that we're on version nine of the language and don't have deep copy yet is that the language's gatekeepeers are nowhere near as inclusive as they think they are, and every time someone like me has something to add, the door gets closed in their face
@StoneCypher maybe it's just too much too quickly. because javascript has no real versions, features cannot be added willy-nilly.
If I'm not mistaken, the purpose of this repo is to track the addition of the possibility of a standard library. Until there is system/syntax for the standard library, I doubt tc39 will want to consider the actual additions of modules to the standard library.
Once there is a system in place, I assume each addition will have to go through the tc39 process.
@StoneCypher it seems like you're proposing what could be a library of your own, like Lodash. Keep in my lodash has a very strong popularity and even thou is not mean to become a standard library for JS.
Following a standards process requires some strategy and even some (team) engineering to convince different independent people and companies to agree with what you're proposal. A (max,min) strategy is unmet but highly required here. Nothing goes to the specs as imposed. The consensus process tries on its best to avoid that.
A topic like this, "Okay", is a high trolling for the community in general. I'd like to ask you considering some further understanding of a standard process.
This specific paper helped me a lot understanding the process I recommend reading: http://wirfs-brock.com/allen/files/papers/standpats-asianplop2016.pdf
The books you might also find as references for these paper are great:
I believe they are very educational and might provide a proper answer I can't really provide in such a short comment. I hope you understand that.
no, i am not proposing a library of my own, like lodash. it's unfortunate that you closed this without understanding it first
i am very clearly and repeatedly saying "the reason things like lodash exist is that the js standard library isn't good enough and is gatekept by people who resist change with no apparent actual reason"
it's been 20 years. getting the basics like deep copy is not "adding features willy nilly." that's just heel dragging
people have been releasing incompatible, bug-ridden libraries for these things since es2
we're coming up on es9. the language turned 24 this year.
it's time to have our six basic containers and our 39 basic functional predicates.
@obedm503 - we've been waiting on that system for 20 years
there's no reason to wait. most of these are one-liners
people act like the language having core, reliable, uniform implementations of its basics is somehow extremism
what are they actually adding to the language?
well, they care about syntax when it's cosmetic stuff like adding commas to the ends of arglists, but not when it's meaningful stuff like message dispatch, or pattern matching until it's proposed internally, or generators
well, they care about functions, when it's string padding, but not when it's functional predicates
we've got six different asynchronicity mechanisms now - promises, callbacks, async, generator yield, web workers, and timeouts. we're adding a seventh, async iteration, which is sufficiently difficult that i don't think it'll actually get used. none of them can safely be threads or processes, but if we want a very slight rephrasing to make threads or processes possible but not required by an implementor, it's out of scope
we've got shared memory and atomics but no processes or threads 🤣
we're making complex, deep alterations to tagged template strings (i even use them - they're pretty cool) - but just adding our functional predicate basics to array, or the rudimentary containers to the language, is "adding features willy nilly"
except flatmap
and flatten
, because those come from an insider, so those two go in.
sadly many gatekeepers derive their power from saying no, rather than from saying "yes, the language would be better with this trivial, easy change"
I am not saying that these features shouldn't be added. All I'm saying is that this is not the repo for such suggestions.
we've been waiting on that system for 20 years
and this is why this proposal is exciting! It's about time for a standard library system that doesn't depend on adding another global.
sadly many gatekeepers derive their power from saying no, rather than from saying "yes, the language would be better with this trivial, easy change"
once something is added to ES, it cannot really be removed. hence why "trivial changes" must be looked at with 20 years foresight.
with
was removed
these are nearly all things that were in lisp-1
. ample foresight is achieved
i mean if we want to work on them and plan them that's great. take one's time, do it right, that's awesome
but all we ever get is "this isn't the place."
I stopped writing this about halfway through because I got a response that said "go away, we don't want new features, put them in your own repo"
Array additions
If arrows are a revelation because they're dense and don't bind, then let's remember how big a deal these will be, plskthx
All of these are assumed to be members of the Array prototype, or however someone with technical skill would say that. Thus, if I write foo, I mean to be able to call
[1,2,3].foo()
.Honestly, I also think all array members should have an in-place and a return variant.
These examples are chosen for clarity, not because they're good examples of why these matter
delete_first/1
Returns a new array missing the first element strict-matching the passed argument. Cumbersome to create on the fly, potentially far more efficient with an in-lib impl, and a cornerstone of many algs. Especially important if linked lists get introduced.
delete_each_first/1
This is ultra-useful in algorithms and can be implemented radically faster if you have access to pointers. Many languages call this "remove" or "subtract"
For each item in the arg's list, delete the first remaining match in the source list. Return the result.
pop_while/1
Pops elements from an array until the predicate passes. Efficiency and readability win.
filter_map/1
This sounds silly until you fall in love with it.
Return false to filter,
{value:anything}
to map. Or[anything]
, which is probably faster and definitely less aesthetic. The fundamental problem is thatfalse
needs to be a valid result, so, a container is required.flatlength/0
Much faster and clearer than that thing you write every March. Also 50% less buggy, and reliably uniform in all but one browser.
takewhile/1
Return the largest prefix of the list for which the passed predicate holds.
Again this is about readability and efficiency. But hey, that's what standard libraries are for :heart:
It's sort of the opposite of find?
Rework this for prototype plskthx. Also probably debug it because I'm garbage.
Uses
split/1
from belowdeepcopy/1
There's like 80 million of these and they're all subtly different in terrible ways.
Can we please just accept that the language needs to offer at least two of these, and that they need to be member methods of all the container types?
prefix/1
Much faster if natively implemented. Hugely alg-useful.
This, but on Array.prototype using all the high-neckbeard detail sorcery
suffix/1
Like prefix, but on the butt.
partition/1
Given a predicate, produce an array of two arrays, the first of which in order are the passing and the second the failing members of the list prior. Indices are not kept.
zip/1
Produce an array-of-arrays where members line up pairwise with the passed in argument by index. Super more efficientizable using magic parallel mumbo jumbo, under the hood. Ultra duper useful.
Keeps going until all the arrays peter out. Maybe produces holes? Dunno how holes work with this. Kind of a hole thing to bring it up.
Should take varargs because
zip_n
is otherwise a needed thing and that's just varargs or an array anyway. Butzip_n
can be a thing if varargs are the devil this year, I guessunzip/0,1
Unzips an array into N arrays where N is the length of the first array, or the argument you pass if you need to force it because some hole put holes into your array.
chunk/1
split/1
kunique/1
Filters down to objects with the first instance of a given value on the provided key.
If the array contains non-objects they are also filtered.
unique/0,1
Removes succeeding matching elements without otherwise reordering.
Parameter is strict, default true, which when false determines equality with
==
instead of===
.usort
Like sort/0 and sort/1, except that matching elements are removed
nsort
Like sort/0 and sort/1, except that a new array is produced, rather than for the sort to be in-place.
nusort
Like sort/0 and sort/1, except that a new array is produced, rather than for the sort to be in-place, and matching elements are removed.
umerge
Basically
const umerge = (Arrays...) => unique([].concat(Arrays))
;Can be done way more efficiently in-place and very often implemented (and botched)
histo/0,1
Histo is useful AF, and I think it should be on
Array
Which means I'd like
histo
should also be able to take a predicate that buckets.without/1
Can be made far faster natively. Pretty big readability win.
Which means I'd like
intersection
Contains the unique elements present in each list.
decorate_sort_undecorate/2
Varargify all/1
It would be nice to be able to write
Ramda calls this
allPass
Varargify any/1
It would be nice to be able to write
Ramda calls this AnyPass
grams/1
Generate the n-grams of the array.
bucket/1
Bucket's argument may either be a
string
or afunction
.If bucket's argument is a string, bucket is applied to an array of objects, and the string is a member which all objects will be checked for. The value of the member under the argument string will be used as the bucket key in the return object. The return object is an object of keys containing arrays, each with the members of the original object in order whose member matches the bucket key.
If bucket's argument is a function, bucket is applied to any array, the function is a bucket key generating predicate, and the bucket will be generated similarly.
That's a bunch of opaque baloney.
sample/0,1
Called without an argument, the numeric argument is assumed to be 1.
sample
returns a fair, unbiassed sample of the array it's called on. Array elements will be represented with original frequency.If a number is passed to
sample
, that is the size of the counted sample that will be returned.If you call for a counted
sample
on a unique list, the returned results will also be unique. You won't get[5,5]
because there's only one5
to get.On the other hand, if you call
sample
on a list with repetitions, the result may have repetitions if warranted.By example, a list with ninety nine
'a'
s and one'b'
, sampled for three elements, will usually produce three'a'
s, and occasionaly two'a'
s and one'b'
, but never two or threeb
s.