tc39 / proposal-built-in-modules

BSD 2-Clause "Simplified" License
892 stars 25 forks source link

Okay #39

Closed StoneCypher closed 5 years ago

StoneCypher commented 5 years ago

I stopped writing this about halfway through because I got a response that said "go away, we don't want new features, put them in your own repo"

Array additions

If arrows are a revelation because they're dense and don't bind, then let's remember how big a deal these will be, plskthx

All of these are assumed to be members of the Array prototype, or however someone with technical skill would say that. Thus, if I write foo, I mean to be able to call [1,2,3].foo().

Honestly, I also think all array members should have an in-place and a return variant.

These examples are chosen for clarity, not because they're good examples of why these matter

delete_first/1

Returns a new array missing the first element strict-matching the passed argument. Cumbersome to create on the fly, potentially far more efficient with an in-lib impl, and a cornerstone of many algs. Especially important if linked lists get introduced.

[1,2,3,2,1].delete_first(2); // yields [1,3,2,1]

delete_each_first/1

This is ultra-useful in algorithms and can be implemented radically faster if you have access to pointers. Many languages call this "remove" or "subtract"

For each item in the arg's list, delete the first remaining match in the source list. Return the result.

[1,2,3,2,1,2].delete_each_first([2,1,2]); // [ 3,1,2 ]

pop_while/1

Pops elements from an array until the predicate passes. Efficiency and readability win.

let c = [1,2,3,2,1];
c = c.pop_while(() => c.length > 2);  // c is now [1,2]

filter_map/1

This sounds silly until you fall in love with it.

Return false to filter, {value:anything} to map. Or [anything], which is probably faster and definitely less aesthetic. The fundamental problem is that false needs to be a valid result, so, a container is required.

let employees = [{ name: 'linda', lizard: false, salary: 100000 }, 
                 {name: 'bob', lizard: true, salary: 100000 }];

const the_system = () => employees = employees.filter_map(staff => {
  if (!(staff.lizard)) { return false; } // filthy human
  return { value: { name: staff.name, salary: Math.floor(staff.salary * 1.2) } };
});

flatlength/0

Much faster and clearer than that thing you write every March. Also 50% less buggy, and reliably uniform in all but one browser.

flatlength( [ [1,2,3], 4, [5, [6]], 7 ] );  // it's ... it's 7.  c'mon

takewhile/1

Return the largest prefix of the list for which the passed predicate holds.

Again this is about readability and efficiency. But hey, that's what standard libraries are for :heart:

It's sort of the opposite of find?

Rework this for prototype plskthx. Also probably debug it because I'm garbage.

Uses split/1 from below

const takewhile = (list, pred) => list.split(list.findIndex(x => !pred(x)));

deepcopy/1

There's like 80 million of these and they're all subtly different in terrible ways.

Can we please just accept that the language needs to offer at least two of these, and that they need to be member methods of all the container types?

prefix/1

Much faster if natively implemented. Hugely alg-useful.

This, but on Array.prototype using all the high-neckbeard detail sorcery

const prefix = (ours, base) => ours.all( (item, i) => item === base[i] );

suffix/1

Like prefix, but on the butt.

partition/1

Given a predicate, produce an array of two arrays, the first of which in order are the passing and the second the failing members of the list prior. Indices are not kept.

function quicksort(List) {

  if (!(Array.isArray(List))) { throw new TypeError('Damnit, Dave.'); }
  if (list.length === 0)      { return List; }

  const [First, ... Remainder]  = List;
  const SmallerThanFirst        = Item => Item < First;
  const [ Smaller, LargerOrEq ] = Array.partition(SmallerThanFirst);

  return Smaller.concat( [First], LargerOrEq );

}

zip/1

Produce an array-of-arrays where members line up pairwise with the passed in argument by index. Super more efficientizable using magic parallel mumbo jumbo, under the hood. Ultra duper useful.

Keeps going until all the arrays peter out. Maybe produces holes? Dunno how holes work with this. Kind of a hole thing to bring it up.

[1,2,3].zip(['a', 'b', 'c']);  
// produces [ [1,'a'], [2,'b'], [3,'c'] ]

Should take varargs because zip_n is otherwise a needed thing and that's just varargs or an array anyway. But zip_n can be a thing if varargs are the devil this year, I guess

[1,2,3].zip(['a', 'b', 'c'], ['do', 're', 'mi']);  
// produces [ [1,'a','do'], [2,'b','re'], [3,'c','mi'] ]

unzip/0,1

Unzips an array into N arrays where N is the length of the first array, or the argument you pass if you need to force it because some hole put holes into your array.

[1,2,3].unzip();
// produces [ [1,2,3], ['a','b','c'] ]

chunk/1

[1,2,3,4,5,6,7].chunk(3); // [ [1,2,3], [4,5,6], [7] ]

split/1

[1,2,3,4,5,6,7].split(3); // [ [1,2,3], [4,5,6,7] ]

kunique/1

Filters down to objects with the first instance of a given value on the provided key.

[ { first: 'bob',     last: 'ross'  },
  { first: 'bob',     last: 'dobbs' },
  { first: 'william', last: 'ross'  },
  { first: 'william', last: 'riker' } ].kunique('last');

// [ { first: 'bob',     last: 'ross'  },
//   { first: 'bob',     last: 'dobbs' },
//   { first: 'william', last: 'riker' } ]; // second 'ross' omitted

If the array contains non-objects they are also filtered.

[ 1, 2, { last: 'ross' }, 3 ].kunique('last'); // [{ last: 'ross' }]

unique/0,1

Removes succeeding matching elements without otherwise reordering.

[1,2,3,6,2,4,1,5,6].unique(); // [1,2,3,6,4,5]

Parameter is strict, default true, which when false determines equality with == instead of ===.

[0, 0.0].unique();      // [0, 0.0]
[0, 0.0].unique(true);  // [0, 0.0]
[0, 0.0].unique(false); // [0]

usort

Like sort/0 and sort/1, except that matching elements are removed

let foo = [1,2,3,2,3,4];
    foo.usort(); // foo is now [1,2,3,4]

nsort

Like sort/0 and sort/1, except that a new array is produced, rather than for the sort to be in-place.

const foo = [1,2,3,2,3,4]
      foo.nsort(); // returns [1,2,2,3,3,4].  foo is untouched

[1,2,3,2,3,4].nsort(); // returns [1,2,2,3,3,4] w/o the temporary

nusort

Like sort/0 and sort/1, except that a new array is produced, rather than for the sort to be in-place, and matching elements are removed.

const foo = [1,2,3,2,3,4]
      foo.nsort(); // returns [1,2,3,4].  foo is untouched

[1,2,3,2,3,4].nsort(); // returns [1,2,3,4] w/o the temporary

umerge

Basically const umerge = (Arrays...) => unique([].concat(Arrays));

Can be done way more efficiently in-place and very often implemented (and botched)

histo/0,1

Histo is useful AF, and I think it should be on Array

const histo = List => List.reduce( 
  (acc, cur) => (acc[cur] === undefined
                          ? acc[cur] = 1
                          : acc[cur]++
                          , acc)
  , {} );

Which means I'd like

[10,20,30,40,30,20,10,20].histo(); // returns { 10: 2, 20: 3, 30: 2, 40: 1 }

histo should also be able to take a predicate that buckets.

// US grades are in five brackets: A, B, C, D, and F.  These match the top 40
// deciles, and the bottom 60% or F (and sometimes D) is considered failing.
//
// So, a 92 is an A, a 60 is a D, and a 59 is an F.

const us_grade = x => 
  String.fromCharCode(74 - (([90,80,70,60].find(tier => x > tier) || 40) / 10));

us_grade(92); // 'A'
us_grade(71); // 'C'
us_grade(58); // 'F'

[91,84,71,54,81,98,67,81].histo(us_grade); // { A:2, B:3, C:1, D:1, F:1 }

without/1

Can be made far faster natively. Pretty big readability win.

const without = (list, nongrata) => list.filter(item => (!( nongrata.contains(item) )) );

Which means I'd like

[1,2,3,4,3,2,1,2].without([2,3]); // returns [1,4,1]

intersection

Contains the unique elements present in each list.

Array.intersection([1,2,3], [2,3,4], [1,2,3,4]); // [2,3]

decorate_sort_undecorate/2

Varargify all/1

It would be nice to be able to write

employees.all(alive, not_incarcerated)

Ramda calls this allPass

Varargify any/1

It would be nice to be able to write

employees.any(dead, incarcerated)

Ramda calls this AnyPass

grams/1

Generate the n-grams of the array.

[1,2,3,4,5,6].grams(3); // [ [1,2,3], [2,3,4], [3,4,5], [4,5,6] ]

bucket/1

Bucket's argument may either be a string or a function.

If bucket's argument is a string, bucket is applied to an array of objects, and the string is a member which all objects will be checked for. The value of the member under the argument string will be used as the bucket key in the return object. The return object is an object of keys containing arrays, each with the members of the original object in order whose member matches the bucket key.

If bucket's argument is a function, bucket is applied to any array, the function is a bucket key generating predicate, and the bucket will be generated similarly.

That's a bunch of opaque baloney.

const Actors = [ { name: 'Idris Elba',       notable_genre: 'action' },
                 { name: 'Gal Godot',        notable_genre: 'action' },
                 { name: 'Wanda Sykes',      notable_genre: 'comedy' },
                 { name: 'Chadwick Boseman', notable_genre: 'adventure' }
               ];

Actors.bucket('notable_genre'); 
// returns { action: ['Idris Elba', 'Gal Godot'], comedy: ['Wanda Sykes'] }

Actors.bucket( a => a.notable_genre.substring(0, 1) );
// returns { a: ['Idris Elba', 'Gal Godot', 'Chadwick Boseman'], c: ['Wanda Sykes'] }

sample/0,1

Called without an argument, the numeric argument is assumed to be 1.

sample returns a fair, unbiassed sample of the array it's called on. Array elements will be represented with original frequency.

[1,2,3,4,5].sample();  // 2
[1,2,3,4,5].sample();  // 4
[1,2,3,4,5].sample();  // 1
[1,2,3,4,5].sample();  // 5

If a number is passed to sample, that is the size of the counted sample that will be returned.

[1,2,3,4,5].sample(2);  // [2,3]
[1,2,3,4,5].sample(2);  // [4,1]
[1,2,3,4,5].sample(2);  // [1,2]
[1,2,3,4,5].sample(2);  // [5,4]

If you call for a counted sample on a unique list, the returned results will also be unique. You won't get [5,5] because there's only one 5 to get.

On the other hand, if you call sample on a list with repetitions, the result may have repetitions if warranted.

By example, a list with ninety nine 'a's and one 'b', sampled for three elements, will usually produce three 'a's, and occasionaly two 'a's and one 'b', but never two or three bs.

const base = new Array(99).fill('a').push('b');
seq(1000000).map( () => base.sample(3).nsort().join('') ).histo();
// { 'aaa': 999973, 'aab': 27 }
StoneCypher commented 5 years ago

New functionals

before/3,4,5

Accepting three functions, before(pred, pre, post, ctx1, ctx2), call pred.
While pred is false, call and return pre. When pred returns true, call and memoize post, then forever return memoized post.

Because this is essentially a reverse Promise, I want to call this threat. Nobody thinks that's funny but me.

before(cheating_detected, handle_user_calls, permanent_cheater_lockdown);
StoneCypher commented 5 years ago

New globals

Language level memoization, idempotence, and partial application (no that isn't currying) could have a pretty significant contribution to engine implementers' ability to optimize

StoneCypher commented 5 years ago

Object additions

ukeymerge

What we always do for config stuff. Walk N objects-or-maps-or-whatever, either from an array or more likely as varargs, taking keys from the earliest present.

What you want for

const Config = ukeymerge(userPreferences, themeSettings, editorDefaults);

So it'll be the non-terrible version of

const ukeymerge = (Objects..., out = {}) =>

  Objects.map(o => 
    Object.keys(o).map(
      ok => { 
        if (out[ok] === undefined) { out[ok] = o[ok]; } 
      }
    )
  ),
  out;

That's probably wrong and only 20% of us can confidently eyeball it

Can be done way more efficiently in-place and very often implemented (and botched) in userland code.

Which means I'd like

const userPreferences = { indent: 'space' },
      themeSettings   = { indent: 'tab', size: 4 },
      editorDefaults  = { indent: 'tab', size: 2, smart: true };

const tabExample = userPreferences.ukeymerge(themeSettings, editorDefaults);

deepEquals/1

A sane array equals thing already. Jesus

const a1 = [1,2,3],
      a2 = [1,2,3];

const o1 = [a:1,b:2],
      o2 = [a:1,b:2];

const n1 = [a1, o1, {z: [a1, o1]}],
      n2 = [a1, o1, {z: [a1, o1]}];

console.log(a1.deepEquals(a2)? 'yay' : 'javascript');
console.log(o1.deepEquals(o2)? 'yay' : 'javascript');
console.log(n1.deepEquals(n2)? 'yay' : 'javascript');

similar/1

Two objects are similar if they have the same keys.

const similar(o1, o2) => 
  Object.keys(o1).nsort().deepEquals(
    Object.keys(o2).nsort()
  );

// lol writing this in es3.  programmer.hasOwnSanity

{ name: 'bob' }.similar({ name: 'carol' });  // true
{ name: 'bob' }.similar({ face: 'bob'   });  // false

shaped_like/1,2

Like similar, but for a list of object keys in the first place.

The second argument is a boolean, default false, which when true enforces exhaustive keys (that is, not just everything, but also nothing else.)

const troll = {
  name       : 'Thog',
  gender     : 'male',
  class      : 'warrior',
  level      : 14,
  job_title  : 'marauder',
  attributes : { str: 16, 
                 wis: 11, 
                 int: 10, 
                 dex: 7, 
                 con: 19, 
                 cha: 11 },
  hit_points : 141,
  thac0      : 7,
  salary     : { value    : 15, 
                 currency : 'gp', 
                 timing   : { frame: 'monthly', 
                              dates: [1,15] } 
               }
};

const wizard   = [ 'level', 'hit_points', 'spell_list' ],
      employee = [ 'job_title', 'hit_points', 'salary' ];

troll.shaped_like(wizard);          // false - no spell list
troll.shaped_like(employee);        // true - has all three
troll.shaped_like(employee, true);  // false - has other things also

pick/1

A natural for destructuring assignment with pattern matching to get through complex datastructures.

{ a: "fonzie", b: "stringer", c: "your way out" }.pick(['a', 'b']); 
// returns ['fonzie', 'stringer']

pick_skip/1

The converse of pick.

{ a: "fonzie", b: "stringer", c: "your way out" }.pick_skip(['a', 'b']); 
// returns ["your way out"]

winnow/1

Winnow is like pick, but it keeps the object's structure and (reduced) keys.

{ a: "fonzie", b: "stringer", c: "your way out" }.winnow(['a', 'b']); 
// returns { a: 'fonzie', b: 'stringer' }

Winnow can also take a predicate.

const only_angry = key => key.toUppercase() === key;

const bad_example = { 
  grr: 'grumble', 
  GRR: 'LOUDER', 
  hello: 'greeting', 
  HI: 'WHY ARE YOU SCARED' 
};

bad_example.winnow(only_angry); 
// returns { GRR: 'LOUDER', HI: 'WHY ARE YOU SCARED' }

This is startlingly useful for deep nested JSON validation, by example, which must be vetted for minimum key anger levels at all time to ensure integrity.

remove/1

The converse of winnow, remove returns a new object with the noted keys omitted entirely.

{ a: "fonzie", b: "stringer", c: "your way out" }.remove(['a', 'b']); 
// returns { c: "your way out" }

Remove can also take a predicate.

const only_angry = key => key.toUppercase() === key;

const bad_example = { 
  grr: 'grumble', 
  GRR: 'LOUDER', 
  hello: 'greeting', 
  HI: 'WHY ARE YOU SCARED' 
};

bad_example.remove(only_angry); 
// returns { grr: 'grumble', hello: 'greeting' }

filter/1

Objects want filter too, friends.

Details left as an exercise for the reader

StoneCypher commented 5 years ago

Math additions

seq/1

We all know this by heart for a reason.

A terrible, terrible reason.

const seq = n => (new Array(n)).fill(false).map( (_, i) => i );

I triple-guarantee there's a better way to do that. Rainbow sprinkles on the contract.

range/3

Seq on steroids.

Array.range(10, 20, 3);   // [10, 13, 16, 19]

rnd/1,2

C'mon.

// I called for implementation-controllable randomness elsewhere.  Assume that's
// handled here, somehow.  Thanks

const rnd_i({ min=0, max, impl='PCG', seed }) =>
  // I probably got this wrong
  Math.floor(Math.random(impl, seed) * (max - min)) + min;

const rnd(a, b) =>
  typeof(a) === 'object'
    ? rnd_i(a)
    : b === undefined
      ? rnd_i({ max: a })
      : rnd_i({ max: b, min: a });

Math.rnd(4);   // [0 .. 3]
Math.rnd(1,6); // [1 .. 6]

Probably you can't just call random like that because it'd trample other unrelated states, so maybe

const rnd = Math.make_rnd({ impl: 'PNG', seed: ..., step: 408 });

or whatever, and then you could query for current state and what have you

StoneCypher commented 5 years ago

there's an enormous amount more that the js standard library legitimately needs

unfortunately all i got was a bunch of flak, so, i stopped trying

StoneCypher commented 5 years ago

i think a big part of the reason that we're on version nine of the language and don't have deep copy yet is that the language's gatekeepeers are nowhere near as inclusive as they think they are, and every time someone like me has something to add, the door gets closed in their face

obedm503 commented 5 years ago

@StoneCypher maybe it's just too much too quickly. because javascript has no real versions, features cannot be added willy-nilly.

If I'm not mistaken, the purpose of this repo is to track the addition of the possibility of a standard library. Until there is system/syntax for the standard library, I doubt tc39 will want to consider the actual additions of modules to the standard library.

Once there is a system in place, I assume each addition will have to go through the tc39 process.

leobalter commented 5 years ago

@StoneCypher it seems like you're proposing what could be a library of your own, like Lodash. Keep in my lodash has a very strong popularity and even thou is not mean to become a standard library for JS.

Following a standards process requires some strategy and even some (team) engineering to convince different independent people and companies to agree with what you're proposal. A (max,min) strategy is unmet but highly required here. Nothing goes to the specs as imposed. The consensus process tries on its best to avoid that.

A topic like this, "Okay", is a high trolling for the community in general. I'd like to ask you considering some further understanding of a standard process.

This specific paper helped me a lot understanding the process I recommend reading: http://wirfs-brock.com/allen/files/papers/standpats-asianplop2016.pdf

The books you might also find as references for these paper are great:

I believe they are very educational and might provide a proper answer I can't really provide in such a short comment. I hope you understand that.

StoneCypher commented 5 years ago

no, i am not proposing a library of my own, like lodash. it's unfortunate that you closed this without understanding it first

i am very clearly and repeatedly saying "the reason things like lodash exist is that the js standard library isn't good enough and is gatekept by people who resist change with no apparent actual reason"

it's been 20 years. getting the basics like deep copy is not "adding features willy nilly." that's just heel dragging

StoneCypher commented 5 years ago

people have been releasing incompatible, bug-ridden libraries for these things since es2

we're coming up on es9. the language turned 24 this year.

it's time to have our six basic containers and our 39 basic functional predicates.

StoneCypher commented 5 years ago

@obedm503 - we've been waiting on that system for 20 years

there's no reason to wait. most of these are one-liners

people act like the language having core, reliable, uniform implementations of its basics is somehow extremism

what are they actually adding to the language?

well, they care about syntax when it's cosmetic stuff like adding commas to the ends of arglists, but not when it's meaningful stuff like message dispatch, or pattern matching until it's proposed internally, or generators

well, they care about functions, when it's string padding, but not when it's functional predicates

we've got six different asynchronicity mechanisms now - promises, callbacks, async, generator yield, web workers, and timeouts. we're adding a seventh, async iteration, which is sufficiently difficult that i don't think it'll actually get used. none of them can safely be threads or processes, but if we want a very slight rephrasing to make threads or processes possible but not required by an implementor, it's out of scope

we've got shared memory and atomics but no processes or threads 🤣

we're making complex, deep alterations to tagged template strings (i even use them - they're pretty cool) - but just adding our functional predicate basics to array, or the rudimentary containers to the language, is "adding features willy nilly"

except flatmap and flatten, because those come from an insider, so those two go in.

sadly many gatekeepers derive their power from saying no, rather than from saying "yes, the language would be better with this trivial, easy change"

obedm503 commented 5 years ago

I am not saying that these features shouldn't be added. All I'm saying is that this is not the repo for such suggestions.

we've been waiting on that system for 20 years

and this is why this proposal is exciting! It's about time for a standard library system that doesn't depend on adding another global.

sadly many gatekeepers derive their power from saying no, rather than from saying "yes, the language would be better with this trivial, easy change"

once something is added to ES, it cannot really be removed. hence why "trivial changes" must be looked at with 20 years foresight.

StoneCypher commented 5 years ago

with was removed

these are nearly all things that were in lisp-1. ample foresight is achieved

StoneCypher commented 5 years ago

i mean if we want to work on them and plan them that's great. take one's time, do it right, that's awesome

but all we ever get is "this isn't the place."