darobin / formic

Playing with forms
MIT License
33 stars 8 forks source link

Sparse Arrays #15

Open rodneyrehm opened 10 years ago

rodneyrehm commented 10 years ago
<input name="mix[0]" value="alpha">
<input name="mix[5]" value="bravo">
<!--
  { mix: ["alpha", null, null, null, null, "bravo"] }
  mix[0]=alpha&mix[5]=bravo
-->

This has been pointed out before: This is not only "ugly", it's a trivial DoS waiting to happen.

While this was trying to be developer friendly, a simple "fix" can be found in PHP's json_encode: sequential vs. non-sequential array example. If the keys of a map do not fulfill the following condition, the map is not converted to array, but serialized to object:

var data = {"0": "alpha", "1": "bravo", "2": "charlie"};
// all indexes must be integer
var _notInteger = /[^0-9]/
var _invalidKeys = Object.keys(data).some(_notInteger.test, _notInteger);
var keys = Object.keys(data).map(Number);
var _invalidKeys = keys.some(function(value){ return isNaN(value) });
// lowest index must be 0
var _lowerBound = Math.min.apply(Math, keys) !== 0;
// highest index must be exactly 
var _upperBound = Math.max.apply(Math, keys) !== keys.length - 1;

if (_invalidKeys || _lowerBound || _upperBound) {
  // serialize to Object
} else {
  // serialize to Array
}
darobin commented 10 years ago

The problem is that there are legitimate reasons for sending (moderately) sparse arrays. I really don't want to remove those.

I would think it reasonable for UAs to impose a limit on the number of null fields in an array. WDYT?

macek commented 10 years ago

@darobin I'm going to agree with your latest comment. It's certainly worth some discussion, but I think denying a (moderately) sparse array simply because they could be misused is not an end-all, be-all reason not to support them.

emilv commented 9 years ago

This is not so much a UA issue as it is an issue with the server-side conversion algorithm for supporting older clients. Someone can send in a relatively small payload that blows up in server-side memory.

Enforcing a limit on null values might be surprising to a lot of developers.

We should specify this limit, either as a specific upper bound or at least by acknowledging that there might be limits and what should happen if so. Otherwise we will quickly get a lot of different behaviours, which kind of defeats the point of a standard.

I propose that we address this issue by:

  1. Mentioning that there might be such a limit, and
  2. Saying how to enforce this limit in the conversion algorithm
Boldewyn commented 8 years ago

Apparently, browsers do not yet (as of 2016-03) feature a security guard against massive sparse arrays: http://codingsight.com/how-to-hang-chrome-firefox-and-node-js-inside-native-func/