CityWebConsultants / Iris

Modular content management and web application framework built with Node.js and MongoDB
http://irisjs.org
Other
9 stars 7 forks source link

Mongo indexes need correcting #130

Closed adam-clarey closed 8 years ago

adam-clarey commented 8 years ago

With our organisations entity type, a couple of fields are set as 'unique', however in mongo it creates an index for every field. So even though those fields are not set as 'unique' in the index, the fact that they are in the index means that the entity will fail saving if any field is not unique.

It will create errors on submission like:

E11000 duplicate key error index: mydatabase.collection1.$field_name dup key: { : null }

So even if the extra fields are empty, it will still fail as it will find other null fields in the db.

I'm not sure how indexes are being set/updated so will need to investigate further.

You can view the indexes for your database with the command:

db.myCollection.getIndexes();

Manually removing the unnecessary indexes like:

db.organisations.dropIndex({"field_names.new", 1});

Will fix the problem.

A proper solution needs to be built into the system.

FilipNest commented 8 years ago

So is this a problem with the unique field setting we've got? It is true that null counts as a value for unique fields that are required. I literally have no idea about what any of the above means so you're probably going to have to investigate it yourself! There's no inherent stuff for indexes in the system. I suppose we could take the mongo part of the unique field out and rely on our own presave hook system.

Not sure, currently setting a unique field runs a special presave hook to check it. However it also adds the unique property to the mongoose schema. You're probably doing something way beyond anything I've even considered so I'll leave this one to you.