Closed marcomtz85 closed 6 years ago
Yes this is possible. but I didn't found any case to add this into jsstore, As we can query on multiple columns in jsstore.
Could you tell me , your case. I mean why do you need it ?
I understand that adding multiple index allows to search on multiple columns & it might be little faster.
Thanks for the fast answer.
I'm working on a project that has a section for making offline surveys. For avoiding a mess on the people's addresses, I've downloaded my country's states/cities/neighborhoods/postal codes database and managed to load an abbreviated version of it via my webapp into the devices with jsstore .bulkInsert very well. We're talking of a table with about 150,000 records.
The thing is, when the person that answers the survey on street knows his/her postal code, the search by this key is really really fast and we get almost instantly the neighborhood or neighborhoods that are under this zip code.
But if the person doesn't know this code, when we search the neighborhoods that are inside a given state and city with .select by two or three columns, the search can take ~10-15 seconds in states with a lot of cities and neighborhoods. That's a lot for locally stored data, when via an AJAX API call to a SQL server with compund indexes it's really faster than that.
Maybe that kinds of projects with such number of rows are really specific or specialized, but in some countries it's really helpful the opportunity that IndexedDB and JsStore give of working truly offline on city streets, roads or little towns; not depending on unreliable coverage or bandwidth, or on your volunteers'/collaborators' data plans.
Thank you again, and I hope this helps.
ok I got ur point. I will add this feature. I need to find a way to feed through jsstore. If you have any suggestion, feel free to talk.
Btw - could you give me those records in a json file and db structure? This will help me compare the performance. May be based on these cases, I can find other solutions 😀 .
@ujjwalguptaofficial of course.
You can find the JSON containing the data here . I'm sorry that this is in spanish but let me explain the structure of the file.
Under the "data" child, you'll find 3 more: estados (states), municipios (cities/counties), and colonias (neighborhoods). Each one of them contains one child called "catalogo" that is the catalog of the fields contained in the another one called "datos" (data). I separated them, so the "colonias" data was as light as possible not repeating states' and cities' names, and the field names. Only information containing their ids (or "claves").
And the table's structure looks like this:
var db_name = 'gesdev';
var db_version = 2;
var tabla_geo_estados = {
name: 'geo_estados',
version: db_version,
columns: [
{
name: 'inegi_entidad_clave',
unique: true,
primaryKey: true,
},
],
};
var tabla_geo_municipios = {
name: 'geo_municipios',
version: db_version,
columns: [
{
name: 'inegi_entidad_clave',
notNull: true,
dataType: JsStore.DATA_TYPE.Number,
},
{
name: 'inegi_municipio_clave',
notNull: true,
dataType: JsStore.DATA_TYPE.Number,
},
{
name: 'ine_municipio_id',
notNull: true,
dataType: JsStore.DATA_TYPE.Number,
},
],
};
var tabla_geo_colonias = {
name: 'geo_colonias',
version: db_version,
columns: [
{
name: 'inegi_entidad_clave',
notNull: true,
dataType: JsStore.DATA_TYPE.Number,
},
{
name: 'inegi_municipio_clave',
notNull: true,
dataType: JsStore.DATA_TYPE.Number,
},
{
name: 'codpostal',
notNull: true,
dataType: JsStore.DATA_TYPE.Number,
},
{
name: 'asentamiento_id',
notNull: true,
dataType: JsStore.DATA_TYPE.Number,
},
],
};
var database = {
name: db_name,
version: db_version,
tables: [
tabla_geo_estados,
tabla_geo_municipios,
tabla_geo_colonias,
],
};
Searching in "geo_estados" and "geo_municipios" isn't problematic because the first one has 32 records, and the second ~2,500. I was thinking on a solution merging in one field both the ine_entidad_clave and ine_municipio_clave, so the city 192 of the state 30 would be 30192 as none of the states reaches 1,000 cities, but I think it's not the most convenient thing to do to treat in a different way local and server information.
If you try searching in "geo_colonias" those neighborhoods that are inside state (inegi_entidad_clave) 30 and city (inegi_municipio_clave) 192 in a fair computer or a high-end mobile device, the query would take 3 or 4 seconds (not too bad), but in mid or low-end mobile devices that time increases.
I'm really convinced that multiple-column-indexes would be a really cool feature for seamlessly working both offline and online having the ability of using pretty similar code for both scenarios, and giving the final user a high speed experience.
Thank you and I'll take a look on the source code hoping to make a good suggestion.
Not a problem due to spanish. If i cant understand, i will learn spanish mi amigo.
I am thinking about adding an index option in table.so somethng like this -
var tabla_geo_colonias = {
name: 'geo_colonias',
version: db_version,
indexes: [datas], // will find out what will be the format to get the data
columns: [
],
};
What do you think ?
Hi there - I am not able to find it what to take from that json file & insert into table.
Could you please give me the code to insert data into all tables from that json file ?
Also I need query which is taking time with respective values.
Hey there - I have added this feature in v 2.8.0 . You need to specify keyPath option in column.
e.g -
var tabla_geo_municipios = {
name: 'geo_municipios',
columns: [{
name: 'inegi_municipio_clave',
notNull: true,
dataType: JsStore.DATA_TYPE.Number,
}, {
name: 'ine_municipio_id',
notNull: true,
dataType: JsStore.DATA_TYPE.Number,
}, {
name: 'ine_municipio_id_inegi_municipio_clave',
keyPath: ['ine_municipio_id', 'inegi_municipio_clave'] // this will add multiple index with name ine_municipio_id_inegi_municipio_clave
}],
};
Now you can search on using multiple index. So lets say you have municipio_id is 5454 & municipio_clave is 1234556
con.select({
from: 'geo_municipios',
where: {
ine_municipio_id_inegi_municipio_clave: [5454, 1234556]
}
})
Also make sure order of value is same as you have defined in keyPath.
Let me know if you have any question
Btw - I tested with a table which has 155570 records & its not much faster. Check out the screenshot
it might be faster in your case, give it a try and let me know
Hi. I'm sorry for not answering before but I've been very busy in the last few days.
Hi there - I am not able to find it what to take from that json file & insert into table.
Could you please give me the code to insert data into all tables from that json file ?
var tablas_secuencia = [
{
tabla: 'geo_estados',
objeto: 'estados',
},
{
tabla: 'geo_municipios',
objeto: 'municipios',
},
{
tabla: 'geo_colonias',
objeto: 'colonias',
},
]; // This is an object containing the tables that will by affected by which object from the JSON file
var chunk_size = 250; // Size of 250 records for each .bulkInsert
var cuentasegtablas = 0; // Number of table (objectstore) being processed
var cuentacampos = {}; // Number of fields of the table
var cuenta = {}; // This object will contain the number of records from each object inside JSON file
var cuentatemporal = {}; // Object with the number of records to be proccessed inside each chunk
var cuentasegtemporal = {}; // Number of records already proccessed inside each chunk
var cuentaseg = {}; // Number of records already proccessed of each object
var cuentachunks = {}; // Total number of chunks of each object
var cuentasegchunks = {}; // Chunks already proccessed
var cuentasegcampos = {}; // Fields already proccessed inside each record
cuentaseg["estados"] = 0;
cuentaseg["municipios"] = 0;
cuentaseg["colonias"] = 0;
// We declare a function that iterates on each table
function inserta(tabla,objeto) {
tabla = tabla || '';
objeto = objeto || '';
if (cuentaseg[objeto] == 0) {
cuentacampos[objeto] = countInObject(data.data[objeto].catalogo);
cuenta[objeto] = countInObject(data.data[objeto].datos);
cuentachunks[objeto] = Math.ceil(cuenta[objeto]/chunk_size);
console.log('Chunks '+objeto+': '+cuentachunks[objeto]);
cuentasegchunks[objeto] = 0;
}
if (cuentasegchunks[objeto] < cuentachunks[objeto]) {
var valores = [];
cuentasegtemporal[objeto] = 0;
var diferencia = cuenta[objeto]-cuentaseg[objeto];
if (diferencia < chunk_size) {
cuentatemporal[objeto] = diferencia;
} else {
cuentatemporal[objeto] = chunk_size;
}
while (cuentasegtemporal[objeto] < cuentatemporal[objeto]) {
valores[cuentasegtemporal[objeto]] = {};
cuentasegcampos[objeto] = 0;
while (cuentasegcampos[objeto] < cuentacampos[objeto]) {
valores[cuentasegtemporal[objeto]][data.data[objeto].catalogo[cuentasegcampos[objeto]].titulo] = data.data[objeto].datos[cuentaseg[objeto]][cuentasegcampos[objeto]];
++cuentasegcampos[objeto];
}
++cuentasegtemporal[objeto];
++cuentaseg[objeto];
};
db_con.bulkInsert({
into: tabla,
values: valores
}).then(function(results) {
++cuentasegchunks[objeto];
inserta(tabla,objeto);
}).catch(function(error) {
console.log('Error al insertar '+objeto+' a DB:',error);
});
} else {
++cuentasegtablas;
if (tablas_secuencia[cuentasegtablas]) {
inserta(
tablas_secuencia[cuentasegtablas].tabla,
tablas_secuencia[cuentasegtablas].objeto
);
} else {
$('#creditos_texto').empty();
funciones_ventana3();
console.log('Carga geográfica terminada.');
storage.set('geo_disponible',1);
};
}
}
inserta(
tablas_secuencia[cuentasegtablas].tabla,
tablas_secuencia[cuentasegtablas].objeto
);
I think it could be cleaner, but you know, we're always in a rush. And also made an extra effort by separating the field information from the data records in the JSON file in order to make it as small as possible.
Also I need query which is taking time with respective values.
The query that has a noticeable low time of response is the one with which I get the list of neighborhoods in a city id that is inside a given state.
db_con.select({
from: 'geo_colonias',
where: {
inegi_entidad_clave: parseInt(inegi_entidad_clave),
inegi_municipio_clave: parseInt(inegi_municipio_clave),
}
}).then(function(results) {
console.log(results);
}).catch(function(error) {
console.error(error);
});
Hey there - I have added this feature in v 2.8.0 . You need to specify keyPath option in column.
[...]
Now you can search on using multiple index. So lets say you have municipio_id is 5454 & municipio_clave is 1234556
[...]
Also make sure order of value is same as you have defined in keyPath.
Let me know if you have any question
Thank you very much! It looks great. I'll give it a try and I'll let you know what I find!
¡Gracias amigo!
Thank you again!! Really great job!
Gave it a try and in my computer and in a low-end tablet the benchmark for this query got way lower than 1 second. Chrome, Firefox and the latest Microsoft Edge (44) done consistently well!
The only little detail is that in Microsoft Edge 41 (the one included with the October 2017 Windows 10 Update), it logs an uncatched error:
{code: 0, message: "DataError", name: "DataError"}
Maybe that version didn't support keyPath object indexes??
I'll check for a fallback for them, but maybe that browser has too few users to take much care about it.
yep seems like edge had issue with compound key & multiEntry key, they have fixed it in latest version. For this you can check the edge version & run the older code.
yep seems like edge had issue with compound key & multiEntry key, they have fixed it in latest version. For this you can check the edge version & run the older code.
Yes, that's it. Thank you very much and again congratulations for your great work!
Docs for keypath is here - http://jsstore.net/tutorial/keypath/
Hi,
First of all, I would like to congratulate you all for contributing to this great tool that makes really friendly for developers a complex API like IndexedDB. Really good job.
In second place, I've reading your docs and another forums searching for the capability of creating tables (objectstores) with JsStore with "multiple column indexes". I've read that this is posible with the IndexedDB native API, but I couldn't find how to do it with this project.
If I accidentally missed it, would be happy to be corrected; and if not. it would be a really cool feature.
Thank you very much!