AlaSQL / alasql

AlaSQL.js - JavaScript SQL database for browser and Node.js. Handles both traditional relational tables and nested JSON data (NoSQL). Export, store, and import data from localStorage, IndexedDB, or Excel.
http://alasql.org
MIT License
7.04k stars 659 forks source link

Schema Creation for Alasql takes time #409

Open seal713 opened 9 years ago

seal713 commented 9 years ago

Hi,

I am using Alasql to create bunch of tables, totally around 70. And I found it takes around 10 seconds to complete all of them on iPad Safari. Can you please kindly let us know whether this is expected or is there any way to improve the performance here?

Secondly, do we have any method to persist the schema aligned with the data from Alasql to IndexDB or WebSQL, so that we can create only once and reuse later?

Warm regards, Seal

seal713 commented 9 years ago

For now, we store/retrieve the data as a JSON string to/from WebSQL db. Every time, we have to rebuild the schema. Is there any way we can store and reuse the schema info as well?

mathiasrw commented 9 years ago

@seal713

Please provide an example of how are you store data before you put them into the table to help us identify the problem.

(bonusinfo: if you have them as a array of arrays per table you can do: alasql.tables.tablename.data = mySoruceObj)

seal713 commented 9 years ago

@mathiasrw

Many thanks for your kindly feedback. Here is the sample codes which we are using.

The alasql.tables.tablename.data stores data that can be loaded quickly. The major issue is the schema creation takes much time. We want to know if the schema creation we did here can be reused or is there anyway to improve the performance for from Alasql end.

Here is the sample code for reference.

alasqlDB = new alasql.Database("alasql_db");

// creating schema is taking much time for all the 70 tables
// is it possible to fastly restore the schema (saved in localstorage) without executing SQL?
var tableSchemas = JSON.parse(window.localStorage.getItem('TableSchemas')) 
for(table in tableSchemas) {
    // get the SQL from local storage, for example, 
//    'CREATE TABLE [cities] ([id] INT, city TEXT DEFAULT "test", population INT DEFAULT 0, CONSTRAINT AB UNIQUE(id));'
    var createquery = tableSchemas[table];
    alasqlDB.exec(createquery); // the exec takes time
}

// data load
for(all the <tableName> get the <content> from storage) {
   var myarray = JSON.parse(content);
   alasqlDB.exec('INSERT INTO [' + tableName + '] SELECT * FROM ?', [myarray]);
}
mathiasrw commented 9 years ago

It does not make much sense. The scheme should be the quick part.

if you would like to share your full tableSchemas that would be awesome.

I am not aware of how to copy the schemas but have a feeling something in the schemas is blocking the normally speedy process.

seal713 commented 9 years ago

@mathiasrw

Yes, in desktop browser it is fast, but in ipad Safari, the performance degrade so much. See below one for example, we create 100 tables(each around 100 columns).

When you test on desktop, it takes around 3 seconds, but in iPad, it takes around 25 seconds. That's why we want to know whether Alasql has anyway to store the schema. And when we needed, we can load it directly, not to create again.

Codes sample:

    var NUM = 100;

    for(var i=0;i<NUM;i++)
    {
        var sql_createtbl = 'CREATE TABLE test' + i + ' (RecordNum INTEGER PRIMARY KEY AUTOINCREMENT ,  [col1] TEXT DEFAULT "",  [col2] TEXT DEFAULT "",  [col3] TEXT DEFAULT "",  [col4] TEXT DEFAULT "",  [col5] TEXT DEFAULT "",  [col6] TEXT DEFAULT "",  [col7] TEXT DEFAULT "",  [col8] TEXT DEFAULT "",  [col9] TEXT DEFAULT "",  [col10] TEXT DEFAULT "",  [col11] TEXT DEFAULT "",  [col12] TEXT DEFAULT "",  [col13] TEXT DEFAULT "",  [col14] TEXT DEFAULT "",  [col15] TEXT DEFAULT "",  [col16] TEXT DEFAULT "",  [col17] TEXT DEFAULT "",  [col18] TEXT DEFAULT "",  [col19] TEXT DEFAULT "",  [col20] TEXT DEFAULT "",  [col21] TEXT DEFAULT "",  [col22] TEXT DEFAULT "",  [col23] TEXT DEFAULT "",  [col24] TEXT DEFAULT "",  [col25] TEXT DEFAULT "",  [col26] TEXT DEFAULT "",  [col27] TEXT DEFAULT "",  [col28] TEXT DEFAULT "",  [col29] TEXT DEFAULT "",  [col30] TEXT DEFAULT "",  [col31] TEXT DEFAULT "",  [col32] TEXT DEFAULT "",  [col33] TEXT DEFAULT "",  [col34] TEXT DEFAULT "",  [col35] TEXT DEFAULT "",  [col36] TEXT DEFAULT "",  [col37] TEXT DEFAULT "",  [col38] TEXT DEFAULT "",  [col39] TEXT DEFAULT "",  [col40] TEXT DEFAULT "",  [col41] TEXT DEFAULT "",  [col42] TEXT DEFAULT "",  [col43] TEXT DEFAULT "",  [col44] TEXT DEFAULT "",  [col45] TEXT DEFAULT "",  [col46] TEXT DEFAULT "",  [col47] TEXT DEFAULT "",  [col48] TEXT DEFAULT "",  [col49] TEXT DEFAULT "",  [col50] TEXT DEFAULT "",  [col51] TEXT DEFAULT "",  [col52] TEXT DEFAULT "",  [col53] TEXT DEFAULT "",  [col54] TEXT DEFAULT "",  [col55] TEXT DEFAULT "",  [col56] TEXT DEFAULT "",  [col57] TEXT DEFAULT "",  [col58] TEXT DEFAULT "",  [col59] TEXT DEFAULT "",  [col60] TEXT DEFAULT "",  [col61] TEXT DEFAULT "",  [col62] TEXT DEFAULT "",  [col63] TEXT DEFAULT "",  [col64] TEXT DEFAULT "",  [col65] TEXT DEFAULT "",  [col66] TEXT DEFAULT "",  [col67] TEXT DEFAULT "",  [col68] TEXT DEFAULT "",  [col69] TEXT DEFAULT "",  [col70] TEXT DEFAULT "",  [col71] TEXT DEFAULT "",  [col72] TEXT DEFAULT "",  [col73] TEXT DEFAULT "",  [col74] TEXT DEFAULT "",  [col75] TEXT DEFAULT "",  [col76] TEXT DEFAULT "",  [col77] TEXT DEFAULT "",  [col78] TEXT DEFAULT "",  [col79] TEXT DEFAULT "",  [col80] TEXT DEFAULT "",  [col81] TEXT DEFAULT "",  [col82] TEXT DEFAULT "",  [col83] TEXT DEFAULT "",  [col84] TEXT DEFAULT "",  [col85] TEXT DEFAULT "",  [col86] TEXT DEFAULT "",  [col87] TEXT DEFAULT "",  [col88] TEXT DEFAULT "",  [col89] TEXT DEFAULT "",  [col90] TEXT DEFAULT "",  [col91] TEXT DEFAULT "",  [col92] TEXT DEFAULT "",  [col93] TEXT DEFAULT "",  [col94] TEXT DEFAULT "",  [col95] TEXT DEFAULT "",  [col96] TEXT DEFAULT "",  [col97] TEXT DEFAULT "",  [col98] TEXT DEFAULT "",  [col99] TEXT DEFAULT "",  [col100] TEXT DEFAULT "",  [col101] TEXT DEFAULT "",  [col102] TEXT DEFAULT "",  [col103] TEXT DEFAULT "")';
        db.exec(sql_createtbl);
    }
mathiasrw commented 9 years ago

THank you for the demo code.

Interesting.

I will investigate...

seal713 commented 9 years ago

@mathiasrw ,

Recently days, we've tried one option, to store below properties for each table into local storage and reload them from local storage when needed.

var keysImdbTable = ['columns', 'xcolumns', 'defaultfns', 'identities', 'inddefs', 'indices', 'uniqdefs', 'checkfn'];

But we met one issue, the 'UNIQUE' constraint is not working properly and it may have some other potential issue as well.

Hence we wonder is there any way from Alasql end which can make the DDL execution faster, or any similar like above but more robust API we can use to store/load the schema.

Warm regards, Seal

mathiasrw commented 9 years ago

@seal713

I have not found out any good way. Mayby @agershun can help out - but he is not so active these weeks.

seal713 commented 9 years ago

@mathiasrw @agershun ,

Many thanks for your kindly support. Please let us know once there is any update from your guys expertise.

Regards, Seal

nickdeis commented 8 years ago

A potential workaround that I've been using is to define the schema once, then dump it as a JSON and save it. Then you can simply load the schema from that. So

var myDB = JSON.stringify(alasql.databases[dbid])

Save the output (localStorage, on the server, indexDB) Then, when the page loads, instead of defining the ddl, you can simply do the following

alasql.databases[dbid] = JSON.parse(myDB);

But it seems like you've already tried that (not sure)

The best way to improve serialization/deserialization robustness is probably to add specialized toJSON methods. See: Object#toJSON

Then add specialized static methods fromJSON methods and recursively handle each prototype.

mathiasrw commented 8 years ago

Interesting

Would you mind sharing the table definitions? I would love to test what part of them that affects time

seal713 commented 8 years ago

@nickdeis @mathiasrw , thanks a lot for you guys assistance. But now the issue is, once we reload the database with alasql.databases[dbid] = JSON.parse(myDB);, some constraint schema like UNIQUE is missing.

Looks like those kind of constraint information is missing or can not be re-initialized during the loading. This may need alasql's fmk change.

agershun commented 8 years ago

Yes, there some thing to be done after Database is loading from JSON:

  1. Each tables should be recreated from new alasql.Table() object.
  2. We need to reindex all tables in database with table.indexColumns() function

Some ideas can be borrowed from here

I will add a special function to AlaSQL for storing/restoring whole database from the JSON object. We also need this functionality for TRANSACTIONs support. Sorry for late answer.

mathiasrw commented 8 years ago

I will add a special function to AlaSQL for storing/restoring whole database from the JSON object

Would really be a killer feature


Some ideas can be borrowed from here

Interesting comment on https://github.com/agershun/alasql/blob/develop/src/92localstorage.js#L71 :)

agershun commented 8 years ago

We also need to do not forget about 'objects'... This property is used by graph functionality.

Отправлено с iPhone

11 янв. 2016 г., в 15:04, Mathias Rangel Wulff notifications@github.com написал(а):

I will add a special function to AlaSQL for storing/restoring whole database from the JSON object

Would really be a killer feature

— Reply to this email directly or view it on GitHub.

nickdeis commented 8 years ago

:+1: This would help my alasql-loader and any build time/ serverside technologies in the future.

agershun commented 8 years ago

Absolutely! I am starting...

mathiasrw commented 8 years ago

How are you thinking the syntax will be?

I was thinking something simple like

IMPORT('./myData.db.json');
...
update abc set bla = 45 where xxx=123;
...
EXPORT('./myData.db.json');

or maybe

LOAD('./myData.db.json');
...
update abc set bla = 45 where xxx=123;
...
SAVE('./myData.db.json');
nickdeis commented 8 years ago

Might be of some use: Object#toJSON

I'd rather see a alasql.load(json) and alasql.dump(db) function. Could always have both I suppose.

mathiasrw commented 8 years ago

Good point. Its must be accessed by a property of the alasql object.

I actually prefer your words .load( and .dump(

I beleave its good to have it as a "sql" command syntax also - so it can be used from the commandline

agershun commented 8 years ago

Just dive into the issue... we will have a problem to restore insert/delete/update functions...

You can see some temporary files:

... will work on it...

agershun commented 8 years ago
  1. Another pairs: STORE/RESTORE :) serialize/deserialize. But it does not matter...
  2. To minimize the time of loading we need to provide pair of native functions, like:
var obj = alasql.storeDatabase(databaseid)
alasql.restoreDatabase(obj,databaseid,forceOverwrite)
mathiasrw commented 8 years ago

I would hope that there was (also) a method to export all of my databases at the same time so it's only one operation when I want to save all of my alasql on the client.


var obj = alasql.storeDatabase(databaseid); alasql.restoreDatabase(obj,databaseid,forceOverwrite)

In my perception it does not make sense not to force overwrite. I cant se when you would NOT like to force an overwrite...

mathiasrw commented 8 years ago

What about if

mathiasrw commented 8 years ago

Just dive into the issue... we will have a problem to restore insert/delete/update functions...

are you talking triggers?

As long as the triggers are made by making a function that observes (so its not SQL stored in alasql) I see no reason to keep this as part of the dump.

agershun commented 8 years ago

No, Triggers do not use observers. So the problem not here, but in createtable.js file. AlaSQL creates three functions for each table. Probably, I can reqrite this procedure to unlink it from the table...

mathiasrw commented 8 years ago

The import/export/load/dump/store/restore/serialize/deserialize/ feature is also a bit related to #54 and #110

agershun commented 8 years ago

110 - yes, #54 - ... no (I thought only about SQL-text with INSERTs).

agershun commented 8 years ago

BTW some people need the function to 'flat the JSON object', like (there are some XLS-related questions about it):

var obj = {a:{b:{c:5}, d:1}, c:2};
var res = flat(obj);
// res = {'a.b.c':5,'a.d':1,c:2};

Did you meet this function before?

We can add it to the library with parameters:

function FLAT(obj, pointer) {}
function UNFLAT(obj, pointer) {}

By default pointer is equal "." So, we can use it:

var data = [{a:{b:{c:1}}}, {a:{d:5}}];
alasql('SELECT * INTO XLSX("my.xlsx",{header:true}) FROM FLAT(?,"->")',[data]);
// save data in XLSX file: 
//                    a->b->c      a->d
//                              1        
//                                         5
mathiasrw commented 8 years ago

Nope. I have a feeling its a recursive little cute function. I think the hard part is to make sure one does not go into a loop with objects somehow pointing to a parrent

mathiasrw commented 8 years ago

What about if there is an object with an array of objects - do they expect the flat to do create elements like a.b[5].foo ?

agershun commented 8 years ago

Yes. a.b[5] or a.b.5. This can be an option. It is not hard (but little bit slow to check for nested recursion - but it solvable). I atill thinking about this StackOverflow issue: http://stackoverflow.com/questions/34527959/parse-a-jsonwith-array-and-objects-and-export-the-data-into-excel-file-in-node/34771859#34771859

mathiasrw commented 8 years ago

Made the flat function: http://jsfiddle.net/40oq02g4/

Im expanding with abillity to config joining chars

mathiasrw commented 8 years ago

Yes. a.b[5] or a.b.5.

If its .5 one does not know if it was an array or a n object property named "5"

agershun commented 8 years ago

You are great programmer!

a[5] or a.5 - Who cares... :) anyway it can be reached by a['b']['5']

We can make the secial option or leave it as a.b[5]. Let's use [ ] and then ask users for that.

Could you write a FROM function from the FLAT() function like:

alasql.stdfn.FLAT = function() { 
   // Your function goes here
}
alasql.from.FLAT = function(filename, opts, cb, idx, query) {
    var res = alasql.stdfn.FLAT(filename,opts);
    if(cb){
        res = cb(res, idx, query);
    }
    return res;
 };
agershun commented 8 years ago

After that we can use this function:

SELECT FLAT({a:{b:1}})
SELECT * FROM FLAT({a:{b:1}})
mathiasrw commented 8 years ago

Opdated version: http://jsfiddle.net/40oq02g4/1/


var flat = function(obj, objDevider, levels, showArrayAsObjec, arrayDevider){
  levels = levels | 0 || 100; 
  objDevider = objDevider || '.';
  arrayDevider = arrayDevider || [];
  arrayDeviderPre = arrayDevider[0] || '[';
  arrayDeviderPost = arrayDevider[1] || ']';
  showArrayAsObjec = !!showArrayAsObjec;

  function flatten(obj, path, result, level, parrentIsObj) {
        if(level>levels){
        return;
      }

      for (var prop in obj) {
          if (obj.hasOwnProperty(prop)) {
                            var name
                            if(0===level){
                name = prop
              } else if (parrentIsObj || showArrayAsObjec){
                name = path + objDevider + prop
              }else{
                name = path + arrayDeviderPre +prop+arrayDeviderPost
              }

              if (typeof obj[prop] == "object"){
                    flatten(obj[prop], name, result, level+1, obj[prop].constructor !== Array);
              }else{
                  result[name] = obj[prop]
              }
          }
      }
      return result
  }

return flatten(obj, '', {}, 0);
}

Default settings are the same as flat(obj, '.', 100, false, ['[',']']);


Having

var x = {
    'name': 'a',
    'kid': {
        'level': 2,
        'sun': [{
            'name': 'c',
            'level': 3
        },{
        'name': 'b',
        'level': 2,
        'sun': [{
            'name': 'c',
            'level': 3
        }]
     }]}
   }

we get

flat(x, '->');

{
    "name": "a",
    "kid->level": 2,
    "kid->sun[0]->name": "c",
    "kid->sun[0]->level": 3,
    "kid->sun[1]->name": "b",
    "kid->sun[1]->level": 2,
    "kid->sun[1]->sun[0]->name": "c",
    "kid->sun[1]->sun[0]->level": 3
}
flat(x, '->');
gives
{
    "name": "a",
    "kid->level": 2,
    "kid->sun[0]->name": "c",
    "kid->sun[0]->level": 3,
    "kid->sun[1]->name": "b",
    "kid->sun[1]->level": 2,
    "kid->sun[1]->sun[0]->name": "c",
    "kid->sun[1]->sun[0]->level": 3
}
flat(x, '_', 3, true)
gives
{
    "name": "a",
    "kid_level": 2,
    "kid_sun_0_name": "c",
    "kid_sun_0_level": 3,
    "kid_sun_1_name": "b",
    "kid_sun_1_level": 2
}
mathiasrw commented 8 years ago

Could you write a FROM function from the FLAT() function like:

I expect that 'filename' can be renamed to object and that the options are an object with settings?

agershun commented 8 years ago

We were (I was)wrong. A little bit. I will describe it in an hour.

Отправлено с iPhone

14 янв. 2016 г., в 20:32, Mathias Rangel Wulff notifications@github.com написал(а):

Could you write a FROM function from the FLAT() function like:

I expect that 'filename' can be renamed to object and that the options are an object with settings

— Reply to this email directly or view it on GitHub.

agershun commented 8 years ago

For array we need to generate a separate line for each array element with all upper properties

Отправлено с iPhone

14 янв. 2016 г., в 20:40, agershun@gmail.com написал(а):

We were (I was)wrong. A little bit. I will describe it in an hour.

Отправлено с iPhone

14 янв. 2016 г., в 20:32, Mathias Rangel Wulff notifications@github.com написал(а):

Could you write a FROM function from the FLAT() function like:

I expect that 'filename' can be renamed to object and that the options are an object with settings

— Reply to this email directly or view it on GitHub.

mathiasrw commented 8 years ago

Try this from http://jsfiddle.net/40oq02g4/2/

alasql.stfn.FLAT = function(obj,opts) { 

    opts = opts || {}
    var levels = opts.levels | 0 || 100; 
    var objDevider = opts.objDevider || '.';
    var arrayDevider = opts.arrayDevider || [];
    var showArrayAsObjec = !!opts.showArrayAsObjec;
    var arrayDeviderPre = arrayDevider[0] || '[';
    var arrayDeviderPost = arrayDevider[1] || ']';

    function flatten(obj, path, result, level, parrentIsObj) {

            if(level > levels){
                return;
            }

            for (var prop in obj) {
                    if (obj.hasOwnProperty(prop)) {
                            var name
                            if(0===level){
                                name = prop
                            } else if (parrentIsObj || showArrayAsObjec){
                                name = path + objDevider + prop
                            }else{
                                name = path + arrayDeviderPre + prop + arrayDeviderPost
                            }

                            if (typeof obj[prop] == "object"){
                                flatten(obj[prop], name, result, level+1, obj[prop].constructor !== Array);
                            }else{
                                result[name] = obj[prop]
                            }
                    }
        }
        return result
    }

return flatten(obj, '', {}, 0);
}

alasql.from.FLAT = function(obj, opts, cb, idx, query) {
    var res = alasql.stdfn.FLAT(obj,opts);
    if(cb){
        res = cb(res, idx, query);
    }
    return res;
 };
agershun commented 8 years ago

Let's create a separate issue for FLAT()

Отправлено с iPhone

14 янв. 2016 г., в 20:41, agershun@gmail.com написал(а):

For array we need to generate a separate line for each array element with all upper properties

Отправлено с iPhone

14 янв. 2016 г., в 20:40, agershun@gmail.com написал(а):

We were (I was)wrong. A little bit. I will describe it in an hour.

Отправлено с iPhone

14 янв. 2016 г., в 20:32, Mathias Rangel Wulff notifications@github.com написал(а):

Could you write a FROM function from the FLAT() function like:

I expect that 'filename' can be renamed to object and that the options are an object with settings

— Reply to this email directly or view it on GitHub.

xiaqizhi commented 8 years ago

@mathiasrw @agershun

Hi, Any update on the new the import/export/load/dump/store/restore/serialize/deserialize/ feature ? my team is using alasql and facing the same issue where the schema creating take a long time for 60 tables on iPad. we need an API to load/import the DB schema in one operation instead of creating the table one by one. I searched the new version of alasql 0.2.5, but could not find this APIs. Is there a plan in the near future to support the import/export? Thank you.

mathiasrw commented 8 years ago

Hi @xiaqizhi

Glad to hear your find good use for this.

There is no plan on what to improve - only individual people contributing with what they like working with.

If you need this urgently you might want to contact @agershun and see if you can buy some of his time - or look into the matter yourself and contribute to the library.

nkviru commented 8 years ago

@agershun, did some investigation to find the missing details between create table vs load schema using JSON. In below screen "window._memDB" is the one with createTable and "myContact" is the new one loaded from local storage (JSON).

crtblvsjsonload

nkviru commented 8 years ago

Also, with schema loaded with JSON (local staorage), primary key doesn't work either. @agershun, need you help to resolve this issue, in case you are busy, please let me know probable fix area for loading schema through JSON so that we will try our best to fix, but given the complexity it may turn out to be difficult at our end.

primekeynotwork

mathiasrw commented 8 years ago

Thanks for digging into this !

probable fix area for loading schema through JSON

All code interaction for the JSON storage file is located in https://github.com/agershun/alasql/blob/develop/src/94filestorage.js

xiaqizhi commented 8 years ago

@mathiasrw Thanks for your quick response. Currently we're trying to load the table schema from JSON that's stored in local storage. We have stored the property of columns, xcolumns, defaultfns, inddefs, uniqdefs and pk.

Code Sample:

var contactUniqdefs = JSON.stringify(window._memDB.tables.Sync.uniqdefs);
window.localStorage.setItem("contactUniqdefs", contactUniqdefs);
var myContact = new alasql.Table();
myContact.uniqdefs = JSON.parse(window.localStorage.getItem("contactUniqdefs"));

And then the issue occurs as @nkviru mentioned above.

We're wondering why some properties of UNIQUE and PRIMARY constraint are lost after such kind of conversion. Is there any API that can be invoked to set the UNIQUE or PRIMARY constraint?

Would you please kindly provide some fix for this issue or let us know the probable area to fix?

Any help/hint would be highly appreciated.

Drashij commented 3 months ago

Hi,

I am using AlaSQL in my Ionic Capacitor app for web-based SQL operations on IndexedDB. The tables are created correctly, and data is inserted in the expected format initially. However, after refreshing the browser, the schema appears to be lost, and while data continues to be inserted, it no longer follows the correct format.

Is there a way to persist the schema, or am I overlooking something?

Here is how I’m initializing the database:

async initDB(): Promise<void> {
  try {
    const result = await alasql.promise(
      `CREATE INDEXEDDB DATABASE IF NOT EXISTS ${this.dbName}`
    );
    await alasql.promise(`ATTACH INDEXEDDB DATABASE ${this.dbName}`);
    await alasql.promise(`USE ${this.dbName}`);

    console.log(`IndexedDB '${this.dbName}' initialized successfully`);
  } catch (error) {
    console.error('Error initializing IndexedDB:', error);
  }
}

After this, I run the CREATE TABLE queries and then perform data insertions.

mathiasrw commented 3 months ago

@Drashij Please open things like this on stack overflow or in https://github.com/AlaSQL/alasql/discussions

Below is the response from the AlaSQL bot


Your issue is likely due to not explicitly persisting the schema in IndexedDB after the initial page load. In AlaSQL, the schema (table definitions, indexes, etc.) is stored in memory, which is why it gets lost after a browser refresh. To resolve this, you should explicitly persist the schema in IndexedDB.

Here's how you can adjust your initialization function to ensure the schema persists across page loads:

async initDB(): Promise<void> {
  try {
    const dbExists = await alasql.promise(`SHOW TABLES FROM ${this.dbName}`);

    if (dbExists.length === 0) {
      // Create tables only if they don't exist
      await alasql.promise(`
        CREATE TABLE IF NOT EXISTS myTable (
          id INT PRIMARY KEY, 
          name STRING
        )
      `);
      // Additional CREATE TABLE queries go here

      console.log(`Tables created in IndexedDB '${this.dbName}'`);
    }

    await alasql.promise(`ATTACH INDEXEDDB DATABASE ${this.dbName}`);
    await alasql.promise(`USE ${this.dbName}`);

    console.log(`IndexedDB '${this.dbName}' initialized successfully`);
  } catch (error) {
    console.error('Error initializing IndexedDB:', error);
  }
}

Key Adjustments:

  1. Check for Existing Tables: Before creating tables, check if they already exist using SHOW TABLES.
  2. Create Tables Conditionally: Only create tables if they don't exist, which prevents schema loss after refresh.

With these adjustments, the schema should persist across page loads, and your data insertions should continue to follow the correct format.