Closed emmacasolin closed 2 years ago
The ErrorEncryptedEFSError
has additional properties on top of those from AbstractError
, so its toJSON
and fromJSON
need to be adjusted accordingly. This also involves modifying the constructor, since it didn't take a message
parameter.
The constructor is now
constructor(
{
errno,
message,
path,
dest,
syscall,
}: {
errno: {
errno: number;
code: string;
description: string;
};
message?: string;
path?: string;
dest?: string;
syscall?: string;
},
options: {
timestamp?: Date;
data?: POJO;
cause?: T;
} = {},
) {
if (message == null) {
message = errno.code + ': ' + errno.description;
if (path != null) {
message += ', ' + path;
if (dest != null) message += ' -> ' + dest;
}
}
super(message, options);
this._errno = errno.errno;
this._code = errno.code;
this._description = errno.description;
this._syscall = syscall;
}
So if we're reconstructing the error inside fromJSON
we can pass in the message property.
And the toJSON
/fromJSON
are as follows:
public toJSON(): any {
const json = super.toJSON();
json.data._errno = this._errno;
json.data._code = this._code;
json.data._description = this._description;
json.data._syscall = this._syscall;
return json;
}
public static fromJSON<T extends Class<any>>(
this: T,
json: any,
): InstanceType<T> {
if (
typeof json !== 'object' ||
json.type !== this.name ||
typeof json.data !== 'object' ||
typeof json.data.message !== 'string' ||
isNaN(Date.parse(json.data.timestamp)) ||
typeof json.data.data !== 'object' ||
typeof json.data._errno != 'number' ||
typeof json.data._code != 'string' ||
typeof json.data._description != 'string' ||
!('cause' in json.data) ||
('stack' in json.data && typeof json.data.stack !== 'string') ||
('_syscall' in json.data && typeof json.data._syscall !== 'string')
) {
throw new TypeError(`Cannot decode JSON to ${this.name}`);
}
const e = new this({
errno: {
errno: json.data._errno,
code: json.data._code,
description: json.data._description,
},
message: json.data.message,
syscall: json.data._syscall,
},
{
timestamp: new Date(json.data.timestamp),
data: json.data.data,
cause: json.data.cause,
});
e.stack = json.data.stack;
return e;
}
I think you need json.data._description
in your fromJSON
.
And _code
.
Yep, good catch - fixed now
There are quite a lot of test failures, I'm assuming because of the upgrade to Node 16. Working through them now.
The first thing was that a lot of the tests were not finishing and not timing out - they would just run forever. I narrowed it down to a while loop in INodeManager.fileGetBlocks
where the condition was inodesUtils.unbufferId(k as BufferId) !== blockCount
, but if whatever value is returned by unbufferId
is larger than blockCount
then we enter an infinite loop. Changed the !==
to a <
to solve this.
I feel like this isn't the issue though, since this is the sort of thing that should have been affecting the tests before Node 16 if it was a problem... I'm also seeing a test failure that suggests another related issue... my current theory is that perhaps lexicographic-integer
works differently with Node 16, but I'll have to experiment a bit more
I feel like this isn't the issue though, since this is the sort of thing that should have been affecting the tests before Node 16 if it was a problem... I'm also seeing a test failure that suggests another related issue... my current theory is that perhaps
lexicographic-integer
works differently with Node 16, but I'll have to experiment a bit more
So it appears that unbufferId
is returning undefined
, so it could indeed be a problem with lexicographic-integer
Ok, I think it's actually a problem with the db, or at least with tran.iterator
, because the keys coming out of
const options = endIdx
? {
gte: inodesUtils.bufferId(startIdx),
lt: inodesUtils.bufferId(endIdx),
}
: { gte: inodesUtils.bufferId(startIdx) };
let blockCount = startIdx;
for await (const [k, v] of tran.iterator(options, [
...this.dataDbPath,
ino.toString(),
])) {
console.log(k.toJSON());
are { type: 'Buffer', data: [] }
for every k.
The tran.iterator
shouldn't be giving you back empty buffers. That would be a bug.
The js-db
hasn't been updated to nodev16 yet, so I'm not sure why that would occur.
Can you get 2 repos cloned, and compare between the master version and the current PR, and see whether the buffers are empty?
Note that Buffer.toJSON()
is not necessary, you can just log out the buffer directly.
Note that js-db
on master has these dependencies:
"dependencies": {
"@matrixai/async-init": "^1.7.0",
"@matrixai/errors": "^1.0.1",
"@matrixai/logger": "^2.1.0",
"@matrixai/resources": "^1.0.0",
"@matrixai/workers": "^1.3.0",
"@types/abstract-leveldown": "^7.2.0",
"level": "7.0.1",
"threads": "^1.6.5"
},
None of those dependencies should affect that... so maybe there's a bug in js-db when run with node v16. You'll want to branch off in js-db later for investigation if you change to node v16.
Unrelated, but note that shell.nix
should also be changed to this:
{ pkgs ? import ./pkgs.nix {} }:
with pkgs;
pkgs.mkShell {
nativeBuildInputs = [
nodejs
nodePackages.node2nix
];
shellHook = ''
echo 'Entering js-encryptedfs'
set -o allexport
. ./.env
set +o allexport
set -v
mkdir --parents "$(pwd)/tmp"
# Built executables and NPM executables
export PATH="$(pwd)/dist/bin:$(npm bin):$PATH"
# Enables npm link
export npm_config_prefix=~/.npm
npm install
set +v
'';
}
I've tested in a new repo with everything the same except using Node 14 instead of 16, using the old pkgs.nix
, and using es2020 instead of es2021 and the Buffers are no longer empty. So there's something going on with one of these changes.
I've tested in a new repo with everything the same except using Node 14 instead of 16, using the old
pkgs.nix
, and using es2020 instead of es2021 and the Buffers are no longer empty. So there's something going on with one of these changes.
It's actually a problem with the pkgs.nix
- changing only this causes the issue with the buffers being empty. The old rev (53caacaf56640d04180775aee016d2f16d6f083c) is fine but the new one (a5774e76bb8c3145eac524be62375c937143b80c) is not.
I can't seem to replicate the issue inside js-db
after upgrading it to node 16. I even tried seeing if adding the options gte
and lt
made a difference but nothing... the keys are their expected values.
Ok at this point you should try using the debugger, or you can go into the node_modules
inside EFS and add console.log
tracing deeper into the stack.
Make sure that EFS is actually using the same version of js-db that you have in the master branch of js-db.
Additional debugging support can be done by using npm link
. This allows you to "link" the js-db implementation to the js-encryptedfs, so that changes in js-db can be immediately reflected in js-encryptedfs.
@tegefaulkes has used this before, you can ask him how to set that up.
It appears that by upgrading EFS and/or db into nodev16, we end up with an issue where the INodeManager
is missing from the key path. This results in some problems in our testing.
Strangely this seems to occur because this.constructor.name
is empty. Investigating why.
Running:
class X {
static g() {
return this.name;
}
prop = [this.constructor.name];
f () {
return this.constructor.name;
}
}
const x = new X();
console.log(x.f());
console.log(x.prop);
console.log(X.g());
Works fine inside npm test -- ./tests/random.test.ts
.
Narrowed it down to the CreateDestroyStartStop
decorator. It appears in node v16, the decorator ends up making the this.name
and this.constructor.name
into empty strings. This didn't happen in nodev14.
This may be due to node changes, or it may be due to changes in https://github.com/MatrixAI/js-async-init/pull/9.
Confirmed it was in js-async-init
.
I have released 1.7.3 of js-async-init, please update here accordingly.
The culprit was the that class decorators used anonymous classes, and these do not preserve the name. In fact its correct behaviour was to set the class name to an empty string. See: https://github.com/microsoft/TypeScript/issues/37157
Node v16 must have fixed this, and so the fix was to use Object.defineProperty
to always set the class name from the extended class.
This landed in https://github.com/MatrixAI/js-async-init/commit/02d441352de5dd23f738dec488b96924398d394f.
BTW @emmacasolin I noticed that there was some missing changes in the v16 update, so I also ended up updating js-resources, js-async-locks, and js-async-init. Also js-errors got an update too. Please review the commits I made to all the libraries to see what changes were missing.
So I recommend updating ALL libraries again.
Some new changes that js-async-init had in the eslintrc was merged into TypeScript-Demo-Lib https://github.com/MatrixAI/TypeScript-Demo-Lib/commit/758a9d21333209700bd7addb098d1a2c080b6e4d
@emmacasolin the js-db has to be updated before js-encryptedfs.
You'll need to bring js-db to nodev16 updating all the libraries there, and then come back to this.
Also js-workers
too. So pending updates to v16 is (in-order):
Make sure not to bring in typescript-cached-transpile
into those libraries, and shell.nix
should be updated.
js-logger
doesn't have any docs - should I add that in?
Also js-async-init
is still missing the change to shell.nix
and .gitlab-ci.yml
and js-async-locks
is missing the change to .gitlab-ci.yml
.
Those changes can be done without doing a new release.
js-logger
doesn't have any docs - should I add that in?
Sure
A couple test failures I'm currently debugging:
The first one is random, and it's in EncryptedFS.concurrent.test.ts
for the test EncryptedFS.copyFile and EncryptedFS.createWriteStream
. Here we have a concurrent write stream which writes 'AAAAA'
ten times (using a for loop) alongside a copy of the file being written to some other file. The test then checks the contents of the file, which the test expects to either be empty (the copy was done before writing anything) or to contain 10x 'AAAAA'
(the copy was done after the write stream ended).
This test fails occasionally with the copied file containing 9x 'AAAAA'
. I'm assuming this just means that the copy was made in the middle of the write stream, and that this is expected behaviour (i.e. the test is too rigid)? I've run the failing test around 10-20 times and for the most part the copied file contents is 10x 'AAAAA'
(it's never empty).
The second one is in INodeManager.file.test.ts
for read sparse blocks from a file
and this one occurs every time the test is run. The test expects there to be zeroed blocks in the created file, however these aren't present. I haven't looked into this one too much yet.
Regarding the second test, @tegefaulkes mentioned he was still debugging that. Sparsity can occur either between blocks or within blocks.
When sparsity is within blocks, it is expected that zero bytes are written to the sparse locations within the block.
When sparsity is between blocks, then no blocks are written, instead the block indexes are completely skipped and when you are reading the blocks, the fileGetBlocks
and related functions are supposed to synthetically create zeroed out blocks for missing block indexes.
The above is what should be happening, you may need to use db.dump
or tran.dump
to inspect the contents to see what is happening. I don't think this area of the code has been adequately tested.
The second failure was due to something I had changed while debugging yesterday. The logic here
while (inodesUtils.unbufferId(k as BufferId) !== blockCount) {
yield Buffer.alloc(blockSize);
blockCount++;
}
Was confusing for two reasons. Firstly because the term on the right of the condition is the one that's changing (usually it would go on the left) and secondly because !==
makes it unclear what we expect to happen to blockCount
(i.e. how is it changing every time we loop? Increasing? Decreasing?). I had changed the !==
to a <
, but it should have been the other way around. The condition here is now
while (blockCount < inodesUtils.unbufferId(k as BufferId)) {
yield Buffer.alloc(blockSize);
blockCount++;
}
Which is both logically correct and easier to understand at a glance.
As for the other failing test, expecting either the copy to be empty or the copy to contain the full stream is too rigid, as it could also be anything in between. It now checks for the contents to be all A
, to have a length that is a multiple of 5, and to have a size between 0-50.
// All A, in multiples of 5
expect(contents).toMatch(RegExp('^[^A]*((AAAAA)+[^A]*)$'));
// Contents length between 0 and 10*5
expect(contents.length).toBeGreaterThanOrEqual(0);
expect(contents.length).toBeLessThanOrEqual(50);
expect(stat.size).toBeGreaterThanOrEqual(0);
expect(stat.size).toBeLessThanOrEqual(50);
All tests passing. Just need to re-generate the docs and then I'll merge and release.
That blockCount
solution does seem more clearer. Nice.
Description
@matrixai/errors was upgraded to include
toJSON
andfromJSON
to help with error serialisation/deserialisation. This has been updated here (along with other dependencies), along with upgrading to Node 16 to align with other Matrix AI packages.Tasks
Final checklist