Open til-schneider opened 5 years ago
Maybe I'm missing something but this doesn't quite work for me. An empty message gives this:
export const Empty = {
read(pbf: Pbf, end?: number): IEmpty {
return pbf.readFields(Empty._readField, {}, end);
},
_readField(tag: number, obj: any, pbf: Pbf): void {
},
write(obj: IEmpty, pbf: Pbf): void {
}
};
There is a type error because _readFields()
is (tag: number, obj: any, pbf: Pbf) => void
but pbf.readFields
's first parameter is (tag: number, result?: T, pbf?: Pbf) => void
.
Note, that is according to the @types/pbf
package.
There is a similar problem for write
here:
write(obj: IDirectoryListing, pbf: Pbf): void {
if (obj.entries) for (var i = 0; i < obj.entries.length; i++) pbf.writeMessage(1, DirectoryEntry.write, obj.entries[i]);
}
If only there were some type-safe (ish) language that PBF could have been written in to prevent these errors... :-P
I think you are referring to the typings of @types/pbf
which are defined here?
I didn't change the internal implementation of read
and write
- it is the same as in the current master. The problem is, that their implementation relies on certain premises - e.g. that array attributes for repeated fields are already there. If you change to a stronger typing as any
you'll get type errors with the generated code.
You can test this by changing the generated signature and then run the tests.
But I'm wondering why you get type errors. T
will be resolved to any
, so (tag: number, result?: T, pbf?: Pbf) => void
should be compatible to (tag: number, obj: any, pbf: Pbf) => void
.
The problem is the ?
s - _readField
needs to be able to handle the case where obj
and pbf
are undefined
(for some reason).
I think the error might actually be in @types/pdf
actually - I can't see any reason why pbf
would ever be undefined
and the generated _readField()
code assumes that obj
is not undefined.
Apart from that type issue (which I may resolve by just uninstalling @types/pbf
), this works great for me!
I have a 10 MB message that takes 10 seconds to decode with the official implementation, but with this it takes 170 millisconds! I thought Google engineers were meant to be good.
The only thing I wish for was that it translated the type names from snake case to camel case properly.
Yes, I think so, too. This is clearly a bug in @types/pdf
. As you wrote, obj
and pbf
are not optional.
Unfortunately if I remove @types/pbf
then the import statement does not work:
import Pbf from 'pbf';
I tried creating pbf.d.ts
containing declare module 'pbf';
but then I get errors like Cannot use namespace Pbf as a type
.
How did you get this to work?
Ah I found the magic "shut up typescript" incantation:
declare module 'pbf' {
type Pbf = any;
export = Pbf
}
Ugh no that doesn't quite work because then you can't do new Pbf
. :-(
Ok finally got it to work by copy/pasting the @types/pbf' file into
pbf.d.tsand fixing the
?`s:
declare module 'pbf' {
class Pbf {
static readonly Varint: 0;
static readonly Fixed64: 1;
static readonly Bytes: 2;
static readonly Fixed32: 5;
buf: Uint8Array;
pos: number;
type: number;
length: number;
constructor(buffer?: Uint8Array|ArrayBuffer);
destroy(): void;
readFields<T>(readField: (tag: number, result: T, pbf: Pbf) => void, result: T, end?: number): T;
readMessage<T>(readField: (tag: number, result: T, pbf: Pbf) => void, result: T): T;
readFixed32(): number;
readSFixed32(): number;
readFixed64(): number;
readSFixed64(): number;
readFloat(): number;
readDouble(): number;
readVarint(isSigned?: boolean): number;
readVarint64(): number;
readSVarint(): number;
readBoolean(): boolean;
readString(): string;
readBytes(): Uint8Array;
readPackedVarint(arr?: number[], isSigned?: boolean): number[];
readPackedSVarint(arr?: number[]): number[];
readPackedBoolean(arr?: boolean[]): boolean[];
readPackedFloat(arr?: number[]): number[];
readPackedDouble(arr?: number[]): number[];
readPackedFixed32(arr?: number[]): number[];
readPackedSFixed32(arr?: number[]): number[];
readPackedFixed64(arr?: number[]): number[];
readPackedSFixed64(arr?: number[]): number[];
skip(val: number): void;
writeTag(tag: number, type: number): void;
realloc(min: number): void;
finish(): Uint8Array;
writeFixed32(val: number): void;
writeSFixed32(val: number): void;
writeFixed64(val: number): void;
writeSFixed64(val: number): void;
writeVarint(val: number): void;
writeSVarint(val: number): void;
writeBoolean(val: boolean): void;
writeString(str: string): void;
writeFloat(val: number): void;
writeDouble(val: number): void;
writeBytes(buffer: Uint8Array): void;
writeRawMessage<T>(fn: (obj: T, pbf: Pbf) => void, obj?: T): void;
writeMessage<T>(tag: number, fn: (obj: T, pbf: Pbf) => void, obj?: T): void;
writePackedVarint(tag: number, arr: number[]): void;
writePackedSVarint(tag: number, arr: number[]): void;
writePackedBoolean(tag: number, arr: boolean[]): void;
writePackedFloat(tag: number, arr: number[]): void;
writePackedDouble(tag: number, arr: number[]): void;
writePackedFixed32(tag: number, arr: number[]): void;
writePackedSFixed32(tag: number, arr: number[]): void;
writePackedFixed64(tag: number, arr: number[]): void;
writePackedSFixed64(tag: number, arr: number[]): void;
writeBytesField(tag: number, buffer: Uint8Array): void;
writeFixed32Field(tag: number, val: number): void;
writeSFixed32Field(tag: number, val: number): void;
writeFixed64Field(tag: number, val: number): void;
writeSFixed64Field(tag: number, val: number): void;
writeVarintField(tag: number, val: number): void;
writeSVarintField(tag: number, val: number): void;
writeStringField(tag: number, str: string): void;
writeFloatField(tag: number, val: number): void;
writeDoubleField(tag: number, val: number): void;
writeBooleanField(tag: number, val: boolean): void;
}
export = Pbf;
}
Could you open a pull request with this change in @types/pbf
?
Yep, though I'm not really sure about all the types. You might want to double check it yourself. E.g. are the ?
s here correct?
writeRawMessage<T>(fn: (obj: T, pbf: Pbf) => void, obj?: T): void;
writeMessage<T>(tag: number, fn: (obj: T, pbf: Pbf) => void, obj?: T): void;
And is arr
really optional in all the readPacked
functions?
Ok I've been using this pull request for a little while and it seems to be working very well. My only suggestion is to add /* tslint:disable */
at the start of the file because there are quite a few warnings otherwise (empty blocks, unbraced if
s, line length).
Actually there is one other issue - sub-messages. Consider:
message Foo {
Bar b = 1;
}
message Bar {
uint32 a = 1;
}
This will give the following interfaces:
export interface IFoo {
b?: IBar;
}
export interface IBar {
a?: number;
}
So far so good, however the read code sets b
to null
by default, rather than undefined
. So this causes a type error:
export const Foo = {
read(pbf: Pbf, end?: number): IFoo {
return pbf.readFields(Foo._readField, {b: null}, end);
},
I'm pretty sure this is just a bug in pbf though; not in this change.
This line should return undefined
instead of null
.
Yep, though I'm not really sure about all the types. You might want to double check it yourself. E.g. are the
?
s here correct?writeRawMessage<T>(fn: (obj: T, pbf: Pbf) => void, obj?: T): void; writeMessage<T>(tag: number, fn: (obj: T, pbf: Pbf) => void, obj?: T): void;
No, I think it should be obj: T
(without ?
) in both cases:
writeMessage
simply passes fn
and obj
to writeRawMessage
(see line 345 of index.js).writeRawMessage
passes obj
to fn
(see line 332 of index.js).fn
is a generated write
method, which has generated code like obj.value
without any null/undefined check. So obj
can't be optional.And is
arr
really optional in all thereadPacked
functions?
arr
might be optional:
this.type !== Pbf.Bytes
then the readPacked
functions all call arr.push
without any null/undefined check. So in those cases arr
is not optional.this.type === Pbf.Bytes
, then the readPacked
functions do arr = arr || [];
. So this would work if arr
is null/undefined. But I'm not shure whether this really happens.My only suggestion is to add
/* tslint:disable */
at the start of the file because there are quite a few warnings otherwise (empty blocks, unbracedif
s, line length).
14addeb adds that comment.
Regarding null
vs. undefined
in sub-messages I agree with you: This should be changed in pbf.
Thanks for your pull request!
The change in the "normal mode" - I mean "each top-level message is generated into a single object literal" - is already covered by the existing tests.
Testing the output of the new modes --es6
and --typescript
would need some big dev-dependencies, which you probably don't want. These modes pretty much work in the same way as the "normal mode" - the internals of the generated code which do all the reading and writing is the same. The new modes only change code around - like ES6 exports or TypeScript types.
So I would propose to not add extra tests for this.
Are there any updates on this? It would be a pretty nifty feature to have...
From my perspective it's done...
@til-schneider Minor note, it looks like the generated code isn't compatible with --strictNullChecks
(with the aforementioned @types/pbf issue):
Would you mind me PRing a fix to your repo?
Can you post the code snippet which is complained by TypeScript?
Sure, you can send me a PR if you already have a fix.
I'd love to see a release with this functionality.
For tests, would it help to add set of tests asserting the expected compiled output for each format (js, es6, ts)? I don't see any tests like that for the standard js output, but with the addition of other formats, it might make sense?
I'd be happy to add tests if it helps this PR to move forward :)
@til-schneider @mourner What is the status here.
I would love to this PR lands - I could help if something is missing or need to be updated.
Please let me know.
This whole project looks pretty dead to me. The last release was in Nov 2019. This pull requests is soon one year old and none of the project owners doesn't really care about.
I can't judge the proposed changes of @UlysseM by simply looking at them. So I would have to create a test covering these cases. But since this pull request could have been merged already for months I don't see why it should be merged at any time. So I don't see why to invest any more efforts into this.
For those who want to use this TypeScript support, you can use my forked repo in your package.json
directly:
"dependencies": {
"pdf": "github:junghans-schneider/pbf#master"
}
This whole project looks pretty dead to me. The last release was in Nov 2019. This pull requests is soon one year old and none of the project owners doesn't really care about.
Gosh. The project is very much alive and works perfectly for the purpose it was built for, and I could respond in detail on why this particular PR slipped through and what would have helped maintainers land this sooner, but with this kind of toxic attitude, I'm glad we never did. Have a nice day.
I'm sorry. I really didn't want to insult you. I wanted to reply to @vicb's question "What is the status here". It was just my impression this project could be dead, because of the lack of any reaction for so many months.
I could respond in detail on why this particular PR slipped through and what would have helped maintainers land this sooner
Actually, this would be very helpful. What is still missing from your perspective?
@mourner
Gosh. The project is very much alive and works perfectly for the purpose it was built for, and I could respond in detail on why this particular PR slipped through and what would have helped maintainers land this sooner, but with this kind of toxic attitude, I'm glad we never did. Have a nice day.
I maintain and contribute to quite a few open source projects (including mapbox projects) and I know it is though as most people would only complain and be negative.
However here people are proposing help and I can think that keeping this PR opened for so long without commenting on what need to be done is not less disrespectful than the comment from @til-schneider. To be clear I don't see either as toxic. Let's say that this PR has slipped but let's try to revive and merge this.
I tend to agree with @til-schneider when it comes to mapbox repos. I use vector-tile-js, pbf, tippecanoe for one of my projects. Those packages are super helpful but sometimes it feels like they are not maintained. By no mean I want to be toxic by saying that but maybe mapbox should clarify what PR are accepted (bug fix only vs features, ...). If it doesn't fit with what people want to do with the code they can fork it. BTW my project also use togeojson but the tmcw fork because of nice fixes/evolutions he made.
To come back on this CL:
I don't think there is a need to generate ES5, ES6 and typescript. This makes the code more complex, more bug prone and harder to test. Generating only one version should be ok.
I would favor generating typescript and let the TS compiler generate ESwhatever if needed. From the issue I created it seems like you don't agree. Then what about generating ES6 only and a .d.ts file for typescript ? (the source code need not be in typescript when there is a .d.ts).
I think ES6 would allow for nice simplification of the code (i.e. reduce the line count, make it easier to understand). For example the context could use class inheritance instead of directly manipulating the (JS) protos.
I hope we can move this forward together, Have a nice day
(shameless plug: could yo chime in mapbox/vector-tile-js#74)
I would favor generating typescript and let the TS compiler generate ESwhatever if needed. From the issue I created it seems like you don't agree.
Yes right, I don't agree. As said in #122, I think complexity would get up, not down. But I think we should not duplicate this discussion here. Let's keep the discussion in #122.
Then what about generating ES6 only and a .d.ts file for typescript?
I don't see the point about the complexity of generating ES5, ES6 and TypeScript. This only affects the glue code around the actual logic (which stays the same for all flavors). I count 8 if statements in compile.js
which do the trick. That's not an amount of complexity which needs to be reduced. I do like having the choice between generating ES5, ES6 or TypeScript. And maybe there are still users who need ES5 and don't like to transpile ES6 to ES5.
However, I do like the idea of generating a .d.ts file. Having a .d.ts would also help JavaScript users since many editors understand them and can provide assistance like code completion. But I see this as an extension which could be done after this pull request is merged. Therefore I think we should do this discussion in an extra issue.
Let's keep the discussion in #122.
I don't see the point about the complexity of generating ES5, ES6 and TypeScript.
Check @UlysseM comments on your CL. They all have subtle differences and you will eventually end up with 3 different code paths. While I'm sure you can handle that it will be a burden to fix bug and maintain this code - on top of having to write 3 times similar code and tests.
There are very good tools to transpile the code today, no need to re-invent them in this repo in my opinion.
users who need ES5 and don't like to transpile ES6 to ES5.
The idea would not be to ask user to transpile anything. The idea would still be that the compiler generates what ever output you tell him to generate but with only 2 lines of code, conceptually:
npm i transpiler
,write(transpiler(generateCode, targetLanguage))
.That would be transparent for the users and remove unneeded complexity here. The simpler the better - even more true for maintenance.
I am very interested in ES6 output, not so much TypeScript.
Maybe the tests suggested nearly a year ago (even if that adds a lot of dev dependencies) would help land this.
@mourner: I already asked this a few months ago: What is still missing from your perspective?
@til-schneider my philosophy for open source tools like this is to keep them minimal, simple, and as narrowly focused as possible, while being meticulously covered with tests, so that they can keep being useful for years without regular manual maintenance — that's the only way for me to manage maintaining 50+ projects at once, especially during challenging years like 2020.
If we merge the PR in the current state, there will be no way to verify whether TypeScript or ES6 version continues to work after any subsequent PR, however small, without manually testing it. Same with any TS version upgrade. So covering with tests is essential, even if it means bringing in typescript
as a dev dependency.
The other thing that I'm worried about is that nor I neither other maintainers are using TypeScript, so I don't feel qualified enough to decide in favor of a certain design decision (such as those people have disagreements about above) without spending a long time carefully considering each option, which I unfortunately don't have spare time for at the moment due to other commitments. Intuitively I like the option of providing .d.ts
annotations instead of .ts
because this feels like a simpler and lower maintenance solution, but without a strong consensus so far, I'm hesitant to choose.
@mourner: Very well explained! Thank you for this.
I think it should be possible to test the ES6 version just with plain node (since node natively supports ES6 by now).
What about this:
.d.ts
file instead of a combined .ts
.As already mentioned, the .d.ts
file would also help developers not using TypeScript, since many editors are able to use them for providing code assistance. And the actually running code would have test coverage.
@mourner: Would you accept this PR with these changes?
Maybe I should then drop the --typescript
option and always generate a .d.ts
file in -es6
mode?
@til-schneider even if the PR isn't accepted, with those changes I'd probably start using your branch instead of the main project. It's pretty stable and shouldn't need much maintenance.
Maybe I should then drop the
--typescript
option and always generate a.d.ts
file in-es6
mode?
Yes 👍
I use https://github.com/timostamm/protobuf-ts which is a solid and well maintained TS implementation. You might want to check it out.
On Sun, Jan 3, 2021, 17:26 Andrew Herron notifications@github.com wrote:
@til-schneider https://github.com/til-schneider even if the PR isn't accepted, with those changes I'd probably start using your branch instead of the main project. It's pretty stable and shouldn't need much maintenance.
Maybe I should then drop the --typescript option and always generate a .d.ts file in -es6 mode?
Yes 👍
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/mapbox/pbf/pull/107#issuecomment-753712926, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAB4X4TWDNRL74EACLG2VSDSYEKLFANCNFSM4IQBXYJA .
I use https://github.com/timostamm/protobuf-ts which is a solid and well maintained TS implementation. You might want to check it out.
Thanks but I don't actually need TS support, just ES6, and my aim is to use the library with the smallest bundle size. In my exploration pbf
was easily the winner in that area.
That matches what I observed. Pbf is probably the best if bundle size is your primarily concern.
On Mon, Jan 4, 2021, 03:39 Andrew Herron notifications@github.com wrote:
I use https://github.com/timostamm/protobuf-ts which is a solid and well maintained TS implementation. You might want to check it out.
Thanks but I don't actually need TS support, just ES6, and my aim is to use the library with the smallest bundle size. In my exploration pbf was easily the winner in that area.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/mapbox/pbf/pull/107#issuecomment-753927077, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAB4X4RLQYBR3LYM4IU4TSLSYGSHVANCNFSM4IQBXYJA .
Before I do any more work on this PR, I need to know from @mourner at which point he would accept this PR. I could implement the suggestions from my last comment. But what would it be good for, if @mourner still didn't accept the PR.
I'm also interested in this, to some extent. It would be extremely cool if the types of this project would be part of it. A simpler solution to this would be to simply include the types from @types/pbf in this repo and later on automatically generate them instead of manually edit them. I can open a PR with the simple approach if anyone is interested, but I guess this PR is better as it does the second part already...
My two cents.
This could be very helpful. The only reason we don't use pbf is bc to the lack of typescript definitions generation.
This could also be very helpful for me as I'm trying to migrate my site to Typescript which depends on pbf. Will there be any progress on this PR?
You can simply use @types/pbf
...
Apologies for commenting on an old neglected PR. My last few years haven't been easy to put it mildly — I'm a Ukrainian living in Kyiv, there's only so much energy to work on non-critical open source maintenance.
I finally just got my hands on overhauling and modernizing the project, adding ES codegen by default, first-class TS types for the package itself, updating dev tools and code practices, and greatly extending test coverage — check out https://github.com/mapbox/pbf/pull/134. I'll keep this PR open for reference when we're ready to add TS codegen support.
This adds two options
--es6
and--typescript
which allow to generate ES6 or TypeScript code.Example ES6 output:
Example TypeScript output:
Further more each top-level message is generated into a single object literal:
before:
after: