Open RyanCavanaugh opened 8 years ago
not sure if it's related to this issue, anyway this is what happens with my code:
let resolver;
new Promise(resolve => (resolver = resolve));
console.log(resolver); // error: Variable 'resolver' is used before being assigned.ts(2454)
I post here an issue, which i think related to this topic. I Don't really understand why the compiler is not able to infer arg: c
in my last function :
(tell me if i should open a new issue instead)
type t = number[] | c;
class c {
constructor(public data: number[]) {}
m(n: number): void { console.log(n); }
}
function f(arg: t): void {
if (!(arg instanceof c)) {
arg = new c(arg);
}
arg.m(0); // expected behavior: arg is inferred as c
}
function g(arg: t): void {
if (!(arg instanceof c)) {
arg = new c(arg);
}
(() => {
arg.m(0); // expected behavior: arg is inferred as c
})();
}
function h(arg: t): void {
if (!(arg instanceof c)) {
arg = new c(arg);
}
for (let n = 0; n < 3; n++) {
arg.m(n); // expected behavior: arg is inferred as c
}
}
function i(arg: t): void {
if (!(arg instanceof c)) {
arg = new c(arg);
}
[0, 1, 2].forEach(n => {
arg.m(n); // actual behavior: arg is not inferred as c !
});
}
@basickarl hey, You can use overloading:
interface Something {
maybe?: () => void;
}
function run(isTrue: true): { maybe: () => void };
function run(isTrue: boolean): Something {
const object: Something = {};
if (isTrue) {
object.maybe = (): void => {
console.log('maybe');
};
}
return object;
}
run(true).maybe(); // no error here anymore
Since #40860 was closed automatically...
interface ValidationErrorMap {
readonly [errorCode: string]: string | ValidationErrorMap;
}
type ValidationResult = ValidationErrorMap | "success";
function combine(results: ReadonlyArray<ValidationResult>): ValidationResult {
let finalResult: ValidationResult = "success";
for(const result of results) {
finalResult = (result === "success") ? finalResult
: (finalResult === "success") ? result
: { ...finalResult, ...result }; // Spread types may only be created from object types.
}
return finalResult;
}
Intuitively, if the compiler is smart enough to recognize that finalResult
is initialized with a specific value of "success"
it should be smart enough to see that the value changes inside a loop scope, and can't be assumed.
Since #41045 was closed automatically...
type Thing = { data: any };
const things: Map<string, Thing> = new Map();
function add_thing (id: string, data: any) {
let thing = things.get(id);
if (typeof thing === 'undefined') {
thing = { data };
things.set(id, thing);
}
// Object is possibly 'undefined'
return () => thing.data;
}
It is not possible for "thing" to be undefined based on the above code.
From #41113, I'll add another duplicate to the pile:
function doSomething(callback: () => void) {
callback();
}
let result: 'foo' | 'bar' = 'bar';
doSomething(() => {
result = 'foo';
})
// This should work, but fails because type is always "bar"
if (result === 'foo') {
}
You can fix it by casting to the appropriate type:
let result: 'foo' | 'bar' = 'bar' as 'foo' | 'bar';
This can also happen with promises - although it may be possible to detect the case where the type needs to be expanded Consider the following:
enum State {
Start = 0,
Foo = 1,
Middle = 2,
End = 3
}
class TestClass {
public state: State = State.Start;
async go(): Promise<void> {
this.state = State.End
const promise = new Promise<void>(resolve => {
// doSomething();
this.state = State.Middle;
resolve();
});
await promise;
if (this.state === State.Middle) { // Compiler Error
}
}
}
const testClass = new TestClass()
console.log(testClass.state);
testClass.go();
console.log(testClass.state);
Simple case in test code:
const a = ["a"];
assert(a.length === 1);
a.push("b");
// error
assert(a.length === 2);
Is there any way to force a pessimistic compiler and assume that all specified types are possible, disabling automatic narrowing? This is quite frustrating when I have to use as Type
statements to cast variables into the types I've already specified for them. It would also lead to very error-prone code when types change because I have to bypass the compiler to keep it from erroring out on simple loops and callbacks. I think a short-term solution (seeing that this has been open since 2016) would be to add a configuration option to let the user choose between the two methods while a more permanent solution is developed.
typescript has different types for the same variable in the same scope
value
here has two different types.
one is string
and the other is string | string[]
function getValue(value: string | (()=>string)):void{}
function test(value: string | string[]): void{
if(value instanceof Array){
return
}
// commenting out this line resolve the issue, but I don't know why?
// TS consider `value` here as string | string[], but it should be string only
value = 'string'
// `value` here is string
getValue(value)
// but here is string | string[]
getValue(()=>value)
}
issue #44921
@eng-dibo If you read the issue description, it’s fully explained why that happens.
I'm trying to get the idea from the issue description, but I couldn't if you explain what exactly causes this issue for my code snippet, it would be nice of you @fatcerberus
An out
operator, applicable to both arguments and external identifiers would be interesting.
For arguments:
// Operator `out` in `result`.
declare function tryDoSomething(x: string, out result: { success: boolean; value: number; }): void;
function myFunc(x: string) {
let result = { success: false, value: 0 };
tryDoSomething(x, result);
if (result.success === true) { // %%
return result.value;
}
tryDoSomething(x.trim(), result);
if (result.success === true) { // Ok - `result` is `out`
return result.value;
}
return -1;
}
For external identifiers:
let token = SyntaxKind.ExportKeyword
declare function nextToken(): SyntaxKind out(token) // A comma-separated list - `out(a,b,c)`
if (token === SyntaxKind.ExportKeyword) {
nextToken();
if (token === SyntaxKind.DefaultKeyword) { // OK - `nextToken` affects `token`
...
}
...
}
In addition, if a function calls another with side effects, it will need to propagate the same out
statements.
For arguments:
declare function foo(out point: Point): void
function bar(point: Point) {
foo(point) // Error - function `bar` needs to declare `point` as `out` too
}
For external identifiers:
let x = 0
declare function incX(): void out x
function foo() {
incX() // Error - function `foo` needs to declare `out x` too.
}
Maybe, a &
operator instead of out
:
declare function foo(&x: number): void
declare function bar(): void &(x,y)
Or any other more elegant syntax with the same functionality.
An out operator, applicable to both arguments and external identifiers would be interesting.
Initially, "out variables" could be entirely optional. Changing the value of a variable without declaring it as out
is acceptable. The restriction only applies when there are calls to functions that already contain out
declarations (need to be propagated by the calling function). Next, there could be a strict
option to not allow modifications to external variables without declaring them as out
in the function definition.
I find adding modifiers to specific members of interfaces is excessive. Also because it is the function that must define if the member of the interface provided as argument must change or not, and not the interface itself. Consequently, setting an argument of type point: { x, y }
to out
in a function makes the compiler understand that any member of point
may have changed after the function returns.
BorrowScript is a Rust inspired Borrow Checker with TypeScript inspired Syntax. It adds ownership operators that looks like a related approach to the suggested out
operator (see write
operator and this issue regarding a current typo in the readme in the write
example). Keep in mind that BorrowScript is overall a different language that compiles to static binaries. For example, string
is mutable. Hence, the example of the write
operator looks like this
function writeFoo(write foo: string) {
foo.push('bar')
console.log(foo) // "foobar"
}
As far as I understand, this should work for objects, too
function changeState(write data: { state: 'stopped' | 'running' }) {
// toggle state
if (data.state === 'stopped') {
data.state = 'running';
} else {
data.state = 'stopped';
}
}
In the last section HTTP Server with State of the readme is a draft how closures might look like, but (syntax) is not final. See [copy counterRef]
in this excerpt of the example:
const counterRef = new Mutex(0)
server.get('/', (req, res)[copy counterRef] => {
let value = counterRef.lock()
value.increment()
res.send()
})
@RyanCavanaugh This could be applied to your example Optimistic: Bad behavior on locals of the issue description. I add [write token]
to the signatures of nextToken
and maybeNextToken
which results in
enum Token { Alpha, Beta, Gamma }
let token = Token.Alpha;
function nextToken()[write token] {
token = Token.Beta;
}
function maybeNextToken()[write token] {
if (... something ...) {
nextToken();
}
}
function doSomething() {
if (token !== Token.Alpha) {
maybeNextToken();
}
// is this possible? yes, since `maybeNextToken` with annotation `[write token]` may have been called
if (token === Token.Alpha) {
// something happens
}
}
where I answered the question in the comment in doSomething
.
Note: I guess, [write token]
may be inferred from the body of the function (e.g. if an assignment token = ...
exists). If it can not be inferred, the proper default should be read
.
This should also work for Optimistic: Bad behavior on fields, but write result
has to be added explicitly to the declaration of tryDoSomething
, since the body of the function is not known:
// Function somewhere else
declare function tryDoSomething(x: string, write result: { success: boolean; value: number; }): void;
function myFunc(x: string) {
let result = { success: false, value: 0 };
// result.success is known to be false
tryDoSomething(x, result);
// result.success may have changed, since result may have changed by call of tryDoSomething,
// which has `write result`
if (result.success === true) { // %%
return result.value;
}
tryDoSomething(x.trim(), result);
// result.success may have changed, since result may have changed by call of tryDoSomething,
// which has `write result`
if (result.success === true) { // ??
return result.value;
}
return -1;
}
In Pessimistic: Bad behavior on locals log
should have no write kind
, i.e. no changes are required.
In Pessimistic: Bad behavior on fields, example 1 alert
may have [write a]
, but it is (still) not known in fn
if the passed args
is a
.
function fn(arg: { x: string | null }) {
if (arg.x !== null) {
alert('All is OK!');
// Flow: Not OK, arg.x could be null
console.log(arg.x.substr(3));
}
}
let a = { x: 'ok' };
function alert()[write a] {
a.x = null;
}
fn(a);
You could assume pessimistically that arg.x
might have changed, since a
has a property x
(as well as arg
), which might be changed by alert
, since it has write a
. If the (names of) properties are different, you could assume that arg
may not be the same object as a
. However, I guess, it is better to be optimistic in general, i.e. do not assume that write a
might change a different variable arg
, even though that means that you would not detect the bug in this example.
In Pessimistic: Bad behavior on fields, example 2 I would argue in the same way for an optimistic approach, i.e. even if visit
has write node
, the expressions passed to visit
are not the same variable as the parameter node
of visitChildren
. To highlight this, I renamed the parameter of visit
(from node
to child
) to avoid ambiguity (with the parameter node
of visitChildren
), when referring to the variable name, i.e. visit
has write child
instead of write node
, since node
refers to visitChildren
. Changed example:
function visitChildren(node: Node, visit: (child: Node) => void) {
switch(node.kind) {
case SyntaxKind.BinaryExpression:
visit(node.left);
visit(node.right); // Unsafe? No, since node.left is passed and not node
break;
case SyntaxKind.PropertyAccessExpression:
visit(node.expr);
visit(node.name); // Unsafe? No, since node.expr is passed and not node
break;
}
}
As in the previous example Pessimistic: Bad behavior on fields, example 1, you may not detect bugs in case, visit
can change node
anyway. For example, if you can access and change the parent node of node.left
, i.e. node
. You would not know that node.left.parent === node
.
In Mitigating with (shallow) inlining / analysis I added write x
to check1
and check2
. The first example leads to a false positive, but the second works fine
// Non-null assignment can still trigger null warnings
function fn(x: string | null) {
function check1()[write x] {
x = 'still OK';
}
if (x !== null) {
check1();
// False positive: Error, x could be null, since `write x` does not tell what changed
console.log(x.substr(0));
}
}
function fn(x: string | null) {
function check1()[write x] {
check2();
}
function check2()[write x] {
x = null;
}
if (x !== null) {
check1();
// error, since check1 has write x
console.log(x.substr(0));
}
}
I'd like an //@ts-no-type-narrowing override (or some way to mark up a variable as not being able to be type narrowed) because it's annoying having to work around this issue (and it's much clearer for my colleagues than //ts-expect-error.
I have a callback function which is passed to a class, various methods on the class are awaited, any of which may call the callback function and that callback function changes a variable. Between each method I want to check the value of the variable but typescript is throwing a hissy fit because "enum with a long name" value 0 can never be equal to "enum with a long name" value 3, and the amount of bloat I have to add to check that same variable 3 times when typescript thinks it hasn't changed is making my code unreadable.
e.g.:
var myvariable: no_narrow MyReallyLongEnumName = MyReallyLongEnumName.Value0;
That way we could mark up variables passed in as function parameters and such. We'd still get typing information for the variable and type checking and autocomplete on our checks and assignments but we'd be able to override the compiler's type narrowing for an individual variable that we know is going to change.
This is just absurd. Can you not turn off ts(2339) for JavaScript or something? This flags millions of lines of JavaScript code as broken.
One more example where current behavior is problematic:
let init: Array<number> | false = false
const setInit = (...args: Array<number>) => {
init = args
}
setInit(1, 2, 3)
// init is now = [1, 2, 3]
// but TypeScript still thinks it's `false`, so inside below 'if' refinement it thinks it's of type `never`
if (init) setInit(...init)
// ^^^^ error, because `init` is of type: `never`
Playground link with relevant code
A variable that is a union of two or more types is being assumed to only be the initialized type and refined to it, even though the actual type might change as a side effect of another function that's executed in the same scope.
I am coming here via #37415 and #27900 due to discussion being aggregated here, but those others are more specific. Here is an illustrative code sample where two structures the same in JavaScript are treated differently by TypeScript, which is quite confusing:
const examplePromiseConstructor = function(generallyAString ?: string) {
return new Promise(function(resolve, reject) {
//At the start of this function, the default value of the param is set,
//but the logic for doing so is more complex than what can readily fit into
//the function signature (e.g. depending on a complex combination of
//other parameters omitted from this minimal example)
//and may reference another function in the same object,
//which prevents automatic inference of that object's type
generallyAString = 'a string';
console.log(generallyAString); //generallyAString is of type 'string'
Promise.resolve()
.then(function() {
//generallyAString should still be of type 'string'
console.log(generallyAString); //generallyAString is of type 'string | undefined'
resolve(generallyAString)
}).catch(function(err: unknown) {
console.error(err);
reject(err)
});
});
}
//From a JavaScript perspective, this is exactly the same as the above,
//but Typescript treats it differently and maintains the type narrowing
//through the resolution of the promise which demonstrably cannot affect
//the type of the value.
const exampleAsync = async function(generallyAString ?: string) {
generallyAString = 'a string';
console.log(generallyAString);
try {
await Promise.resolve();
console.log(generallyAString); //generallyAString is still of type 'string'
return generallyAString;
} catch(err: unknown) {
console.error(err);
throw err;
};
}
The log lines are mostly there so you can hover over the variable name and see the type.
In the real example which sent me here, the async call is more complex than Promise.resolve()
but doesn't take the local generallyAString
as a parameter and doesn't affect its value or type.
@wbt Both examples are not exactly the same. TS is IMO here exactly right due to async behaviour of promises. Proven by changing your example:
const examplePromiseConstructor = function(generallyAStringParam ?: string) {
return new Promise(function(resolve, reject) {
generallyAStringParam = 'not const';
const generallyAString = 'a string';
console.log(generallyAString); //generallyAString is of type 'string'
Promise.resolve()
.then(function() {
console.log(generallyAString); //generallyAString is of type 'string'
console.log(generallyAStringParam); //generallyAStringParam is of type 'string | undefined'
});
//generallyAString = undefined // not allowed
generallyAStringParam = undefined // oh!
});
}
examplePromiseConstructor()
@HolgerJeromin I don't agree with your comment. TypeScript can see the code as it actually is, which is NOT as in your hypothetical, and see that there is no change in the parameter type which could precede arrival at the 'then' block.
Hi, I was pointed to report this here, would it be part of this issue?
JS type checking: recognize globalThis.vars set in functions
In vscode Settings, enable JS/TS: Check JS
Create app.js
globalThis.uti = {}
function init() {
globalThis.gigel = 1
}
function plugin(options) {
console.log('ready on port', process.env.PORT)
console.log(uti) // this works
console.log(gigel) // Error: cannot find name
}
init()
plugin()
gigel
should be recognized as a global variable, even though it was set in a function.
Thanks!
Did not read everyone's response as there is a lot, but why can't typescript look inside the function to see if it does change and the possible outcomes given that?
I have a different suggestion to solve this problem. Instead of trying to get TypeScript to know correctly what the type is, give us a way to override the type in a given scope. To say this variable is this type now. Something I'd imagine it looked like.
const chars = 'Hello // Cake\nPotato'.split('')
while (chars.length) {
if (chars[0] === '/') {
now chars[0] = string // Telling TypeScript that chars[0] is now to be seen as a string and not '/' within the given scope.
do
chars.shift()
while (chars[0] !== '\n') // Currently throws an error saying there is no overlap between chars[0] and '\n'
chars.shift()
}
console.log(chars.shift())
}
Don't know if this is the correct place to address this, but it would be great if TypeScript could do some flow analysis for assignments? Recently I write a lot of mocks of API calls for our tests. Let's say you have some resource which can have a lot of different status. There are a lot of shared properties, but also status-specific properties. And of course you can transition from one status to another.
Currently I write it like this:
type Example = { status: 'a' } | { status: 'b', field: string };
// generated at some point
const example = { status: 'a' } as Example;
// later update data
example.status = 'b';
if (example.status === 'b')
example.field = '123';
But it would be great if I could write it like this:
type Example = { status: 'a' } | { status: 'b', field: string };
// generated at some point
const example = { status: 'a' } as Example;
// later update data
example.status = 'b';
example.field = '123'; // this currently errors
Furthermore it would be great if TypeScript would complain on this:
type Example = { status: 'a' } | { status: 'b', field: string };
// generated at some point
const example = { status: 'a' } as Example;
// later update data
example.status = 'b';
// FORGOT TO SET "field"
Basically requiring me to set all mandatory fields immediately after setting status
to 'b'
.
@donaldpipowitch but what do you mean by "immediately"? What type does example
have at the beginning of example.field = '123'
?
This seems like a better fit for
let example: Example = {status: 'a'};
//later
example = {...example, status: 'b', field: '123'};
unless there's some reason it's vital that the object referenced by example
not change.
Yeah, it's actually vital that the object reference doesn't change.
"immediately": Was just thinking about something similar to assigning properties in a constructor (only that there is no constructor and the assignment needs to happen directly after setting the status
in the example).
Hey!
Below code gives error "This comparison appears to be unintentional because the types 'false' and 'true' have no overlap.". Replacing "setIt(oo);" with "oo.cancel = true;" makes sense.
Is this related to this issue or new one? If related, any suggestion except "ts-ignore"? Maybe some internal hinting decorator/flag, comment that will mark setIt as mutating passed value?
function setIt(o: { cancel: boolean }): void {
o.cancel = true;
}
function doIt(): void {
const oo = { cancel: false };
if (oo.cancel === true) {
console.log('1');
return;
}
setIt(oo);
if (oo.cancel === true) {
console.log('2');
return;
}
console.log('3');
}
doIt();
Just encountered a bug in control flow for foreach
vs for
loop:
let x: string | number | null = null;
for (let i = 0; i < 1; i++) {
x = 'string';
}
[1].forEach(i => {
x = 1;
});
x;
Expected: x
should be inferred as type string | number | null
Actual: x is inferred as type
string | null` (it seems to ignore the forEach control flow)
Reproduced on TS v5.0.3
I just had an issue the other way round with async
and await
(adding for better searchability). Happy to create a seperate issue when needed.
Typescript did not detect possible type change after an async
function call.
The code after await
could be called many minutes later (same as code inside a .then
Promise method) so the detected type information should be reset.
class Foo{
static prop: string | number = '';
}
// working
Foo.prop = '';
Promise.resolve()
.then(()=>{
Foo.prop;
//^?
// detected as string | number
});
// not working
async function context() {
Foo.prop = '';
async function foo (){
Foo.prop = 1;
}
await foo()
Foo.prop;
//^?
// detected as string only but should be string | number
}
Re: "Mitigating with const parameters". That would simplify CFA and therefore speed up type checking, and of course help users. Seems essential.
In case it's not been covered, here's an example that doesn't deal with objects whatsoever, so there's no concern with mutation, only reassignment: https://tsplay.dev/m3K1qm Note that it works if i change the var
to const
, but I'd expect TS to be able to determine that the var isn't actually reassigned anywhere and pretend it's a const
.
FYI The behavior seems to be inconsistent depending on how the function is defined
Here's a simple playground https://tsplay.dev/w2gbVN
const element = new Node();
if (element instanceof HTMLElement) {
function one() {
element;
// ^? Node
}
const two = function () {
element;
// ^? HTMLElement
};
const three = () => {
element;
// ^? HTMLElement
};
}
Ideally it would be preferable if it kept the type narrowing here since we're narrowing on a const variable, not on a let or some mutable field - but even if that's not gonna be implemented due to complications I think it would be better to at least keep in consistent maybe?
FYI The behavior seems to be inconsistent depending on how the function is defined
Your example looks like correct behaviour according to JS hoisting rule - the inner function
gets pulled up to the top of the parent function scope.
A work around for this type check:
let x: 1 | boolean = false
setTimeout(() => { x=1 })
await waitUntilXChanged(...)
// wrong
if (x === 1) { ... }
const xIsNumber = (res: typeof x): res is 1 => x === 1
// passed
if (xIsNumber(x)) { ... }
@borisovg
FYI The behavior seems to be inconsistent depending on how the function is defined
Your example looks like correct behaviour according to JS hoisting rule - the inner
function
gets pulled up to the top of the parent function scope.
This playground behavior seems to indicate that one
is not hoisted to outside the if
scope:
class Bob { name(){ return "bob"}};
class Alice { name() { return "alice"}};
const element = Math.random() > 0.5 ? new Alice() : new Bob();
one(); // error - Cannot find name 'one'.(2304), also JS runtime error "one is not defined"
if (element instanceof Bob) {
one();
function one() {
console.log(element.name())
element;
// ^? Bob | Alice
}
const two = function () {
element;
// ^? Bob
};
const three = () => {
element;
// ^? Bob
};
}
Therefore, in the above example, because element
is const
, element
inside the if
will always be instanceof Bob
.
@craigphicks You are right that I don't actually fully understand how hoisting of function definitions works! 😂 There is definitely some hoisting going on in non-strict mode but I don't know how that works with TS:
if (1 > 0) {
function foo() {
console.log("FOO");
}
}
foo()
// prints 'FOO' as unless 'use strict' is set
@borisovg it's hoisted if that branch gets executed. So in the example before I would assume it is safe to consider element to be Bob - since it would be impossible to call this function without element being Bob
In your example replace 1 > 0
with 1 < 0
and you will get an error even w/o 'use strict'
@HugeLetters Looking at your original example it is interesting that if you change const element
to let element
all the 3 inner functions will type it as Node
despite the guard. There is definitely as TS dimension being added here.
Yup, they all change since with let
there's no guarantee it won't be reassigned w/o some overly complicated checks - and practically impossible to guarantee when you introduce async
I just ran into a problem with this for the first time despite using TypeScript for years.
Here's a simplified example to demonstrate. Essentially, my class has a property. That property can be changed by a method, but TypeScript doesn't think it can.
class Foo {
state: 'online' | 'offline' = 'online'
foo() {
if (this.state !== 'online') return
this.bar()
if (this.state === 'offline') return // This line gives an error because TypeScript thinks the value must be `'online'`.
console.log('still online')
}
bar() {
this.state = 'offline'
}
}
I just ran into a problem with this for the first time despite using TypeScript for years.
Here's a simplified example to demonstrate. Essentially, my class has a property. That property can be changed by a method, but TypeScript doesn't think it can.
class Foo { state: 'online' | 'offline' = 'online' foo() { if (this.state !== 'online') return this.bar() if (this.state === 'offline') return // This line gives an error because TypeScript thinks the value must be `'online'`. console.log('still online') } bar() { this.state = 'offline' } }
The error ocurs from L8 if (this.state !== 'online') return
, after that line, type of this.state must be 'online'.
@dhlolo The error occurs on line 7.
This comparison appears to be unintentional because the types '"online"' and '"offline"' have no overlap.(2367)`
Line 5 contains if (this.state !== 'online') return
, not line 8.
While TypeScript is correct that this.state
must be 'online'
immediately after line 5, line 6 (this.bar()
) changes that, so TypeScript should not be so confident that this.state
is still 'online'
after line 6.
Note that it would be a different story if state
was a variable instead of a property, since this.bar()
could not change it. However, it's a property of a class, which can be changed by methods in between checks.
@dhlolo The error occurs on line 7.
This comparison appears to be unintentional because the types '"online"' and '"offline"' have no overlap.(2367)`
Line 5 contains
if (this.state !== 'online') return
, not line 8.While TypeScript is correct that
this.state
must be'online'
immediately after line 5, line 6 (this.bar()
) changes that, so TypeScript should not be so confident thatthis.state
is still'online'
after line 6.Note that it would be a different story of
state
was a variable instead of a property, sincethis.bar()
could not change it. However, it's a property of a class, which can be changed by methods in between checks.
It seems tsc will not detect effect of method ‘bar’ which is to change 'this.state'. It can only infer in some obvious shallow grammar. If you replace 'this.bar()' with this.state = 'offline'
, the error disappear. By the way, you can use asserts to avoid the error: https://www.typescriptlang.org/play/?ts=5.4.5#code/MYGwhgzhAEBiD29oG8BQ1oQC5iwUwC5oByeAOxAEsy9joAfE+AM2aproF4mLrbV00ZogAUAShSCMlZtBFYAFpQgA6bLjzQAhJ26leHCQCc8WAK5GyU6IuUqARmCPiA3NZlzbq9fmi69LGx8xBLQJuaW1sDkEPAgeCog8ADmIsTYlCAg0OTstGKCAL6Cjs5iRJAQeEZYMF7QyiiYOPhEpKx5dIWSGBheai2aAR3BbhjFxUA
@dhlolo The problem isn't that TypeScript doesn't detect that bar()
changes this.state
. I would never expect it to do that. The problem is that TypeScript assumes that this.state
can't have been changed at all. This means that you are writing code with the false assumption that this.state
must be 'online'
and potentially using it in places where 'offline'
would be invalid.
class Foo {
state: 'online' | 'offline' = 'online'
foo() {
if (this.state !== 'online') return
this.bar()
// @ts-ignore: TypeScript raises an error because it assumes `this.state` must be `'online'`.
if (this.state === 'offline') {
console.log('offline');
}
// qux only accepts `'online'` or `'not-online'`, but `this.state` could be `'offline'`.
// TypeScript doesn't raise an error because it assumes `this.state` must be `'online'`.
this.qux(this.state);
}
bar() {
this.state = 'offline'
}
qux(input: 'online' | 'not-online') {
if(input !== 'online' && input !== 'not-online') {
// This will throw because `input` is actually `'offline'`.
throw new Error(`Expected 'online' or 'not-online'. Got '${input}'`);
}
}
}
@dhlolo The problem isn't that TypeScript doesn't detect that
bar()
changesthis.state
. I would never expect it to do that. The problem is that TypeScript assumes thatthis.state
can't have been changed at all. This means that you are writing code with the false assumption thatthis.state
must be'online'
and potentially using it in places where'offline'
would be invalid.class Foo { state: 'online' | 'offline' = 'online' foo() { if (this.state !== 'online') return this.bar() // @ts-ignore: TypeScript raises an error because it assumes `this.state` must be `'online'`. if (this.state === 'offline') { console.log('offline'); } // qux only accepts `'online'` or `'not-online'`, but `this.state` could be `'offline'`. // TypeScript doesn't raise an error because it assumes `this.state` must be `'online'`. this.qux(this.state); } bar() { this.state = 'offline' } qux(input: 'online' | 'not-online') { if(input !== 'online' && input !== 'not-online') { // This will throw because `input` is actually `'offline'`. throw new Error(`Expected 'online' or 'not-online'. Got '${input}'`); } } }
Honestly, you are right. It seems to be too optimistic in your case which class method may set class property. But other choices could be assuming that all method could change property(which seems to be too pessimistic), or analyse deeper(which may make things much more complicated).
@MartinJohns You seem to dislike this whole exchange. Care to chime in as to why?
@jordanbtucker Because it's just a repetition of the comments before and the linked issues. Your case is literally the first example on this issue. To me it's spam. And by responding to your question I'm unfortunately doing the same, adding another comment that will make everyone subscribed get a notification, without any real content or change.
@MartinJohns I didn't see many, if any, examples in this discussion that specifically dealt with simple class properties and methods, so I thought I was adding to the conversation. Maybe I missed it, but it just seemed like it hadn't been discussed specifically.
Is this isn't the right place to have discussions about code flow analysis, then maybe it should just be locked.
@jordanbtucker - I'm glad you brought up volatility in the context of class properties and methods, because ...
It's not uncommon to see member functions named XXXMutating
or XXXNonMutating()
or similar, especially in a garbage collected language like JS. That's because there may be multiple agents referencing the same object that depend upon that object not changing. If an agent wants to mutate an object being referenced by other agents, then it should clone the object first. In that scenario, it would be helpful to have member function typed as "self mutatating" or "not self mutating", and have that enforced for the actions within the member functions. Typing could help with always cloning when necessary, and never cloning when not necessary.
If member functions could be so marked, and bar
was marked as "self mutating" then it could be (with proper design) a simple O(1) operation to reset this
to its widened state. (Useful, even if it isn't the only reason for such marking.)
I encountered this use case, I believe it belongs here.
What happens: trying to make typeguard based on shape of children props. But type is not inferred correctly.
Happens at 5.4.5, so it is not part of #57465 fix
But same workaround can be applied (x is type
)
type ChildItem = {id: number} | string
type Item = {
children: ChildItem[] // can be objects or strings here
}
type NarrowedItem = {
children: {id: number}[] // only objects here
}
const allItems: Item[] = [] // initially accept with any children
const filteredItems: NarrowedItem[] = allItems.filter(item => // and here only accept items
item.children.every(child => typeof child !== 'string') // if all its children are objects
) // it should be correct in runtime, but typescript thinks it is not narrowed
const filteredItemsWorkaround: NarrowedItem[] = allItems.filter((item): item is NarrowedItem => // same code, but with "item is NarrowedItem"
item.children.every(child => typeof child !== 'string')
) // it works here
Error on line 13 (filteredItems
declaration):
Type 'Item[]' is not assignable to type 'NarrowedItem[]'.
Type 'Item' is not assignable to type 'NarrowedItem'.
Types of property 'children' are incompatible.
Type 'ChildItem[]' is not assignable to type '{ id: number; }[]'.
Type 'ChildItem' is not assignable to type '{ id: number; }'.
Type 'string' is not assignable to type '{ id: number; }'.
Some rough notes from a conversation @ahejlsberg and I had earlier about trade-offs in the control flow analysis work based on running the real-world code (RWC) tests. For obvious reasons I'll be comparing what Flow does with similar examples to compare and contrast possible outcomes.
The primary question is: When a function is invoked, what should we assume its side effects are?
One option is to be pessimistic and reset all narrowings, assuming that any function might mutate any object it could possibly get its hands on. Another option is to be optimistic and assume the function doesn't modify any state. Both of these seem to be bad.
This problem spans both locals (which might be subject to some "closed over or not" analysis) and object fields.
Optimistic: Bad behavior on locals
The TypeScript compiler has code like this:
Optimistically assuming
token
isn't modified bymaybeNextToken
incorrectly flagstoken === Token.Alpha
as an impossibility. However, in other cases, this is a good check to do! See later examples.Optimistic: Bad behavior on fields
The RWC suite picked up a "bug" that looked like this:
The
??
line here is not a bug in the user code, but we thought it was, because after the%%
block runs, the only remaining value inresult.success
's domain isfalse
.Pessimistic: Bad behavior on locals
We found actual bugs (several!) in partner code that looked like this:
Here, we detected the bug that
Kind.Good
(which has the falsy value0
) is not in the domain ofkind
at the point of thecase
label. However, if we were fully pessimistic, we couldn't know that the global functionlog
doesn't modify the global variablekind
, thus incorrectly allowing this broken code.Pessimistic: Bad behavior on fields, example 1
A question on flowtype SO is a good example of this
A smaller example that demonstrates the behavior:
The problem here is that, pessimistically, something like this might be happening:
Pessimistic: Bad behavior on fields, example 2
The TS compiler has code that looks like this (simplified):
Here, we discriminated the
Node
union type by itskind
. A pessimistic behavior would say that the second invocations are unsafe, because the call tovisit
may have mutatednode.kind
through a secondary reference and invalidated the discrimination.Mitigating with (shallow) inlining / analysis
Flow does some assignment analysis to improve the quality of these errors, but it's obviously short of a full inlining solution, which wouldn't be even remotely practical. Some examples of how to defeat the analysis:
Mitigating with
const
parametersA low-hanging piece of fruit is to allow a
const
modifier on parameters. This would allow a much faster fix for code that looks like this:Mitigating with
readonly
fieldsThe
visitChildren
example above might be mitigated by saying thatreadonly
fields retain their narrowing effects even in the presence of intervening function calls. This is technically unsound as you may have both areadonly
and non-readonly
alias to the same property, but in practice this is probably very rare.Mitigating with other options
Random ideas that got thrown out (will add to this list) but are probably bad?
pure
modifier on functions that says this function doesn't modify anything. This is a bit impractical as we'd realistically want this on the vast majority of all functions, and it doesn't really solve the problem since lots of functions only modify one thing so you'd really want to say "pure
except form
"volatile
property modifier that says this "this property will change without notice". We're not C++ and it's perhaps unclear where you'd apply this and where you wouldn't.