Eventual-Inc / Daft

Distributed DataFrame for Python designed for the cloud, powered by Rust
https://getdaft.io
Apache License 2.0
1.76k stars 105 forks source link

if_else() expression requires both sides to exist regardless of if the logical expression evaluates to true or false #2190

Closed daveqs closed 2 weeks ago

daveqs commented 3 weeks ago

Describe the bug When applying the if_else expression following a logical expression, both sides of the logical expression must exist regardless if the logical is true or false. If one side only exists when the logical is true but does't exist when it is false, or visa versa, daft returns ValueError: DaftError::External Unable to create logical plan node.

To Reproduce Run the following Python example

import daft

my_keys = ['key_A', 'key_B', 'key_C']

# this works
test1_df = daft.from_pydict({"struct_col": [{'key_A':1, 'key_B':2}, {'key_A':3, 'key_B':4}, {'key_A':5, 'key_B':6}]})
for key in my_keys:
    idx = test1_df.schema()['struct_col'].dtype.to_arrow_dtype().get_field_index(key)
    if idx > -1:
        test1_df = test1_df.with_column(key, test1_df['struct_col'].struct.get(key))
    else:
        test1_df = test1_df.with_column(key, daft.lit(None))

test1_df.show()

# this fails
test2_df = daft.from_pydict({"struct_col": [{'key_A':1, 'key_B':2}, {'key_A':3, 'key_B':4}, {'key_A':5, 'key_B':6}]})
for key in my_keys:
    idx = test2_df.schema()['struct_col'].dtype.to_arrow_dtype().get_field_index(key)
    test2_df = test2_df.with_column(
        key,
        (daft.lit(idx) > daft.lit(-1)).if_else(test2_df['struct_col'].struct.get(key), daft.lit(None))
    )

test2_df.show()

Expected behavior In the above example, I expect test2_df to be identical to test1_df. This would be the case if daft only evaluated the left side of the if_else() expression when the logical is is being applied to (in the example, (daft.lit(idx) > daft.lit(-1)) ) is true and the right side of the if_else() expression when its logical is false.

Desktop (please complete the following information):

samster25 commented 2 weeks ago

@colin-ho can you take a look?

colin-ho commented 2 weeks ago

Hey @daveqs ! The reason for your error is because Daft is trying to create a schema from the if_else expression, and to do that it needs to check the datatypes of both left/right sides of the if_else to determine the result type. For the specific case of the expression: test2_df['struct_col'].struct.get('key_C'), the datatype cannot be determined as the key does not exist, and therefore the error is thrown.

That being said, we can make your desired behavior work, i.e. modify if_else to only selectively check the datatypes of the left or right side based on the result of the predicate. However this is only possible for predicates that can be evaluated at planning time, something like daft.lit(idx > -1).if_else will work, but (daft.col("col") > 1).if_else won't be possible, as this expression requires knowledge of the column.

Let me know if this is ok for you!

daveqs commented 2 weeks ago

Hi @colin-ho , in my opinion what you described is the preferred behavior for if_else, and it would solve my use case (though I understand the limitation that the logical must be evaluated during query planning).

Do you see any downside to implementing this behavior?

colin-ho commented 2 weeks ago

Hi @colin-ho , in my opinion what you described is the preferred behavior for if_else, and it would solve my use case (though I understand the limitation that the logical must be evaluated during query planning).

Do you see any downside to implementing this behavior?

None I can think of, I can make a PR for this this week.

daveqs commented 2 weeks ago

Hi @colin-ho , in my opinion what you described is the preferred behavior for if_else, and it would solve my use case (though I understand the limitation that the logical must be evaluated during query planning).

Do you see any downside to implementing this behavior?

None I can think of, I can make a PR for this this week.

Great, thank you!

colin-ho commented 2 weeks ago

Hey @daveqs ! Just merged in the PR for this fix, should be available in the upcoming release next week!