I've recently started using trackpy and it's been amazing. I started using the link_df function on cropped datasets (centroids from segmented data) to figure out the parameters for linking and it worked a charm. However, when I tried using the same pipeline on my full timelapse data (~7000 labels over 60 frames) I found it took quite a long time. So, I turned to link_df_iter as I read it takes a much shorter amount of time to link and I used the same parameters I used with link_df! However, whenever I try to run this code:
File ~\anaconda3\envs\tracking\lib\site-packages\pandas\core\reshape\concat.py:443, in _Concatenator.init(self, objs, axis, join, keys, levels, names, ignore_index, verify_integrity, copy, sort)
440 self.verify_integrity = verify_integrity
441 self.copy = copy
--> 443 objs, keys = self._clean_keys_and_objs(objs, keys)
445 # figure out what our result ndim is going to be
446 ndims = self._get_ndims(objs)
File ~\anaconda3\envs\tracking\lib\site-packages\pandas\core\reshape\concat.py:502, in _Concatenator._clean_keys_and_objs(self, objs, keys)
500 objs_list = [objs[k] for k in keys]
501 else:
--> 502 objs_list = list(objs)
504 if len(objs_list) == 0:
505 raise ValueError("No objects to concatenate")
File ~\anaconda3\envs\tracking\lib\site-packages\trackpy\linking\linking.py:278, in link_df_iter(f_iter, search_range, pos_columns, t_column, kwargs)
274 coords_iter = coords_from_df_iter(f_coords_iter, pos_columns, t_column)
276 ids_iter = (_ids for _i, _ids in
277 link_iter(coords_iter, search_range, kwargs))
--> 278 for df, ids in zip(f_iter, ids_iter):
279 df_linked = df.copy()
280 df_linked['particle'] = ids
File ~\anaconda3\envs\tracking\lib\site-packages\trackpy\linking\linking.py:276, in (.0)
273 f_iter, f_coords_iter = itertools.tee(f_iter)
274 coords_iter = coords_from_df_iter(f_coords_iter, pos_columns, t_column)
--> 276 ids_iter = (_ids for _i, _ids in
277 link_iter(coords_iter, search_range, **kwargs))
278 for df, ids in zip(f_iter, ids_iter):
279 df_linked = df.copy()
File ~\anaconda3\envs\tracking\lib\site-packages\trackpy\linking\linking.py:88, in link_iter(coords_iter, search_range, **kwargs)
85 coords_iter = iter(coords_iter)
87 # interpret the first element of the iterable
---> 88 val = next(coords_iter)
89 if isinstance(val, np.ndarray):
90 # the iterable was not enumerated, so enumerate the remainder
91 coords_iter = enumerate(coords_iter, start=1)
When I also run just the tp.link_df_iter(df, t_column='frame', pos_columns=['z', 'y', 'x'], search_range=20, memory=8) line no error and an output:
<generator object link_df_iter at 0x0000018D83335A10>
Which makes me think it's not to do with the link_df_iter but how it interacts with pd.concat?
I've checked my original df going into and my 'frame' column are all ints and I have no NaN values either. I was just wondering if anyone else experienced anything like this and would know how I should troubleshoot? Thanks for any help!
Hi everyone!
I've recently started using trackpy and it's been amazing. I started using the link_df function on cropped datasets (centroids from segmented data) to figure out the parameters for linking and it worked a charm. However, when I tried using the same pipeline on my full timelapse data (~7000 labels over 60 frames) I found it took quite a long time. So, I turned to link_df_iter as I read it takes a much shorter amount of time to link and I used the same parameters I used with link_df! However, whenever I try to run this code:
linked_df = pd.concat(tp.link_df_iter(df, t_column='frame', pos_columns=['z', 'y', 'x'], search_range=20, memory=8), ignore_index=True)
I've been getting this TypeError which points to the t_column from my dataframe.
When I also run just the
tp.link_df_iter(df, t_column='frame', pos_columns=['z', 'y', 'x'], search_range=20, memory=8)
line no error and an output:Which makes me think it's not to do with the link_df_iter but how it interacts with pd.concat?
I've checked my original df going into and my 'frame' column are all ints and I have no NaN values either. I was just wondering if anyone else experienced anything like this and would know how I should troubleshoot? Thanks for any help!