Closed elichienxD closed 2 years ago
Unfortunately if the data is far from being a tree then the algorithm might produce negative weights. I think this is an algorithmic issue if the data is far from a tree not an implementation bug, but I am not so sure right now. I would need to work through the math again to be sure.
Here, it doesn't remove those edges, it instead sets those edge weights to 0. The tree structure is stored in G2 and that still has the edge (at least in the Julia code).
Hi @rsonthal ,
After some research on neighbor joining methods, I find that it is a common practice to replace negative edge weights (branches) to 0. However, in theory we should transfer the difference to the adjacency branches. See here.
I wonder if your method is also the same as NJ based method? Is it also okay to simply replacing negative weights with 0?
Thanks, Eli
I agree that setting the weights to 0 (during the algorithm) seems to be an okay thing to do, as would distributing the edge weight. I am curious to how that would effect the algorithm.
So my method has some similarities with NJ, mine is top down rather than bottom up like NJ, but I think it functions on similar concepts.
Hi @rsonthal ,
I found that when the input metric is the distance of many random points (n=128) in a 3d unit ball, then TreeRep can often return a tree with negative edge weights. Is this an implementation bug?
Also, I found that in src/Tree Rep.ipynb,
Does that mean TreeRep is known to produce negative edge weights? I think simply remove those negative edge is not reasonable, as they will induce infinite distance between leave nodes right?
Below is the minimum working example (in Python) to reproduce my issue. Please let me know if I misunderstand something.