vgteam / sequenceTubeMap

displays multiple genomic sequences in the form of a tube map
MIT License
180 stars 25 forks source link

I want to learn more about your projects #372

Closed Szhiha closed 10 months ago

Szhiha commented 11 months ago

Would you mind telling me how to achieve generate a horizontal piece of base sequence like you? And most importantly, how did you generate horizontal base sequences for these segments and nodes and correspond them one-to-one? Thank you !

Szhiha commented 11 months ago

And I am wondering what is 'from_length' and 'to_length' in your jsonfile

adamnovak commented 11 months ago

Would you mind telling me how to achieve generate a horizontal piece of base sequence like you? And most importantly, how did you generate horizontal base sequences for these segments and nodes and correspond them one-to-one?

I'm not really sure what you mean here. I think we draw the horizontal pieces of sequence using normal SVG text in the SVG that we use to render the visualization. A lot of the logic to work out where to put the nodes in the visualization looks to be in generateNodeOrder(), but honestly the person who really understood the algorithm no longer works on the project.

We work out what the sequences should be and what nodes to make by using pangenomics tools to make graphs. Usually we would use vg or The Minigraph-Cactus Pangenome Pipeline or PanGenome Graph Builder.

And I am wondering what is 'from_length' and 'to_length' in your jsonfile

from_length and to_length are from the vg data model: https://github.com/vgteam/libvgio/blob/45d8ada05ee1d1405ef44d93f2ac00a5a097dd09/deps/vg.proto#L51-L52

from_length is the number of bases of the graph node involved, and to_length is the number of bases of the aligned read that correspond to them. So if from_length is 0 and to_length is, say, 5, then you are looking at 5 bases inserted in the read that weren't present in the graph.

Szhiha commented 11 months ago

Thank you very much!

adamnovak commented 10 months ago

If that helps, I'm going to go ahead and close this issue. Thanks!