Closed R-Alexandre closed 1 year ago
Zeroth issue: As you probably already figured out, the Ptolemy module uses Ptolemy coordinates natively and can convert to shapes (see https://arxiv.org/abs/1207.6711v1) parameterizing a PGL(3,C)-representation. is_pu_2_1_representation checks the conditions outlined in https://arxiv.org/abs/1307.669 (Proposition 3.5) to see whether the given PGL(3,C)-representation factors through a PU(2,1)-representation under some embedding PU(2,1)->PGL(3,C). The Ptolemy module cannot give you the PU(2,1)-matrices.
First issue: The running out of stack might just be because the exact solutions involve polynomials with gigantic coefficients since it is computing the solution by taking field extensions whenever there is a polynomial in the lexicographic Groebner basis which is non-linear in more than one variable. I didn't know about "Rational Univariate Representation" back then. giac now supports RUR, so if you need exact solutions, I recommend giving it a try.
As for the matrix determinant assertion error: that seems like a bug. What manifold did that occur with - so that I can further investigate?
You can also give numerical=True to retrieve_solutions to avoid computing exact solutions. Only caveat is that you sometimes need to use high precision (with pari.set_real_precision) because evaluating a polynomial near a root of another one is unstable). Ideally one would use interval arithmetic to know whether the chosen precision was good enough. Here is some experimental code to compute intervals for the complex volumes that is computing the Groebner basis and RUR and uses interval arithmetic: https://github.com/3-manifolds/SnapPy/blob/master/dev/extended_ptolemy/complexVolumesClosed.py
Second issue: Sometimes there are multiple ways of parametrizing the same PSL(2,C)-representation by Ptolemy coordinates - in other words (assuming the representations are not boundary-degenerate), the same shape coordinates can be lifted to Ptolemy coordinates in multiple ways (see https://www.math.uic.edu/t3m/SnapPy/ptolemy_classes.html#snappy.ptolemy.ptolemyVariety.PtolemyVariety.degree_to_shapes). I think that I was counting the number of PGL-representations and not the number of solutions to the Ptolemy variety when generating the webpages by just computing the shapes from the Ptolemy coordinates and comparing the shapes numerically. The code to do the latter is not part of what ships with SnapPy though.
Thank you very much for your answer
Zeroth issue: As you probably already figured out, the Ptolemy module uses Ptolemy coordinates natively and can convert to shapes (see https://arxiv.org/abs/1207.6711v1) parameterizing a PGL(3,C)-representation. is_pu_2_1_representation checks the conditions outlined in https://arxiv.org/abs/1307.669 (Proposition 3.5) to see whether the given PGL(3,C)-representation factors through a PU(2,1)-representation under some embedding PU(2,1)->PGL(3,C). The Ptolemy module cannot give you the PU(2,1)-matrices.
So I have to think those matrices to be only in PGL(3,C) and assume some conjugaison ?
The running out of stack might just be because the exact solutions involve polynomials with gigantic coefficients since it is computing the solution by taking field extensions whenever there is a polynomial in the lexicographic Groebner basis which is non-linear in more than one variable. I didn't know about "Rational Univariate Representation" back then. giac now supports RUR, so if you need exact solutions, I recommend giving it a try.
Can you please give me a link to see how to do this ?
As for the matrix determinant assertion error: that seems like a bug. What manifold did that occur with - so that I can further investigate?
m027 for example, I think I got the same issue with others with 4 tetrahedra
You can also give numerical=True to retrieve_solutions to avoid computing exact solutions. Only caveat is that you sometimes need to use high precision (with pari.set_real_precision) because evaluating a polynomial near a root of another one is unstable). Ideally one would use interval arithmetic to know whether the chosen precision was good enough. Here is some experimental code to compute intervals for the complex volumes that is computing the Groebner basis and RUR and uses interval arithmetic: https://github.com/3-manifolds/SnapPy/blob/master/dev/extended_ptolemy/complexVolumesClosed.py
Indeed I use a pari.set_real_precision to get better results.
Second issue: Sometimes there are multiple ways of parametrizing the same PSL(2,C)-representation by Ptolemy coordinates - in other words (assuming the representations are not boundary-degenerate), the same shape coordinates can be lifted to Ptolemy coordinates in multiple ways (see https://www.math.uic.edu/t3m/SnapPy/ptolemy_classes.html#snappy.ptolemy.ptolemyVariety.PtolemyVariety.degree_to_shapes). I think that I was counting the number of PGL-representations and not the number of solutions to the Ptolemy variety when generating the webpages by just computing the shapes from the Ptolemy coordinates and comparing the shapes numerically. The code to do the latter is not part of what ships with SnapPy though.
How do you compare those shapes? I would be interested in a way to do so.
It explains why sometimes I get bigger numbers. But why do I get lesser numbers than your webpage?
Thank you very much for your answer
Zeroth issue: As you probably already figured out, the Ptolemy module uses Ptolemy coordinates natively and can convert to shapes (see https://arxiv.org/abs/1207.6711v1) parameterizing a PGL(3,C)-representation. is_pu_2_1_representation checks the conditions outlined in https://arxiv.org/abs/1307.669 (Proposition 3.5) to see whether the given PGL(3,C)-representation factors through a PU(2,1)-representation under some embedding PU(2,1)->PGL(3,C). The Ptolemy module cannot give you the PU(2,1)-matrices.
So I have to think those matrices to be only in PGL(3,C) and assume some conjugaison ?
The shapes only give you a PGL(3,C)-representation up to conjugation to begin with. The ptolemy module is making some choices when giving you the matrices with, e.g., M=Manifold("m004") M.ptolemy_variety(3,1).retrieve_solutions(numerical=True)[0][0].evaluate_word('a', M.fundamental_group())
Those choices are most likely not the ones that make it so that a representation that can be conjugated into PU(2,1) will be in PU(2,1).
I guess there is some way of using the arguments in https://arxiv.org/abs/1307.669 to explicitly get a PU(2,1)-decoration of the tetrahedra and compute the matrices, but I am not sure anyone ever worked that out.
Those choices are most likely not the ones that make it so that a representation that can be conjugated into PU(2,1) will be in PU(2,1).
I guess there is some way of using the arguments in https://arxiv.org/abs/1307.669 to explicitly get a PU(2,1)-decoration of the tetrahedra and compute the matrices, but I am not sure anyone ever worked that out.
Ok that is now clear to me. Thanks
I am looking at the second issue now, in particular, at M = Manifold("m011")
.
We have that M.ptolemy_variety(3,0).degree_to_shapes()
is 1
, so the a (non-boundary degenerate) PGL(3,C)-representation is parameterized by only one and not multiple points in the (reduced) Ptolemy variety - so the potential issue I mentioned earlier is not a problem for this particular manifold.
However, I noticed that you really need to use higher precision to get reliable results.
To get the number of PU(2,1)-representation, you can do
len([x
for x
in M.ptolemy_variety(3,'all').retrieve_solutions(numerical=True).flatten(2).cross_ratios()
if x.is_pu_2_1_representation(1e-10)])
or (which is much slower)
len([x
for x
in M.ptolemy_variety(3,'all').retrieve_solutions().numerical().flatten(2).cross_ratios()
if x.is_pu_2_1_representation(1e-10)])
Remark: Your code looks fine but I am using flatten
that turns a list of list (of lists...) into just a list here to avoid writing nested for
-loops. I have some more remarks below.
With pari.set_real_precision(15)
, I get 15 and with pari.set_real_precision(100)
, I get 21 which is the correct answer (well, modulo the fact that the gluing equations are triangulation dependent and might miss some representations).
The problem is that the numerical method is evaluating polynomials near a root of another polynomial which is unstable.
To increase your confidence, you can do a sanity check that the numerical values are solving the gluing equations up to a small epsilon:
sols=M.ptolemy_variety(3,'all').retrieve_solutions(numerical=True).flatten(2).cross_ratios()
dummy=sol.check_against_manifold(epsilon=1e-50)
len([x for x in sols if x.is_pu_2_1_representation(1e-10)])
The second line throws an exception if any gluing equation is violated by more than 1e-50.
Remark: I recommend using dimension
to skip the higher-dimensional components of a Ptolemy variety, e.g.,
[[ component
for component # go through all components in the variety
in per_obstruction
if component.dimension == 0] # that are zero-dimensional
for per_obstruction # go through all obstruction classes
in M.ptolemy_variety(3,'all').retrieve_solutions(numerical=True)]
Remark: Instead of doing
for i in range(len(myList)):
myElement = myList[i]
...
you can do
for i, myElement in enumerate(myList):
...
About the first issue: The original exception was that pari was running out of memory but it was then masked by the AssertionError which is misleading. I just pushed a commit fixing that. I could finish the exact computation for m027 for PGL(3,C) when increasing the pari memory to 16007168 by calling pari.allocatemem() often enough.
I'll try to figure out how difficult it would be to get giac's rur to process the groebner basis we already computed.
Hi,
Thanks a lot for the time you spend on this. I reproduced your computations, it works fine. (And thanks for the Py-advices!)
Now I have a doubt. When I use retrieve_solutions(numerical=True)
, are my matrices less accurate or not?
(Since I reproduce the table, note that I needed a pari precision at 200 for m060.)
My guess is that the matrices will be more accurate if you do retrieve_solutions().numerical()
instead of retrive_solutions(numerical=True)
given the same parti.get_real_precision()
. But either way, it is probably much cheaper to increase pari.set_real_precision()
to make both precise enough than to compute the intermediate exact solution requiring to compute the number field from the lexicographic Groebner basis and expressing everything in that number field.
The HTML generating code ran a loop that started with a precision of 100 and doubled it until check_against_manifold(epsilon = 1e-50)
succeeded. So 200 is probably also what was used for m060
for its table.
Can you please give me a link to see how to do this ?
giac now supports RUR, so if you need exact solutions, I recommend giving it a try.
Can you please give me a link to see how to do this ?
You might have a look at this one: https://github.com/3-manifolds/SnapPy/blob/master/dev/extended_ptolemy/giac_rur.py
It might be a bit fiddly to convert the way the Ptolemy module represents a Groebner basis into giac's representation and then convert the way giac is representing the RUR back to the Ptolemy module. If I find time, I'll have a look - but I can't promise that I find time soon.
My guess is that the matrices will be more accurate if you do
retrieve_solutions().numerical()
instead ofretrive_solutions(numerical=True)
given the sameparti.get_real_precision()
. But either way, it is probably much cheaper to increasepari.set_real_precision()
to make both precise enough than to compute the intermediate exact solution requiring to compute the number field from the lexicographic Groebner basis and expressing everything in that number field.The HTML generating code ran a loop that started with a precision of 100 and doubled it until
check_against_manifold(epsilon = 1e-50)
succeeded. So 200 is probably also what was used form060
for its table.
Oh yeah, much much much cheaper indeed. So that's great, thank you.
You might have a look at this one: https://github.com/3-manifolds/SnapPy/blob/master/dev/extended_ptolemy/giac_rur.py
It might be a bit fiddly to convert the way the Ptolemy module represents a Groebner basis into giac's representation and then convert the way giac is representing the RUR back to the Ptolemy module. If I find time, I'll have a look - but I can't promise that I find time soon.
I'll try to take a look, thanks.
Also, it has come to my knowledge that someone already made the work to compute the conjugation matrices to get a (true) PU(2,1) representation. I'll keep you posted when I know more about it
Hi again,
I have an issue with some manifolds: m044, m052 m055, m060, m069, m070
I get the same error message for each one of them. Probably just missing files
Traceback (most recent call last):
File "/users/home/alexandrer/.local/lib/python3.6/site-packages/snappy/ptolemy/ptolemyVariety.py", line 905, in _retrieve_url
s = urlopen(url)
File "/usr/lib/python3.6/urllib/request.py", line 223, in urlopen
return opener.open(url, data, timeout)
File "/usr/lib/python3.6/urllib/request.py", line 532, in open
response = meth(req, response)
File "/usr/lib/python3.6/urllib/request.py", line 642, in http_response
'http', request, response, code, msg, hdrs)
File "/usr/lib/python3.6/urllib/request.py", line 570, in error
return self._call_chain(*args)
File "/usr/lib/python3.6/urllib/request.py", line 504, in _call_chain
result = func(*args)
File "/usr/lib/python3.6/urllib/request.py", line 650, in http_error_default
raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 404: Not Found
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "rph-knot.py", line 94, in <module>
for x in M.ptolemy_variety(3,'all').retrieve_solutions(numerical=True,verbose=False).flatten(2).cross_ratios()
File "/users/home/alexandrer/.local/lib/python3.6/site-packages/snappy/ptolemy/manifoldMethods.py", line 165, in retrieve_solutions
for p in self ])
File "/users/home/alexandrer/.local/lib/python3.6/site-packages/snappy/ptolemy/manifoldMethods.py", line 165, in <listcomp>
for p in self ])
File "/users/home/alexandrer/.local/lib/python3.6/site-packages/snappy/ptolemy/ptolemyVariety.py", line 558, in retrieve_solutions
verbose = verbose)
File "/users/home/alexandrer/.local/lib/python3.6/site-packages/snappy/ptolemy/ptolemyVariety.py", line 520, in _retrieve_solution_file
return _retrieve_url(url)
File "/users/home/alexandrer/.local/lib/python3.6/site-packages/snappy/ptolemy/ptolemyVariety.py", line 913, in _retrieve_url
"%s" % (url, e))
RuntimeError: Problem connecting to server while retrieving http://ptolemy.unhyperbolic.org/data/pgl3/OrientableCuspedCensus/04_tetrahedra/m052__sl3_c0.magma_out: HTTP Error 404: Not Found
PARI stack size set to 10000000000 bytes, maximum size set to 10000003072
I'll focus on m044
but the other cases should be similar. For some of the obstruction classes, the Groebner basis of radical decomposition of the Ptolemy variety was so hard to compute that magma couldn't do it. Fabrice Rouiller has a couple of more tricks up his sleeves and provided us the rational univariate representation for the radical decomposition instead.
Thus, you can get the solutions with
M.ptolemy_variety(3,0).retrieve_solutions(prefer_rur=True, numerical=True) # Provided by Fabrice
M.ptolemy_variety(3,1).retrieve_solutions(prefer_rur=False, numerical=True) # Computed by Magma
Ideally, you wouldn't need to try the two truth values of prefer_rur
, Ptolemy should automatically try both - but, alas, the behavior of python's urllib
changed and this automatic behavior broke between python 2.x and python 3.x. I just pushed a commit to fix it. Unless you want to compile SnapPy from source or wait until the release of SnapPy 2.8, you need would need to do something like
try:
sol = M.ptolemy_variety(3,0).retrieve_solutions(prefer_rur=False, numerical=True)
except:
sol = M.ptolemy_variety(3,0).retrieve_solutions(prefer_rur=True, numerical=True)
to do it automatically.
M.ptolemy_variety(3,0).retrieve_solutions(prefer_rur=True)
returns exact solutions but as rational functions in the generator of the number field instead of polynomials of it. You can do M.ptolemy_variety(3,0).retrieve_solutions(prefer_rur=True).to_PUR()
to convert it them to the latter, but doing the division over the number field will give huge polynomials.
Thanks for the help. Timing is great, I talk with Fabrice this afternoon ;)
Strangely, I can do M.ptolemy_variety(3,'all').retrieve_solutions(prefer_rur=True)
with Python2 but not with Python3 ... is that expected?
Also, about precision, is something changing with the choice of the RUR?
Strangely, I can do
M.ptolemy_variety(3,'all').retrieve_solutions(prefer_rur=True)
with Python2 but not with Python3 ... is that expected?
Yes:
the behavior of python's
urllib
changed and this automatic behavior broke between python 2.x and python 3.xAlso, about precision, is something changing with the choice of the RUR?
It probably gets better. The number-field defining polynomial you get from RUR should have smaller coefficients than the one from the lexicographic Groebner basis.
Timing is great, I talk with Fabrice this afternoon ;)
Say hi!
Closing as this is an old issue and appears to have been resolved. We can reopen if you are still experiencing problems.
Hi (again),
Full disclosure, I'm currently working on PU(2,1) representations furnished by the PtolemyVariety (http://ptolemy.unhyperbolic.org/html/summary.html). Thank you for this work ... It helps me a lot for my research.
I have some issues with the reproducibility of the tables for PGL(3,C) representations.
Zero issue: on coordinates
This is not an issue, but it is a question that I didn't find its answer in the doc. What is the choice of the coordinates when talking about PU(2,1) representations?
First issue: retrieve_solutions()
First, I encountered a difficulty to retrieve the solutions. The code :
snappy.Manifold('m027').ptolemy_variety(3,'all').retrieve_solutions()
raise an AssertionError.
It depends on the choice of the manifold. I didn't have any issue with the manifolds requiring 2 or 3 tetrahedra. But when I tried with manifolds with 4 tetrahedra, lots of them give me something like the following:
Warning: increasing stack size to 8003584.
is not an issue. But for others, I have to considerably increase the stack size ... and it bothers me a bit (is it truly necessary?).Second issue: is_pu_2_1_representation()
Thanks to the method _is_pu_2_1_representation(), I can test if a representation is in PU(2,1). Unfortunately, sometimes, I don't retrieve the same number of solutions as listed in http://ptolemy.unhyperbolic.org/html/pgl3/OrientableCuspedCensus/03_tetrahedra/summary.html.
I count the number of solutions in PU(2,1) with the following code (that also gives the coordinates of the representations in PU(2,1) in order to compare with the table given in the preceding link).
Probably my code is not right ... But since I'm not always wrong, I don't understand what is not right.
Thanks for your help :)