Closed xiephysics closed 2 years ago
Hi,
Please can you clarify the sparsity of the matrix, what format you put it in, and where exactly the Maximum allowed dimension exceeded
error is thrown? A sample of the code you ran would be ideal.
Also, try running the auction_solve function with cardinality_check=False
to skip the cardinality check if you know that your matrix will have a valid solution.
Thanks.
Thank you for your kind reply. Here is my code that gets Maximum allowed dimension exceeded error even I run the auction_solve function with cardinality_check=False.
mat = np.random.uniform(0, 1, (60000, 60000)).astype(np.float64) sol = auction_solve(mat, problem='min') col_ind = sol['sol']
I guess is because your sslap library is for super sparse matrix, am I right? Do you have any suggestions on how to solve non-sparse large matrix assignment problems? Thanks.
Hi,
At float64 size, this matrix is incredibly large, about 30 GB of memory is required! You should definitely consider changing the data type, or rethinking whether you need a matrix this large.
Regarding the algorithm, yes it works better for super sparse matrices. You'll find that most solvers will struggle with matrices that large. Do you definitely require every value in your matrix?
What I try to do is similar to assigning 60000 jobs for 60000 workers. If I make the matrix sparse, the preciseness will decrease, what do you think? I try to turn the dataset into float32, but when I run it, I got Buffer dtype mismatch, expected 'double' but got 'float' error.
mat = np.random.uniform(0, 1, (60000, 60000)).astype(np.float32) sol = auction_solve(mat, problem='min') col_ind = sol['sol']
Yes, the code is made for float64 so won't work with float32. Even so, a float32 matrix of your size would still be far too large for the algorithm to run effectively.
What do you mean by preciseness will decrease? Can you explain what exactly are the 'costs' for each worker being assigned a job? Maybe there is a way to disregard large amounts of possible assignments?
The cost for each worker being assigned a job is equal to a logarithm conditional probability i.e. log(p_worker | p_job), the value of all the cost range from 540 to 550. From now, I can't see a way to disregard large amounts of possible assignments.
Okay. As a dense problem, I would advise using a library or software more suited to large dense matrix solvers, likely requiring GPUs for computation.
It might be worth considering making your matrix sparse by, for example, disregarding any cost above 545. But that is up to you and your specific problem.
I will mark this as closed as it is not a sparse problem - I hope you manage to solve it!
Hi, when I use auctionsolve to run 60000 x 60000 matrix, I got "Maximum allowed dimension exceeded" error. I checked your code in sslap/auction.pyx, but I didn't find any limitation of the maximum dimension. Would you kindly tell me how to solve it? Thank you so much.