Open tsung-wei-huang opened 2 years ago
Not started yet.
I just finished reading the data line by line and working on creating the data structure for net<->cells.
Current progress:
Have parsed the fille into a dictionary containing NETNAME as the key and the cells connected to the net as the value. I have also parsed the r value from the file, which I then use to calculate the balance criterion for the file. From here, I divide the dictionary into two partitions by dividing the dictionary into two distinct dictionaries.
Future Plans: Use the FM algorithm discussed in class to effectively calculate the gain for each cell and move it accordingly. My plan for this is to iterate over both partitions to calculate the gain for each cell. From here, I store the gain value in an array. Each index of the array will hold a linked list for each cell that contains the gain value. The algorithm will choose any cell that meets the balance criterion and has the highest gain. Once a cell moves to the other partition, the index of the cell will be locked, and the algorithm will contain a conditional that checks if an index (cell) has been locked.
I have parsed the nets/cells and use them to create a simple initial solution.
TODO: implement the actual algorithm and bucket data structure
Currently: working on updating the bucket after every tentative move.
Current progress:
Implement my own data structure to store a given circuit.
TODOs:
I have parsed the nets and cells, calculated and updated the gain, and finished the initial partition.
The file is now parsed into a map of net names, to cells. Next is to implement data structures for nets and cells and begin conducting F-M passes.
I have checked all the input files and tried to implement the methods introduced in the class, I have tried to do the partition and calculate the gain of the processes, but haven't had much progress.
Sorry for missing the timeline. I have read the file into the vector. Will work on the algorithm implementation.
I have finished most parts. But data_0, data_4, and data_5 have very long running time.
I ran 1 pass for each input and measured runtime on my desktop. The results are shown below:
Input | cut_size | runtime |
---|---|---|
input_0.dat | 76906 | 22.12s |
input_1.dat | 3121 | 0.02s |
input_2.dat | 5778 | 0.02s |
input_3.dat | 51064 | 2.26s |
input_4.dat | 114202 | 1.40s |
input_5.dat | 333685 | 8.88s |
input_6.dat | 2 | 0.01s |
I randomly partition cells and apply FM algorithm. The cutsize number does not reduce much no matter how many passes I run. It seems like initial partition matters a lot.
################## update I found a bug in my code and the table below are the updated results (one pass):
Input | cut_size | runtime |
---|---|---|
input_0.dat | 42239 | 7.69s |
input_1.dat | 2569 | 0.01s |
input_2.dat | 4283 | 0.02s |
input_3.dat | 35052 | 0.35s |
input_4.dat | 99017 | 0.72s |
input_5.dat | 290729 | 2.26s |
input_6.dat | 2 | 0.003s |
I also found several passes indeed improve the cutsize.
@all, ideally, your program should finish all test cases within one minute, if you have a decent implementation.
<!DOCTYPE html>
Input | cutsize | runtime |
---|---|---|
input_0.dat | 34879 | 3m31.1s |
input_1.dat | 1474 | 0.028s |
input_2.dat | 2636 | 0.148 s |
input_3.dat | 32102 | 18.363s |
input_4.dat | 54724 | 3m2.52s |
input_5.dat | 170105 | 35m42.398s |
input_6.dat | 1 | 0.001s |
Initial partition is a random partition. One pass gives the above result. The runtime is not ideal. According to the profiling analysis, "updating the gain"of the connected cells of a moved cell is the bottleneck. Will work on this part to reduce the runtime.
Finished implementing FS / TE calculation, working on the gain bucket data structure.
Finished parsing and creating the bucket list. Now working on implementing the FM partition process.
Finished initial partition part. Will work on implementing the onepass process.
These are the times for finding the best cut size for all passes combined. Working on making it more efficient. Plan is to make switching partitions more efficient, and also changing the way cell gain is calculated
===UPDATE=== only for one pass, these are the runtimes. initial partition matters a lot. e.g. input_0.dat has really bad initial partition so it takes much longer.
Input | cutsize | runtime |
---|---|---|
input_0.dat | 101601 | 13m 35 s |
input_1.dat | 2959 | 0.1 s |
input_2.dat | 5723 | 0.19 s |
input_3.dat | 59725 | 1.95 s |
input_4.dat | 110468 | 4.14 s |
input_5.dat | 321443 | 12.13 s |
input_6.dat | 1 | 0.000s |
Finished the partition initialization part, still working on the bucket list cell gain update part.
Input | cutsize | runtime |
---|---|---|
input_0.dat | 35669 | 81.74 s |
input_1.dat | 1826 | 0.08 s |
input_2.dat | 3461 | 0.15 s |
input_3.dat | 40293 | 1.54 s |
input_4.dat | 71171 | 3.17 s |
input_5.dat | 215954 | 10.10 s |
input_6.dat | 2 | 0.000 s |
I did not pass the test_case 0. The cutsize was reported incorrect. I will work on fixing that.
All data structures should be complete now and an initial partition is created. The cutsize calculation though is off somehow so it fails to pass the checker. Searching for that bug now.
EDIT @ 12:44AM I figured out what was going wrong with the cutsize calculation (I had some conditionals in the wrong order) I can now report my times for the initial random pass, ensuring balance as follows:
Input | cutsize | runtime |
---|---|---|
input_0.dat | 99381 | 2.030 s |
input_1.dat | 3046 | 0.034 s |
input_2.dat | 5913 | .062 s |
input_3.dat | 60109 | 0.795 s |
input_4.dat | 110819 | 1.528 s |
input_5.dat | 328572 | 4.527 s |
input_6.dat | 2 | 0.006 s |
Have changed to c++ and have parsing done along with a bucketlist and algorithm. Currently having issues with partitioning since it is not storing the cells and their nets properly.
UPDATE:
Algorithm implemented: table is as follows: (for all passes not just one)
SECOND UPDATE: Updating table to time one pass: (-O2 helped so much with optimization thanks TW! Almost like optimizing assembly code leads to more optimal solutions xD). Muitliple passes currently result in a balance issue as well as some tests not finishing at all (input0, 3,4,5). 4/6 tests wont finish with multiple passes. bug fixed. | input | cutsize | runtime |
---|---|---|---|
input_0.dat | 101608 | 3.23s | |
input_1.dat | 2960 | 0.03s | |
input_2.dat | 5725 | 0.06ss | |
input_3.dat | 59727 | 0.70s | |
input_4.dat | 110470 | 1.57s | |
input_5.dat | 321443 | 5.57s | |
input_6.dat | 2 | 0.001s |
all passes | input | cutsize | runtime |
---|---|---|---|
input_0.dat | ? | ? | |
input_1.dat | 2524 | 0.68s | |
input_2.dat | 5303 | 0.95s | |
input_3.dat | ? | ? | |
input_4.dat | ? | ? | |
input_5.dat | ? | ? | |
input_6.dat | 1 | 0.001s |
Please respond to this issue page with your current results using the following table format:
You should only report those that passed the checker program.