issues
search
bdusell
/
semiring-einsum
Generic PyTorch implementation of einsum that supports different semirings
https://bdusell.github.io/semiring-einsum/
MIT License
43
stars
8
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Pytorch 2 Compatible
#44
AutomaticHourglass
opened
4 months ago
0
Doesn't work with PyTorch 2?
#43
davidweichiang
closed
1 week ago
3
Bump urllib3 from 1.26.12 to 1.26.18
#42
dependabot[bot]
opened
8 months ago
0
Bump urllib3 from 1.26.12 to 1.26.17
#41
dependabot[bot]
closed
8 months ago
1
Bump certifi from 2022.6.15.1 to 2023.7.22
#40
dependabot[bot]
opened
11 months ago
0
Bump future from 0.18.2 to 0.18.3
#39
dependabot[bot]
opened
11 months ago
0
Bump certifi from 2022.6.15.1 to 2022.12.7
#38
dependabot[bot]
closed
11 months ago
1
Subtract used memory from max_cpu_bytes in AutomaticBlockSize
#37
ccshan
opened
1 year ago
0
In log forward, include tensors saved for backward in size calculation
#36
bdusell
opened
1 year ago
0
Use minimum block size when memory is not enough
#35
ccshan
closed
1 year ago
1
Make AutomaticBlockSize work with torch.bool tensors
#34
ccshan
closed
1 year ago
0
Avoid the call to permute in LookupInfo.lookup
#33
ccshan
opened
1 year ago
1
(feat): Support For Complex Numbers
#32
ilan-gold
closed
1 year ago
5
More than 26 indices?
#31
davidweichiang
opened
1 year ago
3
Document log_viterbi_einsum
#30
davidweichiang
closed
1 year ago
0
Correct amax(dim=()) behavior
#29
davidweichiang
closed
2 years ago
0
`log_viterbi_einsum_forward` won't work if there are no summed variables
#28
bdusell
closed
1 year ago
3
Bump pygments from 2.5.1 to 2.7.4
#27
dependabot[bot]
closed
2 years ago
1
Bump babel from 2.7.0 to 2.9.1
#26
dependabot[bot]
closed
2 years ago
1
Bump urllib3 from 1.25.7 to 1.26.5
#25
dependabot[bot]
closed
2 years ago
1
Bump pyyaml from 5.1.2 to 5.4
#24
dependabot[bot]
closed
2 years ago
1
Bump numpy from 1.17.4 to 1.21.0
#23
dependabot[bot]
closed
2 years ago
1
out= option
#22
davidweichiang
opened
2 years ago
2
Better error messages
#21
davidweichiang
opened
2 years ago
0
Use torch.amax if available
#20
davidweichiang
closed
2 years ago
0
Add py.typed
#19
davidweichiang
closed
1 year ago
2
Fix bug in log_viterbi_einsum_forward when result has 0 dims
#18
davidweichiang
closed
1 year ago
2
Zerodim argmax
#17
davidweichiang
closed
1 year ago
3
log_viterbi_einsum_forward raises exception with no summed-out variables
#16
davidweichiang
closed
1 year ago
0
Setuptools
#15
davidweichiang
closed
2 years ago
0
Change clip_max_values to check for +inf as well.
#14
davidweichiang
closed
2 years ago
1
log_einsum returns nan instead of inf
#13
davidweichiang
closed
1 year ago
0
Boolean semiring
#12
davidweichiang
opened
2 years ago
3
Derived functions
#11
davidweichiang
opened
2 years ago
1
Broadcasting
#10
davidweichiang
opened
2 years ago
3
Allow repeated and unmatched output indices?
#9
davidweichiang
opened
2 years ago
2
Allow zero-dimensional arguments
#8
davidweichiang
closed
2 years ago
0
Zero-dimensional tensors
#7
davidweichiang
closed
2 years ago
0
Default block_size
#6
davidweichiang
closed
1 year ago
5
document log_einsum and (renamed) log_viterbi_einsum
#5
davidweichiang
closed
1 year ago
4
Folding trick
#4
bdusell
opened
3 years ago
0
[feature request] support multiple simultaneous right-hand-sides
#3
teichert
opened
3 years ago
10
sync examples regarding specifying block_size
#2
teichert
closed
3 years ago
1
Question about how to choose block_size
#1
speedcell4
closed
3 years ago
2