adsabs / export_service

Export service to output ADS records with various formats including BibTex, AASTex, and multiple tagged and xml options
MIT License
3 stars 5 forks source link

Missing fields in RIS format #74

Closed aaccomazzi closed 5 years ago

aaccomazzi commented 6 years ago

We are currently not outputting the eprint id and ISSN number. See classic output for e.g. 2018ITGRS..56.3300L

golnazads commented 6 years ago

There is no ISSN for this bibcode in solr. Am I missing something?

golnazads commented 6 years ago

the eprintid works right now only for arXiv records. once identifier fix goes to production it should be fixed for every record.

TY  - Preprint
T1  - New Convergence Aspects of Stochastic Gradient Algorithms
A1  - Nguyen, Lam M.
A1  - Nguyen, Phuong Ha
A1  - Richtárik, Peter
A1  - Scheinberg, Katya
A1  - Takáč, Martin
A1  - van Dijk, Marten
JO  - eprint arXiv:1811.12403
Y1  - 2018/11/1
SP  - arXiv:1811.12403
KW  - Mathematics - Optimization and Control
KW  - Computer Science - Machine Learning
UR  - https://ui.adsabs.harvard.edu/#abs/2018arXiv181112403N
N2  - The classical convergence analysis of SGD is carried out under the
assumption that the norm of the stochastic gradient is uniformly
bounded. While this might hold for some loss functions, it is violated
for cases where the objective function is strongly convex. In Bottou et
al. (2016), a new analysis of convergence of SGD is performed under the
assumption that stochastic gradients are bounded with respect to the
true gradient norm. We show that for stochastic problems arising in
machine learning such bound always holds; and we also propose an
alternative convergence analysis of SGD with diminishing learning rate
regime, which results in more relaxed conditions than those in Bottou et
al. (2016). We then move on the asynchronous parallel setting, and prove
convergence of Hogwild! algorithm in the same regime in the case of
diminished learning rate. It is well-known that SGD converges if a
sequence of learning rates $\{\eta_t\}$ satisfies $\sum_{t=0}^\infty
\eta_t \rightarrow \infty$ and $\sum_{t=0}^\infty \eta^2_t < \infty$.
We show the convergence of SGD for strongly convex objective function
without using bounded gradient assumption when $\{\eta_t\}$ is a
diminishing sequence and $\sum_{t=0}^\infty \eta_t \rightarrow \infty$.
In other words, we extend the current state-of-the-art class of learning
rates satisfying the convergence of SGD.
C1  - eprint: arXiv:1811.12403
ER  -  
golnazads commented 6 years ago

Fixed issn. works if there is issn in solr.

TY  - JOUR
T1  - On the energetic spectrum of a Mott exciton in ionic cristals
A1  - Moskalenko, S. A.
A1  - Tolpygo, K. B.
JO  - Zhurnal Eksperimental'noi i Teoreticheskoi Fiziki (ISSN 0044-4510), vol. 36,  1959, p. 149. In Russian
VL  - 36
Y1  - 1959/12/1
SP  - 149
KW  - IONIC CRYSTALS
KW  - MOOT's EXCITONS
KW  - ENERGIES SPECTRUM
UR  - https://ui.adsabs.harvard.edu/\#abs/1959ZhETF..36..149M
SN  - 0044-4510
ER  -  
golnazads commented 5 years ago

There are 177 records in solr that have ISSN indication in the publication but no ISSN field. Carolyn fixed 13. Updated RE in adspy to capture the rest.

golnazads commented 5 years ago

Just check to make sure 2018ITGRS..56.3300L that you have noted has been fixed. However, there is no issn in pub_raw though. "bibcode":"2018ITGRS..56.3300L", "pub_raw":"IEEE Transactions on Geoscience and Remote Sensing, vol. 56, issue 6, pp. 3300-3310"}]

I guess I have asked this earlier.

golnazads commented 5 years ago

issn fixed in solr.