jpfielding / gorets

RETS client in Go
GNU General Public License v3.0
39 stars 23 forks source link

Error Request #46

Closed leonardoo closed 7 years ago

leonardoo commented 7 years ago

hi, im doing some request to a rest service and after some time i get this error

Post http://imls.rets.fnismls.com/rets/fnisrets.aspx/IMLS/search: http: ContentLength=766 with Body length

jpfielding commented 7 years ago

hmm. not sure how i can hep given the information provided. any chance you can provide a wire log of the RETS conversation?

for instance:

`POST /rets/fnisrets.aspx/IMLS/search HTTP/1.1 Host: imls.rets.paragonrels.com User-Agent: Top Producer/1.0 Content-Length: 250 Accept: / Authorization: Digest username="clownpants", realm="IMLS", nonce="2017-09-12T17:37:21", uri="/rets/fnisrets.aspx/IMLS/search", response="bang", qop=auth, nc=00000007, cnonce="bang", algorithm="MD5" Content-Type: application/x-www-form-urlencoded Cookie: RETS-Session-ID=-bang- Rets-Version: RETS/1.5

Class=VL_2&Format=COMPACT-DECODED&Limit=2500&Query=%28%28L_Status%3D%7C1_0%2C1_1%2C1_2%2C1_3%2C1_5%2C1_6%2C1_7%2C1_8%2C1_9%29%7C%28L_Status%3D%7C3_0%2C3_1%29%29%2C%28L_UpdateDate%3D2017-09-12T17%3A18%3A00.000%2B%29&QueryType=DMQL2&SearchType=PropertyHTTP/1.1 200 OK Cache-Control: private Transfer-Encoding: chunked Content-Type: text/xml; charset=utf-8 Expires: Mon, 01 Jan 0001 00:00:00 GMT RETS-Server: RETS-Paragon/1.0 RETS-Version: RETS/1.5 X-SERVER: A101 Date: Tue, 12 Sep 2017 17:27:32 GMT Vary: Accept-Encoding

332b

....` if i had to _guess_, given your message. id say you set the SearchRequest.HTTPMethod="POST", without having set the SearchParams.HTTPFormEncodedValues=true
leonardoo commented 7 years ago

do you know a way to obtain this with gorets?, well the error ocurr after i succesfully download more that 10k properties bewteen 14k and 16k i get the error, this is part of the code i use to create the request and make the request to the rets server

var query string
if _last != "" {
    query = fmt.Sprintf("(L_UpdateDate=%v+)", _last)
}else{
    query = "(L_UpdateDate=1950-01-01T00:00:00+)"
}
_columns := make([]string, 0)
for k := range columns {
    _columns = append(_columns, k)
}
var s string = strings.Join(_columns, ",")
params := rets.SearchParams{
    HTTPFormEncodedValues: true,
    Select:     s,
    Query:      query,
    SearchType: resource.ResourceID,
    Class:      meta.ClassName,
    Format:     "COMPACT",
    QueryType:  "DMQL2",
    Count:      1,
    Limit:      configuration.LIMIT,
    Offset:     0,
}
req := rets.SearchRequest{
    URL: urls.Search,
    SearchParams: params,
    HTTPMethod: "POST",
}
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Minute)
result, err := rets.SearchCompact(ctx, sess, req)
if err != nil {
    fmt.Println(err.Error())
    importLogDetails := models.ImportLogDetails{
        ID:          bson.NewObjectId().Hex(),
        ImportLogID: importLog,
        Data:        "",
        Error:       err.Error(),
    }
    hostConn.Insert(importLogDetails)
}
jpfielding commented 7 years ago

an example of built in wire logging is available here:

https://github.com/jpfielding/gorets/blob/master/cmds/common/config.go

if you arent connecting over https, wireshark can be used too.

On Tue, Sep 12, 2017 at 6:53 PM, leonardo orozco notifications@github.com wrote:

do you know a way to obtain this with gorets?, well the error ocurr after i succesfully download more that 10k properties bewteen 14k and 16k i get the error, this is part of the code i use to create the request

var query stringif _last != "" { query = fmt.Sprintf("(L_UpdateDate=%v+)", _last) }else{ query = "(L_UpdateDate=1950-01-01T00:00:00+)" } _columns := make([]string, 0)for k := range columns { _columns = append(_columns, k) }var s string = strings.Join(_columns, ",") params := rets.SearchParams{ HTTPFormEncodedValues: true, Select: s, Query: query, SearchType: resource.ResourceID, Class: meta.ClassName, Format: "COMPACT", QueryType: "DMQL2", Count: 1, Limit: configuration.LIMIT, Offset: 0, } req := rets.SearchRequest{ URL: urls.Search, SearchParams: params, HTTPMethod: "POST", }

ctx, cancel := context.WithTimeout(context.Background(), 10*time.Minute) result, err := rets.SearchCompact(ctx, sess, req)if err != nil { fmt.Println(err.Error()) importLogDetails := models.ImportLogDetails{ ID: bson.NewObjectId().Hex(), ImportLogID: importLog, Data: "", Error: err.Error(), } hostConn.Insert(importLogDetails) }

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/jpfielding/gorets/issues/46#issuecomment-328947362, or mute the thread https://github.com/notifications/unsubscribe-auth/AABKmepkA8PGajDevc37a7_WDKTgnI99ks5shtMkgaJpZM4PU-BM .

leonardoo commented 7 years ago

this is the last part of the wire logging

https://pastebin.com/9Ms5aggB

jpfielding commented 7 years ago

looks like you are running out of time. you eventually get a

Unauthorized: Access is denied due to invalid credentials

even though your creds are valid.

iirc, that vendor has a time limit on how long you are logged into your sessions. you have 350k records you are trying to extract in batches (limit) of 100. thats a lot of time you'll spend making requests.

id suggest you make your limit bigger, or better yet, remove it all together, and simply keep your own counter (so you know the next offset) and take note of the maxrows value after reading the results and if its true, send a request for the next page. between, the counter and the max rows, you'll know where to set your offset for the follow up page, and it will likely be much bigger than 100. sometimes vendors even drop these limits at night. if you are only querying the covering index (see their metadata for keyfieldindex), you can 'force' the limit to NONE (limit = -1).

since this isnt an issue with gorets, im going to close this for now. feel free to hit me up with questions at gophers.slack.com on the #gorets channel.