teamgram / teamgram-server

Unofficial open source mtproto server written in golang with compatible telegram client.
https://teamgram.net
Apache License 2.0
1.83k stars 366 forks source link

Connect webogram to nebula-chat / nebulaim server ? #22

Closed ekorudi closed 2 years ago

ekorudi commented 5 years ago

https://github.com/nebulaim/webogram client is easy to setup with nodejs 10 and can run, the server also easy to install and run. Can you tweak little bit webogram to be able to talk to server. It look like server can not recognized client. So , anyone interested in the project will jump in fast.

Here is some of the browser log:

[SW] on message {type: "notifications_clear"} push_worker.js:85 [SW] on message {type: "notifications_clear"} push_worker.js:85 [SW] on message {type: "notifications_clear"} J @ jquery.min.js:2 mtproto_wrapper.js:143 Get networker error {code: 406, type: "NETWORK_BAD_RESPONSE", url: "http://127.0.0.1:8800/apiw1", originalError: {…}} undefined mtproto_wrapper.js:143 Get networker error {code: 406, type: "NETWORK_BAD_RESPONSE", url: "http://127.0.0.1:8800/apiw1", originalError: {…}} undefined controllers.js:217 sendCode error {code: 406, type: "NETWORK_BAD_RESPONSE", url: "http://127.0.0.1:8800/apiw1", originalError: {…}, handled: true, …} services.js:4044 notify {title: "Telegram", message: "Your authorization key was successfully generated! Open the app to log in.", tag: "auth_key"} true true false angular.js:11881 POST http://127.0.0.1:8800/apiw1 net::ERR_EMPTY_RESPONSE (anonymous) @ angular.js:11881 sendReq @ angular.js:11642 push_worker.js:85 [SW] on message {type: "notifications_clear"}

=== Here is the server log:

I0730 14:52:59.923364 35437 server.go:301] onServerUnencryptedRawMessage - receive data: {peer: {connID: 3@frontend80-(127.0.0.1:8800->127.0.0.1:53683)}, ctx: &{{%!s(int32=0) %!s(uint32=0)} %!s(int=256) %!s(*mtproto.TLHandshakeData=&{0xc0006f9950 {} [] 0}) %!s(int64=0)}, msg: {conn_type: 2, auth_key_id: 0, quick_ack_id: 0, payload_len: 40}} I0730 14:52:59.924172 35437 server.go:340] sendMessage - handshake: {peer: {connID: 3@frontend80-(127.0.0.1:8800->127.0.0.1:53683)}, md: request:<service_name:"handshake" method_name:"mtproto.TLHandshakeData" log_id:1156110426904203264 trace_id:1156110426904203266 span_id:1156110426904203267 > correlation_id:1156110426904203265 attachment_size:40 mtproto_meta:<auth_key_id:0 server_id:1 client_conn_id:3 client_addr:"127.0.0.1:53683" from:"frontend" receive_time:1564473179924 > , msg: data2:<state:513 client_conn_id:3 ctx:<constructor:CRC32_handshakeContext data2:<> > > } I0730 14:52:59.927902 35401 server.go:95] onServerMessageDataArrived - msg: data2:<state:513 client_conn_id:3 ctx:<constructor:CRC32_handshakeContext data2:<> > > I0730 14:52:59.931836 35401 decode.go:320] newTLObjectByClassID, classID: 0x60469778 I0730 14:52:59.932478 35401 handshake.go:193] req_pq#60469778 - state: {data2:<state:513 client_conn_id:3 ctx:<constructor:CRC32_handshakeContext data2:<> > > }, request: {"nonce":"12ODQvHpDXvLMzzzbxRA8w=="} I0730 14:52:59.934285 35401 cache_state_manager.go:71] put state key: (salts_d7638342f1e90d7bcb333cf36f1440f3@a8fc3d2f1e509ae57c615f9258d16f3b) I0730 14:52:59.935936 35401 handshake.go:217] req_pq#60469778 - state: {data2:<state:513 client_conn_id:3 ctx:<constructor:CRC32_handshakeContext data2:<nonce:"\327c\203B\361\351\r{\3133<\363o\024@\363" server_nonce:"\250\374=/\036P\232\345|a_\222X\321o;" > > > }, reply: {"data2":{"nonce":"12ODQvHpDXvLMzzzbxRA8w==","server_nonce":"qPw9Lx5QmuV8YV+SWNFvOw==","pq":"\u0017\ufffdH\ufffd\u001a\u0008\ufffd\ufffd","server_public_key_fingerprints":[-6205835210776354611]}} I0730 14:52:59.937031 35437 server.go:223] onClientHandshakeMessage - handshake: peer({connID: 1@handshake-(127.0.0.1:53672->127.0.0.1:10005)}), state: {data2:<state:514 res_state:1 client_conn_id:3 ctx:<constructor:CRC32_handshakeContext data2:<nonce:"\327c\203B\361\351\r{\3133<\363o\024@\363" server_nonce:"\250\374=/\036P\232\345|a_\222X\321o;" > > > } W0730 14:52:59.938390 35437 server.go:230] conn closed, handshake: data2:<state:514 res_state:1 client_conn_id:3 ctx:<constructor:CRC32_handshakeContext data2:<nonce:"\327c\203B\361\351\r{\3133<\363o\024@\363" server_nonce:"\250\374=/\036P\232\345|a_\222X\321o;" > > > I0730 14:52:59.965383 35437 zrpc_client.go:229] sendPing: ping_id:6785692651167134319 I0730 14:52:59.965448 35437 zrpc_client.go:236] &{request:<service_name:"zrpc" method_name:"mtproto.TLPing" > [8 239 228 174 222 185 240 231 149 94] []} I0730 14:52:59.965518 35437 server.go:218] onClientTimer E0730 14:59:30.879001 35437 mtproto_http_proxy_codec.go:51] read tcp 127.0.0.1:8800->127.0.0.1:53728: i/o timeout I0730 14:59:30.879075 35437 mtproto_server.go:119] onConnectionClosed - 127.0.0.1:53728 I0730 14:59:30.879094 35437 server.go:176] onServerConnectionClosed - {peer: {connID: 4@frontend80-(127.0.0.1:8800->127.0.0.1:53728)}} E0730 14:59:30.879206 35437 tcp_server.go:192] conn {{connID: 4@frontend80-(127.0.0.1:8800->127.0.0.1:53728)}} recv error: read tcp 127.0.0.1:8800->127.0.0.1:53728: i/o timeout I0730 14:59:58.453999 35377 zrpc_client.go:229] sendPing: ping_id:1354515734731151583 I0730 14:59:58.454092 35377 zrpc_client.go:236] &{request:<service_name:"zrpc" method_name:"mtproto.TLPing" > [8 223 177 207 218 165 156 141 230 18] []} I0730 14:59:58.455277 35377 zrpc_client.go:188] recv pong: constructor:CRC32_pong data2:<ping_id:1354515734731151583 > I0730 15:00:00.033277 35437 zrpc_client.go:229] sendPing: ping_id:1679611750482287486 I0730 15:00:00.033372 35437 zrpc_client.go:236] &{request:<service_name:"zrpc" method_name:"mtproto.TLPing" > [8 254 182 198 160 207 191 203 167 23] []} I07

wubenqi commented 5 years ago

webogram not support.