cosmos / cosmos-sdk

:chains: A Framework for Building High Value Public Blockchains :sparkles:
https://cosmos.network/
Apache License 2.0
6.28k stars 3.64k forks source link

gRPC endpoint for connecting to Cosmos node #16457

Closed lalittanna closed 1 year ago

lalittanna commented 1 year ago

Hi all, I am trying to interact with the mainnet Cosmos chain through gRPC but I am getting errors related to the endpoint when doing so. I have tried many endpoints from providers like allthatnode or getblock etc. Could someone guide me to where I can find a working grpc endpoint?

This is the code I am executing:

package main

import (
    "context"
    "fmt"

    banktypes "github.com/cosmos/cosmos-sdk/x/bank/types"
    "google.golang.org/grpc"
)

func main() {
    grpcConn, err := grpc.Dial(
        "", // Add endpoint here
        grpc.WithInsecure(), 
    )
    if err != nil {
        fmt.Printf("GRPC connection error: %v", err)
    }
    defer grpcConn.Close()
    bankClient := banktypes.NewQueryClient(grpcConn)
    grpcRes, err := bankClient.Balance(
        context.Background(), 
        &banktypes.QueryBalanceRequest{
            Address: "cosmos1wze8mn5nsgl9qrgazq6a92fvh7m5e6psjcx2du",
            Denom:   "uatom",
        },
    )
    if err != nil {
        fmt.Printf("Get balance error: %v", err)
    }
    fmt.Println(grpcRes.Balance.Amount.Uint64())
}

When I run a local simapp and use it's endpoint, the script works but whenever I use any online endpoint, it's not working.

tac0turtle commented 1 year ago

is the endpoint you are trying working? the code looks okay

lalittanna commented 1 year ago

No I am getting this error: Get balance error: rpc error: code = Unavailable desc = connection error: desc = "error reading server preface: http2: frame too large"panic: runtime error: invalid memory address or nil pointer dereference [signal SIGSEGV: segmentation violation code=0x2 addr=0x0 pc=0x101492184]

The error only comes when I try to interact with an existing chain but if I connect with a local simapp, it works.

tac0turtle commented 1 year ago

can you share the complete stack trace please

Rome314 commented 1 year ago

I have the same issue. However, Postman works fine with the same node Here is grpc logs:

2023/06/12 12:15:48 INFO: [core] [Channel #1] Channel created
2023/06/12 12:15:48 INFO: [core] [Channel #1] original dial target is: "grpc.quicksilver.zone:443"
2023/06/12 12:15:48 INFO: [core] [Channel #1] parsed dial target is: {Scheme:grpc.quicksilver.zone Authority: URL:{Scheme:grpc.quicksilver.zone Opaque:443 User: Host: Path: RawPath: OmitHost:false ForceQuery:false RawQuery: Fragment: RawFragment:}}
2023/06/12 12:15:48 INFO: [core] [Channel #1] fallback to scheme "passthrough"
2023/06/12 12:15:48 INFO: [core] [Channel #1] parsed dial target is: {Scheme:passthrough Authority: URL:{Scheme:passthrough Opaque: User: Host: Path:/grpc.quicksilver.zone:443 RawPath: OmitHost:false ForceQuery:false RawQuery: Fragment: RawFragment:}}
2023/06/12 12:15:48 INFO: [core] [Channel #1] Channel authority set to "grpc.quicksilver.zone:443"
2023/06/12 12:15:48 INFO: [core] [Channel #1] Resolver state updated: {
  "Addresses": [
    {
      "Addr": "grpc.quicksilver.zone:443",
      "ServerName": "",
      "Attributes": null,
      "BalancerAttributes": null,
      "Type": 0,
      "Metadata": null
    }
  ],
  "ServiceConfig": null,
  "Attributes": null
} (resolver returned new addresses)
2023/06/12 12:15:48 INFO: [core] [Channel #1] Channel switches to new LB policy "pick_first"
2023/06/12 12:15:48 INFO: [core] [Channel #1 SubChannel #2] Subchannel created
2023/06/12 12:15:48 INFO: [core] [Channel #1] Channel Connectivity change to CONNECTING
2023/06/12 12:15:48 INFO: [core] [Channel #1 SubChannel #2] Subchannel Connectivity change to CONNECTING
2023/06/12 12:15:48 INFO: [core] [Channel #1 SubChannel #2] Subchannel picks a new address "grpc.quicksilver.zone:443" to connect
2023/06/12 12:15:48 INFO: [core] pickfirstBalancer: UpdateSubConnState: 0x140005049d8, {CONNECTING <nil>}
2023/06/12 12:15:48 INFO: [transport] [client-transport 0x14000034fc0] Closing: connection error: desc = "error reading server preface: http2: frame too large"
2023/06/12 12:15:48 INFO: [transport] [client-transport 0x14000034fc0] loopyWriter exiting with error: transport closed by client
2023/06/12 12:15:48 INFO: [core] Creating new client transport to "{\n  \"Addr\": \"grpc.quicksilver.zone:443\",\n  \"ServerName\": \"grpc.quicksilver.zone:443\",\n  \"Attributes\": null,\n  \"BalancerAttributes\": null,\n  \"Type\": 0,\n  \"Metadata\": null\n}": connection error: desc = "error reading server preface: http2: frame too large"
2023/06/12 12:15:48 WARNING: [core] [Channel #1 SubChannel #2] grpc: addrConn.createTransport failed to connect to {
  "Addr": "grpc.quicksilver.zone:443",
  "ServerName": "grpc.quicksilver.zone:443",
  "Attributes": null,
  "BalancerAttributes": null,
  "Type": 0,
  "Metadata": null
}. Err: connection error: desc = "error reading server preface: http2: frame too large"
2023/06/12 12:15:48 INFO: [core] [Channel #1 SubChannel #2] Subchannel Connectivity change to TRANSIENT_FAILURE, last error: connection error: desc = "error reading server preface: http2: frame too large"
2023/06/12 12:15:48 INFO: [core] pickfirstBalancer: UpdateSubConnState: 0x140005049d8, {TRANSIENT_FAILURE connection error: desc = "error reading server preface: http2: frame too large"}
2023/06/12 12:15:48 INFO: [core] [Channel #1] Channel Connectivity change to TRANSIENT_FAILURE
2023/06/12 12:15:48 query failed: rpc error: code = Unavailable desc = connection error: desc = "error reading server preface: http2: frame too large"
lalittanna commented 1 year ago

@tac0turtle @Rome314 Morning guys, I have the issue fixed. The error was due to the endpoint being wrong. I have found one that's working and @Rome314 you can get it from here if you like: https://services.kjnodes.com/home/mainnet/cosmoshub

@tac0turtle thanks for the help and being patient with my late replies 😅 Cheers!

tac0turtle commented 1 year ago

amazing, thank you for commenting you have it fixed ❤️