Closed vigodeltoro closed 2 years ago
Hello,
// Q_Vlan
uint32 Dot1q_Vlan = 243;
uint32 Post_Dot1q_Vlan = 254;
uint32 Dot1q_Cvlan = 245;
uint32 Post_Dot1q_Cvlan = 255;
results in the following Go structures
Dot1Q_Vlan uint32 `protobuf:"varint,243,opt,name=Dot1q_Vlan,json=Dot1qVlan,proto3" json:"Dot1q_Vlan,omitempty"`
Post_Dot1Q_Vlan uint32 `protobuf:"varint,254,opt,name=Post_Dot1q_Vlan,json=PostDot1qVlan,proto3" json:"Post_Dot1q_Vlan,omitempty"`
Dot1Q_Cvlan uint32 `protobuf:"varint,245,opt,name=Dot1q_Cvlan,json=Dot1qCvlan,proto3" json:"Dot1q_Cvlan,omitempty"`
Post_Dot1Q_Cvlan uint32 `protobuf:"varint,255,opt,name=Post_Dot1q_Cvlan,json=PostDot1qCvlan,proto3" json:"Post_Dot1q_Cvlan,omitempty"`
It might be a good enhancement to use a mapping based on the name as well (semi related to #110), will think about it.
The following should work
mapping:
- field: 252
destination: InIf
- field: 253
destination: OutIf
- field: 243
- destination: Dot1q_Vlan
+ destination: Dot1Q_Vlan
- field: 254
- destination: Post_Dot1q_Vlan
+ destination: Post_Dot1Q_Vlan
- field: 245
- destination: Dot1q_Cvlan
+ destination: Dot1Q_Cvlan
- field: 255
- destination: Post_Dot1q_Cvlan
+ destination: Post_Dot1Q_Cvlan
You need to make sure ClickhouseDB also has the correct proto if you are doing direct insertion.
Hi Louis,
thanks a lot.. I tested it.. but without success, values are still zero.. That's my actual configuration:
flow.proto: // Q_Vlan uint32 Dot1Q_Vlan = 243; uint32 Post_Dot1Q_Vlan = 254; uint32 Dot1Q_Cvlan = 245; uint32 Post_Dot1Q_Cvlan = 255;
mapping.yaml:
ipfix: mapping:
field: 255 destination: Post_Dot1Q_Cvlan
Clickhouse Table:
CREATE TABLE IF NOT EXISTS sisr_dev.kafka_goflow2_ipfix_proto_consumer ON CLUSTER "sisr-dev-ch-cluster" ( SrcMac UInt64, DstMac UInt64, TCPFlags UInt32, TimeReceived UInt64, TimeFlowStartMs UInt64, TimeFlowEndMs UInt64, SamplerAddress FixedString(16), SrcAddr FixedString(16), DstAddr FixedString(16), InIf UInt32, OutIf UInt32, IngressVrfID UInt32, Dot1Q_Vlan UInt32, Post_Dot1Q_Vlan UInt32, Dot1Q_Cvlan UInt32, Post_Dot1Q_Cvlan UInt32, EType UInt32, Proto UInt32, SrcPort UInt32, DstPort UInt32, Bytes UInt64, Packets UInt64 ) ENGINE = Kafka() SETTINGS kafka_broker_list = 'BROKERS, kafka_topic_list = 'goflow2_raw_ipfix_proto', kafka_group_name = 'goflow2_proto_raw_ch_kafka', kafka_num_consumers = 8, kafka_thread_per_consumer = 4, kafka_format = 'Protobuf', kafka_schema = 'flow.proto:FlowMessage';
Do maybe have another idea ? On the other hand it is hard to debug, because the only point I can read the protobuf at the moment is in the Clickhouse DB. I can't verify if its broken at another point. Is there a possibility to add that fields to json as well ? I tried to start a container with the mapping above and json output but I can't find the fields in the output..
I think, you are right, mapping on the name could be very nice 👍
Thanks a lot for your help.. very appreciated :)
Just to make sure: did you recompile the proto after changing it? Using make proto
.
To visualize it as JSON, until my next PR, you need to edit the fields mapped in the code (should be in the format folder).
Great :).. you got it..I'm getting the values for dot1q and so on. Sorry for that dumb mistake on my side :/.. I did only a docker-compose build and forgot the "make proto"
I had a look to the changes in JSON.. but I didn't make it because I wasn't totally sure were and my go skills are limited... I think make that easier is a good step 👍
Thanks a lot for your help :)
Fixed ... :)
Hi Louis,
I have another question.. may be you are so kind to help out..
In my IPFix Flows I see the fields Dot1q_Vlan, Post_Dot1q_Vlan, Dot1q_Cvlan, Post_Dot1q_Cvlan in PMACCT and tcpdump. I followed your manual to add exotic files by mapping.
That is my mapping:
ipfix: mapping:
Furthermore I add the following to flow.proto :
// Q_Vlan uint32 Dot1q_Vlan = 243; uint32 Post_Dot1q_Vlan = 254; uint32 Dot1q_Cvlan = 245; uint32 Post_Dot1q_Cvlan = 255;
After that I rebuild the container with "docker build ." and startet the goflow2 container
But I don't get values for that 4 new fields in clickhouse and as you can see in the data I send in the last issue, the data will be exported. In ClickhouseDB the columns are zero..
Do I configure sth. wrong ?
Do you have an idea ?
Thanks a lot again..
Best regards Christian