-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
data received by kibana is unreadable #1
Comments
gRPC relies on Protobuf (GPB) for modeling data, and GPB is binary. Something we can do is add an option to covert the data into JSON prior sending it to Kafka. That way, there won't be issues with Elasticsearch/Kibana. I've done that for other projects, so it is something we can have without too much effort. That said, the way Cisco decided to model the telemetry data is not the best in my personal opinion, so traversing the data can be difficult, regardless the format of the payload (GPB or JSON). For this reason, an additional app, maybe based on Kafka Streams, would be useful to pre-process the data, and reformat it on a way that can be easily digested by applications like Elasticsearch. |
Alejandro, thank you for your reply. converting to json is definitely something hat would help, it is what pipeline does for example. our goal is to get telemetry and then be able to visualize it from a time series DB. Today, other then using pipeline export to influxDB (which doesn't work for NX-OS cisco-ie/pipeline-gnmi#1 ) , or manually parsing data from kafka before sending to a time series DB, we have no other way (AFAIK) to do this. do you have any suggestion on what we should do, end to end, from a telemetry receiver to the data visualization in a tool like graphana? |
At OpenNMS, we have a post-processing module that runs prior storing the data provided by the Nexus device to a TSDB. This module, basically executes a provided Groovy script to transform the GPB data at runtime, and pass the result content to the persistence layer. For Elasticsearch, the approach might be similar, but to keep this application simple, I would do that externally. For example, by writing a Kafka Stream application that will take the data from the Kafka Topic that this application is populating, apply some customization and put the results on another topic. This resulting topic is the one that would feed Elasticsearch. Certainly, I could add support to convert the GBP data to JSON, but that might not be enough. I'll create a branch with this change soon. Converting GPB to JSON is not that hard. Here is an example. With Java, besides compiling the The idea is to see how the data looks in JSON format, but please keep an open mind here, because the NX-OS data is not nicely formatted, so the content might be hard to traverse in order to get what's actually needed. This is why I believe having a pre-processor is not optional. |
I've created a PR with the required changes to optionally convert the GPB payload to JSON. To give it a try, please checkout the branch called |
Thank you, looking into it today |
It is working nicely. thank you |
You're very welcome! |
Hello.
we are trying to get telemetry data into kibana, but what we receive is unreadable as it includes what seem to be binary information. this is what kibana sees
some logs from grpc2kafka
The text was updated successfully, but these errors were encountered: