Skip to content

How to increase through put for Vector consuming from kafka #12085

Answered by jszwedko
pkirubak asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @pkirubak !

We are rolling out a native (and native_json) codec that I think will allow you to avoid using the lua transform to convert JSON to metrics here which, as you note, appears to be the bottleneck. We expect this to be complete by v0.22.0.

For now, you could try splitting up the processing by partitioning the data and running multiple, identical, Lua transforms, that would run each in their own task. This would involve using the route transform with one route per parallel lua transform, partitioning across some field from the metric value that would fairly distribute (maybe the microsecond portion of the timestamp?), and then fanning back into the prom_metric sink. Let me know…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by pkirubak
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants