Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add observability samples (self managed) for Istio addon #4744

Merged
merged 5 commits into from
Jan 15, 2025

Conversation

biefy
Copy link
Contributor

@biefy biefy commented Jan 14, 2025

Add observability samples (self managed) for Istio addon.
The yaml files are modified based on https://github.com/istio/istio/blob/master/samples/addons/.

@biefy biefy requested review from palma21 and a team as code owners January 14, 2025 02:33
@biefy biefy requested a review from gambtho January 14, 2025 02:33
Copy link
Contributor

@chasewilson chasewilson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor updates for grammar and readability.

examples/istio-based-service-mesh/observability/README.md Outdated Show resolved Hide resolved
examples/istio-based-service-mesh/observability/README.md Outdated Show resolved Hide resolved
@shashankbarsin
Copy link
Contributor

  • One problem of having the sample deployment of prometheus and grafana itself in the samples is that we'd need to keep tabs on when newer versions of prometheus/grafana are available and then update these samples (without which deployment of older samples could result in vulnerabilities). To address this, what if we instead document a step each on self-installing Prometheus or Grafana with reference links to their docs (for example, https://github.com/prometheus-operator/kube-prometheus/tree/main?tab=readme-ov-file#quickstart) and remove the sample deployment files?
  • I observed that istio grafana dashboards from community are mentioned directly as a configmap. It'd again be the same problem as above where we'd need to maintain them over time. Instead, can we point to istio community grafana dashbaords here (https://github.com/istio/istio/tree/master/manifests/addons/dashboards) and document steps on how to import these dashboards into Grafana after it's deployed?
  • The one part I think we should retain is the scrape config

@biefy
Copy link
Contributor Author

biefy commented Jan 15, 2025

  • One problem of having the sample deployment of prometheus and grafana itself in the samples is that we'd need to keep tabs on when newer versions of prometheus/grafana are available and then update these samples (without which deployment of older samples could result in vulnerabilities). To address this, what if we instead document a step each on self-installing Prometheus or Grafana with reference links to their docs (for example, https://github.com/prometheus-operator/kube-prometheus/tree/main?tab=readme-ov-file#quickstart) and remove the sample deployment files?
  • I observed that istio grafana dashboards from community are mentioned directly as a configmap. It'd again be the same problem as above where we'd need to maintain them over time. Instead, can we point to istio community grafana dashbaords here (https://github.com/istio/istio/tree/master/manifests/addons/dashboards) and document steps on how to import these dashboards into Grafana after it's deployed?
  • The one part I think we should retain is the scrape config

These are great questions.

For refreshing these files, by looking through the upstream files history, they mostly refresh images. We could do the same - maybe twice a year?

For the scraping configs, they are directly from the upstream files with some namespace replacement.

Copy link
Contributor

@Azure/aks-pm issue needs labels

@biefy biefy merged commit e4a6c9b into Azure:master Jan 15, 2025
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants