Merge ~spencerrunde/landscape-charm:update-cos-integration into landscape-charm:main

Proposed by Spencer Runde
Status: Merged
Merged at revision: 2054958e5a9e9fe1eb6205edbfa0336af0996435
Proposed branch: ~spencerrunde/landscape-charm:update-cos-integration
Merge into: landscape-charm:main
Diff against target: 1093 lines (+1031/-1)
6 files modified
lib/charms/grafana_agent/LICENSE (+201/-0)
lib/charms/grafana_agent/v0/cos_agent.py (+819/-0)
metadata.yaml (+2/-0)
requirements-dev.txt (+4/-0)
src/charm.py (+3/-0)
tests/test_charm.py (+2/-1)
Reviewer Review Type Date Requested Status
Bill Kronholm Approve
Review via email: mp+459842@code.launchpad.net

Commit message

add integration with grafana machine agent for log scraping.

Description of the change

Grafana Agent now recurses log directories, so it isn't necessary to simlink /var/log/landscape/ log files.

We still set log directory permissions during installation to address a possible issue with the jammy juju base image.

Testing:
- Set up COS Lite if you haven't already: https://charmhub.io/prometheus-k8s/docs/deploy-cos-lite?channel=edge#heading--introduction
- `charmcraft pack` in the repo to create a new landscape server charm
- `juju deploy ./bundle.yaml` to deploy a landscape bundle with the created charm
- Integrate the landscape server charm into COS: https://charmhub.io/topics/canonical-observability-stack/tutorials/instrumenting-machine-charms#preview
- Check the Grafana dashboards to ensure landscape logs are being scraped

Please feel free to reach out to me if you haven't set up COS Lite before. I'm happy to walk through the setup to make reviewing this easier.

To post a comment you must log in.
3ea7f73... by Spencer Runde

remove ensure_log_dir

Revision history for this message
Bill Kronholm (wck0) wrote :

LGTM modulo the following:

Check that the LICENSE file mentioned in cos_agent.py is identical to the one already in the project. Otherwise, add that LICENSE file to the grafana_agent/v0 directory.

Rebase with main.

53f37e5... by Spencer Runde

include Apache 2.0 license from grafana-agent-operator

62ea84a... by Spencer Runde

Merge branch 'upstream_main' into update-cos-integration

Revision history for this message
Bill Kronholm (wck0) :
review: Approve

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1diff --git a/lib/charms/grafana_agent/LICENSE b/lib/charms/grafana_agent/LICENSE
2new file mode 100644
3index 0000000..f49a4e1
4--- /dev/null
5+++ b/lib/charms/grafana_agent/LICENSE
6@@ -0,0 +1,201 @@
7+ Apache License
8+ Version 2.0, January 2004
9+ http://www.apache.org/licenses/
10+
11+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
12+
13+ 1. Definitions.
14+
15+ "License" shall mean the terms and conditions for use, reproduction,
16+ and distribution as defined by Sections 1 through 9 of this document.
17+
18+ "Licensor" shall mean the copyright owner or entity authorized by
19+ the copyright owner that is granting the License.
20+
21+ "Legal Entity" shall mean the union of the acting entity and all
22+ other entities that control, are controlled by, or are under common
23+ control with that entity. For the purposes of this definition,
24+ "control" means (i) the power, direct or indirect, to cause the
25+ direction or management of such entity, whether by contract or
26+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
27+ outstanding shares, or (iii) beneficial ownership of such entity.
28+
29+ "You" (or "Your") shall mean an individual or Legal Entity
30+ exercising permissions granted by this License.
31+
32+ "Source" form shall mean the preferred form for making modifications,
33+ including but not limited to software source code, documentation
34+ source, and configuration files.
35+
36+ "Object" form shall mean any form resulting from mechanical
37+ transformation or translation of a Source form, including but
38+ not limited to compiled object code, generated documentation,
39+ and conversions to other media types.
40+
41+ "Work" shall mean the work of authorship, whether in Source or
42+ Object form, made available under the License, as indicated by a
43+ copyright notice that is included in or attached to the work
44+ (an example is provided in the Appendix below).
45+
46+ "Derivative Works" shall mean any work, whether in Source or Object
47+ form, that is based on (or derived from) the Work and for which the
48+ editorial revisions, annotations, elaborations, or other modifications
49+ represent, as a whole, an original work of authorship. For the purposes
50+ of this License, Derivative Works shall not include works that remain
51+ separable from, or merely link (or bind by name) to the interfaces of,
52+ the Work and Derivative Works thereof.
53+
54+ "Contribution" shall mean any work of authorship, including
55+ the original version of the Work and any modifications or additions
56+ to that Work or Derivative Works thereof, that is intentionally
57+ submitted to Licensor for inclusion in the Work by the copyright owner
58+ or by an individual or Legal Entity authorized to submit on behalf of
59+ the copyright owner. For the purposes of this definition, "submitted"
60+ means any form of electronic, verbal, or written communication sent
61+ to the Licensor or its representatives, including but not limited to
62+ communication on electronic mailing lists, source code control systems,
63+ and issue tracking systems that are managed by, or on behalf of, the
64+ Licensor for the purpose of discussing and improving the Work, but
65+ excluding communication that is conspicuously marked or otherwise
66+ designated in writing by the copyright owner as "Not a Contribution."
67+
68+ "Contributor" shall mean Licensor and any individual or Legal Entity
69+ on behalf of whom a Contribution has been received by Licensor and
70+ subsequently incorporated within the Work.
71+
72+ 2. Grant of Copyright License. Subject to the terms and conditions of
73+ this License, each Contributor hereby grants to You a perpetual,
74+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
75+ copyright license to reproduce, prepare Derivative Works of,
76+ publicly display, publicly perform, sublicense, and distribute the
77+ Work and such Derivative Works in Source or Object form.
78+
79+ 3. Grant of Patent License. Subject to the terms and conditions of
80+ this License, each Contributor hereby grants to You a perpetual,
81+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
82+ (except as stated in this section) patent license to make, have made,
83+ use, offer to sell, sell, import, and otherwise transfer the Work,
84+ where such license applies only to those patent claims licensable
85+ by such Contributor that are necessarily infringed by their
86+ Contribution(s) alone or by combination of their Contribution(s)
87+ with the Work to which such Contribution(s) was submitted. If You
88+ institute patent litigation against any entity (including a
89+ cross-claim or counterclaim in a lawsuit) alleging that the Work
90+ or a Contribution incorporated within the Work constitutes direct
91+ or contributory patent infringement, then any patent licenses
92+ granted to You under this License for that Work shall terminate
93+ as of the date such litigation is filed.
94+
95+ 4. Redistribution. You may reproduce and distribute copies of the
96+ Work or Derivative Works thereof in any medium, with or without
97+ modifications, and in Source or Object form, provided that You
98+ meet the following conditions:
99+
100+ (a) You must give any other recipients of the Work or
101+ Derivative Works a copy of this License; and
102+
103+ (b) You must cause any modified files to carry prominent notices
104+ stating that You changed the files; and
105+
106+ (c) You must retain, in the Source form of any Derivative Works
107+ that You distribute, all copyright, patent, trademark, and
108+ attribution notices from the Source form of the Work,
109+ excluding those notices that do not pertain to any part of
110+ the Derivative Works; and
111+
112+ (d) If the Work includes a "NOTICE" text file as part of its
113+ distribution, then any Derivative Works that You distribute must
114+ include a readable copy of the attribution notices contained
115+ within such NOTICE file, excluding those notices that do not
116+ pertain to any part of the Derivative Works, in at least one
117+ of the following places: within a NOTICE text file distributed
118+ as part of the Derivative Works; within the Source form or
119+ documentation, if provided along with the Derivative Works; or,
120+ within a display generated by the Derivative Works, if and
121+ wherever such third-party notices normally appear. The contents
122+ of the NOTICE file are for informational purposes only and
123+ do not modify the License. You may add Your own attribution
124+ notices within Derivative Works that You distribute, alongside
125+ or as an addendum to the NOTICE text from the Work, provided
126+ that such additional attribution notices cannot be construed
127+ as modifying the License.
128+
129+ You may add Your own copyright statement to Your modifications and
130+ may provide additional or different license terms and conditions
131+ for use, reproduction, or distribution of Your modifications, or
132+ for any such Derivative Works as a whole, provided Your use,
133+ reproduction, and distribution of the Work otherwise complies with
134+ the conditions stated in this License.
135+
136+ 5. Submission of Contributions. Unless You explicitly state otherwise,
137+ any Contribution intentionally submitted for inclusion in the Work
138+ by You to the Licensor shall be under the terms and conditions of
139+ this License, without any additional terms or conditions.
140+ Notwithstanding the above, nothing herein shall supersede or modify
141+ the terms of any separate license agreement you may have executed
142+ with Licensor regarding such Contributions.
143+
144+ 6. Trademarks. This License does not grant permission to use the trade
145+ names, trademarks, service marks, or product names of the Licensor,
146+ except as required for reasonable and customary use in describing the
147+ origin of the Work and reproducing the content of the NOTICE file.
148+
149+ 7. Disclaimer of Warranty. Unless required by applicable law or
150+ agreed to in writing, Licensor provides the Work (and each
151+ Contributor provides its Contributions) on an "AS IS" BASIS,
152+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
153+ implied, including, without limitation, any warranties or conditions
154+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
155+ PARTICULAR PURPOSE. You are solely responsible for determining the
156+ appropriateness of using or redistributing the Work and assume any
157+ risks associated with Your exercise of permissions under this License.
158+
159+ 8. Limitation of Liability. In no event and under no legal theory,
160+ whether in tort (including negligence), contract, or otherwise,
161+ unless required by applicable law (such as deliberate and grossly
162+ negligent acts) or agreed to in writing, shall any Contributor be
163+ liable to You for damages, including any direct, indirect, special,
164+ incidental, or consequential damages of any character arising as a
165+ result of this License or out of the use or inability to use the
166+ Work (including but not limited to damages for loss of goodwill,
167+ work stoppage, computer failure or malfunction, or any and all
168+ other commercial damages or losses), even if such Contributor
169+ has been advised of the possibility of such damages.
170+
171+ 9. Accepting Warranty or Additional Liability. While redistributing
172+ the Work or Derivative Works thereof, You may choose to offer,
173+ and charge a fee for, acceptance of support, warranty, indemnity,
174+ or other liability obligations and/or rights consistent with this
175+ License. However, in accepting such obligations, You may act only
176+ on Your own behalf and on Your sole responsibility, not on behalf
177+ of any other Contributor, and only if You agree to indemnify,
178+ defend, and hold each Contributor harmless for any liability
179+ incurred by, or claims asserted against, such Contributor by reason
180+ of your accepting any such warranty or additional liability.
181+
182+ END OF TERMS AND CONDITIONS
183+
184+ APPENDIX: How to apply the Apache License to your work.
185+
186+ To apply the Apache License to your work, attach the following
187+ boilerplate notice, with the fields enclosed by brackets "[]"
188+ replaced with your own identifying information. (Don't include
189+ the brackets!) The text should be enclosed in the appropriate
190+ comment syntax for the file format. We also recommend that a
191+ file or class name and description of purpose be included on the
192+ same "printed page" as the copyright notice for easier
193+ identification within third-party archives.
194+
195+ Copyright [yyyy] [name of copyright owner]
196+
197+ Licensed under the Apache License, Version 2.0 (the "License");
198+ you may not use this file except in compliance with the License.
199+ You may obtain a copy of the License at
200+
201+ http://www.apache.org/licenses/LICENSE-2.0
202+
203+ Unless required by applicable law or agreed to in writing, software
204+ distributed under the License is distributed on an "AS IS" BASIS,
205+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
206+ See the License for the specific language governing permissions and
207+ limitations under the License.
208\ No newline at end of file
209diff --git a/lib/charms/grafana_agent/v0/cos_agent.py b/lib/charms/grafana_agent/v0/cos_agent.py
210new file mode 100644
211index 0000000..259a901
212--- /dev/null
213+++ b/lib/charms/grafana_agent/v0/cos_agent.py
214@@ -0,0 +1,819 @@
215+# Copyright 2023 Canonical Ltd.
216+# See LICENSE file for licensing details.
217+
218+r"""## Overview.
219+
220+This library can be used to manage the cos_agent relation interface:
221+
222+- `COSAgentProvider`: Use in machine charms that need to have a workload's metrics
223+ or logs scraped, or forward rule files or dashboards to Prometheus, Loki or Grafana through
224+ the Grafana Agent machine charm.
225+
226+- `COSAgentConsumer`: Used in the Grafana Agent machine charm to manage the requirer side of
227+ the `cos_agent` interface.
228+
229+
230+## COSAgentProvider Library Usage
231+
232+Grafana Agent machine Charmed Operator interacts with its clients using the cos_agent library.
233+Charms seeking to send telemetry, must do so using the `COSAgentProvider` object from
234+this charm library.
235+
236+Using the `COSAgentProvider` object only requires instantiating it,
237+typically in the `__init__` method of your charm (the one which sends telemetry).
238+
239+The constructor of `COSAgentProvider` has only one required and nine optional parameters:
240+
241+```python
242+ def __init__(
243+ self,
244+ charm: CharmType,
245+ relation_name: str = DEFAULT_RELATION_NAME,
246+ metrics_endpoints: Optional[List[_MetricsEndpointDict]] = None,
247+ metrics_rules_dir: str = "./src/prometheus_alert_rules",
248+ logs_rules_dir: str = "./src/loki_alert_rules",
249+ recurse_rules_dirs: bool = False,
250+ log_slots: Optional[List[str]] = None,
251+ dashboard_dirs: Optional[List[str]] = None,
252+ refresh_events: Optional[List] = None,
253+ scrape_configs: Optional[Union[List[Dict], Callable]] = None,
254+ ):
255+```
256+
257+### Parameters
258+
259+- `charm`: The instance of the charm that instantiates `COSAgentProvider`, typically `self`.
260+
261+- `relation_name`: If your charmed operator uses a relation name other than `cos-agent` to use
262+ the `cos_agent` interface, this is where you have to specify that.
263+
264+- `metrics_endpoints`: In this parameter you can specify the metrics endpoints that Grafana Agent
265+ machine Charmed Operator will scrape. The configs of this list will be merged with the configs
266+ from `scrape_configs`.
267+
268+- `metrics_rules_dir`: The directory in which the Charmed Operator stores its metrics alert rules
269+ files.
270+
271+- `logs_rules_dir`: The directory in which the Charmed Operator stores its logs alert rules files.
272+
273+- `recurse_rules_dirs`: This parameters set whether Grafana Agent machine Charmed Operator has to
274+ search alert rules files recursively in the previous two directories or not.
275+
276+- `log_slots`: Snap slots to connect to for scraping logs in the form ["snap-name:slot", ...].
277+
278+- `dashboard_dirs`: List of directories where the dashboards are stored in the Charmed Operator.
279+
280+- `refresh_events`: List of events on which to refresh relation data.
281+
282+- `scrape_configs`: List of standard scrape_configs dicts or a callable that returns the list in
283+ case the configs need to be generated dynamically. The contents of this list will be merged
284+ with the configs from `metrics_endpoints`.
285+
286+
287+### Example 1 - Minimal instrumentation:
288+
289+In order to use this object the following should be in the `charm.py` file.
290+
291+```python
292+from charms.grafana_agent.v0.cos_agent import COSAgentProvider
293+...
294+class TelemetryProviderCharm(CharmBase):
295+ def __init__(self, *args):
296+ ...
297+ self._grafana_agent = COSAgentProvider(self)
298+```
299+
300+### Example 2 - Full instrumentation:
301+
302+In order to use this object the following should be in the `charm.py` file.
303+
304+```python
305+from charms.grafana_agent.v0.cos_agent import COSAgentProvider
306+...
307+class TelemetryProviderCharm(CharmBase):
308+ def __init__(self, *args):
309+ ...
310+ self._grafana_agent = COSAgentProvider(
311+ self,
312+ relation_name="custom-cos-agent",
313+ metrics_endpoints=[
314+ # specify "path" and "port" to scrape from localhost
315+ {"path": "/metrics", "port": 9000},
316+ {"path": "/metrics", "port": 9001},
317+ {"path": "/metrics", "port": 9002},
318+ ],
319+ metrics_rules_dir="./src/alert_rules/prometheus",
320+ logs_rules_dir="./src/alert_rules/loki",
321+ recursive_rules_dir=True,
322+ log_slots=["my-app:slot"],
323+ dashboard_dirs=["./src/dashboards_1", "./src/dashboards_2"],
324+ refresh_events=["update-status", "upgrade-charm"],
325+ scrape_configs=[
326+ {
327+ "job_name": "custom_job",
328+ "metrics_path": "/metrics",
329+ "authorization": {"credentials": "bearer-token"},
330+ "static_configs": [
331+ {
332+ "targets": ["localhost:9003"]},
333+ "labels": {"key": "value"},
334+ },
335+ ],
336+ },
337+ ]
338+ )
339+```
340+
341+### Example 3 - Dynamic scrape configs generation:
342+
343+Pass a function to the `scrape_configs` to decouple the generation of the configs
344+from the instantiation of the COSAgentProvider object.
345+
346+```python
347+from charms.grafana_agent.v0.cos_agent import COSAgentProvider
348+...
349+
350+class TelemetryProviderCharm(CharmBase):
351+ def generate_scrape_configs(self):
352+ return [
353+ {
354+ "job_name": "custom",
355+ "metrics_path": "/metrics",
356+ "static_configs": [{"targets": ["localhost:9000"]}],
357+ },
358+ ]
359+
360+ def __init__(self, *args):
361+ ...
362+ self._grafana_agent = COSAgentProvider(
363+ self,
364+ scrape_configs=self.generate_scrape_configs,
365+ )
366+```
367+
368+## COSAgentConsumer Library Usage
369+
370+This object may be used by any Charmed Operator which gathers telemetry data by
371+implementing the consumer side of the `cos_agent` interface.
372+For instance Grafana Agent machine Charmed Operator.
373+
374+For this purpose the charm needs to instantiate the `COSAgentConsumer` object with one mandatory
375+and two optional arguments.
376+
377+### Parameters
378+
379+- `charm`: A reference to the parent (Grafana Agent machine) charm.
380+
381+- `relation_name`: The name of the relation that the charm uses to interact
382+ with its clients that provides telemetry data using the `COSAgentProvider` object.
383+
384+ If provided, this relation name must match a provided relation in metadata.yaml with the
385+ `cos_agent` interface.
386+ The default value of this argument is "cos-agent".
387+
388+- `refresh_events`: List of events on which to refresh relation data.
389+
390+
391+### Example 1 - Minimal instrumentation:
392+
393+In order to use this object the following should be in the `charm.py` file.
394+
395+```python
396+from charms.grafana_agent.v0.cos_agent import COSAgentConsumer
397+...
398+class GrafanaAgentMachineCharm(GrafanaAgentCharm)
399+ def __init__(self, *args):
400+ ...
401+ self._cos = COSAgentRequirer(self)
402+```
403+
404+
405+### Example 2 - Full instrumentation:
406+
407+In order to use this object the following should be in the `charm.py` file.
408+
409+```python
410+from charms.grafana_agent.v0.cos_agent import COSAgentConsumer
411+...
412+class GrafanaAgentMachineCharm(GrafanaAgentCharm)
413+ def __init__(self, *args):
414+ ...
415+ self._cos = COSAgentRequirer(
416+ self,
417+ relation_name="cos-agent-consumer",
418+ refresh_events=["update-status", "upgrade-charm"],
419+ )
420+```
421+"""
422+
423+import json
424+import logging
425+from collections import namedtuple
426+from itertools import chain
427+from pathlib import Path
428+from typing import TYPE_CHECKING, Any, Callable, ClassVar, Dict, List, Optional, Set, Union
429+
430+import pydantic
431+from cosl import GrafanaDashboard, JujuTopology
432+from cosl.rules import AlertRules
433+from ops.charm import RelationChangedEvent
434+from ops.framework import EventBase, EventSource, Object, ObjectEvents
435+from ops.model import Relation, Unit
436+from ops.testing import CharmType
437+
438+if TYPE_CHECKING:
439+ try:
440+ from typing import TypedDict
441+
442+ class _MetricsEndpointDict(TypedDict):
443+ path: str
444+ port: int
445+
446+ except ModuleNotFoundError:
447+ _MetricsEndpointDict = Dict # pyright: ignore
448+
449+LIBID = "dc15fa84cef84ce58155fb84f6c6213a"
450+LIBAPI = 0
451+LIBPATCH = 7
452+
453+PYDEPS = ["cosl", "pydantic < 2"]
454+
455+DEFAULT_RELATION_NAME = "cos-agent"
456+DEFAULT_PEER_RELATION_NAME = "peers"
457+DEFAULT_SCRAPE_CONFIG = {
458+ "static_configs": [{"targets": ["localhost:80"]}],
459+ "metrics_path": "/metrics",
460+}
461+
462+logger = logging.getLogger(__name__)
463+SnapEndpoint = namedtuple("SnapEndpoint", "owner, name")
464+
465+
466+class CosAgentProviderUnitData(pydantic.BaseModel):
467+ """Unit databag model for `cos-agent` relation."""
468+
469+ # The following entries are the same for all units of the same principal.
470+ # Note that the same grafana agent subordinate may be related to several apps.
471+ # this needs to make its way to the gagent leader
472+ metrics_alert_rules: dict
473+ log_alert_rules: dict
474+ dashboards: List[GrafanaDashboard]
475+ subordinate: Optional[bool]
476+
477+ # The following entries may vary across units of the same principal app.
478+ # this data does not need to be forwarded to the gagent leader
479+ metrics_scrape_jobs: List[Dict]
480+ log_slots: List[str]
481+
482+ # when this whole datastructure is dumped into a databag, it will be nested under this key.
483+ # while not strictly necessary (we could have it 'flattened out' into the databag),
484+ # this simplifies working with the model.
485+ KEY: ClassVar[str] = "config"
486+
487+
488+class CosAgentPeersUnitData(pydantic.BaseModel):
489+ """Unit databag model for `peers` cos-agent machine charm peer relation."""
490+
491+ # We need the principal unit name and relation metadata to be able to render identifiers
492+ # (e.g. topology) on the leader side, after all the data moves into peer data (the grafana
493+ # agent leader can only see its own principal, because it is a subordinate charm).
494+ principal_unit_name: str
495+ principal_relation_id: str
496+ principal_relation_name: str
497+
498+ # The only data that is forwarded to the leader is data that needs to go into the app databags
499+ # of the outgoing o11y relations.
500+ metrics_alert_rules: Optional[dict]
501+ log_alert_rules: Optional[dict]
502+ dashboards: Optional[List[GrafanaDashboard]]
503+
504+ # when this whole datastructure is dumped into a databag, it will be nested under this key.
505+ # while not strictly necessary (we could have it 'flattened out' into the databag),
506+ # this simplifies working with the model.
507+ KEY: ClassVar[str] = "config"
508+
509+ @property
510+ def app_name(self) -> str:
511+ """Parse out the app name from the unit name.
512+
513+ TODO: Switch to using `model_post_init` when pydantic v2 is released?
514+ https://github.com/pydantic/pydantic/issues/1729#issuecomment-1300576214
515+ """
516+ return self.principal_unit_name.split("/")[0]
517+
518+
519+class COSAgentProvider(Object):
520+ """Integration endpoint wrapper for the provider side of the cos_agent interface."""
521+
522+ def __init__(
523+ self,
524+ charm: CharmType,
525+ relation_name: str = DEFAULT_RELATION_NAME,
526+ metrics_endpoints: Optional[List["_MetricsEndpointDict"]] = None,
527+ metrics_rules_dir: str = "./src/prometheus_alert_rules",
528+ logs_rules_dir: str = "./src/loki_alert_rules",
529+ recurse_rules_dirs: bool = False,
530+ log_slots: Optional[List[str]] = None,
531+ dashboard_dirs: Optional[List[str]] = None,
532+ refresh_events: Optional[List] = None,
533+ *,
534+ scrape_configs: Optional[Union[List[dict], Callable]] = None,
535+ ):
536+ """Create a COSAgentProvider instance.
537+
538+ Args:
539+ charm: The `CharmBase` instance that is instantiating this object.
540+ relation_name: The name of the relation to communicate over.
541+ metrics_endpoints: List of endpoints in the form [{"path": path, "port": port}, ...].
542+ This argument is a simplified form of the `scrape_configs`.
543+ The contents of this list will be merged with the contents of `scrape_configs`.
544+ metrics_rules_dir: Directory where the metrics rules are stored.
545+ logs_rules_dir: Directory where the logs rules are stored.
546+ recurse_rules_dirs: Whether to recurse into rule paths.
547+ log_slots: Snap slots to connect to for scraping logs
548+ in the form ["snap-name:slot", ...].
549+ dashboard_dirs: Directory where the dashboards are stored.
550+ refresh_events: List of events on which to refresh relation data.
551+ scrape_configs: List of standard scrape_configs dicts or a callable
552+ that returns the list in case the configs need to be generated dynamically.
553+ The contents of this list will be merged with the contents of `metrics_endpoints`.
554+ """
555+ super().__init__(charm, relation_name)
556+ dashboard_dirs = dashboard_dirs or ["./src/grafana_dashboards"]
557+
558+ self._charm = charm
559+ self._relation_name = relation_name
560+ self._metrics_endpoints = metrics_endpoints or []
561+ self._scrape_configs = scrape_configs or []
562+ self._metrics_rules = metrics_rules_dir
563+ self._logs_rules = logs_rules_dir
564+ self._recursive = recurse_rules_dirs
565+ self._log_slots = log_slots or []
566+ self._dashboard_dirs = dashboard_dirs
567+ self._refresh_events = refresh_events or [self._charm.on.config_changed]
568+
569+ events = self._charm.on[relation_name]
570+ self.framework.observe(events.relation_joined, self._on_refresh)
571+ self.framework.observe(events.relation_changed, self._on_refresh)
572+ for event in self._refresh_events:
573+ self.framework.observe(event, self._on_refresh)
574+
575+ def _on_refresh(self, event):
576+ """Trigger the class to update relation data."""
577+ relations = self._charm.model.relations[self._relation_name]
578+
579+ for relation in relations:
580+ # Before a principal is related to the grafana-agent subordinate, we'd get
581+ # ModelError: ERROR cannot read relation settings: unit "zk/2": settings not found
582+ # Add a guard to make sure it doesn't happen.
583+ if relation.data and self._charm.unit in relation.data:
584+ # Subordinate relations can communicate only over unit data.
585+ try:
586+ data = CosAgentProviderUnitData(
587+ metrics_alert_rules=self._metrics_alert_rules,
588+ log_alert_rules=self._log_alert_rules,
589+ dashboards=self._dashboards,
590+ metrics_scrape_jobs=self._scrape_jobs,
591+ log_slots=self._log_slots,
592+ subordinate=self._charm.meta.subordinate,
593+ )
594+ relation.data[self._charm.unit][data.KEY] = data.json()
595+ except (
596+ pydantic.ValidationError,
597+ json.decoder.JSONDecodeError,
598+ ) as e:
599+ logger.error("Invalid relation data provided: %s", e)
600+
601+ @property
602+ def _scrape_jobs(self) -> List[Dict]:
603+ """Return a prometheus_scrape-like data structure for jobs.
604+
605+ https://prometheus.io/docs/prometheus/latest/configuration/configuration/#scrape_config
606+ """
607+ if callable(self._scrape_configs):
608+ scrape_configs = self._scrape_configs()
609+ else:
610+ # Create a copy of the user scrape_configs, since we will mutate this object
611+ scrape_configs = self._scrape_configs.copy()
612+
613+ # Convert "metrics_endpoints" to standard scrape_configs, and add them in
614+ for endpoint in self._metrics_endpoints:
615+ scrape_configs.append(
616+ {
617+ "metrics_path": endpoint["path"],
618+ "static_configs": [{"targets": [f"localhost:{endpoint['port']}"]}],
619+ }
620+ )
621+
622+ scrape_configs = scrape_configs or [DEFAULT_SCRAPE_CONFIG]
623+
624+ # Augment job name to include the app name and a unique id (index)
625+ for idx, scrape_config in enumerate(scrape_configs):
626+ scrape_config["job_name"] = "_".join(
627+ [self._charm.app.name, str(idx), scrape_config.get("job_name", "default")]
628+ )
629+
630+ return scrape_configs
631+
632+ @property
633+ def _metrics_alert_rules(self) -> Dict:
634+ """Use (for now) the prometheus_scrape AlertRules to initialize this."""
635+ alert_rules = AlertRules(
636+ query_type="promql", topology=JujuTopology.from_charm(self._charm)
637+ )
638+ alert_rules.add_path(self._metrics_rules, recursive=self._recursive)
639+ return alert_rules.as_dict()
640+
641+ @property
642+ def _log_alert_rules(self) -> Dict:
643+ """Use (for now) the loki_push_api AlertRules to initialize this."""
644+ alert_rules = AlertRules(query_type="logql", topology=JujuTopology.from_charm(self._charm))
645+ alert_rules.add_path(self._logs_rules, recursive=self._recursive)
646+ return alert_rules.as_dict()
647+
648+ @property
649+ def _dashboards(self) -> List[GrafanaDashboard]:
650+ dashboards: List[GrafanaDashboard] = []
651+ for d in self._dashboard_dirs:
652+ for path in Path(d).glob("*"):
653+ dashboard = GrafanaDashboard._serialize(path.read_bytes())
654+ dashboards.append(dashboard)
655+ return dashboards
656+
657+
658+class COSAgentDataChanged(EventBase):
659+ """Event emitted by `COSAgentRequirer` when relation data changes."""
660+
661+
662+class COSAgentValidationError(EventBase):
663+ """Event emitted by `COSAgentRequirer` when there is an error in the relation data."""
664+
665+ def __init__(self, handle, message: str = ""):
666+ super().__init__(handle)
667+ self.message = message
668+
669+ def snapshot(self) -> Dict:
670+ """Save COSAgentValidationError source information."""
671+ return {"message": self.message}
672+
673+ def restore(self, snapshot):
674+ """Restore COSAgentValidationError source information."""
675+ self.message = snapshot["message"]
676+
677+
678+class COSAgentRequirerEvents(ObjectEvents):
679+ """`COSAgentRequirer` events."""
680+
681+ data_changed = EventSource(COSAgentDataChanged)
682+ validation_error = EventSource(COSAgentValidationError)
683+
684+
685+class MultiplePrincipalsError(Exception):
686+ """Custom exception for when there are multiple principal applications."""
687+
688+ pass
689+
690+
691+class COSAgentRequirer(Object):
692+ """Integration endpoint wrapper for the Requirer side of the cos_agent interface."""
693+
694+ on = COSAgentRequirerEvents() # pyright: ignore
695+
696+ def __init__(
697+ self,
698+ charm: CharmType,
699+ *,
700+ relation_name: str = DEFAULT_RELATION_NAME,
701+ peer_relation_name: str = DEFAULT_PEER_RELATION_NAME,
702+ refresh_events: Optional[List[str]] = None,
703+ ):
704+ """Create a COSAgentRequirer instance.
705+
706+ Args:
707+ charm: The `CharmBase` instance that is instantiating this object.
708+ relation_name: The name of the relation to communicate over.
709+ peer_relation_name: The name of the peer relation to communicate over.
710+ refresh_events: List of events on which to refresh relation data.
711+ """
712+ super().__init__(charm, relation_name)
713+ self._charm = charm
714+ self._relation_name = relation_name
715+ self._peer_relation_name = peer_relation_name
716+ self._refresh_events = refresh_events or [self._charm.on.config_changed]
717+
718+ events = self._charm.on[relation_name]
719+ self.framework.observe(
720+ events.relation_joined, self._on_relation_data_changed
721+ ) # TODO: do we need this?
722+ self.framework.observe(events.relation_changed, self._on_relation_data_changed)
723+ for event in self._refresh_events:
724+ self.framework.observe(event, self.trigger_refresh) # pyright: ignore
725+
726+ # Peer relation events
727+ # A peer relation is needed as it is the only mechanism for exchanging data across
728+ # subordinate units.
729+ # self.framework.observe(
730+ # self.on[self._peer_relation_name].relation_joined, self._on_peer_relation_joined
731+ # )
732+ peer_events = self._charm.on[peer_relation_name]
733+ self.framework.observe(peer_events.relation_changed, self._on_peer_relation_changed)
734+
735+ @property
736+ def peer_relation(self) -> Optional["Relation"]:
737+ """Helper function for obtaining the peer relation object.
738+
739+ Returns: peer relation object
740+ (NOTE: would return None if called too early, e.g. during install).
741+ """
742+ return self.model.get_relation(self._peer_relation_name)
743+
744+ def _on_peer_relation_changed(self, _):
745+ # Peer data is used for forwarding data from principal units to the grafana agent
746+ # subordinate leader, for updating the app data of the outgoing o11y relations.
747+ if self._charm.unit.is_leader():
748+ self.on.data_changed.emit() # pyright: ignore
749+
750+ def _on_relation_data_changed(self, event: RelationChangedEvent):
751+ # Peer data is the only means of communication between subordinate units.
752+ if not self.peer_relation:
753+ event.defer()
754+ return
755+
756+ cos_agent_relation = event.relation
757+ if not event.unit or not cos_agent_relation.data.get(event.unit):
758+ return
759+ principal_unit = event.unit
760+
761+ # Coherence check
762+ units = cos_agent_relation.units
763+ if len(units) > 1:
764+ # should never happen
765+ raise ValueError(
766+ f"unexpected error: subordinate relation {cos_agent_relation} "
767+ f"should have exactly one unit"
768+ )
769+
770+ if not (raw := cos_agent_relation.data[principal_unit].get(CosAgentProviderUnitData.KEY)):
771+ return
772+
773+ if not (provider_data := self._validated_provider_data(raw)):
774+ return
775+
776+ # Copy data from the principal relation to the peer relation, so the leader could
777+ # follow up.
778+ # Save the originating unit name, so it could be used for topology later on by the leader.
779+ data = CosAgentPeersUnitData( # peer relation databag model
780+ principal_unit_name=event.unit.name,
781+ principal_relation_id=str(event.relation.id),
782+ principal_relation_name=event.relation.name,
783+ metrics_alert_rules=provider_data.metrics_alert_rules,
784+ log_alert_rules=provider_data.log_alert_rules,
785+ dashboards=provider_data.dashboards,
786+ )
787+ self.peer_relation.data[self._charm.unit][
788+ f"{CosAgentPeersUnitData.KEY}-{event.unit.name}"
789+ ] = data.json()
790+
791+ # We can't easily tell if the data that was changed is limited to only the data
792+ # that goes into peer relation (in which case, if this is not a leader unit, we wouldn't
793+ # need to emit `on.data_changed`), so we're emitting `on.data_changed` either way.
794+ self.on.data_changed.emit() # pyright: ignore
795+
796+ def _validated_provider_data(self, raw) -> Optional[CosAgentProviderUnitData]:
797+ try:
798+ return CosAgentProviderUnitData(**json.loads(raw))
799+ except (pydantic.ValidationError, json.decoder.JSONDecodeError) as e:
800+ self.on.validation_error.emit(message=str(e)) # pyright: ignore
801+ return None
802+
803+ def trigger_refresh(self, _):
804+ """Trigger a refresh of relation data."""
805+ # FIXME: Figure out what we should do here
806+ self.on.data_changed.emit() # pyright: ignore
807+
808+ @property
809+ def _principal_unit(self) -> Optional[Unit]:
810+ """Return the principal unit for a relation.
811+
812+ Assumes that the relation is of type subordinate.
813+ Relies on the fact that, for subordinate relations, the only remote unit visible to
814+ *this unit* is the principal unit that this unit is attached to.
815+ """
816+ if relations := self._principal_relations:
817+ # Technically it's a list, but for subordinates there can only be one relation
818+ principal_relation = next(iter(relations))
819+ if units := principal_relation.units:
820+ # Technically it's a list, but for subordinates there can only be one
821+ return next(iter(units))
822+
823+ return None
824+
825+ @property
826+ def _principal_relations(self):
827+ relations = []
828+ for relation in self._charm.model.relations[self._relation_name]:
829+ if not json.loads(relation.data[next(iter(relation.units))]["config"]).get(
830+ ["subordinate"], False
831+ ):
832+ relations.append(relation)
833+ if len(relations) > 1:
834+ logger.error(
835+ "Multiple applications claiming to be principal. Update the cos-agent library in the client application charms."
836+ )
837+ raise MultiplePrincipalsError("Multiple principal applications.")
838+ return relations
839+
840+ @property
841+ def _remote_data(self) -> List[CosAgentProviderUnitData]:
842+ """Return a list of remote data from each of the related units.
843+
844+ Assumes that the relation is of type subordinate.
845+ Relies on the fact that, for subordinate relations, the only remote unit visible to
846+ *this unit* is the principal unit that this unit is attached to.
847+ """
848+ all_data = []
849+
850+ for relation in self._charm.model.relations[self._relation_name]:
851+ if not relation.units:
852+ continue
853+ unit = next(iter(relation.units))
854+ if not (raw := relation.data[unit].get(CosAgentProviderUnitData.KEY)):
855+ continue
856+ if not (provider_data := self._validated_provider_data(raw)):
857+ continue
858+ all_data.append(provider_data)
859+
860+ return all_data
861+
862+ def _gather_peer_data(self) -> List[CosAgentPeersUnitData]:
863+ """Collect data from the peers.
864+
865+ Returns a trimmed-down list of CosAgentPeersUnitData.
866+ """
867+ relation = self.peer_relation
868+
869+ # Ensure that whatever context we're running this in, we take the necessary precautions:
870+ if not relation or not relation.data or not relation.app:
871+ return []
872+
873+ # Iterate over all peer unit data and only collect every principal once.
874+ peer_data: List[CosAgentPeersUnitData] = []
875+ app_names: Set[str] = set()
876+
877+ for unit in chain((self._charm.unit,), relation.units):
878+ if not relation.data.get(unit):
879+ continue
880+
881+ for unit_name in relation.data.get(unit): # pyright: ignore
882+ if not unit_name.startswith(CosAgentPeersUnitData.KEY):
883+ continue
884+ raw = relation.data[unit].get(unit_name)
885+ if raw is None:
886+ continue
887+ data = CosAgentPeersUnitData(**json.loads(raw))
888+ # Have we already seen this principal app?
889+ if (app_name := data.app_name) in app_names:
890+ continue
891+ peer_data.append(data)
892+ app_names.add(app_name)
893+
894+ return peer_data
895+
896+ @property
897+ def metrics_alerts(self) -> Dict[str, Any]:
898+ """Fetch metrics alerts."""
899+ alert_rules = {}
900+
901+ seen_apps: List[str] = []
902+ for data in self._gather_peer_data():
903+ if rules := data.metrics_alert_rules:
904+ app_name = data.app_name
905+ if app_name in seen_apps:
906+ continue # dedup!
907+ seen_apps.append(app_name)
908+ # This is only used for naming the file, so be as specific as we can be
909+ identifier = JujuTopology(
910+ model=self._charm.model.name,
911+ model_uuid=self._charm.model.uuid,
912+ application=app_name,
913+ # For the topology unit, we could use `data.principal_unit_name`, but that unit
914+ # name may not be very stable: `_gather_peer_data` de-duplicates by app name so
915+ # the exact unit name that turns up first in the iterator may vary from time to
916+ # time. So using the grafana-agent unit name instead.
917+ unit=self._charm.unit.name,
918+ ).identifier
919+
920+ alert_rules[identifier] = rules
921+
922+ return alert_rules
923+
924+ @property
925+ def metrics_jobs(self) -> List[Dict]:
926+ """Parse the relation data contents and extract the metrics jobs."""
927+ scrape_jobs = []
928+ for data in self._remote_data:
929+ for job in data.metrics_scrape_jobs:
930+ # In #220, relation schema changed from a simplified dict to the standard
931+ # `scrape_configs`.
932+ # This is to ensure backwards compatibility with Providers older than v0.5.
933+ if "path" in job and "port" in job and "job_name" in job:
934+ job = {
935+ "job_name": job["job_name"],
936+ "metrics_path": job["path"],
937+ "static_configs": [{"targets": [f"localhost:{job['port']}"]}],
938+ # We include insecure_skip_verify because we are always scraping localhost.
939+ # Even if we have the certs for the scrape targets, we'd rather specify the scrape
940+ # jobs with localhost rather than the SAN DNS the cert was issued for.
941+ "tls_config": {"insecure_skip_verify": True},
942+ }
943+
944+ scrape_jobs.append(job)
945+
946+ return scrape_jobs
947+
948+ @property
949+ def snap_log_endpoints(self) -> List[SnapEndpoint]:
950+ """Fetch logging endpoints exposed by related snaps."""
951+ plugs = []
952+ for data in self._remote_data:
953+ targets = data.log_slots
954+ if targets:
955+ for target in targets:
956+ if target in plugs:
957+ logger.warning(
958+ f"plug {target} already listed. "
959+ "The same snap is being passed from multiple "
960+ "endpoints; this should not happen."
961+ )
962+ else:
963+ plugs.append(target)
964+
965+ endpoints = []
966+ for plug in plugs:
967+ if ":" not in plug:
968+ logger.error(f"invalid plug definition received: {plug}. Ignoring...")
969+ else:
970+ endpoint = SnapEndpoint(*plug.split(":"))
971+ endpoints.append(endpoint)
972+ return endpoints
973+
974+ @property
975+ def logs_alerts(self) -> Dict[str, Any]:
976+ """Fetch log alerts."""
977+ alert_rules = {}
978+ seen_apps: List[str] = []
979+
980+ for data in self._gather_peer_data():
981+ if rules := data.log_alert_rules:
982+ # This is only used for naming the file, so be as specific as we can be
983+ app_name = data.app_name
984+ if app_name in seen_apps:
985+ continue # dedup!
986+ seen_apps.append(app_name)
987+
988+ identifier = JujuTopology(
989+ model=self._charm.model.name,
990+ model_uuid=self._charm.model.uuid,
991+ application=app_name,
992+ # For the topology unit, we could use `data.principal_unit_name`, but that unit
993+ # name may not be very stable: `_gather_peer_data` de-duplicates by app name so
994+ # the exact unit name that turns up first in the iterator may vary from time to
995+ # time. So using the grafana-agent unit name instead.
996+ unit=self._charm.unit.name,
997+ ).identifier
998+
999+ alert_rules[identifier] = rules
1000+
1001+ return alert_rules
1002+
1003+ @property
1004+ def dashboards(self) -> List[Dict[str, str]]:
1005+ """Fetch dashboards as encoded content.
1006+
1007+ Dashboards are assumed not to vary across units of the same primary.
1008+ """
1009+ dashboards: List[Dict[str, Any]] = []
1010+
1011+ seen_apps: List[str] = []
1012+ for data in self._gather_peer_data():
1013+ app_name = data.app_name
1014+ if app_name in seen_apps:
1015+ continue # dedup!
1016+ seen_apps.append(app_name)
1017+
1018+ for encoded_dashboard in data.dashboards or ():
1019+ content = GrafanaDashboard(encoded_dashboard)._deserialize()
1020+
1021+ title = content.get("title", "no_title")
1022+
1023+ dashboards.append(
1024+ {
1025+ "relation_id": data.principal_relation_id,
1026+ # We have the remote charm name - use it for the identifier
1027+ "charm": f"{data.principal_relation_name}-{app_name}",
1028+ "content": content,
1029+ "title": title,
1030+ }
1031+ )
1032+
1033+ return dashboards
1034diff --git a/metadata.yaml b/metadata.yaml
1035index efc7563..61ddaad 100644
1036--- a/metadata.yaml
1037+++ b/metadata.yaml
1038@@ -31,6 +31,8 @@ provides:
1039 nrpe-external-master:
1040 interface: nrpe-external-master
1041 scope: container
1042+ cos-agent:
1043+ interface: cos_agent
1044
1045 peers:
1046 replicas:
1047diff --git a/requirements-dev.txt b/requirements-dev.txt
1048index 4f2a3f5..671f33c 100644
1049--- a/requirements-dev.txt
1050+++ b/requirements-dev.txt
1051@@ -1,3 +1,7 @@
1052 -r requirements.txt
1053 coverage
1054 flake8
1055+
1056+# Grafana Agent Library
1057+cosl
1058+pydantic < 2
1059diff --git a/src/charm.py b/src/charm.py
1060index eb3a64d..9b4f2de 100755
1061--- a/src/charm.py
1062+++ b/src/charm.py
1063@@ -24,6 +24,7 @@ from charms.operator_libs_linux.v0 import apt
1064 from charms.operator_libs_linux.v0.apt import PackageError, PackageNotFoundError
1065 from charms.operator_libs_linux.v0.passwd import group_exists, user_exists
1066 from charms.operator_libs_linux.v0.systemd import service_reload
1067+from charms.grafana_agent.v0.cos_agent import COSAgentProvider
1068
1069 from ops.charm import (
1070 ActionEvent,
1071@@ -182,6 +183,8 @@ class LandscapeServerCharm(CharmBase):
1072 self.landscape_uid = user_exists("landscape").pw_uid
1073 self.root_gid = group_exists("root").gr_gid
1074
1075+ self._grafana_agent = COSAgentProvider(self)
1076+
1077 def _on_config_changed(self, _) -> None:
1078 prev_status = self.unit.status
1079
1080diff --git a/tests/test_charm.py b/tests/test_charm.py
1081index 785d011..e44e8ab 100644
1082--- a/tests/test_charm.py
1083+++ b/tests/test_charm.py
1084@@ -24,7 +24,8 @@ from charms.operator_libs_linux.v0.apt import (
1085
1086 from charm import (
1087 DEFAULT_SERVICES, HAPROXY_CONFIG_FILE, LANDSCAPE_PACKAGES, LEADER_SERVICES, LSCTL,
1088- NRPE_D_DIR, SCHEMA_SCRIPT, HASH_ID_DATABASES, LandscapeServerCharm)
1089+ NRPE_D_DIR, SCHEMA_SCRIPT, HASH_ID_DATABASES, LandscapeServerCharm,
1090+ )
1091
1092
1093 class TestCharm(unittest.TestCase):

Subscribers

People subscribed via source and target branches

to all changes: