Merge ~rharding/charm-grafana:add-mysql-datasource into ~prometheus-charmers/charm-grafana:master

Proposed by Richard Harding
Status: Superseded
Proposed branch: ~rharding/charm-grafana:add-mysql-datasource
Merge into: ~prometheus-charmers/charm-grafana:master
Diff against target: 204 lines (+108/-35)
1 file modified
reactive/grafana.py (+108/-35)
Reviewer Review Type Date Requested Status
Jamon Camisso (community) Needs Fixing
Paul Collins Needs Fixing
Tom Haddon Needs Fixing
Review via email: mp+329855@code.launchpad.net

This proposal has been superseded by a proposal from 2017-10-18.

Description of the change

This change adds support for a mysql based data source.

This is needed to allow building grafana dashboards on top of MySQL data that is generated as part of Juju KPIs.

In order to do this the current grafana-source endpoints needs to be made more generic. There's work around for issues in the current interface where it auto prefixes things with http:// and pulls the IP address of the relation unit vs getting it fed to the system.

The idea is that I've got a charm that pulls log data and stores it parsed into a mysql db.

mysql -> db relation -> my application charmed -> feeds the data source to grafana for dashboards.

To post a comment you must log in.
Revision history for this message
Richard Harding (rharding) wrote :

Current test to make this work:

juju deploy mysql
juju deploy cs:~rharding/gypsy-danger-1
juju relate mysql:db gypsy-danger
juju deploy grafana (this updated charm)
juju relate grafana:grafana-source gypsy-danger

Log into grafana and see the data source. Hitting the verify it works button should show it works.

https://usercontent.irccloud-cdn.com/file/3KOzJdGO/Selection_010.png

Revision history for this message
Richard Harding (rharding) wrote :

Also wanted to make sure the prometheus grafana-source relation was still in working order and validated that in a gce model:

https://usercontent.irccloud-cdn.com/file/fS9oUqzx/Selection_011.png

Revision history for this message
Richard Harding (rharding) wrote :
Revision history for this message
Tom Haddon (mthaddon) wrote :

Could we have a documentation update for this too explaining the feature?

review: Needs Fixing
Revision history for this message
Jacek Nykis (jacekn) wrote :

Thanks! It's a nice addition and I only had quick look but some early feedback and questions inline. We filed RT#105512 to review and test properly.

Could you clarify what the gypsy-danger charm is? Is it mysql charm fork? Or will it proxy to the DB?

My first instinct, without knowing what the gypsy-danger is, is that if you want to talk to the DB you should relate grafana to mysql.

Revision history for this message
Richard Harding (rharding) wrote :

Thanks for the initial review.

gypsy-danger is a python project to parse the apache log files from the charmstore that are tossed into swift looking for important juju kpi data. It then loads that data into several tables in a mysql database that then allows for building queries against the parsed data.

The python code project is: https://github.com/canonicalltd/gypsy-danger
and I started a charm that will eventually deploy and cron job the fetching of new apache logs from swift each day (updating the mysql database with new log data each day).

I've gotten this working with hand coding/deploying here at: kpi.jujugui.org where it talks to a mysql server and I've put the gypsy-danger python parser code on it's own server and so I'm working to charm this up.

If I build a new relation for mysql->grafana directly then it adds a second path to add a new grafana datasource. It also breaks the usability. My code needs a database to store parsed data from apache logs. It needs to be the same exact database with the same user/credential as grafana needs to build a dashboard upon. In this way the theory is that:

gypsy-danger needs a database to store data in. It requests from mysql a fresh db, username, password. MySQL provides that and gypsy-danger then starts to populate the database with tables and records of data. Then, gypsy-danger wants to have a visual front end of that data and sends the data source information to grafana for visualization.

In fact, one long term thing in my wishlist, is that gypsy-danger could actually send over a saved dashboard to grafana so that both the datasource and the dashboard are loaded and ready to go out of the box.

I hope that clears up what I'm trying to do with this. Let me know if I'm unclear or I've taken a dark path of wrongness that I should stay away from.

b558079... by Richard Harding

Rebase away the initial lint work

Revision history for this message
Paul Collins (pjdc) wrote :

A couple remarks inline. As well as this, I think it'd be valuable to have a Mojo spec to deploy grafana, etc., along with gypsy-danger and mysql and exercise the new grafana functionality so that we can CI it, although I don't know if that should be a blocker. There are a couple of likely-looking jobs on Mojo CI Jenkins, although neither passes currently:

https://jenkins.canonical.com/is-mojo-ci/job/live-is-charm-test-grafana-xenial/
https://jenkins.canonical.com/is-mojo-ci/job/live-is-prometheus/

review: Needs Fixing
Revision history for this message
Jacek Nykis (jacekn) wrote :

> Thanks for the initial review.

Thank you for your contribution!

> If I build a new relation for mysql->grafana directly then it adds a second
> path to add a new grafana datasource. It also breaks the usability. My code
> needs a database to store parsed data from apache logs. It needs to be the
> same exact database with the same user/credential as grafana needs to build a
> dashboard upon.

I see, if it needs to be the same DB then the way you want to do it is probably the only sensible one.

> I hope that clears up what I'm trying to do with this. Let me know if I'm
> unclear or I've taken a dark path of wrongness that I should stay away from.

Thanks, I think it's clear why direct grafana<->mysql relation will not work.

What I'd say though is that it's still worth moving some of the mysql source handling logic into the grafana-source interface. mysql is a valid grafana source and IMO it makes logical sense for the interface to handle it without need for grafana to mangle relation data.

Revision history for this message
Richard Harding (rharding) wrote :

New interfaces MP located here:
https://code.launchpad.net/~rharding/interface-grafana-source/+git/interface-grafana-source/+merge/330041

Will push an updated PR here that supports using that shortly.

18aa0c5... by Richard Harding

Use the updated interface layer

ec70312... by Richard Harding

Revert the status_set lint cleanup

5bc388d... by Richard Harding

More cleanup from the rebase

Revision history for this message
Richard Harding (rharding) wrote :

I've updated the interface and updated this MP removing much of the lint cleanup previously done.

I can look into mojo stuff there but that'll be a longer road to hoe as I've not gotten that setup and since they're not currently working...I'd prefer to add a test/support for gypsy-danger as part of something that was working to test current Grafana/Prometheus if required.

7fa684a... by Richard Harding

Add hook to trigger when relation broken to remove record of datasource

Revision history for this message
Richard Harding (rharding) wrote :

Updated the testing bundle to:

 juju deploy cs:~rharding/gypsy-danger-testing-2

This includes the changes that add the hook to properly remove the datasource when the relation is broken so that it can be readded a second time.

Tested with mysql and prometheus and verified the datasources are removed and re-add successfully.

This goes with the interfaces changes in:

https://code.launchpad.net/~rharding/interface-grafana-source/+git/interface-grafana-source/+merge/330041

Revision history for this message
Jamon Camisso (jamon) wrote :

Looks good to me, it is working well any way that I try it along with the grafana-source charm.

Just two tiny flake8 issues and it's ready to merge and promulgate to the charmstore.

review: Approve
e2c9dc0... by Richard Harding

Flake8 cleanup per reviewj

Revision history for this message
Richard Harding (rharding) wrote :

Updated the flake8 issues per review.

Revision history for this message
Jamon Camisso (jamon) wrote :

Ah so close, but when I go to merge this I see that upstream has progressed ahead a bit in terms of admin-password, google analytics etc.

Could you please rebase your branch against master and push again? Sorry about that, I didn't realize the upstream charm had those changes.

review: Needs Fixing
d78ca59... by Richard Harding

Merge branch 'master' of git+ssh://git.launchpad.net/grafana-charm into add-mysql-datasource

Unmerged commits

d78ca59... by Richard Harding

Merge branch 'master' of git+ssh://git.launchpad.net/grafana-charm into add-mysql-datasource

e2c9dc0... by Richard Harding

Flake8 cleanup per reviewj

7fa684a... by Richard Harding

Add hook to trigger when relation broken to remove record of datasource

5bc388d... by Richard Harding

More cleanup from the rebase

ec70312... by Richard Harding

Revert the status_set lint cleanup

18aa0c5... by Richard Harding

Use the updated interface layer

b558079... by Richard Harding

Rebase away the initial lint work

37eba3c... by Richard Harding

Fix the various cleanup required from the layer

031a16d... by Richard Harding

First stab at getting the mysql source to load into the db correctly

b44f4d5... by Richard Harding

Add first bit of mysql relation definition

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
diff --git a/reactive/grafana.py b/reactive/grafana.py
index 8b3b126..f63ee5a 100644
--- a/reactive/grafana.py
+++ b/reactive/grafana.py
@@ -1,19 +1,33 @@
1import six1import base64
2import os2import datetime
3import glob3import glob
4import hashlib4import hashlib
5import datetime
6import requests
7import json5import json
8import subprocess6import os
9import base647import requests
10import shutil8import shutil
11from charmhelpers import fetch9import six
12from charmhelpers.core import host, hookenv, unitdata10import subprocess
13from charmhelpers.core.templating import render11
14from charmhelpers.contrib.charmsupport import nrpe12from charmhelpers.contrib.charmsupport import nrpe
15from charms.reactive import hook, remove_state, set_state, when, when_not13from charmhelpers.core import (
16from charms.reactive.helpers import any_file_changed, data_changed, is_state14 hookenv,
15 host,
16 unitdata,
17)
18from charmhelpers.core.templating import render
19from charmhelpers import fetch
20from charms.reactive.helpers import (
21 any_file_changed,
22 is_state,
23)
24from charms.reactive import (
25 hook,
26 remove_state,
27 set_state,
28 when,
29 when_not,
30)
1731
18import time32import time
1933
@@ -284,20 +298,38 @@ def wipe_nrpe_checks():
284@when('grafana-source.available')298@when('grafana-source.available')
285def configure_sources(relation):299def configure_sources(relation):
286 sources = relation.datasources()300 sources = relation.datasources()
287 if not data_changed('grafana.sources', sources):
288 return
289 for ds in sources:301 for ds in sources:
290 hookenv.log('Found datasource: {}'.format(str(ds)))302 hookenv.log('Found datasource: {}'.format(str(ds)))
291 # Ensure datasource is configured303 # Ensure datasource is configured
292 check_datasource(ds)304 check_datasource(ds)
293305
294306
295@when('grafana.started')307@hook('grafana-source-relation-departed')
296@when_not('grafana-source.available')308def sources_gone(relation):
297def sources_gone():309 hookenv.log('sources gone')
298 # Last datasource gone, remove as needed310 hookenv.log(relation)
299 # TODO implementation311 # Update the datasources now with the new information that something is
300 pass312 # not available.
313 conv = relation.conversation()
314 ds = {
315 'service_name': conv.scope.split('/')[0],
316 'type': conv.get_remote('type'),
317 'url': conv.get_remote('url'),
318 'description': conv.get_remote('description')
319 }
320 hookenv.log('Removing datasource:')
321 hookenv.log(ds)
322
323 if ds:
324 conn = sqlite3.connect('/var/lib/grafana/grafana.db', timeout=30)
325 cur = conn.cursor()
326 cur.execute(
327 'DELETE FROM DATA_SOURCE WHERE type=? AND url=?',
328 (ds['type'], ds['url'])
329 )
330 conn.commit()
331 cur.close()
332 relation.set_local = None
301333
302334
303@when('website.available')335@when('website.available')
@@ -352,14 +384,11 @@ def check_datasource(ds):
352 # 'username': 'username,384 # 'username': 'username,
353 # 'password': 'password385 # 'password': 'password
354 # }386 # }
355
356 conn = sqlite3.connect('/var/lib/grafana/grafana.db', timeout=30)387 conn = sqlite3.connect('/var/lib/grafana/grafana.db', timeout=30)
357 cur = conn.cursor()388 cur = conn.cursor()
358 query = cur.execute('SELECT id, type, name, url, is_default FROM DATA_SOURCE')389 query = cur.execute('SELECT id, type, name, url, is_default FROM DATA_SOURCE')
359 rows = query.fetchall()390 rows = query.fetchall()
360 ds_name = '{} - {}'.format(ds['service_name'], ds['description'])391 ds_name = '{} - {}'.format(ds['service_name'], ds['description'])
361 print(ds_name)
362 print(rows)
363 for row in rows:392 for row in rows:
364 if (row[1] == ds['type'] and row[2] == ds_name and row[3] == ds['url']):393 if (row[1] == ds['type'] and row[2] == ds_name and row[3] == ds['url']):
365 hookenv.log('Datasource already exist, updating: {}'.format(ds_name))394 hookenv.log('Datasource already exist, updating: {}'.format(ds_name))
@@ -371,7 +400,6 @@ def check_datasource(ds):
371 return400 return
372 hookenv.log('Adding new datasource: {}'.format(ds_name))401 hookenv.log('Adding new datasource: {}'.format(ds_name))
373 stmt, values = generate_query(ds, 0)402 stmt, values = generate_query(ds, 0)
374 print(stmt, values)
375 cur.execute(stmt, values)403 cur.execute(stmt, values)
376 conn.commit()404 conn.commit()
377 conn.close()405 conn.close()
@@ -379,15 +407,20 @@ def check_datasource(ds):
379407
380def generate_query(ds, is_default, id=None):408def generate_query(ds, is_default, id=None):
381 if not id:409 if not id:
382 stmt = 'INSERT INTO DATA_SOURCE (org_id, version, type, name' + \410 fields = [
383 ', access, url, is_default, created, updated, basic_auth'411 'org_id',
384 if 'username' in ds and 'password' in ds:412 'version',
385 stmt += ', basic_auth_user, basic_auth_password)' + \413 'type',
386 ' VALUES (?,?,?,?,?,?,?,?,?,?,?,?)'414 'name',
387 else:415 'access',
388 stmt += ') VALUES (?,?,?,?,?,?,?,?,?,?)'416 'url',
417 'is_default',
418 'created',
419 'updated',
420 ]
421
389 dtime = datetime.datetime.today().strftime("%F %T")422 dtime = datetime.datetime.today().strftime("%F %T")
390 values = (1,423 values = [1,
391 0,424 0,
392 ds['type'],425 ds['type'],
393 '{} - {}'.format(ds['service_name'], ds['description']),426 '{} - {}'.format(ds['service_name'], ds['description']),
@@ -395,11 +428,51 @@ def generate_query(ds, is_default, id=None):
395 ds['url'],428 ds['url'],
396 is_default,429 is_default,
397 dtime,430 dtime,
398 dtime)431 dtime]
399 if 'username' in ds and 'password' in ds:432
400 values = values + (1, ds['username'], ds['password'])433 # Used in connecting a mysql source
434 if 'database' in ds:
435 hookenv.log('Adding fields for the database')
436 fields.append('database')
437 values.append(ds['database'])
438
439 if ds['type'] == 'mysql':
440 # There's only one username/password on the interface to we coopt
441 # it for the mysql login in the mysql case
442 fields.append('user')
443 values.append(ds['username'])
444 fields.append('password')
445 values.append(ds['password'])
446
447 if 'username' in ds and 'password' in ds and ds['type'] != 'msyql':
448 # We use the username and password in a backward compatible way
449 # for basic_auth when it's not a mysql data source.
450 hookenv.log('Adding basic_auth info')
451 fields.append('basic_auth')
452 fields.append('basic_auth_user')
453 fields.append('basic_auth_password')
454 values.append(1)
455 values.append(ds['username'])
456 values.append(ds['password'])
401 else:457 else:
402 values = values + (0,)458 fields.append('basic_auth')
459 values.append(0)
460
461 values_placeholders = '?, ' * len(values)
462 # Remove the space [, ] at the end of the generated string.
463 values_placeholders = values_placeholders[:-2]
464
465 stmt = """
466 INSERT INTO DATA_SOURCE ({})
467 VALUES ({})
468 """.format(
469 ", ".join(fields),
470 values_placeholders
471 )
472
473 hookenv.log('Statement is: \n{}'.format(stmt))
474 hookenv.log(values)
475
403 else:476 else:
404 if 'username' in ds and 'password' in ds:477 if 'username' in ds and 'password' in ds:
405 stmt = 'UPDATE DATA_SOURCE SET basic_auth_user = ?, basic_auth_password = ?, basic_auth = 1'478 stmt = 'UPDATE DATA_SOURCE SET basic_auth_user = ?, basic_auth_password = ?, basic_auth = 1'

Subscribers

People subscribed via source and target branches