Merge lp:~openerp-community/server-env-tools/7.0-modules-from-openobject-extension into lp:~server-env-tools-core-editors/server-env-tools/7.0

Status: Superseded
Proposed branch: lp:~openerp-community/server-env-tools/7.0-modules-from-openobject-extension
Merge into: lp:~server-env-tools-core-editors/server-env-tools/7.0
Diff against target: 841 lines (+764/-0)
13 files modified
base_external_dbsource/__init__.py (+24/-0)
base_external_dbsource/__openerp__.py (+61/-0)
base_external_dbsource/base_external_dbsource.py (+175/-0)
base_external_dbsource/base_external_dbsource_demo.xml (+15/-0)
base_external_dbsource/base_external_dbsource_view.xml (+54/-0)
base_external_dbsource/security/ir.model.access.csv (+2/-0)
base_external_dbsource/test/dbsource_connect.yml (+5/-0)
import_odbc/__init__.py (+24/-0)
import_odbc/__openerp__.py (+83/-0)
import_odbc/import_odbc.py (+218/-0)
import_odbc/import_odbc_demo.xml (+15/-0)
import_odbc/import_odbc_view.xml (+86/-0)
import_odbc/security/ir.model.access.csv (+2/-0)
To merge this branch: bzr merge lp:~openerp-community/server-env-tools/7.0-modules-from-openobject-extension
Reviewer Review Type Date Requested Status
Yannick Vaucher @ Camptocamp code review, no tests Needs Fixing
Nicolas Bessi - Camptocamp (community) code review, no tests Needs Fixing
Review via email: mp+183539@code.launchpad.net

This proposal has been superseded by a proposal from 2014-01-07.

Description of the change

[ADD] import_odbc and base_external_dbsource from lp:openobject-extension/7.0

To post a comment you must log in.
Revision history for this message
Nicolas Bessi - Camptocamp (nbessi-c2c-deactivatedaccount) wrote :

Hello,

Thanks for the revirew

Some comments below:

Please use explicit relative import: from . import

in __openerp__ remove the init key please it is unused

in base_external_source.py:

just a details but the order of import is not standard:
-atop should be standard lib call,
-below external used module,
-finally localy imported module

please use OpenERP/community convention to for the model:
from openerp.osv import orm, fields

class base_external_dbsource(orm.Model):

line 71: back slash is not needed
line 82 back slash an + is not needed

in conn_open
            if '%s' not in data.conn_string:
                connStr += ';PWD=%s'
I'm not sure this is a judicious choice.
What if password set in FREETDS config files

In function connexion_test you raise an exception to said that connection works.
It will be better to return an ir.act_windows.

A quick pass to a lynter to fix PEP8 will be nice.

In demo data used for yaml test

        <record model="base.external.dbsource" id="demo_postgre">
            <field name="name">PostgreSQL local</field>
            <field name="conn_string">dbname='postgres' password=%s</field>
            <field name="password">postgresql</field>
            <field name="connector">postgresql</field>
        </record>

Depends on postgres configuration. I have no better idea but a comment in test should be made to validate this first.

Thanks fro the patch. I will look if I can integrate it in connector_odbc.

Regards

Nicolas

review: Needs Fixing (code review, no tests)
Revision history for this message
Daniel Reis (dreis-pt) wrote :

Maxime: I can work on the cleanup of this code, if you like.

Revision history for this message
Maxime Chambreuil (http://www.savoirfairelinux.com) (max3903) wrote :

Thanks Daniel. Please go ahead.

52. By Maxime Chambreuil (http://www.savoirfairelinux.com)

[FIX] PEP8 compliance and review comments

Revision history for this message
Maxime Chambreuil (http://www.savoirfairelinux.com) (max3903) wrote :

Daniel ?

Revision history for this message
Yannick Vaucher @ Camptocamp (yvaucher-c2c) wrote :

Please remove executable right on those files (chmod -x):

base_external_dbsource/base_external_dbsource_demo.xml
base_external_dbsource/__init__.py
base_external_dbsource/security/ir.model.access.csv
base_external_dbsource/test/dbsource_connect.yml
import_odbc/import_odbc_demo.xml
import_odbc/__init__.py
import_odbc/security/ir.model.access.csv

base_external_dbsource.py|33 col 1| F401 'pymssql' imported but unused
base_external_dbsource.py|40 col 1| F811 redefinition of unused 'sqlalchemy' from line 32
base_external_dbsource.py|41 col 1| F401 'MySQLdb' imported but unused
base_external_dbsource.py|65 col 1| F811 redefinition of unused 'sqlalchemy' from line 40

base_external_dbsource.py|166 col 1| F821 undefined name 'osv'
base_external_dbsource.py|177 col 1| F821 undefined name 'osv'

Use orm.execpt_orm instead

And at least split the l.43 in import_odbc/import_odbc

Cheers

review: Needs Fixing (code review, no tests)
53. By Maxime Chambreuil (http://www.savoirfairelinux.com)

[FIX] PEP8 and file permission issues

Revision history for this message
Maxime Chambreuil (http://www.savoirfairelinux.com) (max3903) wrote :

Thanks Yannick.

I made the changes according to your review.

54. By Maxime Chambreuil (http://www.savoirfairelinux.com)

[FIX] Menu, views and version number

Unmerged revisions

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== added directory 'base_external_dbsource'
2=== added file 'base_external_dbsource/__init__.py'
3--- base_external_dbsource/__init__.py 1970-01-01 00:00:00 +0000
4+++ base_external_dbsource/__init__.py 2013-12-20 18:51:45 +0000
5@@ -0,0 +1,24 @@
6+# -*- coding: utf-8 -*-
7+##############################################################################
8+#
9+# Daniel Reis
10+# 2011
11+#
12+# This program is free software: you can redistribute it and/or modify
13+# it under the terms of the GNU Affero General Public License as
14+# published by the Free Software Foundation, either version 3 of the
15+# License, or (at your option) any later version.
16+#
17+# This program is distributed in the hope that it will be useful,
18+# but WITHOUT ANY WARRANTY; without even the implied warranty of
19+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
20+# GNU Affero General Public License for more details.
21+#
22+# You should have received a copy of the GNU Affero General Public License
23+# along with this program. If not, see <http://www.gnu.org/licenses/>.
24+#
25+##############################################################################
26+
27+from . import base_external_dbsource
28+
29+# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
30
31=== added file 'base_external_dbsource/__openerp__.py'
32--- base_external_dbsource/__openerp__.py 1970-01-01 00:00:00 +0000
33+++ base_external_dbsource/__openerp__.py 2013-12-20 18:51:45 +0000
34@@ -0,0 +1,61 @@
35+# -*- coding: utf-8 -*-
36+##############################################################################
37+#
38+# Daniel Reis, 2011
39+# Additional contributions by Maxime Chambreuil, Savoir-faire Linux
40+#
41+# This program is free software: you can redistribute it and/or modify
42+# it under the terms of the GNU Affero General Public License as
43+# published by the Free Software Foundation, either version 3 of the
44+# License, or (at your option) any later version.
45+#
46+# This program is distributed in the hope that it will be useful,
47+# but WITHOUT ANY WARRANTY; without even the implied warranty of
48+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
49+# GNU Affero General Public License for more details.
50+#
51+# You should have received a copy of the GNU Affero General Public License
52+# along with this program. If not, see <http://www.gnu.org/licenses/>.
53+#
54+##############################################################################
55+
56+{
57+ 'name': 'External Database Sources',
58+ 'version': '61.3',
59+ 'category': 'Tools',
60+ 'description': """
61+This module allows you to define connections to foreign databases using ODBC,
62+Oracle Client or SQLAlchemy.
63+
64+Database sources can be configured in Settings > Configuration -> Data sources.
65+
66+Depending on the database, you need:
67+ * to install unixodbc and python-pyodbc packages to use ODBC connections.
68+ * to install FreeTDS driver (tdsodbc package) and configure it through ODBC to
69+ connect to Microsoft SQL Server.
70+ * to install and configure Oracle Instant Client and cx_Oracle python library
71+ to connect to Oracle.
72+ """,
73+ 'author': 'Daniel Reis',
74+ 'website': 'http://launchpad.net/addons-tko',
75+ 'images': [
76+ 'images/screenshot01.png',
77+ ],
78+ 'depends': [
79+ 'base',
80+ ],
81+ 'data': [
82+ 'base_external_dbsource_view.xml',
83+ 'security/ir.model.access.csv',
84+ ],
85+ 'demo': [
86+ 'base_external_dbsource_demo.xml',
87+ ],
88+ 'test': [
89+ 'dbsource_connect.yml',
90+ ],
91+ 'installable': True,
92+ 'active': False,
93+}
94+
95+# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
96
97=== added file 'base_external_dbsource/base_external_dbsource.py'
98--- base_external_dbsource/base_external_dbsource.py 1970-01-01 00:00:00 +0000
99+++ base_external_dbsource/base_external_dbsource.py 2013-12-20 18:51:45 +0000
100@@ -0,0 +1,175 @@
101+# -*- coding: utf-8 -*-
102+##############################################################################
103+#
104+# Daniel Reis
105+# 2011
106+#
107+# This program is free software: you can redistribute it and/or modify
108+# it under the terms of the GNU Affero General Public License as
109+# published by the Free Software Foundation, either version 3 of the
110+# License, or (at your option) any later version.
111+#
112+# This program is distributed in the hope that it will be useful,
113+# but WITHOUT ANY WARRANTY; without even the implied warranty of
114+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
115+# GNU Affero General Public License for more details.
116+#
117+# You should have received a copy of the GNU Affero General Public License
118+# along with this program. If not, see <http://www.gnu.org/licenses/>.
119+#
120+##############################################################################
121+
122+import os
123+import logging
124+from openerp.osv import orm, fields
125+from openerp.tools.translate import _
126+import openerp.tools as tools
127+_logger = logging.getLogger(__name__)
128+
129+CONNECTORS = []
130+
131+try:
132+ import sqlalchemy
133+ CONNECTORS.append(('sqlite', 'SQLite'))
134+ try:
135+ import pymssql
136+ CONNECTORS.append(('mssql', 'Microsoft SQL Server'))
137+ except:
138+ _logger.info('MS SQL Server not available. Please install "pymssql"\
139+ python package.')
140+ try:
141+ import MySQLdb
142+ CONNECTORS.append(('mysql', 'MySQL'))
143+ except:
144+ _logger.info('MySQL not available. Please install "mysqldb"\
145+ python package.')
146+except:
147+ _logger.info('SQL Alchemy not available. Please install "slqalchemy"\
148+ python package.')
149+try:
150+ import pyodbc
151+ CONNECTORS.append(('pyodbc', 'ODBC'))
152+except:
153+ _logger.info('ODBC libraries not available. Please install "unixodbc"\
154+ and "python-pyodbc" packages.')
155+
156+try:
157+ import cx_Oracle
158+ CONNECTORS.append(('cx_Oracle', 'Oracle'))
159+except:
160+ _logger.info('Oracle libraries not available. Please install "cx_Oracle"\
161+ python package.')
162+
163+import psycopg2
164+CONNECTORS.append(('postgresql', 'PostgreSQL'))
165+
166+
167+class base_external_dbsource(orm.Model):
168+ _name = "base.external.dbsource"
169+ _description = 'External Database Sources'
170+ _columns = {
171+ 'name': fields.char('Datasource name', required=True, size=64),
172+ 'conn_string': fields.text('Connection string', help="""
173+Sample connection strings:
174+- Microsoft SQL Server:
175+ mssql+pymssql://username:%s@server:port/dbname?charset=utf8
176+- MySQL: mysql://user:%s@server:port/dbname
177+- ODBC: DRIVER={FreeTDS};SERVER=server.address;Database=mydb;UID=sa
178+- ORACLE: username/%s@//server.address:port/instance
179+- PostgreSQL:
180+ dbname='template1' user='dbuser' host='localhost' port='5432' password=%s
181+- SQLite: sqlite:///test.db
182+"""),
183+ 'password': fields.char('Password', size=40),
184+ 'connector': fields.selection(CONNECTORS, 'Connector',
185+ required=True,
186+ help="If a connector is missing from the\
187+ list, check the server log to confirm\
188+ that the required components were\
189+ detected."),
190+ }
191+
192+ def conn_open(self, cr, uid, id1):
193+ #Get dbsource record
194+ data = self.browse(cr, uid, id1)
195+ #Build the full connection string
196+ connStr = data.conn_string
197+ if data.password:
198+ if '%s' not in data.conn_string:
199+ connStr += ';PWD=%s'
200+ connStr = connStr % data.password
201+ #Try to connect
202+ if data.connector == 'cx_Oracle':
203+ os.environ['NLS_LANG'] = 'AMERICAN_AMERICA.UTF8'
204+ conn = cx_Oracle.connect(connStr)
205+ elif data.connector == 'pyodbc':
206+ conn = pyodbc.connect(connStr)
207+ elif data.connector in ('sqlite', 'mysql', 'mssql'):
208+ conn = sqlalchemy.create_engine(connStr).connect()
209+ elif data.connector == 'postgresql':
210+ conn = psycopg2.connect(connStr)
211+
212+ return conn
213+
214+ def execute(self, cr, uid, ids, sqlquery, sqlparams=None, metadata=False,
215+ context=None):
216+ """Executes SQL and returns a list of rows.
217+
218+ "sqlparams" can be a dict of values, that can be referenced in
219+ the SQL statement using "%(key)s" or, in the case of Oracle,
220+ ":key".
221+ Example:
222+ sqlquery = "select * from mytable where city = %(city)s and
223+ date > %(dt)s"
224+ params = {'city': 'Lisbon',
225+ 'dt': datetime.datetime(2000, 12, 31)}
226+
227+ If metadata=True, it will instead return a dict containing the
228+ rows list and the columns list, in the format:
229+ { 'cols': [ 'col_a', 'col_b', ...]
230+ , 'rows': [ (a0, b0, ...), (a1, b1, ...), ...] }
231+ """
232+ data = self.browse(cr, uid, ids)
233+ rows, cols = list(), list()
234+ for obj in data:
235+ conn = self.conn_open(cr, uid, obj.id)
236+ if obj.connector in ["sqlite", "mysql", "mssql"]:
237+ #using sqlalchemy
238+ cur = conn.execute(sqlquery, sqlparams)
239+ if metadata:
240+ cols = cur.keys()
241+ rows = [r for r in cur]
242+ else:
243+ #using other db connectors
244+ cur = conn.cursor()
245+ cur.execute(sqlquery, sqlparams)
246+ if metadata:
247+ cols = [d[0] for d in cur.description]
248+ rows = cur.fetchall()
249+ conn.close()
250+ if metadata:
251+ return{'cols': cols, 'rows': rows}
252+ else:
253+ return rows
254+
255+ def connection_test(self, cr, uid, ids, context=None):
256+ for obj in self.browse(cr, uid, ids, context):
257+ conn = False
258+ try:
259+ conn = self.conn_open(cr, uid, obj.id)
260+ except Exception, e:
261+ raise orm.except_orm(_("Connection test failed!"),
262+ _("Here is what we got instead:\n %s")
263+ % tools.ustr(e))
264+ finally:
265+ try:
266+ if conn:
267+ conn.close()
268+ except Exception:
269+ # ignored, just a consequence of the previous exception
270+ pass
271+ #TODO: if OK a (wizard) message box should be displayed
272+ raise orm.except_orm(_("Connection test succeeded!"),
273+ _("Everything seems properly set up!"))
274+
275+#EOF
276
277=== added file 'base_external_dbsource/base_external_dbsource_demo.xml'
278--- base_external_dbsource/base_external_dbsource_demo.xml 1970-01-01 00:00:00 +0000
279+++ base_external_dbsource/base_external_dbsource_demo.xml 2013-12-20 18:51:45 +0000
280@@ -0,0 +1,15 @@
281+<?xml version="1.0"?>
282+<openerp>
283+ <data>
284+
285+ <record model="base.external.dbsource" id="demo_postgre">
286+ <field name="name">PostgreSQL local</field>
287+ <field name="conn_string">dbname='postgres' password=%s</field>
288+ <field name="password">postgresql</field>
289+ <field name="connector">postgresql</field>
290+ </record>
291+
292+ </data>
293+</openerp>
294+
295+
296
297=== added file 'base_external_dbsource/base_external_dbsource_view.xml'
298--- base_external_dbsource/base_external_dbsource_view.xml 1970-01-01 00:00:00 +0000
299+++ base_external_dbsource/base_external_dbsource_view.xml 2013-12-20 18:51:45 +0000
300@@ -0,0 +1,54 @@
301+<?xml version="1.0"?>
302+<openerp>
303+ <data>
304+
305+ <!-- DBSource -->
306+
307+ <record model="ir.ui.view" id="view_dbsource_tree">
308+ <field name="name">base.external.dbsource.tree</field>
309+ <field name="model">base.external.dbsource</field>
310+ <field name="type">tree</field>
311+ <field name="arch" type="xml">
312+ <tree string="External DB Sources">
313+ <field name="name"/>
314+ <field name="connector"/>
315+ <field name="conn_string"/>
316+ </tree>
317+ </field>
318+ </record>
319+
320+ <record model="ir.ui.view" id="view_dbsource_form">
321+ <field name="name">base.external.dbsource.form</field>
322+ <field name="model">base.external.dbsource</field>
323+ <field name="type">form</field>
324+ <field name="arch" type="xml">
325+ <form string="External DB Source">
326+ <field name="name"/>
327+ <field name="password" password="True"/>
328+ <newline/>
329+ <field name="connector" colspan="2"/>
330+ <newline/>
331+ <field name="conn_string" colspan="4"/>
332+ <newline/>
333+ <button name="connection_test" string="Test Connection" type="object" icon="gtk-network" colspan="4"/>
334+ </form>
335+ </field>
336+ </record>
337+
338+ <record model="ir.actions.act_window" id="action_dbsource">
339+ <field name="name">External Database Sources</field>
340+ <field name="res_model">base.external.dbsource</field>
341+ <field name="view_type">form</field>
342+ <field name="view_mode">tree,form</field>
343+ <field name="view_id" ref="view_dbsource_tree"/>
344+ </record>
345+
346+ <menuitem name="Database Sources"
347+ id="menu_dbsource"
348+ parent="base.next_id_9"
349+ action="action_dbsource"/>
350+
351+ </data>
352+</openerp>
353+
354+
355
356=== added directory 'base_external_dbsource/images'
357=== added file 'base_external_dbsource/images/screenshot01.png'
358Binary files base_external_dbsource/images/screenshot01.png 1970-01-01 00:00:00 +0000 and base_external_dbsource/images/screenshot01.png 2013-12-20 18:51:45 +0000 differ
359=== added directory 'base_external_dbsource/security'
360=== added file 'base_external_dbsource/security/ir.model.access.csv'
361--- base_external_dbsource/security/ir.model.access.csv 1970-01-01 00:00:00 +0000
362+++ base_external_dbsource/security/ir.model.access.csv 2013-12-20 18:51:45 +0000
363@@ -0,0 +1,2 @@
364+id,name,model_id:id,group_id:id,perm_read,perm_write,perm_create,perm_unlink
365+access_base_external_dbsource_group_system,bae_external_dbsource_group_system,model_base_external_dbsource,base.group_system,1,1,1,1
366
367=== added directory 'base_external_dbsource/test'
368=== added file 'base_external_dbsource/test/dbsource_connect.yml'
369--- base_external_dbsource/test/dbsource_connect.yml 1970-01-01 00:00:00 +0000
370+++ base_external_dbsource/test/dbsource_connect.yml 2013-12-20 18:51:45 +0000
371@@ -0,0 +1,5 @@
372+-
373+ Connect to local Postgres.
374+-
375+ !python {model: base.external.dbsource}: |
376+ self.connection_test(cr, uid, [ref("demo_postgresql")]
377
378=== added directory 'import_odbc'
379=== added file 'import_odbc/__init__.py'
380--- import_odbc/__init__.py 1970-01-01 00:00:00 +0000
381+++ import_odbc/__init__.py 2013-12-20 18:51:45 +0000
382@@ -0,0 +1,24 @@
383+# -*- coding: utf-8 -*-
384+##############################################################################
385+#
386+# Daniel Reis
387+# 2011
388+#
389+# This program is free software: you can redistribute it and/or modify
390+# it under the terms of the GNU Affero General Public License as
391+# published by the Free Software Foundation, either version 3 of the
392+# License, or (at your option) any later version.
393+#
394+# This program is distributed in the hope that it will be useful,
395+# but WITHOUT ANY WARRANTY; without even the implied warranty of
396+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
397+# GNU Affero General Public License for more details.
398+#
399+# You should have received a copy of the GNU Affero General Public License
400+# along with this program. If not, see <http://www.gnu.org/licenses/>.
401+#
402+##############################################################################
403+
404+from . import import_odbc
405+
406+# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
407
408=== added file 'import_odbc/__openerp__.py'
409--- import_odbc/__openerp__.py 1970-01-01 00:00:00 +0000
410+++ import_odbc/__openerp__.py 2013-12-20 18:51:45 +0000
411@@ -0,0 +1,83 @@
412+# -*- coding: utf-8 -*-
413+##############################################################################
414+#
415+# Daniel Reis
416+# 2011
417+#
418+# This program is free software: you can redistribute it and/or modify
419+# it under the terms of the GNU Affero General Public License as
420+# published by the Free Software Foundation, either version 3 of the
421+# License, or (at your option) any later version.
422+#
423+# This program is distributed in the hope that it will be useful,
424+# but WITHOUT ANY WARRANTY; without even the implied warranty of
425+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
426+# GNU Affero General Public License for more details.
427+#
428+# You should have received a copy of the GNU Affero General Public License
429+# along with this program. If not, see <http://www.gnu.org/licenses/>.
430+#
431+##############################################################################
432+
433+{
434+ 'name': 'Import data from SQL and ODBC data sources.',
435+ 'version': '61.3',
436+ 'category': 'Tools',
437+ 'description': """
438+Import data directly from other databases.
439+
440+Installed in the Administration module, menu Configuration -> Import from SQL.
441+
442+Features:
443+ * Fetched data from the databases are used to build lines equivalent to regular import files. These are imported using the standard "import_data()" ORM method, benefiting from all its features, including xml_ids.
444+ * Each table import is defined by an SQL statement, used to build the equivalent for an import file. Each column's name should match the column names you would use in an import file. The first column must provide an unique identifier for the record, and will be used to build its xml_id.
445+ * SQL columns named "none" are ignored. This can be used for the first column of the SQL, so that it's used to build the XML Id but it's not imported to any OpenERP field.
446+ * The last sync date is the last successfull execution can be used in the SQL using "%(sync)s", or ":sync" in the case of Oracle.
447+ * When errors are found, only the record with the error fails import. The other correct records are commited. However, the "last sync date" will only be automaticaly updated when no errors are found.
448+ * The import execution can be scheduled to run automatically.
449+
450+Examples:
451+ * Importing suppliers to res.partner:
452+ SELECT distinct
453+ [SUPPLIER_CODE] as "ref"
454+ , [SUPPLIER_NAME] as "name"
455+ , 1 as "is_supplier"
456+ , [INFO] as "comment"
457+ FROM T_SUPPLIERS
458+ WHERE INACTIVE_DATE IS NULL and DATE_CHANGED >= %(sync)s
459+
460+ * Importing products to product.product:
461+ SELECT PRODUCT_CODE as "ref"
462+ , PRODUCT_NAME as "name"
463+ , 'res_partner_id_'+SUPPLIER_ID as "partner_id/id"
464+ FROM T_PRODUCTS
465+ WHERE DATE_CHANGED >= %(sync)s
466+
467+Improvements ideas waiting for a contributor:
468+ * Allow to import many2one fields (currently not supported). Done by adding a second SQL sentence to get child record list?
469+ * Allow "import sets" that can be executed at different time intervals using different scheduler jobs.
470+ * Allow to inactivate/delete OpenERP records when not present in an SQL result set.
471+ """,
472+ 'author': 'Daniel Reis',
473+ 'website': 'http://launchpad.net/addons-tko',
474+ 'images': [
475+ 'images/snapshot1.png',
476+ 'images/snapshot2.png',
477+ ],
478+ 'depends': [
479+ 'base',
480+ 'base_external_dbsource',
481+ ],
482+ 'data': [
483+ 'import_odbc_view.xml',
484+ 'security/ir.model.access.csv',
485+ ],
486+ 'demo': [
487+ 'import_odbc_demo.xml',
488+ ],
489+ 'test': [],
490+ 'installable': True,
491+ 'active': False,
492+}
493+
494+# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
495
496=== added directory 'import_odbc/images'
497=== added file 'import_odbc/images/snapshot1.png'
498Binary files import_odbc/images/snapshot1.png 1970-01-01 00:00:00 +0000 and import_odbc/images/snapshot1.png 2013-12-20 18:51:45 +0000 differ
499=== added file 'import_odbc/images/snapshot2.png'
500Binary files import_odbc/images/snapshot2.png 1970-01-01 00:00:00 +0000 and import_odbc/images/snapshot2.png 2013-12-20 18:51:45 +0000 differ
501=== added file 'import_odbc/import_odbc.py'
502--- import_odbc/import_odbc.py 1970-01-01 00:00:00 +0000
503+++ import_odbc/import_odbc.py 2013-12-20 18:51:45 +0000
504@@ -0,0 +1,218 @@
505+# -*- coding: utf-8 -*-
506+##############################################################################
507+#
508+# Daniel Reis
509+# 2011
510+#
511+# This program is free software: you can redistribute it and/or modify
512+# it under the terms of the GNU Affero General Public License as
513+# published by the Free Software Foundation, either version 3 of the
514+# License, or (at your option) any later version.
515+#
516+# This program is distributed in the hope that it will be useful,
517+# but WITHOUT ANY WARRANTY; without even the implied warranty of
518+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
519+# GNU Affero General Public License for more details.
520+#
521+# You should have received a copy of the GNU Affero General Public License
522+# along with this program. If not, see <http://www.gnu.org/licenses/>.
523+#
524+##############################################################################
525+
526+import sys
527+from datetime import datetime
528+from openerp.osv import orm, fields
529+import logging
530+_logger = logging.getLogger(__name__)
531+_loglvl = _logger.getEffectiveLevel()
532+SEP = '|'
533+
534+
535+class import_odbc_dbtable(orm.Model):
536+ _name = "import.odbc.dbtable"
537+ _description = 'Import Table Data'
538+ _order = 'exec_order'
539+ _columns = {
540+ 'name': fields.char('Datasource name', required=True, size=64),
541+ 'enabled': fields.boolean('Execution enabled'),
542+ 'dbsource_id': fields.many2one('base.external.dbsource', 'Database source', required=True),
543+ 'sql_source': fields.text('SQL', required=True, help='Column names must be valid "import_data" columns.'),
544+ 'model_target': fields.many2one('ir.model', 'Target object'),
545+ 'noupdate': fields.boolean('No updates', help="Only create new records; disable updates to existing records."),
546+ 'exec_order': fields.integer('Execution order', help="Defines the order to perform the import"),
547+ 'last_sync': fields.datetime('Last sync date',
548+ help="Datetime for the last succesfull sync."
549+ "\nLater changes on the source may not be replicated on the destination"),
550+ 'start_run': fields.datetime('Time started', readonly=True),
551+ 'last_run': fields.datetime('Time ended', readonly=True),
552+ 'last_record_count': fields.integer('Last record count', readonly=True),
553+ 'last_error_count': fields.integer('Last error count', readonly=True),
554+ 'last_warn_count': fields.integer('Last warning count', readonly=True),
555+ 'last_log': fields.text('Last run log', readonly=True),
556+ 'ignore_rel_errors': fields.boolean('Ignore relationship errors',
557+ help="On error try to reimport rows ignoring relationships."),
558+ 'raise_import_errors': fields.boolean('Raise import errors',
559+ help="Import errors not handled, intended for debugging purposes."
560+ "\nAlso forces debug messages to be written to the server log."),
561+ }
562+ _defaults = {
563+ 'enabled': True,
564+ 'exec_order': 10,
565+ }
566+
567+ def _import_data(self, cr, uid, flds, data, model_obj, table_obj, log):
568+ """Import data and returns error msg or empty string"""
569+
570+ def find_m2o(field_list):
571+ """"Find index of first column with a one2many field"""
572+ for i, x in enumerate(field_list):
573+ if len(x) > 3 and x[-3:] == ':id' or x[-3:] == '/id':
574+ return i
575+ return -1
576+
577+ def append_to_log(log, level, obj_id='', msg='', rel_id=''):
578+ if '_id_' in obj_id:
579+ obj_id = '.'.join(obj_id.split('_')[:-2]) + ': ' + obj_id.split('_')[-1]
580+ if ': .' in msg and not rel_id:
581+ rel_id = msg[msg.find(': .')+3:]
582+ if '_id_' in rel_id:
583+ rel_id = '.'.join(rel_id.split('_')[:-2]) + ': ' + rel_id.split('_')[-1]
584+ msg = msg[:msg.find(': .')]
585+ log['last_log'].append('%s|%s\t|%s\t|%s' % (level.ljust(5), obj_id, rel_id, msg))
586+ _logger.debug(data)
587+ cols = list(flds) # copy to avoid side effects
588+ errmsg = str()
589+ if table_obj.raise_import_errors:
590+ model_obj.import_data(cr, uid, cols, [data], noupdate=table_obj.noupdate)
591+ else:
592+ try:
593+ model_obj.import_data(cr, uid, cols, [data], noupdate=table_obj.noupdate)
594+ except:
595+ errmsg = str(sys.exc_info()[1])
596+ if errmsg and not table_obj.ignore_rel_errors:
597+ #Fail
598+ append_to_log(log, 'ERROR', data, errmsg)
599+ log['last_error_count'] += 1
600+ return False
601+ if errmsg and table_obj.ignore_rel_errors:
602+ #Warn and retry ignoring many2one fields...
603+ append_to_log(log, 'WARN', data, errmsg)
604+ log['last_warn_count'] += 1
605+ #Try ignoring each many2one (tip: in the SQL sentence select more problematic FKs first)
606+ i = find_m2o(cols)
607+ if i >= 0:
608+ #Try again without the [i] column
609+ del cols[i]
610+ del data[i]
611+ self._import_data(cr, uid, cols, data, model_obj, table_obj, log)
612+ else:
613+ #Fail
614+ append_to_log(log, 'ERROR', data, 'Removed all m2o keys and still fails.')
615+ log['last_error_count'] += 1
616+ return False
617+ return True
618+
619+ def import_run(self, cr, uid, ids=None, context=None):
620+ db_model = self.pool.get('base.external.dbsource')
621+ actions = self.read(cr, uid, ids, ['id', 'exec_order'])
622+ actions.sort(key=lambda x: (x['exec_order'], x['id']))
623+
624+ #Consider each dbtable:
625+ for action_ref in actions:
626+ obj = self.browse(cr, uid, action_ref['id'])
627+ if not obj.enabled:
628+ continue # skip
629+
630+ _logger.setLevel(obj.raise_import_errors and logging.DEBUG or _loglvl)
631+ _logger.debug('Importing %s...' % obj.name)
632+
633+ #now() microseconds are stripped to avoid problem with SQL smalldate
634+ #TODO: convert UTC Now to local timezone
635+ #http://stackoverflow.com/questions/4770297/python-convert-utc-datetime-string-to-local-datetime
636+ model_name = obj.model_target.model
637+ model_obj = self.pool.get(model_name)
638+ xml_prefix = model_name.replace('.', '_') + "_id_"
639+ log = {'start_run': datetime.now().replace(microsecond=0),
640+ 'last_run': None,
641+ 'last_record_count': 0,
642+ 'last_error_count': 0,
643+ 'last_warn_count': 0,
644+ 'last_log': list()}
645+ self.write(cr, uid, [obj.id], log)
646+
647+ #Prepare SQL sentence; replace "%s" with the last_sync date
648+ if obj.last_sync:
649+ sync = datetime.strptime(obj.last_sync, "%Y-%m-%d %H:%M:%S")
650+ else:
651+ sync = datetime.datetime(1900, 1, 1, 0, 0, 0)
652+ params = {'sync': sync}
653+ res = db_model.execute(cr, uid, [obj.dbsource_id.id],
654+ obj.sql_source, params, metadata=True)
655+
656+ #Exclude columns titled "None"; add (xml_)"id" column
657+ cidx = [i for i, x in enumerate(res['cols']) if x.upper() != 'NONE']
658+ cols = [x for i, x in enumerate(res['cols']) if x.upper() != 'NONE'] + ['id']
659+
660+ #Import each row:
661+ for row in res['rows']:
662+ #Build data row; import only columns present in the "cols" list
663+ data = list()
664+ for i in cidx:
665+ #TODO: Handle imported datetimes properly - convert from localtime to UTC!
666+ v = row[i]
667+ if isinstance(v, str):
668+ v = v.strip()
669+ data.append(v)
670+ data.append(xml_prefix + str(row[0]).strip())
671+
672+ #Import the row; on error, write line to the log
673+ log['last_record_count'] += 1
674+ self._import_data(cr, uid, cols, data, model_obj, obj, log)
675+ if log['last_record_count'] % 500 == 0:
676+ _logger.info('...%s rows processed...' % (log['last_record_count']))
677+
678+ #Finished importing all rows
679+ #If no errors, write new sync date
680+ if not (log['last_error_count'] or log['last_warn_count']):
681+ log['last_sync'] = log['start_run']
682+ level = logging.DEBUG
683+ if log['last_warn_count']:
684+ level = logging.WARN
685+ if log['last_error_count']:
686+ level = logging.ERROR
687+ _logger.log(level, 'Imported %s , %d rows, %d errors, %d warnings.' % (
688+ model_name, log['last_record_count'], log['last_error_count'],
689+ log['last_warn_count']))
690+ #Write run log, either if the table import is active or inactive
691+ if log['last_log']:
692+ log['last_log'].insert(0, 'LEVEL|== Line == |== Relationship ==|== Message ==')
693+ log.update({'last_log': '\n'.join(log['last_log'])})
694+ log.update({'last_run': datetime.now().replace(microsecond=0)})
695+ self.write(cr, uid, [obj.id], log)
696+
697+ #Finished
698+ _logger.debug('Import job FINISHED.')
699+ return True
700+
701+ def import_schedule(self, cr, uid, ids, context=None):
702+ cron_obj = self.pool.get('ir.cron')
703+ new_create_id = cron_obj.create(cr, uid, {
704+ 'name': 'Import ODBC tables',
705+ 'interval_type': 'hours',
706+ 'interval_number': 1,
707+ 'numbercall': -1,
708+ 'model': 'import.odbc.dbtable',
709+ 'function': 'import_run',
710+ 'doall': False,
711+ 'active': True
712+ })
713+ return {
714+ 'name': 'Import ODBC tables',
715+ 'view_type': 'form',
716+ 'view_mode': 'form,tree',
717+ 'res_model': 'ir.cron',
718+ 'res_id': new_create_id,
719+ 'type': 'ir.actions.act_window',
720+ }
721+
722+#EOF
723
724=== added file 'import_odbc/import_odbc_demo.xml'
725--- import_odbc/import_odbc_demo.xml 1970-01-01 00:00:00 +0000
726+++ import_odbc/import_odbc_demo.xml 2013-12-20 18:51:45 +0000
727@@ -0,0 +1,15 @@
728+<?xml version="1.0"?>
729+<openerp>
730+ <data>
731+
732+ <record model="import.odbc.dbtable" id="demo_postgresql_users">
733+ <field name="name">Users from PostgreSQL </field>
734+ <field name="dbsource_id" ref="base_external_dbsource.demo_postgre"/>
735+ <field name="sql_source">select usename as "login", usename as "name" from pg_catalog.pg_user</field>
736+ <field name="model_target" ref="base.model_res_users"/>
737+ </record>
738+
739+ </data>
740+</openerp>
741+
742+
743
744=== added file 'import_odbc/import_odbc_view.xml'
745--- import_odbc/import_odbc_view.xml 1970-01-01 00:00:00 +0000
746+++ import_odbc/import_odbc_view.xml 2013-12-20 18:51:45 +0000
747@@ -0,0 +1,86 @@
748+<?xml version="1.0"?>
749+<openerp>
750+ <data>
751+
752+ <!-- Table form -->
753+
754+ <record model="ir.ui.view" id="view_import_dbtable_form">
755+ <field name="name">import.odbc.dbtable.form</field>
756+ <field name="model">import.odbc.dbtable</field>
757+ <field name="type">form</field>
758+ <field name="arch" type="xml">
759+ <form>
760+ <field name="name" search="1"/>
761+ <field name="exec_order"/>
762+ <field name="model_target"/>
763+ <field name="dbsource_id" search="1"/>
764+ <field name="noupdate"/>
765+ <field name="enabled"/>
766+ <field name="ignore_rel_errors"/>
767+ <field name="raise_import_errors"/>
768+ <field name="last_sync"/>
769+ <group colspan="2">
770+ <button name="import_run" string="Run Import" type="object" icon="gtk-execute"/>
771+ <button name="import_schedule" string="Schedule Import" type="object" icon="gtk-paste"/>
772+ </group>
773+ <field name="sql_source" colspan="4"/>
774+ <separator string="Last execution" colspan="4"/>
775+ <field name="last_record_count"/>
776+ <field name="start_run"/>
777+ <field name="last_warn_count"/>
778+ <field name="last_run"/>
779+ <field name="last_error_count"/>
780+ <field name="last_log" colspan="4"/>
781+ </form>
782+ </field>
783+ </record>
784+
785+ <!-- Table Tree -->
786+
787+ <record id="view_import_dbtable_tree" model="ir.ui.view">
788+ <field name="name">import.odbc.dbtable.tree</field>
789+ <field name="model">import.odbc.dbtable</field>
790+ <field name="type">tree</field>
791+ <field name="arch" type="xml">
792+ <tree colors="grey: enabled==False; red:last_error_count&gt;0; blue:last_warn_count&gt;0">
793+ <field name="exec_order"/>
794+ <field name="name"/>
795+ <field name="model_target"/>
796+ <field name="dbsource_id"/>
797+ <field name="enabled"/>
798+ <field name="last_run"/>
799+ <field name="last_sync"/>
800+ <field name="last_record_count"/>
801+ <field name="last_error_count"/>
802+ <field name="last_warn_count"/>
803+ </tree>
804+ </field>
805+</record>
806+
807+
808+<!-- Tree Search -->
809+ <record id="view_import_dbtable_filter" model="ir.ui.view">
810+ <field name="name">import.odbc.dbtable.filter</field>
811+ <field name="model">import.odbc.dbtable</field>
812+ <field name="type">search</field>
813+ <field name="arch" type="xml">
814+ <search string="Search ODBC Imports">
815+ <field name="name"/>
816+ <field name="dbsource_id"/>
817+ <field name="model_target"/>
818+ </search>
819+ </field>
820+ </record>
821+
822+<!--Menu-->
823+ <record model="ir.actions.act_window" id="action_import_dbtable">
824+ <field name="name">Import from SQL</field>
825+ <field name="res_model">import.odbc.dbtable</field>
826+ <field name="view_type">form</field>
827+ </record>
828+ <menuitem name="Import from SQL" id="menu_import_dbtable" parent="base.next_id_15" action="action_import_dbtable"/>
829+</data>
830+</openerp>
831+
832+
833+
834
835=== added directory 'import_odbc/security'
836=== added file 'import_odbc/security/ir.model.access.csv'
837--- import_odbc/security/ir.model.access.csv 1970-01-01 00:00:00 +0000
838+++ import_odbc/security/ir.model.access.csv 2013-12-20 18:51:45 +0000
839@@ -0,0 +1,2 @@
840+id,name,model_id:id,group_id:id,perm_read,perm_write,perm_create,perm_unlink
841+access_import_odbc_dbsource_group_system,import_odbc_dbtable_group_system,model_import_odbc_dbtable,base.group_system,1,1,1,1