Merge lp:~openerp-community/openobject-extension/import_odbc_base_external_dbsource into lp:openobject-extension/oerp6.1-stable

Status: Needs review
Proposed branch: lp:~openerp-community/openobject-extension/import_odbc_base_external_dbsource
Merge into: lp:openobject-extension/oerp6.1-stable
Diff against target: 823 lines (+746/-0)
13 files modified
base_external_dbsource/__init__.py (+24/-0)
base_external_dbsource/__openerp__.py (+62/-0)
base_external_dbsource/base_external_dbsource.py (+159/-0)
base_external_dbsource/base_external_dbsource_demo.xml (+15/-0)
base_external_dbsource/base_external_dbsource_view.xml (+54/-0)
base_external_dbsource/security/ir.model.access.csv (+2/-0)
base_external_dbsource/test/dbsource_connect.yml (+5/-0)
import_odbc/__init__.py (+24/-0)
import_odbc/__openerp__.py (+84/-0)
import_odbc/import_odbc.py (+214/-0)
import_odbc/import_odbc_demo.xml (+15/-0)
import_odbc/import_odbc_view.xml (+86/-0)
import_odbc/security/ir.model.access.csv (+2/-0)
To merge this branch: bzr merge lp:~openerp-community/openobject-extension/import_odbc_base_external_dbsource
Reviewer Review Type Date Requested Status
extra-addons-commiter Pending
Review via email: mp+118290@code.launchpad.net

Description of the change

[ADD] import_odbc and base_external_dbsource from lp:~dreis-pt/addons-tko/reis

To post a comment you must log in.
355. By webtec <webtec@webtec-dev>

ADD: new execute method that also returns the column list; execute can accept a parameter list; added demo connrction to local PosgreSQL

356. By webtec <webtec@webtec-dev>

CHG: adapted to base_external_dbsource; code refactored

357. By Daniel Reis

[CHG] simplified dbsource.execute api to a single method with two possible behaviors

Unmerged revisions

357. By Daniel Reis

[CHG] simplified dbsource.execute api to a single method with two possible behaviors

356. By webtec <webtec@webtec-dev>

CHG: adapted to base_external_dbsource; code refactored

355. By webtec <webtec@webtec-dev>

ADD: new execute method that also returns the column list; execute can accept a parameter list; added demo connrction to local PosgreSQL

354. By Maxime Chambreuil (http://www.savoirfairelinux.com)

[ADD] import_odbc and base_external_dbsource from addons-tko

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== added directory 'base_external_dbsource'
2=== added file 'base_external_dbsource/__init__.py'
3--- base_external_dbsource/__init__.py 1970-01-01 00:00:00 +0000
4+++ base_external_dbsource/__init__.py 2012-09-13 10:12:28 +0000
5@@ -0,0 +1,24 @@
6+# -*- coding: utf-8 -*-
7+##############################################################################
8+#
9+# Daniel Reis
10+# 2011
11+#
12+# This program is free software: you can redistribute it and/or modify
13+# it under the terms of the GNU Affero General Public License as
14+# published by the Free Software Foundation, either version 3 of the
15+# License, or (at your option) any later version.
16+#
17+# This program is distributed in the hope that it will be useful,
18+# but WITHOUT ANY WARRANTY; without even the implied warranty of
19+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
20+# GNU Affero General Public License for more details.
21+#
22+# You should have received a copy of the GNU Affero General Public License
23+# along with this program. If not, see <http://www.gnu.org/licenses/>.
24+#
25+##############################################################################
26+
27+import base_external_dbsource
28+
29+# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
30
31=== added file 'base_external_dbsource/__openerp__.py'
32--- base_external_dbsource/__openerp__.py 1970-01-01 00:00:00 +0000
33+++ base_external_dbsource/__openerp__.py 2012-09-13 10:12:28 +0000
34@@ -0,0 +1,62 @@
35+# -*- coding: utf-8 -*-
36+##############################################################################
37+#
38+# Daniel Reis, 2011
39+# Additional contributions by Maxime Chambreuil, Savoir-faire Linux
40+#
41+# This program is free software: you can redistribute it and/or modify
42+# it under the terms of the GNU Affero General Public License as
43+# published by the Free Software Foundation, either version 3 of the
44+# License, or (at your option) any later version.
45+#
46+# This program is distributed in the hope that it will be useful,
47+# but WITHOUT ANY WARRANTY; without even the implied warranty of
48+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
49+# GNU Affero General Public License for more details.
50+#
51+# You should have received a copy of the GNU Affero General Public License
52+# along with this program. If not, see <http://www.gnu.org/licenses/>.
53+#
54+##############################################################################
55+
56+{
57+ 'name': 'External Database Sources',
58+ 'version': '61.3',
59+ 'category': 'Tools',
60+ 'description': """
61+This module allows you to define connections to foreign databases using ODBC,
62+Oracle Client or SQLAlchemy.
63+
64+Databases sources can be configured in Settings > Configuration -> Data sources.
65+
66+Depending on the database, you need:
67+ * to install unixodbc and python-pyodbc packages to use ODBC connections.
68+ * to install FreeTDS driver (tdsodbc package) and configure it through ODBC to
69+ connect to Microsoft SQL Server.
70+ * to install and configure Oracle Instant Client and cx_Oracle python library
71+ to connect to Oracle.
72+ """,
73+ 'author': 'Daniel Reis',
74+ 'website': 'http://launchpad.net/addons-tko',
75+ 'images': [
76+ 'images/screenshot01.png',
77+ ],
78+ 'depends': [
79+ 'base',
80+ ],
81+ 'init': [],
82+ 'data': [
83+ 'base_external_dbsource_view.xml',
84+ 'security/ir.model.access.csv',
85+ ],
86+ 'demo': [
87+ 'base_external_dbsource_demo.xml',
88+ ],
89+ 'test': [
90+ 'dbsource_connect.yml',
91+ ],
92+ 'installable': True,
93+ 'active': False,
94+}
95+
96+# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
97
98=== added file 'base_external_dbsource/base_external_dbsource.py'
99--- base_external_dbsource/base_external_dbsource.py 1970-01-01 00:00:00 +0000
100+++ base_external_dbsource/base_external_dbsource.py 2012-09-13 10:12:28 +0000
101@@ -0,0 +1,159 @@
102+# -*- coding: utf-8 -*-
103+##############################################################################
104+#
105+# Daniel Reis
106+# 2011
107+#
108+# This program is free software: you can redistribute it and/or modify
109+# it under the terms of the GNU Affero General Public License as
110+# published by the Free Software Foundation, either version 3 of the
111+# License, or (at your option) any later version.
112+#
113+# This program is distributed in the hope that it will be useful,
114+# but WITHOUT ANY WARRANTY; without even the implied warranty of
115+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
116+# GNU Affero General Public License for more details.
117+#
118+# You should have received a copy of the GNU Affero General Public License
119+# along with this program. If not, see <http://www.gnu.org/licenses/>.
120+#
121+##############################################################################
122+
123+import os
124+from osv import fields, osv
125+from openerp.tools.translate import _
126+import openerp.tools as tools
127+import logging
128+_logger = logging.getLogger(__name__)
129+
130+CONNECTORS = []
131+
132+try:
133+ import sqlalchemy
134+ import pymssql
135+ CONNECTORS.append( ('mssql', 'Microsoft SQL Server') )
136+except:
137+ _logger.info('MS SQL Server not available. Please install "slqalchemy" and "pymssql" python package.')
138+
139+try:
140+ import sqlalchemy
141+ import MySQLdb
142+ CONNECTORS.append( ('mysql', 'MySQL') )
143+except:
144+ _logger.info('MySQL not available. Please install "slqalchemy" and "mysqldb" python package.')
145+
146+try:
147+ import pyodbc
148+ CONNECTORS.append( ('pyodbc', 'ODBC') )
149+except:
150+ _logger.info('ODBC libraries not available. Please install "unixodbc" and "python-pyodbc" packages.')
151+
152+try:
153+ import cx_Oracle
154+ CONNECTORS.append( ('cx_Oracle', 'Oracle') )
155+except:
156+ _logger.info('Oracle libraries not available. Please install "cx_Oracle" python package.')
157+
158+import psycopg2
159+CONNECTORS.append( ('postgresql', 'PostgreSQL') )
160+
161+try:
162+ import sqlalchemy
163+ CONNECTORS.append( ('sqlite', 'SQLite') )
164+except:
165+ _logger.info('SQLAlchemy not available. Please install "slqalchemy" python package.')
166+
167+class base_external_dbsource(osv.osv):
168+ _name = "base.external.dbsource"
169+ _description = 'External Database Sources'
170+ _columns = {
171+ 'name': fields.char('Datasource name', required=True, size=64),
172+ 'conn_string': fields.text('Connection string', help="""\
173+Sample connection strings:
174+- Microsoft SQL Server: mssql+pymssql://username:%s@server:port/dbname?charset=utf8
175+- MySQL: mysql://user:%s@server:port/dbname
176+- ODBC: DRIVER={FreeTDS};SERVER=server.address;Database=mydb;UID=sa
177+- ORACLE: username/%s@//server.address:port/instance
178+- PostgreSQL: dbname='template1' user='dbuser' host='localhost' port='5432' password=%s
179+- SQLite: sqlite:///test.db
180+"""),
181+ 'password': fields.char('Password' , size=40),
182+ 'connector': fields.selection(CONNECTORS, 'Connector', required=True,
183+ help = "If a connector is missing from the list, check the " \
184+ + "server log to confirm that the required componentes were detected."),
185+ }
186+
187+ def conn_open(self, cr, uid, id1):
188+ #Get dbsource record
189+ data = self.browse(cr, uid, id1)
190+ #Build the full connection string
191+ connStr = data.conn_string
192+ if data.password:
193+ if '%s' not in data.conn_string:
194+ connStr += ';PWD=%s'
195+ connStr = connStr % data.password
196+ #Try to connect
197+ if data.connector == 'cx_Oracle':
198+ os.environ['NLS_LANG'] = 'AMERICAN_AMERICA.UTF8'
199+ conn = cx_Oracle.connect(connStr)
200+ elif data.connector == 'pyodbc':
201+ conn = pyodbc.connect(connStr)
202+ elif data.connector in ('sqlite','mysql','mssql'):
203+ conn = sqlalchemy.create_engine(connStr).connect()
204+ elif data.connector == 'postgresql':
205+ conn = psycopg2.connect(connStr)
206+
207+ return conn
208+
209+ def execute(self, cr, uid, ids, sqlquery, sqlparams=None, metadata=False, context=None):
210+ """Executes SQL and returns a list of rows.
211+
212+ "sqlparams" can be a dict of values, that can be referenced in the SQL statement
213+ using "%(key)s" or, in the case of Oracle, ":key".
214+ Example:
215+ sqlquery = "select * from mytable where city = %(city)s and date > %(dt)s"
216+ params = {'city': 'Lisbon', 'dt': datetime.datetime(2000, 12, 31)}
217+
218+ If metadata=True, it will instead return a dict containing the rows list and the columns list,
219+ in the format:
220+ { 'cols': [ 'col_a', 'col_b', ...]
221+ , 'rows': [ (a0, b0, ...), (a1, b1, ...), ...] }
222+ """
223+ data = self.browse(cr, uid, ids)
224+ rows, cols = list(), list()
225+ for obj in data:
226+ conn = self.conn_open(cr, uid, obj.id)
227+ if obj.connector in ["sqlite","mysql","mssql"]:
228+ #using sqlalchemy
229+ cur = conn.execute(sqlquery, sqlparams)
230+ if metadata: cols = cur.keys()
231+ rows = [r for r in cur]
232+ else:
233+ #using other db connectors
234+ cur = conn.cursor()
235+ cur.execute(sqlquery, sqlparams)
236+ if metadata: cols = [d[0] for d in cur.description]
237+ rows = cur.fetchall()
238+ conn.close()
239+ if metadata:
240+ return{'cols': cols, 'rows': rows}
241+ else:
242+ return rows
243+
244+ def connection_test(self, cr, uid, ids, context=None):
245+ for obj in self.browse(cr, uid, ids, context):
246+ conn = False
247+ try:
248+ conn = self.conn_open(cr, uid, obj.id)
249+ except Exception, e:
250+ raise osv.except_osv(_("Connection test failed!"), _("Here is what we got instead:\n %s") % tools.ustr(e))
251+ finally:
252+ try:
253+ if conn: conn.close()
254+ except Exception:
255+ # ignored, just a consequence of the previous exception
256+ pass
257+ #TODO: if OK a (wizard) message box should be displayed
258+ raise osv.except_osv(_("Connection test succeeded!"), _("Everything seems properly set up!"))
259+
260+base_external_dbsource()
261
262=== added file 'base_external_dbsource/base_external_dbsource_demo.xml'
263--- base_external_dbsource/base_external_dbsource_demo.xml 1970-01-01 00:00:00 +0000
264+++ base_external_dbsource/base_external_dbsource_demo.xml 2012-09-13 10:12:28 +0000
265@@ -0,0 +1,15 @@
266+<?xml version="1.0"?>
267+<openerp>
268+ <data>
269+
270+ <record model="base.external.dbsource" id="demo_postgre">
271+ <field name="name">PostgreSQL local</field>
272+ <field name="conn_string">dbname='postgres' password=%s</field>
273+ <field name="password">postgresql</field>
274+ <field name="connector">postgresql</field>
275+ </record>
276+
277+ </data>
278+</openerp>
279+
280+
281
282=== added file 'base_external_dbsource/base_external_dbsource_view.xml'
283--- base_external_dbsource/base_external_dbsource_view.xml 1970-01-01 00:00:00 +0000
284+++ base_external_dbsource/base_external_dbsource_view.xml 2012-09-13 10:12:28 +0000
285@@ -0,0 +1,54 @@
286+<?xml version="1.0"?>
287+<openerp>
288+ <data>
289+
290+ <!-- DBSource -->
291+
292+ <record model="ir.ui.view" id="view_dbsource_tree">
293+ <field name="name">base.external.dbsource.tree</field>
294+ <field name="model">base.external.dbsource</field>
295+ <field name="type">tree</field>
296+ <field name="arch" type="xml">
297+ <tree>
298+ <field name="name"/>
299+ <field name="connector"/>
300+ <field name="conn_string"/>
301+ </tree>
302+ </field>
303+ </record>
304+
305+ <record model="ir.ui.view" id="view_dbsource_form">
306+ <field name="name">base.external.dbsource.form</field>
307+ <field name="model">base.external.dbsource</field>
308+ <field name="type">form</field>
309+ <field name="arch" type="xml">
310+ <form>
311+ <field name="name"/>
312+ <field name="password" password="True"/>
313+ <newline/>
314+ <field name="connector" colspan="2"/>
315+ <newline/>
316+ <field name="conn_string" colspan="4"/>
317+ <newline/>
318+ <button name="connection_test" string="Test Connection" type="object" icon="gtk-network" colspan="4"/>
319+ </form>
320+ </field>
321+ </record>
322+
323+ <record model="ir.actions.act_window" id="action_dbsource">
324+ <field name="name">External Database Sources</field>
325+ <field name="res_model">base.external.dbsource</field>
326+ <field name="view_type">form</field>
327+ <field name="view_mode">tree,form</field>
328+ <field name="view_id" ref="view_dbsource_tree"/>
329+ </record>
330+
331+ <menuitem name="Database Sources"
332+ id="menu_dbsource"
333+ parent="base.next_id_15"
334+ action="action_dbsource"/>
335+
336+ </data>
337+</openerp>
338+
339+
340
341=== added directory 'base_external_dbsource/images'
342=== added file 'base_external_dbsource/images/screenshot01.png'
343Binary files base_external_dbsource/images/screenshot01.png 1970-01-01 00:00:00 +0000 and base_external_dbsource/images/screenshot01.png 2012-09-13 10:12:28 +0000 differ
344=== added directory 'base_external_dbsource/security'
345=== added file 'base_external_dbsource/security/ir.model.access.csv'
346--- base_external_dbsource/security/ir.model.access.csv 1970-01-01 00:00:00 +0000
347+++ base_external_dbsource/security/ir.model.access.csv 2012-09-13 10:12:28 +0000
348@@ -0,0 +1,2 @@
349+id,name,model_id:id,group_id:id,perm_read,perm_write,perm_create,perm_unlink
350+access_base_external_dbsource_group_system,bae_external_dbsource_group_system,model_base_external_dbsource,base.group_system,1,1,1,1
351
352=== added directory 'base_external_dbsource/test'
353=== added file 'base_external_dbsource/test/dbsource_connect.yml'
354--- base_external_dbsource/test/dbsource_connect.yml 1970-01-01 00:00:00 +0000
355+++ base_external_dbsource/test/dbsource_connect.yml 2012-09-13 10:12:28 +0000
356@@ -0,0 +1,5 @@
357+-
358+ Connect to local Postgres.
359+-
360+ !python {model: base.external.dbsource}: |
361+ self.connection_test(cr, uid, [ref("demo_postgresql")]
362
363=== added directory 'import_odbc'
364=== added file 'import_odbc/__init__.py'
365--- import_odbc/__init__.py 1970-01-01 00:00:00 +0000
366+++ import_odbc/__init__.py 2012-09-13 10:12:28 +0000
367@@ -0,0 +1,24 @@
368+# -*- coding: utf-8 -*-
369+##############################################################################
370+#
371+# Daniel Reis
372+# 2011
373+#
374+# This program is free software: you can redistribute it and/or modify
375+# it under the terms of the GNU Affero General Public License as
376+# published by the Free Software Foundation, either version 3 of the
377+# License, or (at your option) any later version.
378+#
379+# This program is distributed in the hope that it will be useful,
380+# but WITHOUT ANY WARRANTY; without even the implied warranty of
381+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
382+# GNU Affero General Public License for more details.
383+#
384+# You should have received a copy of the GNU Affero General Public License
385+# along with this program. If not, see <http://www.gnu.org/licenses/>.
386+#
387+##############################################################################
388+
389+import import_odbc
390+
391+# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
392
393=== added file 'import_odbc/__openerp__.py'
394--- import_odbc/__openerp__.py 1970-01-01 00:00:00 +0000
395+++ import_odbc/__openerp__.py 2012-09-13 10:12:28 +0000
396@@ -0,0 +1,84 @@
397+# -*- coding: utf-8 -*-
398+##############################################################################
399+#
400+# Daniel Reis
401+# 2011
402+#
403+# This program is free software: you can redistribute it and/or modify
404+# it under the terms of the GNU Affero General Public License as
405+# published by the Free Software Foundation, either version 3 of the
406+# License, or (at your option) any later version.
407+#
408+# This program is distributed in the hope that it will be useful,
409+# but WITHOUT ANY WARRANTY; without even the implied warranty of
410+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
411+# GNU Affero General Public License for more details.
412+#
413+# You should have received a copy of the GNU Affero General Public License
414+# along with this program. If not, see <http://www.gnu.org/licenses/>.
415+#
416+##############################################################################
417+
418+{
419+ 'name': 'Import data from SQL and ODBC data sources.',
420+ 'version': '61.3',
421+ 'category': 'Tools',
422+ 'description': """
423+Import data directly from other databases.
424+
425+Installed in the Administration module, menu Configuration -> Import from SQL.
426+
427+Features:
428+ * Fetched data from the databases are used to build lines equivalent to regular import files. These are imported using the standard "import_data()" ORM method, benefiting from all its features, including xml_ids.
429+ * Each table import is defined by an SQL statement, used to build the equivalent for an import file. Each column's name should match the column names you would use in an import file. The first column must provide an unique identifier for the record, and will be used to build its xml_id.
430+ * SQL columns named "none" are ignored. This can be used for the first column of the SQL, so that it's used to build the XML Id but it's not imported to any OpenERP field.
431+ * The last sync date is the last successfull execution can be used in the SQL using "%(sync)s", or ":sync" in the case of Oracle.
432+ * When errors are found, only the record with the error fails import. The other correct records are commited. However, the "last sync date" will only be automaticaly updated when no errors are found.
433+ * The import execution can be scheduled to run automatically.
434+
435+Examples:
436+ * Importing suppliers to res.partner:
437+ SELECT distinct
438+ [SUPPLIER_CODE] as "ref"
439+ , [SUPPLIER_NAME] as "name"
440+ , 1 as "is_supplier"
441+ , [INFO] as "comment"
442+ FROM T_SUPPLIERS
443+ WHERE INACTIVE_DATE IS NULL and DATE_CHANGED >= %(sync)s
444+
445+ * Importing products to product.product:
446+ SELECT PRODUCT_CODE as "ref"
447+ , PRODUCT_NAME as "name"
448+ , 'res_partner_id_'+SUPPLIER_ID as "partner_id/id"
449+ FROM T_PRODUCTS
450+ WHERE DATE_CHANGED >= %(sync)s
451+
452+Improvements ideas waiting for a contributor:
453+ * Allow to import many2one fields (currently not supported). Done by adding a second SQL sentence to get child record list?
454+ * Allow "import sets" that can be executed at different time intervals using different scheduler jobs.
455+ * Allow to inactivate/delete OpenERP records when not present in an SQL result set.
456+ """,
457+ 'author': 'Daniel Reis',
458+ 'website': 'http://launchpad.net/addons-tko',
459+ 'images': [
460+ 'images/snapshot1.png',
461+ 'images/snapshot2.png',
462+ ],
463+ 'depends': [
464+ 'base',
465+ 'base_external_dbsource',
466+ ],
467+ 'init': [],
468+ 'data': [
469+ 'import_odbc_view.xml',
470+ 'security/ir.model.access.csv',
471+ ],
472+ 'demo': [
473+ 'import_odbc_demo.xml',
474+ ],
475+ 'test': [],
476+ 'installable': True,
477+ 'active': False,
478+}
479+
480+# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
481
482=== added directory 'import_odbc/images'
483=== added file 'import_odbc/images/snapshot1.png'
484Binary files import_odbc/images/snapshot1.png 1970-01-01 00:00:00 +0000 and import_odbc/images/snapshot1.png 2012-09-13 10:12:28 +0000 differ
485=== added file 'import_odbc/images/snapshot2.png'
486Binary files import_odbc/images/snapshot2.png 1970-01-01 00:00:00 +0000 and import_odbc/images/snapshot2.png 2012-09-13 10:12:28 +0000 differ
487=== added file 'import_odbc/import_odbc.py'
488--- import_odbc/import_odbc.py 1970-01-01 00:00:00 +0000
489+++ import_odbc/import_odbc.py 2012-09-13 10:12:28 +0000
490@@ -0,0 +1,214 @@
491+# -*- coding: utf-8 -*-
492+##############################################################################
493+#
494+# Daniel Reis
495+# 2011
496+#
497+# This program is free software: you can redistribute it and/or modify
498+# it under the terms of the GNU Affero General Public License as
499+# published by the Free Software Foundation, either version 3 of the
500+# License, or (at your option) any later version.
501+#
502+# This program is distributed in the hope that it will be useful,
503+# but WITHOUT ANY WARRANTY; without even the implied warranty of
504+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
505+# GNU Affero General Public License for more details.
506+#
507+# You should have received a copy of the GNU Affero General Public License
508+# along with this program. If not, see <http://www.gnu.org/licenses/>.
509+#
510+##############################################################################
511+
512+import sys
513+from datetime import datetime
514+from osv import fields, osv
515+import logging
516+_logger = logging.getLogger(__name__)
517+_loglvl = _logger.getEffectiveLevel()
518+SEP = '|'
519+
520+class import_odbc_dbtable(osv.osv):
521+ _name="import.odbc.dbtable"
522+ _description = 'Import Table Data'
523+ _order = 'exec_order'
524+ _columns = {
525+ 'name': fields.char('Datasource name', required=True, size=64),
526+ 'enabled': fields.boolean('Execution enabled'),
527+ 'dbsource_id': fields.many2one('base.external.dbsource', 'Database source', required=True),
528+ 'sql_source': fields.text('SQL', required=True, help='Column names must be valid "import_data" columns.'),
529+ 'model_target': fields.many2one('ir.model','Target object'),
530+ 'noupdate': fields.boolean('No updates', help="Only create new records; disable updates to existing records."),
531+ 'exec_order': fields.integer('Execution order', help="Defines the order to perform the import"),
532+ 'last_sync': fields.datetime('Last sync date', help="Datetime for the last succesfull sync. Later changes on the source may not be replicated on the destination"),
533+ 'start_run': fields.datetime('Time started', readonly=True),
534+ 'last_run': fields.datetime('Time ended', readonly=True),
535+ 'last_record_count': fields.integer('Last record count', readonly=True),
536+ 'last_error_count': fields.integer('Last error count', readonly=True),
537+ 'last_warn_count': fields.integer('Last warning count', readonly=True),
538+ 'last_log': fields.text('Last run log', readonly=True),
539+ 'ignore_rel_errors': fields.boolean('Ignore relationship errors',
540+ help = "On error try to reimport rows ignoring relationships."),
541+ 'raise_import_errors': fields.boolean('Raise import errors',
542+ help = "Import errors not handled, intended for debugging purposes."
543+ + "\nAlso forces debug messages to be written to the server log."),
544+ }
545+ _defaults = {
546+ 'enabled': True,
547+ 'exec_order': 10,
548+ }
549+
550+ def _import_data(self, cr, uid, flds, data, model_obj, table_obj, log):
551+ """Import data and returns error msg or empty string"""
552+
553+ def find_m2o(field_list):
554+ """"Find index of first column with a one2many field"""
555+ for i, x in enumerate(field_list):
556+ if len(x)>3 and x[-3:] == ':id' or x[-3:] == '/id':
557+ return i
558+ return -1
559+
560+ def append_to_log(log, level, obj_id = '', msg = '', rel_id = ''):
561+ if '_id_' in obj_id:
562+ obj_id = '.'.join(obj_id.split('_')[:-2]) + ': ' + obj_id.split('_')[-1]
563+ if ': .' in msg and not rel_id:
564+ rel_id = msg[msg.find(': .')+3:]
565+ if '_id_' in rel_id:
566+ rel_id = '.'.join(rel_id.split('_')[:-2]) + ': ' + rel_id.split('_')[-1]
567+ msg = msg[:msg.find(': .')]
568+ log['last_log'].append('%s|%s\t|%s\t|%s' % (level.ljust(5), obj_id, rel_id, msg))
569+
570+
571+ _logger.debug( data )
572+ cols = list(flds) #copy to avoid side effects
573+ errmsg = str()
574+ if table_obj.raise_import_errors:
575+ model_obj.import_data(cr, uid, cols, [data], noupdate=table_obj.noupdate)
576+ else:
577+ try:
578+ model_obj.import_data(cr, uid, cols, [data], noupdate=table_obj.noupdate)
579+ except:
580+ errmsg = str(sys.exc_info()[1])
581+
582+ if errmsg and not table_obj.ignore_rel_errors:
583+ #Fail
584+ append_to_log(log, 'ERROR', data, errmsg )
585+ log['last_error_count'] += 1
586+ return False
587+ if errmsg and table_obj.ignore_rel_errors:
588+ #Warn and retry ignoring many2one fields...
589+ append_to_log(log, 'WARN', data, errmsg )
590+ log['last_warn_count'] += 1
591+ #Try ignoring each many2one (tip: in the SQL sentence select more problematic FKs first)
592+ i = find_m2o(cols)
593+ if i >= 0:
594+ #Try again without the [i] column
595+ del cols[i]
596+ del data[i]
597+ self._import_data(cr, uid, cols, data, model_obj, table_obj, log)
598+ else:
599+ #Fail
600+ append_to_log(log, 'ERROR', data, 'Removed all m2o keys and still fails.' )
601+ log['last_error_count'] += 1
602+ return False
603+
604+ return True
605+
606+
607+ def import_run(self, cr, uid, ids=None, context=None):
608+
609+ db_model = self.pool.get('base.external.dbsource')
610+ actions = self.read(cr, uid, ids, ['id', 'exec_order'])
611+ actions.sort(key = lambda x:(x['exec_order'], x['id']))
612+
613+ #Consider each dbtable:
614+ for action_ref in actions:
615+
616+ obj = self.browse(cr, uid, action_ref['id'])
617+ if not obj.enabled: continue #skip
618+
619+ _logger.setLevel(obj.raise_import_errors and logging.DEBUG or _loglvl)
620+ _logger.debug('Importing %s...' % obj.name)
621+
622+ #now() microseconds are stripped to avoid problem with SQL smalldate
623+ #TODO: convert UTC Now to local timezone (http://stackoverflow.com/questions/4770297/python-convert-utc-datetime-string-to-local-datetime)
624+ model_name = obj.model_target.model
625+ model_obj = self.pool.get(model_name)
626+ xml_prefix = model_name.replace('.', '_') + "_id_"
627+ log = {'start_run': datetime.now().replace(microsecond=0),
628+ 'last_run': None,
629+ 'last_record_count': 0,
630+ 'last_error_count': 0,
631+ 'last_warn_count': 0,
632+ 'last_log': list()}
633+ self.write(cr, uid, [obj.id], log)
634+
635+ #Prepare SQL sentence; replace "%s" with the last_sync date
636+ if obj.last_sync: sync = datetime.strptime(obj.last_sync, "%Y-%m-%d %H:%M:%S")
637+ else: sync = datetime.datetime(1900, 1, 1, 0, 0, 0)
638+ params = {'sync': sync}
639+ res = db_model.execute(cr, uid, [obj.dbsource_id.id], obj.sql_source, params, metadata=True)
640+
641+ #Exclude columns titled "None"; add (xml_)"id" column
642+ cidx = [i for i, x in enumerate(res['cols']) if x.upper() != 'NONE']
643+ cols = [x for i, x in enumerate(res['cols']) if x.upper() != 'NONE'] + ['id']
644+
645+ #Import each row:
646+ for row in res['rows']:
647+ #Build data row; import only columns present in the "cols" list
648+ data = list()
649+ for i in cidx:
650+ #TODO: Handle imported datetimes properly - convert from localtime to UTC!
651+ v = row[i]
652+ if isinstance(v, str): v = v.strip()
653+ data.append(v)
654+ data.append( xml_prefix + str(row[0]).strip() )
655+
656+ #Import the row; on error, write line to the log
657+ log['last_record_count'] += 1
658+ self._import_data(cr, uid, cols, data, model_obj, obj, log)
659+ if log['last_record_count'] % 500 == 0:
660+ _logger.info('...%s rows processed...' % (log['last_record_count']) )
661+
662+ #Finished importing all rows
663+ #If no errors, write new sync date
664+ if not (log['last_error_count'] or log['last_warn_count']):
665+ log['last_sync'] = log['start_run']
666+ level = logging.DEBUG
667+ if log['last_warn_count']: level = logging.WARN
668+ if log['last_error_count']: level = logging.ERROR
669+ _logger.log(level, 'Imported %s , %d rows, %d errors, %d warnings.' % (
670+ model_name, log['last_record_count'], log['last_error_count'] ,
671+ log['last_warn_count'] ) )
672+ #Write run log, either if the table import is active or inactive
673+ if log['last_log']:
674+ log['last_log'].insert(0, 'LEVEL|== Line == |== Relationship ==|== Message ==')
675+ log.update( {'last_log': '\n'.join(log['last_log'])} )
676+ log.update({ 'last_run': datetime.now().replace(microsecond=0) }) #second=0,
677+ self.write(cr, uid, [obj.id], log)
678+ #Finished
679+ _logger.debug('Import job FINISHED.')
680+ return True
681+
682+
683+ def import_schedule(self, cr, uid, ids, context=None):
684+ cron_obj = self.pool.get('ir.cron')
685+ new_create_id = cron_obj.create(cr, uid, {
686+ 'name': 'Import ODBC tables',
687+ 'interval_type': 'hours',
688+ 'interval_number': 1,
689+ 'numbercall': -1,
690+ 'model': 'import.odbc.dbtable',
691+ 'function': 'import_run',
692+ 'doall': False,
693+ 'active': True
694+ })
695+ return {
696+ 'name': 'Import ODBC tables',
697+ 'view_type': 'form',
698+ 'view_mode': 'form,tree',
699+ 'res_model': 'ir.cron',
700+ 'res_id': new_create_id,
701+ 'type': 'ir.actions.act_window',
702+ }
703+
704+import_odbc_dbtable()
705
706=== added file 'import_odbc/import_odbc_demo.xml'
707--- import_odbc/import_odbc_demo.xml 1970-01-01 00:00:00 +0000
708+++ import_odbc/import_odbc_demo.xml 2012-09-13 10:12:28 +0000
709@@ -0,0 +1,15 @@
710+<?xml version="1.0"?>
711+<openerp>
712+ <data>
713+
714+ <record model="import.odbc.dbtable" id="demo_postgresql_users">
715+ <field name="name">Users from PostgreSQL </field>
716+ <field name="dbsource_id" ref="base_external_dbsource.demo_postgre"/>
717+ <field name="sql_source">select usename as "login", usename as "name" from pg_catalog.pg_user</field>
718+ <field name="model_target" ref="base.model_res_users"/>
719+ </record>
720+
721+ </data>
722+</openerp>
723+
724+
725
726=== added file 'import_odbc/import_odbc_view.xml'
727--- import_odbc/import_odbc_view.xml 1970-01-01 00:00:00 +0000
728+++ import_odbc/import_odbc_view.xml 2012-09-13 10:12:28 +0000
729@@ -0,0 +1,86 @@
730+<?xml version="1.0"?>
731+<openerp>
732+ <data>
733+
734+ <!-- Table form -->
735+
736+ <record model="ir.ui.view" id="view_import_dbtable_form">
737+ <field name="name">import.odbc.dbtable.form</field>
738+ <field name="model">import.odbc.dbtable</field>
739+ <field name="type">form</field>
740+ <field name="arch" type="xml">
741+ <form>
742+ <field name="name" search="1"/>
743+ <field name="exec_order"/>
744+ <field name="model_target"/>
745+ <field name="dbsource_id" search="1"/>
746+ <field name="noupdate"/>
747+ <field name="enabled"/>
748+ <field name="ignore_rel_errors"/>
749+ <field name="raise_import_errors"/>
750+ <field name="last_sync"/>
751+ <group colspan="2">
752+ <button name="import_run" string="Run Import" type="object" icon="gtk-execute"/>
753+ <button name="import_schedule" string="Schedule Import" type="object" icon="gtk-paste"/>
754+ </group>
755+ <field name="sql_source" colspan="4"/>
756+ <separator string="Last execution" colspan="4"/>
757+ <field name="last_record_count"/>
758+ <field name="start_run"/>
759+ <field name="last_warn_count"/>
760+ <field name="last_run"/>
761+ <field name="last_error_count"/>
762+ <field name="last_log" colspan="4"/>
763+ </form>
764+ </field>
765+ </record>
766+
767+ <!-- Table Tree -->
768+
769+ <record id="view_import_dbtable_tree" model="ir.ui.view">
770+ <field name="name">import.odbc.dbtable.tree</field>
771+ <field name="model">import.odbc.dbtable</field>
772+ <field name="type">tree</field>
773+ <field name="arch" type="xml">
774+ <tree colors="grey: enabled==False; red:last_error_count&gt;0; blue:last_warn_count&gt;0">
775+ <field name="exec_order"/>
776+ <field name="name"/>
777+ <field name="model_target"/>
778+ <field name="dbsource_id"/>
779+ <field name="enabled"/>
780+ <field name="last_run"/>
781+ <field name="last_sync"/>
782+ <field name="last_record_count"/>
783+ <field name="last_error_count"/>
784+ <field name="last_warn_count"/>
785+ </tree>
786+ </field>
787+</record>
788+
789+
790+<!-- Tree Search -->
791+ <record id="view_import_dbtable_filter" model="ir.ui.view">
792+ <field name="name">import.odbc.dbtable.filter</field>
793+ <field name="model">import.odbc.dbtable</field>
794+ <field name="type">search</field>
795+ <field name="arch" type="xml">
796+ <search string="Search ODBC Imports">
797+ <field name="name"/>
798+ <field name="dbsource_id"/>
799+ <field name="model_target"/>
800+ </search>
801+ </field>
802+ </record>
803+
804+<!--Menu-->
805+ <record model="ir.actions.act_window" id="action_import_dbtable">
806+ <field name="name">Import from SQL</field>
807+ <field name="res_model">import.odbc.dbtable</field>
808+ <field name="view_type">form</field>
809+ </record>
810+ <menuitem name="Import from SQL" id="menu_import_dbtable" parent="base.next_id_15" action="action_import_dbtable"/>
811+</data>
812+</openerp>
813+
814+
815+
816
817=== added directory 'import_odbc/security'
818=== added file 'import_odbc/security/ir.model.access.csv'
819--- import_odbc/security/ir.model.access.csv 1970-01-01 00:00:00 +0000
820+++ import_odbc/security/ir.model.access.csv 2012-09-13 10:12:28 +0000
821@@ -0,0 +1,2 @@
822+id,name,model_id:id,group_id:id,perm_read,perm_write,perm_create,perm_unlink
823+access_import_odbc_dbsource_group_system,import_odbc_dbtable_group_system,model_import_odbc_dbtable,base.group_system,1,1,1,1